WorldWideScience

Sample records for biologically realistic model

  1. A Model of Biological Attacks on a Realistic Population

    Science.gov (United States)

    Carley, Kathleen M.; Fridsma, Douglas; Casman, Elizabeth; Altman, Neal; Chen, Li-Chiou; Kaminsky, Boris; Nave, Demian; Yahja, Alex

    The capability to assess the impacts of large-scale biological attacks and the efficacy of containment policies is critical and requires knowledge-intensive reasoning about social response and disease transmission within a complex social system. There is a close linkage among social networks, transportation networks, disease spread, and early detection. Spatial dimensions related to public gathering places such as hospitals, nursing homes, and restaurants, can play a major role in epidemics [Klovdahl et. al. 2001]. Like natural epidemics, bioterrorist attacks unfold within spatially defined, complex social systems, and the societal and networked response can have profound effects on their outcome. This paper focuses on bioterrorist attacks, but the model has been applied to emergent and familiar diseases as well.

  2. A Novel CPU/GPU Simulation Environment for Large-Scale Biologically-Realistic Neural Modeling

    Directory of Open Access Journals (Sweden)

    Roger V Hoang

    2013-10-01

    Full Text Available Computational Neuroscience is an emerging field that provides unique opportunities to studycomplex brain structures through realistic neural simulations. However, as biological details are added tomodels, the execution time for the simulation becomes longer. Graphics Processing Units (GPUs are now being utilized to accelerate simulations due to their ability to perform computations in parallel. As such, they haveshown significant improvement in execution time compared to Central Processing Units (CPUs. Most neural simulators utilize either multiple CPUs or a single GPU for better performance, but still show limitations in execution time when biological details are not sacrificed. Therefore, we present a novel CPU/GPU simulation environment for large-scale biological networks,the NeoCortical Simulator version 6 (NCS6. NCS6 is a free, open-source, parallelizable, and scalable simula-tor, designed to run on clusters of multiple machines, potentially with high performance computing devicesin each of them. It has built-in leaky-integrate-and-fire (LIF and Izhikevich (IZH neuron models, but usersalso have the capability to design their own plug-in interface for different neuron types as desired. NCS6is currently able to simulate one million cells and 100 million synapses in quasi real time by distributing dataacross these heterogeneous clusters of CPUs and GPUs.

  3. Realistic Material Appearance Modelling

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Filip, Jiří; Hatka, Martin

    2010-01-01

    Roč. 2010, č. 81 (2010), s. 13-14 ISSN 0926-4981 R&D Projects: GA ČR GA102/08/0593 Institutional research plan: CEZ:AV0Z10750506 Keywords : bidirectional texture function * texture modelling Subject RIV: BD - Theory of Information http:// library .utia.cas.cz/separaty/2010/RO/haindl-realistic material appearance modelling.pdf

  4. Larval dispersal modeling of pearl oyster Pinctada margaritifera following realistic environmental and biological forcing in Ahe atoll lagoon.

    Directory of Open Access Journals (Sweden)

    Yoann Thomas

    Full Text Available Studying the larval dispersal of bottom-dwelling species is necessary to understand their population dynamics and optimize their management. The black-lip pearl oyster (Pinctada margaritifera is cultured extensively to produce black pearls, especially in French Polynesia's atoll lagoons. This aquaculture relies on spat collection, a process that can be optimized by understanding which factors influence larval dispersal. Here, we investigate the sensitivity of P. margaritifera larval dispersal kernel to both physical and biological factors in the lagoon of Ahe atoll. Specifically, using a validated 3D larval dispersal model, the variability of lagoon-scale connectivity is investigated against wind forcing, depth and location of larval release, destination location, vertical swimming behavior and pelagic larval duration (PLD factors. The potential connectivity was spatially weighted according to both the natural and cultivated broodstock densities to provide a realistic view of connectivity. We found that the mean pattern of potential connectivity was driven by the southwest and northeast main barotropic circulation structures, with high retention levels in both. Destination locations, spawning sites and PLD were the main drivers of potential connectivity, explaining respectively 26%, 59% and 5% of the variance. Differences between potential and realistic connectivity showed the significant contribution of the pearl oyster broodstock location to its own dynamics. Realistic connectivity showed larger larval supply in the western destination locations, which are preferentially used by farmers for spat collection. In addition, larval supply in the same sectors was enhanced during summer wind conditions. These results provide new cues to understanding the dynamics of bottom-dwelling populations in atoll lagoons, and show how to take advantage of numerical models for pearl oyster management.

  5. Realistic biological approaches for improving thermoradiotherapy

    DEFF Research Database (Denmark)

    Horsman, Michael R

    2016-01-01

    There is now definitive clinical evidence that hyperthermia can successfully improve the response of certain human tumour types to radiation therapy, but, there is still the need for improvement. From a biological standpoint this can be achieved by either targeting the cellular or vascular...... or radiation in preclinical models and clear benefits in tumour response observed. But few of these methods have actually been combined with thermoradiotherapy. Furthermore, very few combinations have been tested in relevant normal tissue studies, despite the fact that it is the normal tissue response...... that controls the maximal heat or radiation treatment that can be applied. Here we review the most clinically relevant biological approaches that have been shown to enhance thermoradiotherapy, or have the potential to be applied in this context, and suggest how these should be moved forward into the clinic....

  6. Development of a realistic human airway model.

    Science.gov (United States)

    Lizal, Frantisek; Elcner, Jakub; Hopke, Philip K; Jedelsky, Jan; Jicha, Miroslav

    2012-03-01

    Numerous models of human lungs with various levels of idealization have been reported in the literature; consequently, results acquired using these models are difficult to compare to in vivo measurements. We have developed a set of model components based on realistic geometries, which permits the analysis of the effects of subsequent model simplification. A realistic digital upper airway geometry except for the lack of an oral cavity has been created which proved suitable both for computational fluid dynamics (CFD) simulations and for the fabrication of physical models. Subsequently, an oral cavity was added to the tracheobronchial geometry. The airway geometry including the oral cavity was adjusted to enable fabrication of a semi-realistic model. Five physical models were created based on these three digital geometries. Two optically transparent models, one with and one without the oral cavity, were constructed for flow velocity measurements, two realistic segmented models, one with and one without the oral cavity, were constructed for particle deposition measurements, and a semi-realistic model with glass cylindrical airways was developed for optical measurements of flow velocity and in situ particle size measurements. One-dimensional phase doppler anemometry measurements were made and compared to the CFD calculations for this model and good agreement was obtained.

  7. Comparing Realistic Subthalamic Nucleus Neuron Models

    Science.gov (United States)

    Njap, Felix; Claussen, Jens C.; Moser, Andreas; Hofmann, Ulrich G.

    2011-06-01

    The mechanism of action of clinically effective electrical high frequency stimulation is still under debate. However, recent evidence points at the specific activation of GABA-ergic ion channels. Using a computational approach, we analyze temporal properties of the spike trains emitted by biologically realistic neurons of the subthalamic nucleus (STN) as a function of GABA-ergic synaptic input conductances. Our contribution is based on a model proposed by Rubin and Terman and exhibits a wide variety of different firing patterns, silent, low spiking, moderate spiking and intense spiking activity. We observed that most of the cells in our network turn to silent mode when we increase the GABAA input conductance above the threshold of 3.75 mS/cm2. On the other hand, insignificant changes in firing activity are observed when the input conductance is low or close to zero. We thus reproduce Rubin's model with vanishing synaptic conductances. To quantitatively compare spike trains from the original model with the modified model at different conductance levels, we apply four different (dis)similarity measures between them. We observe that Mahalanobis distance, Victor-Purpura metric, and Interspike Interval distribution are sensitive to different firing regimes, whereas Mutual Information seems undiscriminative for these functional changes.

  8. Realistic molecular model of kerogen's nanostructure.

    Science.gov (United States)

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J-M; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp(2)/sp(3) hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  9. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  10. Creating a Structurally Realistic Finite Element Geometric Model of a Cardiomyocyte to Study the Role of Cellular Architecture in Cardiomyocyte Systems Biology.

    Science.gov (United States)

    Rajagopal, Vijay; Bass, Gregory; Ghosh, Shouryadipta; Hunt, Hilary; Walker, Cameron; Hanssen, Eric; Crampin, Edmund; Soeller, Christian

    2018-04-18

    With the advent of three-dimensional (3D) imaging technologies such as electron tomography, serial-block-face scanning electron microscopy and confocal microscopy, the scientific community has unprecedented access to large datasets at sub-micrometer resolution that characterize the architectural remodeling that accompanies changes in cardiomyocyte function in health and disease. However, these datasets have been under-utilized for investigating the role of cellular architecture remodeling in cardiomyocyte function. The purpose of this protocol is to outline how to create an accurate finite element model of a cardiomyocyte using high resolution electron microscopy and confocal microscopy images. A detailed and accurate model of cellular architecture has significant potential to provide new insights into cardiomyocyte biology, more than experiments alone can garner. The power of this method lies in its ability to computationally fuse information from two disparate imaging modalities of cardiomyocyte ultrastructure to develop one unified and detailed model of the cardiomyocyte. This protocol outlines steps to integrate electron tomography and confocal microscopy images of adult male Wistar (name for a specific breed of albino rat) rat cardiomyocytes to develop a half-sarcomere finite element model of the cardiomyocyte. The procedure generates a 3D finite element model that contains an accurate, high-resolution depiction (on the order of ~35 nm) of the distribution of mitochondria, myofibrils and ryanodine receptor clusters that release the necessary calcium for cardiomyocyte contraction from the sarcoplasmic reticular network (SR) into the myofibril and cytosolic compartment. The model generated here as an illustration does not incorporate details of the transverse-tubule architecture or the sarcoplasmic reticular network and is therefore a minimal model of the cardiomyocyte. Nevertheless, the model can already be applied in simulation-based investigations into the

  11. 'Semi-realistic'F-term inflation model building in supergravity

    International Nuclear Information System (INIS)

    Kain, Ben

    2008-01-01

    We describe methods for building 'semi-realistic' models of F-term inflation. By semi-realistic we mean that they are built in, and obey the requirements of, 'semi-realistic' particle physics models. The particle physics models are taken to be effective supergravity theories derived from orbifold compactifications of string theory, and their requirements are taken to be modular invariance, absence of mass terms and stabilization of moduli. We review the particle physics models, their requirements and tools and methods for building inflation models

  12. Biophysically realistic minimal model of dopamine neuron

    Science.gov (United States)

    Oprisan, Sorinel

    2008-03-01

    We proposed and studied a new biophysically relevant computational model of dopaminergic neurons. Midbrain dopamine neurons are involved in motivation and the control of movement, and have been implicated in various pathologies such as Parkinson's disease, schizophrenia, and drug abuse. The model we developed is a single-compartment Hodgkin-Huxley (HH)-type parallel conductance membrane model. The model captures the essential mechanisms underlying the slow oscillatory potentials and plateau potential oscillations. The main currents involved are: 1) a voltage-dependent fast calcium current, 2) a small conductance potassium current that is modulated by the cytosolic concentration of calcium, and 3) a slow voltage-activated potassium current. We developed multidimensional bifurcation diagrams and extracted the effective domains of sustained oscillations. The model includes a calcium balance due to the fundamental importance of calcium influx as proved by simultaneous electrophysiological and calcium imaging procedure. Although there are significant evidences to suggest a partially electrogenic calcium pump, all previous models considered only elecrtogenic pumps. We investigated the effect of the electrogenic calcium pump on the bifurcation diagram of the model and compared our findings against the experimental results.

  13. Metastable cosmic strings in realistic models

    International Nuclear Information System (INIS)

    Holman, R.

    1992-01-01

    The stability of the electroweak Z-string is investigated at high temperatures. The results show that, while finite temperature corrections can improve the stability of the Z-string, their effect is not strong enough to stabilize the Z-string in the standard electroweak model. Consequently, the Z-string will be unstable even under the conditions present during the electroweak phase transition. Phenomenologically viable models based on the gauge group SU(2) L x SU(2) R x U(1) B-L are then considered, and it is shown that metastable strings exist and are stable to small perturbations for a large region of the parameter space for these models. It is also shown that these strings are superconducting with bosonic charge carriers. The string superconductivity may be able to stabilize segments and loops against dynamical contraction. Possible implications of these strings for cosmology are discussed

  14. Physically realistic modeling of maritime training simulation

    OpenAIRE

    Cieutat , Jean-Marc

    2003-01-01

    Maritime training simulation is an important matter of maritime teaching, which requires a lot of scientific and technical skills.In this framework, where the real time constraint has to be maintained, all physical phenomena cannot be studied; the most visual physical phenomena relating to the natural elements and the ship behaviour are reproduced only. Our swell model, based on a surface wave simulation approach, permits to simulate the shape and the propagation of a regular train of waves f...

  15. Realistic modeling of radiation transmission inspection systems

    International Nuclear Information System (INIS)

    Sale, K.E.

    1993-01-01

    We have applied Monte Carlo particle transport methods to assess a proposed neutron transmission inspection system for checked luggage. The geometry of the system and the time, energy and angle dependence of the source have been modeled in detail. A pulsed deuteron beam incident on a thick Be target generates a neutron pulse with a very broad energy spectrum which is detected after passage through the luggage item by a plastic scintillator detector operating in current mode (as opposed to pulse counting mode). The neutron transmission as a function of time information is used to infer the densities of hydrogen, carbon, oxygen and nitrogen in the volume sampled. The measured elemental densities can be compared to signatures for explosives or other contraband. By using such computational modeling it is possible to optimize many aspects of the design of an inspection system without costly and time consuming prototyping experiments or to determine that a proposed scheme will not work. The methods applied here can be used to evaluate neutron or photon schemes based on transmission, scattering or reaction techniques

  16. Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations

    International Nuclear Information System (INIS)

    Aristovich, K Y; Khan, S H

    2010-01-01

    Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties - from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects.

  17. Interferometric data modelling: issues in realistic data generation

    International Nuclear Information System (INIS)

    Mukherjee, Soma

    2004-01-01

    This study describes algorithms developed for modelling interferometric noise in a realistic manner, i.e. incorporating non-stationarity that can be seen in the data from the present generation of interferometers. The noise model is based on individual component models (ICM) with the application of auto regressive moving average (ARMA) models. The data obtained from the model are vindicated by standard statistical tests, e.g. the KS test and Akaike minimum criterion. The results indicate a very good fit. The advantage of using ARMA for ICMs is that the model parameters can be controlled and hence injection and efficiency studies can be conducted in a more controlled environment. This realistic non-stationary noise generator is intended to be integrated within the data monitoring tool framework

  18. An Overview of Westinghouse Realistic Large Break LOCA Evaluation Model

    Directory of Open Access Journals (Sweden)

    Cesare Frepoli

    2008-01-01

    Full Text Available Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.

  19. Building Realistic Mobility Models for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Adrian Pullin

    2018-04-01

    Full Text Available A mobile ad hoc network (MANET is a self-configuring wireless network in which each node could act as a router, as well as a data source or sink. Its application areas include battlefields and vehicular and disaster areas. Many techniques applied to infrastructure-based networks are less effective in MANETs, with routing being a particular challenge. This paper presents a rigorous study into simulation techniques for evaluating routing solutions for MANETs with the aim of producing more realistic simulation models and thereby, more accurate protocol evaluations. MANET simulations require models that reflect the world in which the MANET is to operate. Much of the published research uses movement models, such as the random waypoint (RWP model, with arbitrary world sizes and node counts. This paper presents a technique for developing more realistic simulation models to test and evaluate MANET protocols. The technique is animation, which is applied to a realistic scenario to produce a model that accurately reflects the size and shape of the world, node count, movement patterns, and time period over which the MANET may operate. The animation technique has been used to develop a battlefield model based on established military tactics. Trace data has been used to build a model of maritime movements in the Irish Sea. Similar world models have been built using the random waypoint movement model for comparison. All models have been built using the ns-2 simulator. These models have been used to compare the performance of three routing protocols: dynamic source routing (DSR, destination-sequenced distance-vector routing (DSDV, and ad hoc n-demand distance vector routing (AODV. The findings reveal that protocol performance is dependent on the model used. In particular, it is shown that RWP models do not reflect the performance of these protocols under realistic circumstances, and protocol selection is subject to the scenario to which it is applied. To

  20. Realistic modeling of chamber transport for heavy-ion fusion

    International Nuclear Information System (INIS)

    Sharp, W.M.; Grote, D.P.; Callahan, D.A.; Tabak, M.; Henestroza, E.; Yu, S.S.; Peterson, P.F.; Welch, D.R.; Rose, D.V.

    2003-01-01

    Transport of intense heavy-ion beams to an inertial-fusion target after final focus is simulated here using a realistic computer model. It is found that passing the beam through a rarefied plasma layer before it enters the fusion chamber can largely neutralize the beam space charge and lead to a usable focal spot for a range of ion species and input conditions

  1. A scan for models with realistic fermion mass patterns

    International Nuclear Information System (INIS)

    Bijnens, J.; Wetterich, C.

    1986-03-01

    We consider models which have no small Yukawa couplings unrelated to symmetry. This situation is generic in higher dimensional unification where Yukawa couplings are predicted to have strength similar to the gauge couplings. Generations have then to be differentiated by symmetry properties and the structure of fermion mass matrices is given in terms of quantum numbers alone. We scan possible symmetries leading to realistic mass matrices. (orig.)

  2. Development of vortex model with realistic axial velocity distribution

    International Nuclear Information System (INIS)

    Ito, Kei; Ezure, Toshiki; Ohshima, Hiroyuki

    2014-01-01

    A vortex is considered as one of significant phenomena which may cause gas entrainment (GE) and/or vortex cavitation in sodium-cooled fast reactors. In our past studies, the vortex is assumed to be approximated by the well-known Burgers vortex model. However, the Burgers vortex model has a simple but unreal assumption that the axial velocity component is horizontally constant, while in real the free surface vortex has the axial velocity distribution which shows large gradient in radial direction near the vortex center. In this study, a new vortex model with realistic axial velocity distribution is proposed. This model is derived from the steady axisymmetric Navier-Stokes equation as well as the Burgers vortex model, but the realistic axial velocity distribution in radial direction is considered, which is defined to be zero at the vortex center and to approach asymptotically to zero at infinity. As the verification, the new vortex model is applied to the evaluation of a simple vortex experiment, and shows good agreements with the experimental data in terms of the circumferential velocity distribution and the free surface shape. In addition, it is confirmed that the Burgers vortex model fails to calculate accurate velocity distribution with the assumption of uniform axial velocity. However, the calculation accuracy of the Burgers vortex model can be enhanced close to that of the new vortex model in consideration of the effective axial velocity which is calculated as the average value only in the vicinity of the vortex center. (author)

  3. Gauge coupling unification in realistic free-fermionic string models

    International Nuclear Information System (INIS)

    Dienes, K.R.; Faraggi, A.E.

    1995-01-01

    We discuss the unification of gauge couplings within the framework of a wide class of realistic free-fermionic string models which have appeared in the literature, including the flipped SU(5), SO(6)xSO(4), and various SU(3)xSU(2)xU(1) models. If the matter spectrum below the string scale is that of the Minimal Supersymmetric Standard Model (MSSM), then string unification is in disagreement with experiment. We therefore examine several effects that may modify the minimal string predictions. First, we develop a systematic procedure for evaluating the one-loop heavy string threshold corrections in free-fermionic string models, and we explicitly evaluate these corrections for each of the realistic models. We find that these string threshold corrections are small, and we provide general arguments explaining why such threshold corrections are suppressed in string theory. Thus heavy thresholds cannot resolve the disagreement with experiment. We also study the effect of non-standard hypercharge normalizations, light SUSY thresholds, and intermediate-scale gauge structure, and similarly conclude that these effects cannot resolve the disagreement with low-energy data. Finally, we examine the effects of additional color triplets and electroweak doublets beyond the MSSM. Although not required in ordinary grand unification scenarios, such states generically appear within the context of certain realistic free-fermionic string models. We show that if these states exist at the appropriate thresholds, then the gauge couplings will indeed unify at the string scale. Thus, within these string models, string unification can be in agreement with low-energy data. (orig.)

  4. Finite Time Blowup in a Realistic Food-Chain Model

    KAUST Repository

    Parshad, Rana; Ait Abderrahmane, Hamid; Upadhyay, Ranjit Kumar; Kumari, Nitu

    2013-01-01

    We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.

  5. Finite Time Blowup in a Realistic Food-Chain Model

    KAUST Repository

    Parshad, Rana

    2013-05-19

    We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.

  6. An inexpensive yet realistic model for teaching vasectomy

    Directory of Open Access Journals (Sweden)

    Taylor M. Coe

    2015-04-01

    Full Text Available Purpose Teaching the no-scalpel vasectomy is important, since vasectomy is a safe, simple, and cost-effective method of contraception. This minimally invasive vasectomy technique involves delivering the vas through the skin with specialized tools. This technique is associated with fewer complications than the traditional incisional vasectomy (1. One of the most challenging steps is the delivery of the vas through a small puncture in the scrotal skin, and there is a need for a realistic and inexpensive scrotal model for beginning learners to practice this step. Materials and Methods After careful observation using several scrotal models while teaching residents and senior trainees, we developed a simplified scrotal model that uses only three components–bicycle inner tube, latex tubing, and a Penrose drain. Results This model is remarkably realistic and allows learners to practice a challenging step in the no-scalpel vasectomy. The low cost and simple construction of the model allows wide dissemination of training in this important technique. Conclusions We propose a simple, inexpensive model that will enable learners to master the hand movements involved in delivering the vas through the skin while mitigating the risks of learning on patients.

  7. Realistic shell-model calculations for Sn isotopes

    International Nuclear Information System (INIS)

    Covello, A.; Andreozzi, F.; Coraggio, L.; Gargano, A.; Porrino, A.

    1997-01-01

    We report on a shell-model study of the Sn isotopes in which a realistic effective interaction derived from the Paris free nucleon-nucleon potential is employed. The calculations are performed within the framework of the seniority scheme by making use of the chain-calculation method. This provides practically exact solutions while cutting down the amount of computational work required by a standard seniority-truncated calculation. The behavior of the energy of several low-lying states in the isotopes with A ranging from 122 to 130 is presented and compared with the experimental one. (orig.)

  8. Electron percolation in realistic models of carbon nanotube networks

    International Nuclear Information System (INIS)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-01-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models

  9. Electron percolation in realistic models of carbon nanotube networks

    Science.gov (United States)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-09-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models.

  10. Modeling and Analysis of Realistic Fire Scenarios in Spacecraft

    Science.gov (United States)

    Brooker, J. E.; Dietrich, D. L.; Gokoglu, S. A.; Urban, D. L.; Ruff, G. A.

    2015-01-01

    An accidental fire inside a spacecraft is an unlikely, but very real emergency situation that can easily have dire consequences. While much has been learned over the past 25+ years of dedicated research on flame behavior in microgravity, a quantitative understanding of the initiation, spread, detection and extinguishment of a realistic fire aboard a spacecraft is lacking. Virtually all combustion experiments in microgravity have been small-scale, by necessity (hardware limitations in ground-based facilities and safety concerns in space-based facilities). Large-scale, realistic fire experiments are unlikely for the foreseeable future (unlike in terrestrial situations). Therefore, NASA will have to rely on scale modeling, extrapolation of small-scale experiments and detailed numerical modeling to provide the data necessary for vehicle and safety system design. This paper presents the results of parallel efforts to better model the initiation, spread, detection and extinguishment of fires aboard spacecraft. The first is a detailed numerical model using the freely available Fire Dynamics Simulator (FDS). FDS is a CFD code that numerically solves a large eddy simulation form of the Navier-Stokes equations. FDS provides a detailed treatment of the smoke and energy transport from a fire. The simulations provide a wealth of information, but are computationally intensive and not suitable for parametric studies where the detailed treatment of the mass and energy transport are unnecessary. The second path extends a model previously documented at ICES meetings that attempted to predict maximum survivable fires aboard space-craft. This one-dimensional model implies the heat and mass transfer as well as toxic species production from a fire. These simplifications result in a code that is faster and more suitable for parametric studies (having already been used to help in the hatch design of the Multi-Purpose Crew Vehicle, MPCV).

  11. Toward developing more realistic groundwater models using big data

    Science.gov (United States)

    Vahdat Aboueshagh, H.; Tsai, F. T. C.; Bhatta, D.; Paudel, K.

    2017-12-01

    Rich geological data is the backbone of developing realistic groundwater models for groundwater resources management. However, constructing realistic groundwater models can be challenging due to inconsistency between different sources of geological, hydrogeological and geophysical data and difficulty in processing big data to characterize the subsurface environment. This study develops a framework to utilize a big geological dataset to create a groundwater model for the Chicot Aquifer in the southwestern Louisiana, which borders on the Gulf of Mexico at south. The Chicot Aquifer is the principal source of fresh water in southwest Louisiana, underlying an area of about 9,000 square miles. Agriculture is the largest groundwater consumer in this region and overpumping has caused significant groundwater head decline and saltwater intrusion from the Gulf and deep formations. A hydrostratigraphy model was constructed using around 29,000 electrical logs and drillers' logs as well as screen lengths of pumping wells through a natural neighbor interpolation method. These sources of information have different weights in terms of accuracy and trustworthy. A data prioritization procedure was developed to filter untrustworthy log information, eliminate redundant data, and establish consensus of various lithological information. The constructed hydrostratigraphy model shows 40% sand facies, which is consistent with the well log data. The hydrostratigraphy model confirms outcrop areas of the Chicot Aquifer in the north of the study region. The aquifer sand formation is thinning eastward to merge into Atchafalaya River alluvial aquifer and coalesces to the underlying Evangeline aquifer. A grid generator was used to convert the hydrostratigraphy model into a MODFLOW grid with 57 layers. A Chicot groundwater model was constructed using the available hydrologic and hydrogeological data for 2004-2015. Pumping rates for irrigation wells were estimated using the crop type and acreage

  12. Electron distribution in polar heterojunctions within a realistic model

    Energy Technology Data Exchange (ETDEWEB)

    Tien, Nguyen Thanh, E-mail: thanhtienctu@gmail.com [College of Natural Science, Can Tho University, 3-2 Road, Can Tho City (Viet Nam); Thao, Dinh Nhu [Center for Theoretical and Computational Physics, College of Education, Hue University, 34 Le Loi Street, Hue City (Viet Nam); Thao, Pham Thi Bich [College of Natural Science, Can Tho University, 3-2 Road, Can Tho City (Viet Nam); Quang, Doan Nhat [Institute of Physics, Vietnamese Academy of Science and Technology, 10 Dao Tan Street, Hanoi (Viet Nam)

    2015-12-15

    We present a theoretical study of the electron distribution, i.e., two-dimensional electron gas (2DEG) in polar heterojunctions (HJs) within a realistic model. The 2DEG is confined along the growth direction by a triangular quantum well with a finite potential barrier and a bent band figured by all confinement sources. Therein, interface polarization charges take a double role: they induce a confining potential and, furthermore, they can make some change in other confinements, e.g., in the Hartree potential from ionized impurities and 2DEG. Confinement by positive interface polarization charges is necessary for the ground state of 2DEG existing at a high sheet density. The 2DEG bulk density is found to be increased in the barrier, so that the scattering occurring in this layer (from interface polarization charges and alloy disorder) becomes paramount in a polar modulation-doped HJ.

  13. Realistic camera noise modeling with application to improved HDR synthesis

    Science.gov (United States)

    Goossens, Bart; Luong, Hiêp; Aelterman, Jan; Pižurica, Aleksandra; Philips, Wilfried

    2012-12-01

    Due to the ongoing miniaturization of digital camera sensors and the steady increase of the "number of megapixels", individual sensor elements of the camera become more sensitive to noise, even deteriorating the final image quality. To go around this problem, sophisticated processing algorithms in the devices, can help to maximally exploit the knowledge on the sensor characteristics (e.g., in terms of noise), and offer a better image reconstruction. Although a lot of research focuses on rather simplistic noise models, such as stationary additive white Gaussian noise, only limited attention has gone to more realistic digital camera noise models. In this article, we first present a digital camera noise model that takes several processing steps in the camera into account, such as sensor signal amplification, clipping, post-processing,.. We then apply this noise model to the reconstruction problem of high dynamic range (HDR) images from a small set of low dynamic range (LDR) exposures of a static scene. In literature, HDR reconstruction is mostly performed by computing a weighted average, in which the weights are directly related to the observer pixel intensities of the LDR image. In this work, we derive a Bayesian probabilistic formulation of a weighting function that is near-optimal in the MSE sense (or SNR sense) of the reconstructed HDR image, by assuming exponentially distributed irradiance values. We define the weighting function as the probability that the observed pixel intensity is approximately unbiased. The weighting function can be directly computed based on the noise model parameters, which gives rise to different symmetric and asymmetric shapes when electronic noise or photon noise is dominant. We also explain how to deal with the case that some of the noise model parameters are unknown and explain how the camera response function can be estimated using the presented noise model. Finally, experimental results are provided to support our findings.

  14. Convective aggregation in idealised models and realistic equatorial cases

    Science.gov (United States)

    Holloway, Chris

    2015-04-01

    Idealised explicit convection simulations of the Met Office Unified Model are shown to exhibit spontaneous self-aggregation in radiative-convective equilibrium, as seen previously in other models in several recent studies. This self-aggregation is linked to feedbacks between radiation, surface fluxes, and convection, and the organization is intimately related to the evolution of the column water vapour (CWV) field. To investigate the relevance of this behaviour to the real world, these idealized simulations are compared with five 15-day cases of real organized convection in the tropics, including multiple simulations of each case testing sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. Despite similar large-scale forcing via lateral boundary conditions, systematic differences in mean CWV, CWV distribution shape, and the length scale of CWV features are found between the different sensitivity runs, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations.

  15. Improved transcranial magnetic stimulation coil design with realistic head modeling

    Science.gov (United States)

    Crowther, Lawrence; Hadimani, Ravi; Jiles, David

    2013-03-01

    We are investigating Transcranial magnetic stimulation (TMS) as a noninvasive technique based on electromagnetic induction which causes stimulation of the neurons in the brain. TMS can be used as a pain-free alternative to conventional electroconvulsive therapy (ECT) which is still widely implemented for treatment of major depression. Development of improved TMS coils capable of stimulating subcortical regions could also allow TMS to replace invasive deep brain stimulation (DBS) which requires surgical implantation of electrodes in the brain. Our new designs allow new applications of the technique to be established for a variety of diagnostic and therapeutic applications of psychiatric disorders and neurological diseases. Calculation of the fields generated inside the head is vital for the use of this method for treatment. In prior work we have implemented a realistic head model, incorporating inhomogeneous tissue structures and electrical conductivities, allowing the site of neuronal activation to be accurately calculated. We will show how we utilize this model in the development of novel TMS coil designs to improve the depth of penetration and localization of stimulation produced by stimulator coils.

  16. A Fibrocontractive Mechanochemical Model of Dermal Wound Closure Incorporating Realistic Growth Factor Kinetics

    KAUST Repository

    Murphy, Kelly E.

    2012-01-13

    Fibroblasts and their activated phenotype, myofibroblasts, are the primary cell types involved in the contraction associated with dermal wound healing. Recent experimental evidence indicates that the transformation from fibroblasts to myofibroblasts involves two distinct processes: The cells are stimulated to change phenotype by the combined actions of transforming growth factor β (TGFβ) and mechanical tension. This observation indicates a need for a detailed exploration of the effect of the strong interactions between the mechanical changes and growth factors in dermal wound healing. We review the experimental findings in detail and develop a model of dermal wound healing that incorporates these phenomena. Our model includes the interactions between TGFβ and collagenase, providing a more biologically realistic form for the growth factor kinetics than those included in previous mechanochemical descriptions. A comparison is made between the model predictions and experimental data on human dermal wound healing and all the essential features are well matched. © 2012 Society for Mathematical Biology.

  17. A Fibrocontractive Mechanochemical Model of Dermal Wound Closure Incorporating Realistic Growth Factor Kinetics

    KAUST Repository

    Murphy, Kelly E.; Hall, Cameron L.; Maini, Philip K.; McCue, Scott W.; McElwain, D. L. Sean

    2012-01-01

    Fibroblasts and their activated phenotype, myofibroblasts, are the primary cell types involved in the contraction associated with dermal wound healing. Recent experimental evidence indicates that the transformation from fibroblasts to myofibroblasts involves two distinct processes: The cells are stimulated to change phenotype by the combined actions of transforming growth factor β (TGFβ) and mechanical tension. This observation indicates a need for a detailed exploration of the effect of the strong interactions between the mechanical changes and growth factors in dermal wound healing. We review the experimental findings in detail and develop a model of dermal wound healing that incorporates these phenomena. Our model includes the interactions between TGFβ and collagenase, providing a more biologically realistic form for the growth factor kinetics than those included in previous mechanochemical descriptions. A comparison is made between the model predictions and experimental data on human dermal wound healing and all the essential features are well matched. © 2012 Society for Mathematical Biology.

  18. Bayesian inversion using a geologically realistic and discrete model space

    Science.gov (United States)

    Jaeggli, C.; Julien, S.; Renard, P.

    2017-12-01

    Since the early days of groundwater modeling, inverse methods play a crucial role. Many research and engineering groups aim to infer extensive knowledge of aquifer parameters from a sparse set of observations. Despite decades of dedicated research on this topic, there are still several major issues to be solved. In the hydrogeological framework, one is often confronted with underground structures that present very sharp contrasts of geophysical properties. In particular, subsoil structures such as karst conduits, channels, faults, or lenses, strongly influence groundwater flow and transport behavior of the underground. For this reason it can be essential to identify their location and shape very precisely. Unfortunately, when inverse methods are specially trained to consider such complex features, their computation effort often becomes unaffordably high. The following work is an attempt to solve this dilemma. We present a new method that is, in some sense, a compromise between the ergodicity of Markov chain Monte Carlo (McMC) methods and the efficient handling of data by the ensemble based Kalmann filters. The realistic and complex random fields are generated by a Multiple-Point Statistics (MPS) tool. Nonetheless, it is applicable with any conditional geostatistical simulation tool. Furthermore, the algorithm is independent of any parametrization what becomes most important when two parametric systems are equivalent (permeability and resistivity, speed and slowness, etc.). When compared to two existing McMC schemes, the computational effort was divided by a factor of 12.

  19. Biochemical transport modeling, estimation, and detection in realistic environments

    Science.gov (United States)

    Ortner, Mathias; Nehorai, Arye

    2006-05-01

    Early detection and estimation of the spread of a biochemical contaminant are major issues for homeland security applications. We present an integrated approach combining the measurements given by an array of biochemical sensors with a physical model of the dispersion and statistical analysis to solve these problems and provide system performance measures. We approximate the dispersion model of the contaminant in a realistic environment through numerical simulations of reflected stochastic diffusions describing the microscopic transport phenomena due to wind and chemical diffusion using the Feynman-Kac formula. We consider arbitrary complex geometries and account for wind turbulence. Localizing the dispersive sources is useful for decontamination purposes and estimation of the cloud evolution. To solve the associated inverse problem, we propose a Bayesian framework based on a random field that is particularly powerful for localizing multiple sources with small amounts of measurements. We also develop a sequential detector using the numerical transport model we propose. Sequential detection allows on-line analysis and detecting wether a change has occurred. We first focus on the formulation of a suitable sequential detector that overcomes the presence of unknown parameters (e.g. release time, intensity and location). We compute a bound on the expected delay before false detection in order to decide the threshold of the test. For a fixed false-alarm rate, we obtain the detection probability of a substance release as a function of its location and initial concentration. Numerical examples are presented for two real-world scenarios: an urban area and an indoor ventilation duct.

  20. Realistic modelling of observed seismic motion in complex sedimentary basins

    International Nuclear Information System (INIS)

    Faeh, D.; Panza, G.F.

    1994-03-01

    Three applications of a numerical technique are illustrated to model realistically the seismic ground motion for complex two-dimensional structures. First we consider a sedimentary basin in the Friuli region, and we model strong motion records from an aftershock of the 1976 earthquake. Then we simulate the ground motion caused in Rome by the 1915, Fucino (Italy) earthquake, and we compare our modelling with the damage distribution observed in the town. Finally we deal with the interpretation of ground motion recorded in Mexico City, as a consequence of earthquakes in the Mexican subduction zone. The synthetic signals explain the major characteristics (relative amplitudes, spectral amplification, frequency content) of the considered seismograms, and the space distribution of the available macroseismic data. For the sedimentary basin in the Friuli area, parametric studies demonstrate the relevant sensitivity of the computed ground motion to small changes in the subsurface topography of the sedimentary basin, and in the velocity and quality factor of the sediments. The total energy of ground motion, determined from our numerical simulation in Rome, is in very good agreement with the distribution of damage observed during the Fucino earthquake. For epicentral distances in the range 50km-100km, the source location and not only the local soil conditions control the local effects. For Mexico City, the observed ground motion can be explained as resonance effects and as excitation of local surface waves, and the theoretical and the observed maximum spectral amplifications are very similar. In general, our numerical simulations permit the estimate of the maximum and average spectral amplification for specific sites, i.e. are a very powerful tool for accurate micro-zonation. (author). 38 refs, 19 figs, 1 tab

  1. From Realistic to Primitive Models: A Primitive Model of Methanol

    Czech Academy of Sciences Publication Activity Database

    Vlček, Lukáš; Nezbeda, Ivo

    2003-01-01

    Roč. 101, č. 19 (2003), s. 2987-2996 ISSN 0026-8976 R&D Projects: GA AV ČR IAA4072303; GA AV ČR IAA4072309 Grant - others:NATO(XX) PST.CLG 978178/6343 Institutional research plan: CEZ:AV0Z4072921 Keywords : primitive model * methanol Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.591, year: 2003

  2. Neutron star models with realistic high-density equations of state

    International Nuclear Information System (INIS)

    Malone, R.C.; Johnson, M.B.; Bethe, H.A.

    1975-01-01

    We calculate neutron star models using four realistic high-density models of the equation of state. We conclude that the maximum mass of a neutron star is unlikely to exceed 2 M/sub sun/. All of the realistic models are consistent with current estimates of the moment of inertia of the Crab pulsar

  3. Models for synthetic biology.

    Science.gov (United States)

    Kaznessis, Yiannis N

    2007-11-06

    Synthetic biological engineering is emerging from biology as a distinct discipline based on quantification. The technologies propelling synthetic biology are not new, nor is the concept of designing novel biological molecules. What is new is the emphasis on system behavior. The objective is the design and construction of new biological devices and systems to deliver useful applications. Numerous synthetic gene circuits have been created in the past decade, including bistable switches, oscillators, and logic gates, and possible applications abound, including biofuels, detectors for biochemical and chemical weapons, disease diagnosis, and gene therapies. More than fifty years after the discovery of the molecular structure of DNA, molecular biology is mature enough for real quantification that is useful for biological engineering applications, similar to the revolution in modeling in chemistry in the 1950s. With the excitement that synthetic biology is generating, the engineering and biological science communities appear remarkably willing to cross disciplinary boundaries toward a common goal.

  4. Realistic full wave modeling of focal plane array pixels.

    Energy Technology Data Exchange (ETDEWEB)

    Campione, Salvatore [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Warne, Larry K. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Jorgenson, Roy E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Davids, Paul [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.; Peters, David W. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.

    2017-11-01

    Here, we investigate full-wave simulations of realistic implementations of multifunctional nanoantenna enabled detectors (NEDs). We focus on a 2x2 pixelated array structure that supports two wavelengths of operation. We design each resonating structure independently using full-wave simulations with periodic boundary conditions mimicking the whole infinite array. We then construct a supercell made of a 2x2 pixelated array with periodic boundary conditions mimicking the full NED; in this case, however, each pixel comprises 10-20 antennas per side. In this way, the cross-talk between contiguous pixels is accounted for in our simulations. We observe that, even though there are finite extent effects, the pixels work as designed, each responding at the respective wavelength of operation. This allows us to stress that realistic simulations of multifunctional NEDs need to be performed to verify the design functionality by taking into account finite extent and cross-talk effects.

  5. Laboratory of Biological Modeling

    Data.gov (United States)

    Federal Laboratory Consortium — The Laboratory of Biological Modeling is defined by both its methodologies and its areas of application. We use mathematical modeling in many forms and apply it to a...

  6. Realistic edge field model code REFC for designing and study of isochronous cyclotron

    International Nuclear Information System (INIS)

    Ismail, M.

    1989-01-01

    The focussing properties and the requirements for isochronism in cyclotron magnet configuration are well-known in hard edge field model. The fact that they quite often change considerably in realistic field can be attributed mainly to the influence of the edge field. A solution to this problem requires a field model which allows a simple construction of equilibrium orbit and yield simple formulae. This can be achieved by using a fitted realistic edge field (Hudson et al 1975) in the region of the pole edge and such a field model is therefore called a realistic edge field model. A code REFC based on realistic edge field model has been developed to design the cyclotron sectors and the code FIELDER has been used to study the beam properties. In this report REFC code has been described along with some relevant explaination of the FIELDER code. (author). 11 refs., 6 figs

  7. Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention

    National Research Council Canada - National Science Library

    Itti, L; Dhavale, N; Pighin, F

    2003-01-01

    We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained...

  8. Development of Realistic Head Models for Electromagnetic Source Imaging of the Human Brain

    National Research Council Canada - National Science Library

    Akalin, Z

    2001-01-01

    In this work, a methodology is developed to solve the forward problem of electromagnetic source imaging using realistic head models, For this purpose, first segmentation of the 3 dimensional MR head...

  9. Simulation of photon transport in a realistic human body model

    International Nuclear Information System (INIS)

    Baccarne, V.; Turzo, A.; Bizais, Y.; Farine, M.

    1997-01-01

    A Monte-Carlo photon transport code to simulate scintigraphy is developed. The scintigraphy consists of injecting a patient with a radioactive tracer (Tc, a 140 keV photon emitter) attached to a biologically active molecule. Complicated physical phenomena, photon interactions, occurring in between the radioactive source emission and the detection of the photon on the gamma-camera, require an accurate description. All these phenomena are very sensitive to the characteristics of human tissues and we had to use segmented computerized tomography slices. A preliminary theoretical study of the physical characteristics (rather badly known) of the biological tissues resulted in a two family classification: soft and bone tissues. By devising a Monte-Carlo simulator a systematic investigation was carried out concerning the relative weight of different types of interaction taking place in the traversed tissue. The importance of bone tissues was evidenced in comparison with the soft tissues, as well as the instability of these phenomena as a function of the patient morphology. These information are crucial in the elaboration and validation of correction techniques applied to the diagnosis images of clinical examinations

  10. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed by Fro...... that there is indeed a constructive role for a wide suite of ecosystem models to evaluate fishing strategies in an ecosystem context...

  11. Low-energy phenomenology of a realistic composite model

    International Nuclear Information System (INIS)

    Korpa, C.; Ryzak, Z.

    1986-01-01

    The low-energy limit of the strongly coupled standard model (Abbott-Farhi composite model) is analyzed. The effects of the excited W isotriplet and isoscalar bosons are investigated and compared with experimental data. As a result, constraints on parameters (masses, coupling constants, etc.) of these vector bosons are obtained. They are not severe enough (certain cancellations are possible) to exclude the model on experimental basis

  12. More-Realistic Digital Modeling of a Human Body

    Science.gov (United States)

    Rogge, Renee

    2010-01-01

    A MATLAB computer program has been written to enable improved (relative to an older program) modeling of a human body for purposes of designing space suits and other hardware with which an astronaut must interact. The older program implements a kinematic model based on traditional anthropometric measurements that do provide important volume and surface information. The present program generates a three-dimensional (3D) whole-body model from 3D body-scan data. The program utilizes thin-plate spline theory to reposition the model without need for additional scans.

  13. A Simple, Realistic Stochastic Model of Gastric Emptying.

    Directory of Open Access Journals (Sweden)

    Jiraphat Yokrattanasak

    Full Text Available Several models of Gastric Emptying (GE have been employed in the past to represent the rate of delivery of stomach contents to the duodenum and jejunum. These models have all used a deterministic form (algebraic equations or ordinary differential equations, considering GE as a continuous, smooth process in time. However, GE is known to occur as a sequence of spurts, irregular both in size and in timing. Hence, we formulate a simple stochastic process model, able to represent the irregular decrements of gastric contents after a meal. The model is calibrated on existing literature data and provides consistent predictions of the observed variability in the emptying trajectories. This approach may be useful in metabolic modeling, since it describes well and explains the apparently heterogeneous GE experimental results in situations where common gastric mechanics across subjects would be expected.

  14. A Realistic $U(2)$ Model of Flavor arXiv

    CERN Document Server

    Linster, Matthias

    We propose a simple $U(2)$ model of flavor compatible with an $SU(5)$ GUT structure. All hierarchies in fermion masses and mixings arise from powers of two small parameters that control the $U(2)$ breaking. In contrast to previous $U(2)$ models this setup can be realized without supersymmetry and provides an excellent fit to all SM flavor observables including neutrinos. We also consider a variant of this model based on a $D_6 \\times U(1)_F$ flavor symmetry, which closely resembles the $U(2)$ structure, but allows for Majorana neutrino masses from the Weinberg operator. Remarkably, in this case one naturally obtains large mixing in the lepton sector from small mixing in the quark sector. The model also offers a natural option for addressing the Strong CP Problem and Dark Matter by identifying the Goldstone boson of the $U(1)_F$ factor as the QCD axion.

  15. Trends in hydrodesulfurization catalysis based on realistic surface models

    DEFF Research Database (Denmark)

    Moses, P.G.; Grabow, L.C.; Fernandez Sanchez, Eva

    2014-01-01

    elementary reactions in HDS of thiophene. These linear correlations are used to develop a simple kinetic model, which qualitatively describes experimental trends in activity. The kinetic model identifies the HS-binding energy as a descriptor of HDS activity. This insight contributes to understanding...... the effect of promotion and structure-activity relationships. Graphical Abstract: [Figure not available: see fulltext.] © 2014 Springer Science+Business Media New York....

  16. Model of Ni-63 battery with realistic PIN structure

    Science.gov (United States)

    Munson, Charles E.; Arif, Muhammad; Streque, Jeremy; Belahsene, Sofiane; Martinez, Anthony; Ramdane, Abderrahim; El Gmili, Youssef; Salvestrini, Jean-Paul; Voss, Paul L.; Ougazzaden, Abdallah

    2015-09-01

    GaN, with its wide bandgap of 3.4 eV, has emerged as an efficient material for designing high-efficiency betavoltaic batteries. An important part of designing efficient betavoltaic batteries involves a good understanding of the full process, from the behavior of the nuclear material and the creation of electron-hole pairs all the way through the collection of photo-generated carriers. This paper presents a detailed model based on Monte Carlo and Silvaco for a GaN-based betavoltaic battery device, modeled after Ni-63 as an energy source. The accuracy of the model is verified by comparing it with experimental values obtained for a GaN-based p-i-n structure under scanning electron microscope illumination.

  17. Model of Ni-63 battery with realistic PIN structure

    Energy Technology Data Exchange (ETDEWEB)

    Munson, Charles E.; Voss, Paul L.; Ougazzaden, Abdallah, E-mail: aougazza@georgiatech-metz.fr [Georgia Tech Lorraine, Georgia Tech-C.N.R.S., UMI2958, 2-3 rue Marconi, 57070 Metz (France); School of Electrical and Computer Engineering, Georgia Institute of Technology, 777 Atlantic Drive NW, 30332-0250 Atlanta (United States); Arif, Muhammad; Salvestrini, Jean-Paul [Georgia Tech Lorraine, Georgia Tech-C.N.R.S., UMI2958, 2-3 rue Marconi, 57070 Metz (France); Université de Lorraine, CentraleSupélec, LMOPS, EA 4423, 2 rue E. Belin, 57070 Metz (France); Streque, Jeremy; El Gmili, Youssef [Georgia Tech Lorraine, Georgia Tech-C.N.R.S., UMI2958, 2-3 rue Marconi, 57070 Metz (France); Belahsene, Sofiane; Martinez, Anthony; Ramdane, Abderrahim [Laboratory for Photonics and Nanostructures, CNRS, Route de Nozay, 91460 Marcoussis (France)

    2015-09-14

    GaN, with its wide bandgap of 3.4 eV, has emerged as an efficient material for designing high-efficiency betavoltaic batteries. An important part of designing efficient betavoltaic batteries involves a good understanding of the full process, from the behavior of the nuclear material and the creation of electron-hole pairs all the way through the collection of photo-generated carriers. This paper presents a detailed model based on Monte Carlo and Silvaco for a GaN-based betavoltaic battery device, modeled after Ni-63 as an energy source. The accuracy of the model is verified by comparing it with experimental values obtained for a GaN-based p-i-n structure under scanning electron microscope illumination.

  18. Model of Ni-63 battery with realistic PIN structure

    International Nuclear Information System (INIS)

    Munson, Charles E.; Voss, Paul L.; Ougazzaden, Abdallah; Arif, Muhammad; Salvestrini, Jean-Paul; Streque, Jeremy; El Gmili, Youssef; Belahsene, Sofiane; Martinez, Anthony; Ramdane, Abderrahim

    2015-01-01

    GaN, with its wide bandgap of 3.4 eV, has emerged as an efficient material for designing high-efficiency betavoltaic batteries. An important part of designing efficient betavoltaic batteries involves a good understanding of the full process, from the behavior of the nuclear material and the creation of electron-hole pairs all the way through the collection of photo-generated carriers. This paper presents a detailed model based on Monte Carlo and Silvaco for a GaN-based betavoltaic battery device, modeled after Ni-63 as an energy source. The accuracy of the model is verified by comparing it with experimental values obtained for a GaN-based p-i-n structure under scanning electron microscope illumination

  19. Towards a realistic composite model of quarks and leptons

    International Nuclear Information System (INIS)

    Li Xiaoyuan; Marshak, R.E.

    1985-06-01

    Within the context of the 't Hooft anomaly matching scheme, some guiding principles for the model building are discussed with an eye to low energy phenomenology. It is argued that Λsub(ch) (chiral symmetry breaking scale of the global color-flavor group Gsub(CF)) proportional Λsub(MC) (metacolor scale) and Λ sub(gsub(CF)) (unification scale of the gauge subgroup of Gsub(CF)) < or approx. Λsub(ch). As illustrations of the method, two composite models are suggested that can give rise to three or four generations of ordinary quarks and leptons without exotic fermions. (orig.)

  20. The search for a realistic flipped SU(5) string model

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, J.L. (Center for Theoretical Physics, Texas A and M Univ., College Station, TX (United States) Astroparticle Physics Group, Houston Advanced Research Center (HARC), The Woodlands, TX (United States)); Nanopoulos, D.V. (Center for Theoretical Physics, Texas A and M Univ., College Station, TX (United States) Astroparticle Physics Group, Houston Advanced Research Center (HARC), The Woodlands, TX (United States)); Yuan, K. (Department of Physics and Astronomy, University of Alabama, Tuscaloosa, AL (United States))

    1993-07-05

    We present an extensive search for a class of flipped SU(5) models built within the free fermionic formulation of the heterotic string. We describe a set of algorithms which constitute the basis for a computer program capable of generating systematically the massless spectrum and the superpotential of all possible models within the class we consider. Our search through the huge parameter space to be explored is simplified considerably by the constraint of N=1 spacetime supersymmetry and the need for extra Q, anti Q representations beyond the standard ones in order to possibly achieve string gauge coupling unification at scales of O(10[sup 18] GeV). Our results are remarkably simple and evidence the large degree of redundancy in this kind of constructions. We find one model with gauge group SU(5)xU(1)sub(Y tilde)xSO(10)[sub h]xSU(4)[sub h]xU(1)[sup 5] and fairly acceptable phenomenological properties. We study the D- and F-flatness constraints and the symmetry breaking pattern in this model and conclude that string gauge coupling unification is quite possible. (orig.)

  1. Development of realistic concrete models including scaling effects

    International Nuclear Information System (INIS)

    Carpinteri, A.

    1989-09-01

    Progressive cracking in structural elements of concrete is considered. Two simple models are applied, which, even though different, lead to similar predictions for the fracture behaviour. Both Virtual Crack Propagation Model and Cohesive Limit Analysis (Section 2), show a trend towards brittle behaviour and catastrophical events for large structural sizes. A numerical Cohesive Crack Model is proposed (Section 3) to describe strain softening and strain localization in concrete. Such a model is able to predict the size effects of fracture mechanics accurately. Whereas for Mode I, only untieing of the finite element nodes is applied to simulate crack growth, for Mixed Mode a topological variation is required at each step (Section 4). In the case of the four point shear specimen, the load vs. deflection diagrams reveal snap-back instability for large sizes. By increasing the specimen sizes, such instability tends to reproduce the classical LEFM instability. Remarkable size effects are theoretically predicted and experimentally confirmed also for reinforced concrete (Section 5). The brittleness of the flexural members increases by increasing size and/or decreasing steel content. On the basis of these results, the empirical code rules regarding the minimum amount of reinforcement could be considerably revised

  2. Anisotropic, nonsingular early universe model leading to a realistic cosmology

    International Nuclear Information System (INIS)

    Dechant, Pierre-Philippe; Lasenby, Anthony N.; Hobson, Michael P.

    2009-01-01

    We present a novel cosmological model in which scalar field matter in a biaxial Bianchi IX geometry leads to a nonsingular 'pancaking' solution: the hypersurface volume goes to zero instantaneously at the 'big bang', but all physical quantities, such as curvature invariants and the matter energy density remain finite, and continue smoothly through the big bang. We demonstrate that there exist geodesics extending through the big bang, but that there are also incomplete geodesics that spiral infinitely around a topologically closed spatial dimension at the big bang, rendering it, at worst, a quasiregular singularity. The model is thus reminiscent of the Taub-NUT vacuum solution in that it has biaxial Bianchi IX geometry and its evolution exhibits a dimensionality reduction at a quasiregular singularity; the two models are, however, rather different, as we will show in a future work. Here we concentrate on the cosmological implications of our model and show how the scalar field drives both isotropization and inflation, thus raising the question of whether structure on the largest scales was laid down at a time when the universe was still oblate (as also suggested by [T. S. Pereira, C. Pitrou, and J.-P. Uzan, J. Cosmol. Astropart. Phys. 9 (2007) 6.][C. Pitrou, T. S. Pereira, and J.-P. Uzan, J. Cosmol. Astropart. Phys. 4 (2008) 4.][A. Guemruekcueoglu, C. Contaldi, and M. Peloso, J. Cosmol. Astropart. Phys. 11 (2007) 005.]). We also discuss the stability of our model to small perturbations around biaxiality and draw an analogy with cosmological perturbations. We conclude by presenting a separate, bouncing solution, which generalizes the known bouncing solution in closed FRW universes.

  3. A realistic model for quantum theory with a locality property

    International Nuclear Information System (INIS)

    Eberhard, P.H.

    1987-04-01

    A model reproducing the predictions of relativistic quantum theory to any desired degree of accuracy is described in this paper. It involves quantities that are independent of the observer's knowledge, and therefore can be called real, and which are defined at each point in space, and therefore can be called local in a rudimentary sense. It involves faster-than-light, but not instantaneous, action at distance

  4. Flow visualization through particle image velocimetry in realistic model of rhesus monkey's upper airway.

    Science.gov (United States)

    Kim, Ji-Woong; Phuong, Nguyen Lu; Aramaki, Shin-Ichiro; Ito, Kazuhide

    2018-05-01

    Studies concerning inhalation toxicology and respiratory drug-delivery systems require biological testing involving experiments performed on animals. Particle image velocimetry (PIV) is an effective in vitro technique that reveals detailed inhalation flow patterns, thereby assisting analyses of inhalation exposure to various substances. A realistic model of a rhesus-monkey upper airway was developed to investigate flow patterns in its oral and nasal cavities through PIV experiments performed under steady-state constant inhalation conditions at various flow rates-4, 10, and 20 L/min. Flow rate of the fluid passing through the inlet into the trachea was measured to obtain characteristic flow mechanisms, and flow phenomena in the model were confirmed via characterized flow fields. It was observed that increase in flow rate leads to constant velocity profiles in upper and lower trachea regions. It is expected that the results of this study would contribute to future validation of studies aimed at developing in silico models, especially those involving computational fluid dynamic (CFD) analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Applying a realistic evaluation model to occupational safety interventions

    DEFF Research Database (Denmark)

    Pedersen, Louise Møller

    2018-01-01

    Background: Recent literature characterizes occupational safety interventions as complex social activities, applied in complex and dynamic social systems. Hence, the actual outcomes of an intervention will vary, depending on the intervention, the implementation process, context, personal characte......Background: Recent literature characterizes occupational safety interventions as complex social activities, applied in complex and dynamic social systems. Hence, the actual outcomes of an intervention will vary, depending on the intervention, the implementation process, context, personal...... and qualitative methods. This revised model has, however, not been applied in a real life context. Method: The model is applied in a controlled, four-component, integrated behaviour-based and safety culture-based safety intervention study (2008-2010) in a medium-sized wood manufacturing company. The interventions...... involve the company’s safety committee, safety manager, safety groups and 130 workers. Results: The model provides a framework for more valid evidence of what works within injury prevention. Affective commitment and role behaviour among key actors are identified as crucial for the implementation...

  6. D-term Spectroscopy in Realistic Heterotic-String Models

    CERN Document Server

    Dedes, Athanasios

    2000-01-01

    The emergence of free fermionic string models with solely the MSSM charged spectrum below the string scale provides further evidence to the assertion that the true string vacuum is connected to the Z_2 x Z_2 orbifold in the vicinity of the free fermionic point in the Narain moduli space. An important property of the Z_2 x Z_2 orbifold is the cyclic permutation symmetry between the three twisted sectors. If preserved in the three generations models the cyclic permutation symmetry results in a family universal anomalous U(1)_A, which is instrumental in explaining squark degeneracy, provided that the dominant component of supersymmetry breaking arises from the U(1)_A D-term. Interestingly, the contribution of the family--universal D_A-term to the squark masses may be intra-family non-universal, and may differ from the usual (universal) boundary conditions assumed in the MSSM. We contemplate how D_A--term spectroscopy may be instrumental in studying superstring models irrespective of our ignorance of the details ...

  7. Realistic Modeling of Seismic Wave Ground Motion in Beijing City

    Science.gov (United States)

    Ding, Z.; Romanelli, F.; Chen, Y. T.; Panza, G. F.

    Algorithms for the calculation of synthetic seismograms in laterally heterogeneous anelastic media have been applied to model the ground motion in Beijing City. The synthetic signals are compared with the few available seismic recordings (1998, Zhangbei earthquake) and with the distribution of observed macroseismic intensity (1976, Tangshan earthquake). The synthetic three-component seismograms have been computed for the Xiji area and Beijing City. The numerical results show that the thick Tertiary and Quaternary sediments are responsible for the severe amplification of the seismic ground motion. Such a result is well correlated with the abnormally high macroseismic intensity zone in the Xiji area associated with the 1976 Tangshan earthquake as well as with the ground motion recorded in Beijing city in the wake of the 1998 Zhangbei earthquake.

  8. Realistic modeling of seismic wave ground motion in Beijing City

    International Nuclear Information System (INIS)

    Ding, Z.; Chen, Y.T.; Romanelli, F.; Panza, G.F.

    2002-05-01

    Advanced algorithms for the calculation of synthetic seismograms in laterally heterogeneous anelastic media have been applied to model the ground motion in Beijing City. The synthetic signals are compared with the few available seismic recordings (1998, Zhangbei earthquake) and with the distribution of the observed macroseismic intensity (1976, Tangshan earthquake). The synthetic 3-component seismograms have been computed in the Xiji area and in Beijing town. The numerical results show that the thick Tertiary and Quaternary sediments are responsible of the severe amplification of the seismic ground motion. Such a result is well correlated with the abnormally high macroseismic intensity zone (Xiji area) associated to the 1976 Tangshan earthquake and with the records in Beijing town, associated to the 1998 Zhangbei earthquake. (author)

  9. Toward the M(F)--Theory Embedding of Realistic Free-Fermion Models

    CERN Document Server

    Berglund, P; Faraggi, A E; Nanopoulos, Dimitri V; Qiu, Z; Berglund, Per; Ellis, John; Faraggi, Alon E.; Qiu, Zongan

    1998-01-01

    We construct a Landau-Ginzburg model with the same data and symmetries as a $Z_2\\times Z_2$ orbifold that corresponds to a class of realistic free-fermion models. Within the class of interest, we show that this orbifolding connects between different $Z_2\\times Z_2$ orbifold models and commutes with the mirror symmetry. Our work suggests that duality symmetries previously discussed in the context of specific $M$ and $F$ theory compactifications may be extended to the special $Z_2\\times Z_2$ orbifold that characterizes realistic free-fermion models.

  10. On the Realistic Stochastic Model of GPS Observables: Implementation and Performance

    Science.gov (United States)

    Zangeneh-Nejad, F.; Amiri-Simkooei, A. R.; Sharifi, M. A.; Asgari, J.

    2015-12-01

    High-precision GPS positioning requires a realistic stochastic model of observables. A realistic GPS stochastic model of observables should take into account different variances for different observation types, correlations among different observables, the satellite elevation dependence of observables precision, and the temporal correlation of observables. Least-squares variance component estimation (LS-VCE) is applied to GPS observables using the geometry-based observation model (GBOM). To model the satellite elevation dependent of GPS observables precision, an exponential model depending on the elevation angles of the satellites are also employed. Temporal correlation of the GPS observables is modelled by using a first-order autoregressive noise model. An important step in the high-precision GPS positioning is double difference integer ambiguity resolution (IAR). The fraction or percentage of success among a number of integer ambiguity fixing is called the success rate. A realistic estimation of the GNSS observables covariance matrix plays an important role in the IAR. We consider the ambiguity resolution success rate for two cases, namely a nominal and a realistic stochastic model of the GPS observables using two GPS data sets collected by the Trimble R8 receiver. The results confirm that applying a more realistic stochastic model can significantly improve the IAR success rate on individual frequencies, either on L1 or on L2. An improvement of 20% was achieved to the empirical success rate results. The results also indicate that introducing the realistic stochastic model leads to a larger standard deviation for the baseline components by a factor of about 2.6 on the data sets considered.

  11. Workshop Introduction: Systems Biology and Biological Models

    Science.gov (United States)

    As we consider the future of toxicity testing, the importance of applying biological models to this problem is clear. Modeling efforts exist along a continuum with respect to the level of organization (e.g. cell, tissue, organism) linked to the resolution of the model. Generally,...

  12. Simulation of size-dependent aerosol deposition in a realistic model of the upper human airways

    NARCIS (Netherlands)

    Frederix, E.M.A.; Kuczaj, Arkadiusz K.; Nordlund, Markus; Belka, M.; Lizal, F.; Elcner, J.; Jicha, M.; Geurts, Bernardus J.

    An Eulerian internally mixed aerosol model is used for predictions of deposition inside a realistic cast of the human upper airways. The model, formulated in the multi-species and compressible framework, is solved using the sectional discretization of the droplet size distribution function to

  13. Plasticity-modulated seizure dynamics for seizure termination in realistic neuronal models

    NARCIS (Netherlands)

    Koppert, M.M.J.; Kalitzin, S.; Lopes da Silva, F.H.; Viergever, M.A.

    2011-01-01

    In previous studies we showed that autonomous absence seizure generation and termination can be explained by realistic neuronal models eliciting bi-stable dynamics. In these models epileptic seizures are triggered either by external stimuli (reflex epilepsies) or by internal fluctuations. This

  14. Correcting electrode modelling errors in EIT on realistic 3D head models.

    Science.gov (United States)

    Jehl, Markus; Avery, James; Malone, Emma; Holder, David; Betcke, Timo

    2015-12-01

    Electrical impedance tomography (EIT) is a promising medical imaging technique which could aid differentiation of haemorrhagic from ischaemic stroke in an ambulance. One challenge in EIT is the ill-posed nature of the image reconstruction, i.e., that small measurement or modelling errors can result in large image artefacts. It is therefore important that reconstruction algorithms are improved with regard to stability to modelling errors. We identify that wrongly modelled electrode positions constitute one of the biggest sources of image artefacts in head EIT. Therefore, the use of the Fréchet derivative on the electrode boundaries in a realistic three-dimensional head model is investigated, in order to reconstruct electrode movements simultaneously to conductivity changes. We show a fast implementation and analyse the performance of electrode position reconstructions in time-difference and absolute imaging for simulated and experimental voltages. Reconstructing the electrode positions and conductivities simultaneously increased the image quality significantly in the presence of electrode movement.

  15. A realistic neural mass model of the cortex with laminar-specific connections and synaptic plasticity - evaluation with auditory habituation.

    Directory of Open Access Journals (Sweden)

    Peng Wang

    Full Text Available In this work we propose a biologically realistic local cortical circuit model (LCCM, based on neural masses, that incorporates important aspects of the functional organization of the brain that have not been covered by previous models: (1 activity dependent plasticity of excitatory synaptic couplings via depleting and recycling of neurotransmitters and (2 realistic inter-laminar dynamics via laminar-specific distribution of and connections between neural populations. The potential of the LCCM was demonstrated by accounting for the process of auditory habituation. The model parameters were specified using Bayesian inference. It was found that: (1 besides the major serial excitatory information pathway (layer 4 to layer 2/3 to layer 5/6, there exists a parallel "short-cut" pathway (layer 4 to layer 5/6, (2 the excitatory signal flow from the pyramidal cells to the inhibitory interneurons seems to be more intra-laminar while, in contrast, the inhibitory signal flow from inhibitory interneurons to the pyramidal cells seems to be both intra- and inter-laminar, and (3 the habituation rates of the connections are unsymmetrical: forward connections (from layer 4 to layer 2/3 are more strongly habituated than backward connections (from Layer 5/6 to layer 4. Our evaluation demonstrates that the novel features of the LCCM are of crucial importance for mechanistic explanations of brain function. The incorporation of these features into a mass model makes them applicable to modeling based on macroscopic data (like EEG or MEG, which are usually available in human experiments. Our LCCM is therefore a valuable building block for future realistic models of human cognitive function.

  16. Realistic Gamow shell model for resonance and continuum in atomic nuclei

    Science.gov (United States)

    Xu, F. R.; Sun, Z. H.; Wu, Q.; Hu, B. S.; Dai, S. J.

    2018-02-01

    The Gamow shell model can describe resonance and continuum for atomic nuclei. The model is established in the complex-moment (complex-k) plane of the Berggren coordinates in which bound, resonant and continuum states are treated on equal footing self-consistently. In the present work, the realistic nuclear force, CD Bonn, has been used. We have developed the full \\hat{Q}-box folded-diagram method to derive the realistic effective interaction in the model space which is nondegenerate and contains resonance and continuum channels. The CD-Bonn potential is renormalized using the V low-k method. With choosing 16O as the inert core, we have applied the Gamow shell model to oxygen isotopes.

  17. Modeling of ultrasonic wave propagation in composite laminates with realistic discontinuity representation.

    Science.gov (United States)

    Zelenyak, Andreea-Manuela; Schorer, Nora; Sause, Markus G R

    2018-02-01

    This paper presents a method for embedding realistic defect geometries of a fiber reinforced material in a finite element modeling environment in order to simulate active ultrasonic inspection. When ultrasonic inspection is used experimentally to investigate the presence of defects in composite materials, the microscopic defect geometry may cause signal characteristics that are difficult to interpret. Hence, modeling of this interaction is key to improve our understanding and way of interpreting the acquired ultrasonic signals. To model the true interaction of the ultrasonic wave field with such defect structures as pores, cracks or delamination, a realistic three dimensional geometry reconstruction is required. We present a 3D-image based reconstruction process which converts computed tomography data in adequate surface representations ready to be embedded for processing with finite element methods. Subsequent modeling using these geometries uses a multi-scale and multi-physics simulation approach which results in quantitative A-Scan ultrasonic signals which can be directly compared with experimental signals. Therefore, besides the properties of the composite material, a full transducer implementation, piezoelectric conversion and simultaneous modeling of the attached circuit is applied. Comparison between simulated and experimental signals provides very good agreement in electrical voltage amplitude and the signal arrival time and thus validates the proposed modeling approach. Simulating ultrasound wave propagation in a medium with a realistic shape of the geometry clearly shows a difference in how the disturbance of the waves takes place and finally allows more realistic modeling of A-scans. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Electromagnetic forward modelling for realistic Earth models using unstructured tetrahedral meshes and a meshfree approach

    Science.gov (United States)

    Farquharson, C.; Long, J.; Lu, X.; Lelievre, P. G.

    2017-12-01

    makes the process of building a geophysical Earth model from a geological model much simpler. In this presentation we will explore the issues that arise when working with realistic Earth models and when synthesizing geophysical electromagnetic data for them. We briefly consider meshfree methods as a possible means of alleviating some of these issues.

  19. Integrative computational models of cardiac arrhythmias -- simulating the structurally realistic heart

    Science.gov (United States)

    Trayanova, Natalia A; Tice, Brock M

    2009-01-01

    Simulation of cardiac electrical function, and specifically, simulation aimed at understanding the mechanisms of cardiac rhythm disorders, represents an example of a successful integrative multiscale modeling approach, uncovering emergent behavior at the successive scales in the hierarchy of structural complexity. The goal of this article is to present a review of the integrative multiscale models of realistic ventricular structure used in the quest to understand and treat ventricular arrhythmias. It concludes with the new advances in image-based modeling of the heart and the promise it holds for the development of individualized models of ventricular function in health and disease. PMID:20628585

  20. A realistic extension of gauge-mediated SUSY-breaking model with superconformal hidden sector

    International Nuclear Information System (INIS)

    Asano, Masaki; Hisano, Junji; Okada, Takashi; Sugiyama, Shohei

    2009-01-01

    The sequestering of supersymmetry (SUSY) breaking parameters, which is induced by superconformal hidden sector, is one of the solutions for the μ/B μ problem in gauge-mediated SUSY-breaking scenario. However, it is found that the minimal messenger model does not derive the correct electroweak symmetry breaking. In this Letter we present a model which has the coupling of the messengers with the SO(10) GUT-symmetry breaking Higgs fields. The model is one of the realistic extensions of the gauge mediation model with superconformal hidden sector. It is shown that the extension is applicable for a broad range of conformality breaking scale

  1. Evolutionary approaches for the reverse-engineering of gene regulatory networks: A study on a biologically realistic dataset

    Directory of Open Access Journals (Sweden)

    Gidrol Xavier

    2008-02-01

    Full Text Available Abstract Background Inferring gene regulatory networks from data requires the development of algorithms devoted to structure extraction. When only static data are available, gene interactions may be modelled by a Bayesian Network (BN that represents the presence of direct interactions from regulators to regulees by conditional probability distributions. We used enhanced evolutionary algorithms to stochastically evolve a set of candidate BN structures and found the model that best fits data without prior knowledge. Results We proposed various evolutionary strategies suitable for the task and tested our choices using simulated data drawn from a given bio-realistic network of 35 nodes, the so-called insulin network, which has been used in the literature for benchmarking. We assessed the inferred models against this reference to obtain statistical performance results. We then compared performances of evolutionary algorithms using two kinds of recombination operators that operate at different scales in the graphs. We introduced a niching strategy that reinforces diversity through the population and avoided trapping of the algorithm in one local minimum in the early steps of learning. We show the limited effect of the mutation operator when niching is applied. Finally, we compared our best evolutionary approach with various well known learning algorithms (MCMC, K2, greedy search, TPDA, MMHC devoted to BN structure learning. Conclusion We studied the behaviour of an evolutionary approach enhanced by niching for the learning of gene regulatory networks with BN. We show that this approach outperforms classical structure learning methods in elucidating the original model. These results were obtained for the learning of a bio-realistic network and, more importantly, on various small datasets. This is a suitable approach for learning transcriptional regulatory networks from real datasets without prior knowledge.

  2. Issues in Biological Shape Modelling

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen

    This talk reflects parts of the current research at informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations, modifications, and applications of the elements of constructing models of shape or appear......This talk reflects parts of the current research at informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations, modifications, and applications of the elements of constructing models of shape...

  3. Simplified realistic human head model for simulating Tumor Treating Fields (TTFields).

    Science.gov (United States)

    Wenger, Cornelia; Bomzon, Ze'ev; Salvador, Ricardo; Basser, Peter J; Miranda, Pedro C

    2016-08-01

    Tumor Treating Fields (TTFields) are alternating electric fields in the intermediate frequency range (100-300 kHz) of low-intensity (1-3 V/cm). TTFields are an anti-mitotic treatment against solid tumors, which are approved for Glioblastoma Multiforme (GBM) patients. These electric fields are induced non-invasively by transducer arrays placed directly on the patient's scalp. Cell culture experiments showed that treatment efficacy is dependent on the induced field intensity. In clinical practice, a software called NovoTalTM uses head measurements to estimate the optimal array placement to maximize the electric field delivery to the tumor. Computational studies predict an increase in the tumor's electric field strength when adapting transducer arrays to its location. Ideally, a personalized head model could be created for each patient, to calculate the electric field distribution for the specific situation. Thus, the optimal transducer layout could be inferred from field calculation rather than distance measurements. Nonetheless, creating realistic head models of patients is time-consuming and often needs user interaction, because automated image segmentation is prone to failure. This study presents a first approach to creating simplified head models consisting of convex hulls of the tissue layers. The model is able to account for anisotropic conductivity in the cortical tissues by using a tensor representation estimated from Diffusion Tensor Imaging. The induced electric field distribution is compared in the simplified and realistic head models. The average field intensities in the brain and tumor are generally slightly higher in the realistic head model, with a maximal ratio of 114% for a simplified model with reasonable layer thicknesses. Thus, the present pipeline is a fast and efficient means towards personalized head models with less complexity involved in characterizing tissue interfaces, while enabling accurate predictions of electric field distribution.

  4. The effect of a realistic thermal diffusivity on numerical model of a subducting slab

    Science.gov (United States)

    Maierova, P.; Steinle-Neumann, G.; Cadek, O.

    2010-12-01

    A number of numerical studies of subducting slab assume simplified (constant or only depth-dependent) models of thermal conductivity. The available mineral physics data indicate, however, that thermal diffusivity is strongly temperature- and pressure-dependent and may also vary among different mantle materials. In the present study, we examine the influence of realistic thermal properties of mantle materials on the thermal state of the upper mantle and the dynamics of subducting slabs. On the basis of the data published in mineral physics literature we compile analytical relationships that approximate the pressure and temperature dependence of thermal diffusivity for major mineral phases of the mantle (olivine, wadsleyite, ringwoodite, garnet, clinopyroxenes, stishovite and perovskite). We propose a simplified composition of mineral assemblages predominating in the subducting slab and the surrounding mantle (pyrolite, mid-ocean ridge basalt, harzburgite) and we estimate their thermal diffusivity using the Hashin-Shtrikman bounds. The resulting complex formula for the diffusivity of each aggregate is then approximated by a simpler analytical relationship that is used in our numerical model as an input parameter. For the numerical modeling we use the Elmer software (open source finite element software for multiphysical problems, see http://www.csc.fi/english/pages/elmer). We set up a 2D Cartesian thermo-mechanical steady-state model of a subducting slab. The model is partly kinematic as the flow is driven by a boundary condition on velocity that is prescribed on the top of the subducting lithospheric plate. Reology of the material is non-linear and is coupled with the thermal equation. Using the realistic relationship for thermal diffusivity of mantle materials, we compute the thermal and flow fields for different input velocity and age of the subducting plate and we compare the results against the models assuming a constant thermal diffusivity. The importance of the

  5. Characterization of photomultiplier tubes with a realistic model through GPU-boosted simulation

    Science.gov (United States)

    Anthony, M.; Aprile, E.; Grandi, L.; Lin, Q.; Saldanha, R.

    2018-02-01

    The accurate characterization of a photomultiplier tube (PMT) is crucial in a wide-variety of applications. However, current methods do not give fully accurate representations of the response of a PMT, especially at very low light levels. In this work, we present a new and more realistic model of the response of a PMT, called the cascade model, and use it to characterize two different PMTs at various voltages and light levels. The cascade model is shown to outperform the more common Gaussian model in almost all circumstances and to agree well with a newly introduced model independent approach. The technical and computational challenges of this model are also presented along with the employed solution of developing a robust GPU-based analysis framework for this and other non-analytical models.

  6. IBM parameters derived from realistic shell-model Hamiltonian via Hn-cooling method

    International Nuclear Information System (INIS)

    Nakada, Hitoshi

    1997-01-01

    There is a certain influence of non-collective degrees-of-freedom even in lowest-lying states of medium-heavy nuclei. This influence seems to be significant for some of the IBM parameters. In order to take it into account, several renormalization approaches have been applied. It has been shown in the previous studies that the influence of the G-pairs is important, but does not fully account for the fitted values. The influence of the non-collective components may be more serious when we take a realistic effective nucleonic interaction. To incorporate this influence into the IBM parameters, we employ the recently developed H n -cooling method. This method is applied to renormalize the wave functions of the states consisting of the SD-pairs, for the Cr-Fe nuclei. On this ground, the IBM Hamiltonian and transition operators are derived from corresponding realistic shell-model operators, for the Cr-Fe nuclei. Together with some features of the realistic interaction, the effects of the non-SD degrees-of-freedom are presented. (author)

  7. The Realistic Versus the Spherical Head Model in EEG Dipole Source Analysis in the Presence of Noise

    National Research Council Canada - National Science Library

    Vanrumste, Bart

    2001-01-01

    .... For 27 electrodes, an EEG epoch of one time sample and spatially white Gaussian noise we found that the importance of the realistic head model over the spherical head model reduces by increasing the noise level.

  8. More Realistic Face Model Surface Improves Relevance of Pediatric In-Vitro Aerosol Studies.

    Science.gov (United States)

    Amirav, Israel; Halamish, Asaf; Gorenberg, Miguel; Omar, Hamza; Newhouse, Michael T

    2015-01-01

    Various hard face models are commonly used to evaluate the efficiency of aerosol face masks. Softer more realistic "face" surface materials, like skin, deform upon mask application and should provide more relevant in-vitro tests. Studies that simultaneously take into consideration many of the factors characteristic of the in vivo face are lacking. These include airways, various application forces, comparison of various devices, comparison with a hard-surface model and use of a more representative model face based on large numbers of actual faces. To compare mask to "face" seal and aerosol delivery of two pediatric masks using a soft vs. a hard, appropriately representative, pediatric face model under various applied forces. Two identical face models and upper airways replicas were constructed, the only difference being the suppleness and compressibility of the surface layer of the "face." Integrity of the seal and aerosol delivery of two different masks [AeroChamber (AC) and SootherMask (SM)] were compared using a breath simulator, filter collection and realistic applied forces. The soft "face" significantly increased the delivery efficiency and the sealing characteristics of both masks. Aerosol delivery with the soft "face" was significantly greater for the SM compared to the AC (pmasks was observed with the hard "face." The material and pliability of the model "face" surface has a significant influence on both the seal and delivery efficiency of face masks. This finding should be taken into account during in-vitro aerosol studies.

  9. Realistic modeling of seismic input for megacities and large urban areas

    International Nuclear Information System (INIS)

    Panza, Giuliano F.; Alvarez, Leonardo; Aoudia, Abdelkrim

    2002-06-01

    The project addressed the problem of pre-disaster orientation: hazard prediction, risk assessment, and hazard mapping, in connection with seismic activity and man-induced vibrations. The definition of realistic seismic input has been obtained from the computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different source and structural models. The innovative modeling technique, that constitutes the common tool to the entire project, takes into account source, propagation and local site effects. This is done using first principles of physics about wave generation and propagation in complex media, and does not require to resort to convolutive approaches, that have been proven to be quite unreliable, mainly when dealing with complex geological structures, the most interesting from the practical point of view. In fact, several techniques that have been proposed to empirically estimate the site effects using observations convolved with theoretically computed signals corresponding to simplified models, supply reliable information about the site response to non-interfering seismic phases. They are not adequate in most of the real cases, when the seismic sequel is formed by several interfering waves. The availability of realistic numerical simulations enables us to reliably estimate the amplification effects even in complex geological structures, exploiting the available geotechnical, lithological, geophysical parameters, topography of the medium, tectonic, historical, palaeoseismological data, and seismotectonic models. The realistic modeling of the ground motion is a very important base of knowledge for the preparation of groundshaking scenarios that represent a valid and economic tool for the seismic microzonation. This knowledge can be very fruitfully used by civil engineers in the design of new seismo-resistant constructions and in the reinforcement of the existing built environment, and, therefore

  10. Radiation Damage to Nervous System: Designing Optimal Models for Realistic Neuron Morphology in Hippocampus

    Science.gov (United States)

    Batmunkh, Munkhbaatar; Bugay, Alexander; Bayarchimeg, Lkhagvaa; Lkhagva, Oidov

    2018-02-01

    The present study is focused on the development of optimal models of neuron morphology for Monte Carlo microdosimetry simulations of initial radiation-induced events of heavy charged particles in the specific types of cells of the hippocampus, which is the most radiation-sensitive structure of the central nervous system. The neuron geometry and particles track structures were simulated by the Geant4/Geant4-DNA Monte Carlo toolkits. The calculations were made for beams of protons and heavy ions with different energies and doses corresponding to real fluxes of galactic cosmic rays. A simple compartmental model and a complex model with realistic morphology extracted from experimental data were constructed and compared. We estimated the distribution of the energy deposition events and the production of reactive chemical species within the developed models of CA3/CA1 pyramidal neurons and DG granule cells of the rat hippocampus under exposure to different particles with the same dose. Similar distributions of the energy deposition events and concentration of some oxidative radical species were obtained in both the simplified and realistic neuron models.

  11. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  12. EIT forward problem parallel simulation environment with anisotropic tissue and realistic electrode models.

    Science.gov (United States)

    De Marco, Tommaso; Ries, Florian; Guermandi, Marco; Guerrieri, Roberto

    2012-05-01

    Electrical impedance tomography (EIT) is an imaging technology based on impedance measurements. To retrieve meaningful insights from these measurements, EIT relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of current flows therein. The nonhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeoff between physical accuracy and technical feasibility, which at present severely limits the capabilities of EIT. This work presents a complete algorithmic flow for an accurate EIT modeling environment featuring high anatomical fidelity with a spatial resolution equal to that provided by an MRI and a novel realistic complete electrode model implementation. At the same time, we demonstrate that current graphics processing unit (GPU)-based platforms provide enough computational power that a domain discretized with five million voxels can be numerically modeled in about 30 s.

  13. Magnetic reconnection in the low solar chromosphere with a more realistic radiative cooling model

    Science.gov (United States)

    Ni, Lei; Lukin, Vyacheslav S.; Murphy, Nicholas A.; Lin, Jun

    2018-04-01

    Magnetic reconnection is the most likely mechanism responsible for the high temperature events that are observed in strongly magnetized locations around the temperature minimum in the low solar chromosphere. This work improves upon our previous work [Ni et al., Astrophys. J. 852, 95 (2018)] by using a more realistic radiative cooling model computed from the OPACITY project and the CHIANTI database. We find that the rate of ionization of the neutral component of the plasma is still faster than recombination within the current sheet region. For low β plasmas, the ionized and neutral fluid flows are well-coupled throughout the reconnection region resembling the single-fluid Sweet-Parker model dynamics. Decoupling of the ion and neutral inflows appears in the higher β case with β0=1.46 , which leads to a reconnection rate about three times faster than the rate predicted by the Sweet-Parker model. In all cases, the plasma temperature increases with time inside the current sheet, and the maximum value is above 2 ×104 K when the reconnection magnetic field strength is greater than 500 G. While the more realistic radiative cooling model does not result in qualitative changes of the characteristics of magnetic reconnection, it is necessary for studying the variations of the plasma temperature and ionization fraction inside current sheets in strongly magnetized regions of the low solar atmosphere. It is also important for studying energy conversion during the magnetic reconnection process when the hydrogen-dominated plasma approaches full ionization.

  14. 3D realistic head model simulation based on transcranial magnetic stimulation.

    Science.gov (United States)

    Yang, Shuo; Xu, Guizhi; Wang, Lei; Chen, Yong; Wu, Huanli; Li, Ying; Yang, Qingxin

    2006-01-01

    Transcranial magnetic stimulation (TMS) is a powerful non-invasive tool for investigating functions in the brain. The target inside the head is stimulated with eddy currents induced in the tissue by the time-varying magnetic field. Precise spatial localization of stimulation sites is the key of efficient functional magnetic stimulations. Many researchers devote to magnetic field analysis in empty free space. In this paper, a realistic head model used in Finite Element Method has been developed. The magnetic field inducted in the head bt TMS has been analysed. This three-dimensional simulation is useful for spatial localization of stimulation.

  15. Semantic modeling for theory clarification: The realist vs liberal international relations perspective

    Energy Technology Data Exchange (ETDEWEB)

    Bray, O.H. [Sandia National Labs., Albuquerque, NM (United States)]|[Univ. of New Mexico, Albuquerque, NM (United States). Political Science Dept.

    1994-04-01

    This paper describes a natural language based, semantic information modeling methodology and explores its use and value in clarifying and comparing political science theories and frameworks. As an example, the paper uses this methodology to clarify and compare some of the basic concepts and relationships in the realist (e.g. Waltz) and the liberal (e.g. Rosenau) paradigms for international relations. The methodology can provide three types of benefits: (1) it can clarify and make explicit exactly what is meant by a concept; (2) it can often identify unanticipated implications and consequence of concepts and relationships; and (3) it can help in identifying and operationalizing testable hypotheses.

  16. Modeling the Earth's magnetospheric magnetic field confined within a realistic magnetopause

    Science.gov (United States)

    Tsyganenko, N. A.

    1995-01-01

    Empirical data-based models of the magnetosphereic magnetic field have been widely used during recent years. However, the existing models (Tsyganenko, 1987, 1989a) have three serious deficiencies: (1) an unstable de facto magnetopause, (2) a crude parametrization by the K(sub p) index, and (3) inaccuracies in the equatorial magnetotail B(sub z) values. This paper describes a new approach to the problem; the essential new features are (1) a realistic shape and size of the magnetopause, based on fits to a large number of observed crossing (allowing a parametrization by the solar wind pressure), (2) fully controlled shielding of the magnetic field produced by all magnetospheric current systems, (3) new flexible representations for the tail and ring currents, and (4) a new directional criterion for fitting the model field to spacecraft data, providing improved accuracy for field line mapping. Results are presented from initial efforts to create models assembled from these modules and calibrated against spacecraft data sets.

  17. Realistic modelling of the seismic input: Site effects and parametric studies

    International Nuclear Information System (INIS)

    Romanelli, F.; Vaccari, F.; Panza, G.F.

    2002-11-01

    We illustrate the work done in the framework of a large international cooperation, showing the very recent numerical experiments carried out within the framework of the EC project 'Advanced methods for assessing the seismic vulnerability of existing motorway bridges' (VAB) to assess the importance of non-synchronous seismic excitation of long structures. The definition of the seismic input at the Warth bridge site, i.e. the determination of the seismic ground motion due to an earthquake with a given magnitude and epicentral distance from the site, has been done following a theoretical approach. In order to perform an accurate and realistic estimate of site effects and of differential motion it is necessary to make a parametric study that takes into account the complex combination of the source and propagation parameters, in realistic geological structures. The computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different sources and structural models, allows us the construction of damage scenarios that are out of the reach of stochastic models, at a very low cost/benefit ratio. (author)

  18. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model.

    Science.gov (United States)

    Miura, Hiroaki; Satoh, Masaki; Nasuno, Tomoe; Noda, Akira T; Oouchi, Kazuyoshi

    2007-12-14

    A Madden-Julian Oscillation (MJO) is a massive weather event consisting of deep convection coupled with atmospheric circulation, moving slowly eastward over the Indian and Pacific Oceans. Despite its enormous influence on many weather and climate systems worldwide, it has proven very difficult to simulate an MJO because of assumptions about cumulus clouds in global meteorological models. Using a model that allows direct coupling of the atmospheric circulation and clouds, we successfully simulated the slow eastward migration of an MJO event. Topography, the zonal sea surface temperature gradient, and interplay between eastward- and westward-propagating signals controlled the timing of the eastward transition of the convective center. Our results demonstrate the potential making of month-long MJO predictions when global cloud-resolving models with realistic initial conditions are used.

  19. A Local-Realistic Model of Quantum Mechanics Based on a Discrete Spacetime

    Science.gov (United States)

    Sciarretta, Antonio

    2018-01-01

    This paper presents a realistic, stochastic, and local model that reproduces nonrelativistic quantum mechanics (QM) results without using its mathematical formulation. The proposed model only uses integer-valued quantities and operations on probabilities, in particular assuming a discrete spacetime under the form of a Euclidean lattice. Individual (spinless) particle trajectories are described as random walks. Transition probabilities are simple functions of a few quantities that are either randomly associated to the particles during their preparation, or stored in the lattice nodes they visit during the walk. QM predictions are retrieved as probability distributions of similarly-prepared ensembles of particles. The scenarios considered to assess the model comprise of free particle, constant external force, harmonic oscillator, particle in a box, the Delta potential, particle on a ring, particle on a sphere and include quantization of energy levels and angular momentum, as well as momentum entanglement.

  20. Use of realistic anthropomorphic models for calculation of radiation dose in nuclear medicine

    International Nuclear Information System (INIS)

    Stabin, Michael G.; Emmons, Mary A.; Fernald, Michael J.; Brill, A.B.; Segars, W.Paul

    2008-01-01

    Anthropomorphic phantoms based on simple geometric structures have been used in radiation dose calculations for many years. We have now developed a series of anatomically realistic phantoms representing adults and children using body models based on non-uniform rational B-spline (NURBS), with organ and body masses based on the reference values given in ICRP Publication 89. Age-dependent models were scaled and shaped to represent the reference individuals described in ICRP 89 (male and female adults, newborns, 1-, 5-, 10- and 15-year-olds), using a software tool developed in Visual C++. Voxel-based versions of these models were used with GEANT4 radiation transport codes for calculation of specific absorbed fractions (SAFs) for internal sources of photons and electrons, using standard starting energy values. Organ masses in the models were within a few % of ICRP reference masses, and physicians reviewed the models for anatomical realism. Development of individual phantoms was much faster than manual segmentation of medical images, and resulted in a very uniform standardized phantom series. SAFs were calculated on the Vanderbilt multi node computing network (ACCRE). Photon and electron SAFs were calculated for all organs in all models, and were compared to values from similar phantoms developed by others. Agreement was very good in most cases; some differences were seen, due to differences in organ mass and geometry. This realistic phantom series represents a possible replacement for the Cristy/Eckerman series of the 1980's. Both phantom sets will be included in the next release of the OLINDA/EXM personal computer code, and the new phantoms will be made generally available to the research community for other uses. Calculated radiation doses for diagnostic and therapeutic radiopharmaceuticals will be compared with previous values. (author)

  1. Cellular potts models multiscale extensions and biological applications

    CERN Document Server

    Scianna, Marco

    2013-01-01

    A flexible, cell-level, and lattice-based technique, the cellular Potts model accurately describes the phenomenological mechanisms involved in many biological processes. Cellular Potts Models: Multiscale Extensions and Biological Applications gives an interdisciplinary, accessible treatment of these models, from the original methodologies to the latest developments. The book first explains the biophysical bases, main merits, and limitations of the cellular Potts model. It then proposes several innovative extensions, focusing on ways to integrate and interface the basic cellular Potts model at the mesoscopic scale with approaches that accurately model microscopic dynamics. These extensions are designed to create a nested and hybrid environment, where the evolution of a biological system is realistically driven by the constant interplay and flux of information between the different levels of description. Through several biological examples, the authors demonstrate a qualitative and quantitative agreement with t...

  2. Mesoscopic models of biological membranes

    DEFF Research Database (Denmark)

    Venturoli, M.; Sperotto, Maria Maddalena; Kranenburg, M.

    2006-01-01

    Phospholipids are the main components of biological membranes and dissolved in water these molecules self-assemble into closed structures, of which bilayers are the most relevant from a biological point of view. Lipid bilayers are often used, both in experimental and by theoretical investigations...... to coarse grain a biological membrane. The conclusion of this comparison is that there can be many valid different strategies, but that the results obtained by the various mesoscopic models are surprisingly consistent. A second objective of this review is to illustrate how mesoscopic models can be used...

  3. Satisfaction and sustainability: a realist review of decentralized models of perinatal surgery for rural women.

    Science.gov (United States)

    Kornelsen, Jude; McCartney, Kevin; Williams, Kim

    2016-01-01

    This article was developed as part of a larger realist review investigating the viability and efficacy of decentralized models of perinatal surgical services for rural women in the context of recent and ongoing service centralization witnessed in many developed nations. The larger realist review was commissioned by the British Columbia Ministry of Health and Perinatal Services of British Columbia, Canada. Findings from that review are addressed in this article specific to the sustainability of rural perinatal surgical sites and the satisfaction of providers that underpins their recruitment to and retention at such sites. A realist method was used in the selection and analysis of literature with the intention to iteratively develop a sophisticated understanding of how perinatal surgical services can best meet the needs of women who live in rural and remote environments. The goal of a realist review is to examine what works for whom under what circumstances and why. The high sensitivity search used language (English) and year (since 1990) limiters in keeping with both a realist and rapid review tradition of using reasoned contextual boundaries. No exclusions were made based on methodology or methodological approach in keeping with a realist review. Databases searched included MEDLINE, PubMed, EBSCO, CINAHL, EBM Reviews, NHS Economic Evaluation Database and PAIS International for literature in December 2013. Database searching produced 103 included academic articles. A further 59 resources were added through pearling and 13 grey literature reports were added on recommendation from the commissioner. A total of 42 of these 175 articles were included in this article as specific to provider satisfaction and service sustainability. Operative perinatal practice was found to be a lynchpin of sustainable primary and surgical services in rural communities. Rural shortages of providers, including challenges with recruitment and retention, were found to be a complex issue, with

  4. Improvement of Modeling Scheme of the Safety Injection Tank with Fluidic Device for Realistic LBLOCA Calculation

    International Nuclear Information System (INIS)

    Bang, Young Seok; Cheong, Aeju; Woo, Sweng Woong

    2014-01-01

    Confirmation of the performance of the SIT with FD should be based on thermal-hydraulic analysis of LBLOCA and an adequate and physical model simulating the SIT/FD should be used in the LBLOCA calculation. To develop such a physical model on SIT/FD, simulation of the major phenomena including flow distribution of by standpipe and FD should be justified by full scale experiment and/or plant preoperational testing. Author's previous study indicated that an approximation of SIT/FD phenomena could be obtained by a typical system transient code, MARS-KS, and using 'accumulator' component model, however, that additional improvement on modeling scheme of the FD and standpipe flow paths was needed for a reasonable prediction. One problem was a depressurizing behavior after switchover to low flow injection phase. Also a potential to release of nitrogen gas from the SIT to the downstream pipe and then reactor core through flow paths of FD and standpipe has been concerned. The intrusion of noncondensible gas may have an effect on LBLOCA thermal response. Therefore, a more reliable model on SIT/FD has been requested to get a more accurate prediction and a confidence of the evaluation of LBLOCA. The present paper is to discuss an improvement of modeling scheme from the previous study. Compared to the existing modeling, effect of the present modeling scheme on LBLOCA cladding thermal response is discussed. The present study discussed the modeling scheme of SIT with FD for a realistic simulation of LBLOCA of APR1400. Currently, the SIT blowdown test can be best simulated by the modeling scheme using 'pipe' component with dynamic area reduction. The LBLOCA analysis adopting the modeling scheme showed the PCT increase of 23K when compared to the case of 'accumulator' component model, which was due to the flow rate decrease at transition phase low flow injection and intrusion of nitrogen gas to the core. Accordingly, the effect of SIT/FD modeling

  5. Robust mode space approach for atomistic modeling of realistically large nanowire transistors

    Science.gov (United States)

    Huang, Jun Z.; Ilatikhameneh, Hesameddin; Povolotskyi, Michael; Klimeck, Gerhard

    2018-01-01

    Nanoelectronic transistors have reached 3D length scales in which the number of atoms is countable. Truly atomistic device representations are needed to capture the essential functionalities of the devices. Atomistic quantum transport simulations of realistically extended devices are, however, computationally very demanding. The widely used mode space (MS) approach can significantly reduce the numerical cost, but a good MS basis is usually very hard to obtain for atomistic full-band models. In this work, a robust and parallel algorithm is developed to optimize the MS basis for atomistic nanowires. This enables engineering-level, reliable tight binding non-equilibrium Green's function simulation of nanowire metal-oxide-semiconductor field-effect transistor (MOSFET) with a realistic cross section of 10 nm × 10 nm using a small computer cluster. This approach is applied to compare the performance of InGaAs and Si nanowire n-type MOSFETs (nMOSFETs) with various channel lengths and cross sections. Simulation results with full-band accuracy indicate that InGaAs nanowire nMOSFETs have no drive current advantage over their Si counterparts for cross sections up to about 10 nm × 10 nm.

  6. Stroke type differentiation using spectrally constrained multifrequency EIT: evaluation of feasibility in a realistic head model

    International Nuclear Information System (INIS)

    Malone, Emma; Jehl, Markus; Arridge, Simon; Betcke, Timo; Holder, David

    2014-01-01

    We investigate the application of multifrequency electrical impedance tomography (MFEIT) to imaging the brain in stroke patients. The use of MFEIT could enable early diagnosis and thrombolysis of ischaemic stroke, and therefore improve the outcome of treatment. Recent advances in the imaging methodology suggest that the use of spectral constraints could allow for the reconstruction of a one-shot image. We performed a simulation study to investigate the feasibility of imaging stroke in a head model with realistic conductivities. We introduced increasing levels of modelling errors to test the robustness of the method to the most common sources of artefact. We considered the case of errors in the electrode placement, spectral constraints, and contact impedance. The results indicate that errors in the position and shape of the electrodes can affect image quality, although our imaging method was successful in identifying tissues with sufficiently distinct spectra. (paper)

  7. Fault-Tolerant Robot Programming through Simulation with Realistic Sensor Models

    Directory of Open Access Journals (Sweden)

    Axel Waggershauser

    2008-11-01

    Full Text Available We introduce a simulation system for mobile robots that allows a realistic interaction of multiple robots in a common environment. The simulated robots are closely modeled after robots from the EyeBot family and have an identical application programmer interface. The simulation supports driving commands at two levels of abstraction as well as numerous sensors such as shaft encoders, infrared distance sensors, and compass. Simulation of on-board digital cameras via synthetic images allows the use of image processing routines for robot control within the simulation. Specific error models for actuators, distance sensors, camera sensor, and wireless communication have been implemented. Progressively increasing error levels for an application program allows for testing and improving its robustness and fault-tolerance.

  8. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  9. Calculation of electrical potentials on the surface of a realistic head model by finite differences

    International Nuclear Information System (INIS)

    Lemieux, L.; McBride, A.; Hand, J.W.

    1996-01-01

    We present a method for the calculation of electrical potentials at the surface of realistic head models from a point dipole generator based on a 3D finite-difference algorithm. The model was validated by comparing calculated values with those obtained algebraically for a three-shell spherical model. For a 1.25 mm cubic grid size, the mean error was 4.9% for a superficial dipole (3.75 mm from the inner surface of the skull) pointing in the radial direction. The effect of generator discretization and node spacing on the accuracy of the model was studied. Three values of the node spacing were considered: 1, 1.25 and 1.5 mm. The mean relative errors were 4.2, 6.3 and 9.3%, respectively. The quality of the approximation of a point dipole by an array of nodes in a spherical neighbourhood did not depend significantly on the number of nodes used. The application of the method to a conduction model derived from MRI data is demonstrated. (author)

  10. Investigations of sensitivity and resolution of ECG and MCG in a realistically shaped thorax model

    International Nuclear Information System (INIS)

    Mäntynen, Ville; Konttila, Teijo; Stenroos, Matti

    2014-01-01

    Solving the inverse problem of electrocardiography (ECG) and magnetocardiography (MCG) is often referred to as cardiac source imaging. Spatial properties of ECG and MCG as imaging systems are, however, not well known. In this modelling study, we investigate the sensitivity and point-spread function (PSF) of ECG, MCG, and combined ECG+MCG as a function of source position and orientation, globally around the ventricles: signal topographies are modelled using a realistically-shaped volume conductor model, and the inverse problem is solved using a distributed source model and linear source estimation with minimal use of prior information. The results show that the sensitivity depends not only on the modality but also on the location and orientation of the source and that the sensitivity distribution is clearly reflected in the PSF. MCG can better characterize tangential anterior sources (with respect to the heart surface), while ECG excels with normally-oriented and posterior sources. Compared to either modality used alone, the sensitivity of combined ECG+MCG is less dependent on source orientation per source location, leading to better source estimates. Thus, for maximal sensitivity and optimal source estimation, the electric and magnetic measurements should be combined. (paper)

  11. Implications of introducing realistic fire response traits in a Dynamic Global Vegetation Model

    Science.gov (United States)

    Kelley, D.; Harrison, S. P.; Prentice, I. C.

    2013-12-01

    Bark thickness is a key trait protecting woody plants against fire damage, while the ability to resprout is a trait that confers competitive advantage over non-resprouting individuals in fire-prone landscapes. Neither trait is well represented in fire-enabled dynamic global vegetation models (DGVMs). Here we describe a version of the Land Processes and eXchanges (LPX-Mv1) DGVM that incorporates both of these traits in a realistic way. From a synthesis of a large number of field studies, we show there is considerable innate variability in bark thickness between species within a plant-functional type (PFT). Furthermore, bark thickness is an adaptive trait at ecosystem level, increasing with fire frequency. We use the data to specify the range of bark thicknesses characteristic of each model PFT. We allow this distribution to change dynamically: thinner-barked trees are killed preferentially by fire, shifting the distribution of bark thicknesses represented in a model grid cell. We use the PFT-specific bark-thickness probability range for saplings during re-establishment. Since it is rare to destroy all trees in a grid cell, this treatment results in average bark thickness increasing with fire frequency and intensity. Resprouting is a prominent adaptation of temperate and tropical trees in fire-prone areas. The ability to resprout from above-ground tissue (apical or epicormic resprouting) results in the fastest recovery of total biomass after disturbance; resprouting from basal or below-ground meristems results in slower recovery, while non-resprouting species must regenerate from seed and therefore take the longest time to recover. Our analyses show that resprouting species have thicker bark than non-resprouting species. Investment in resprouting is accompanied by reduced efficacy of regeneration from seed. We introduce resprouting PFTs in LPX-Mv1 by specifying an appropriate range of bark thickness, allowing resprouters to survive fire and regenerate vegetatively in

  12. Smart-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hale, Elaine T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgindy, Tarek [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bugbee, Bruce [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Rossol, Michael N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lopez, Anthony J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Vergara, Claudio [MIT; Domingo, Carlos Mateo [IIT Comillas; Postigo, Fernando [IIT Comillas; de Cuadra, Fernando [IIT Comillas; Gomez, Tomas [IIT Comillas; Duenas, Pablo [MIT; Luke, Max [MIT; Li, Vivian [MIT; Vinoth, Mohan [GE Grid Solutions; Kadankodu, Sree [GE Grid Solutions

    2017-08-09

    The National Renewable Energy Laboratory (NREL) in collaboration with Massachusetts Institute of Technology (MIT), Universidad Pontificia Comillas (Comillas-IIT, Spain) and GE Grid Solutions, is working on an ARPA-E GRID DATA project, titled Smart-DS, to create: 1) High-quality, realistic, synthetic distribution network models, and 2) Advanced tools for automated scenario generation based on high-resolution weather data and generation growth projections. Through these advancements, the Smart-DS project is envisioned to accelerate the development, testing, and adoption of advanced algorithms, approaches, and technologies for sustainable and resilient electric power systems, especially in the realm of U.S. distribution systems. This talk will present the goals and overall approach of the Smart-DS project, including the process of creating the synthetic distribution datasets using reference network model (RNM) and the comprehensive validation process to ensure network realism, feasibility, and applicability to advanced use cases. The talk will provide demonstrations of early versions of synthetic models, along with the lessons learnt from expert engagements to enhance future iterations. Finally, the scenario generation framework, its development plans, and co-ordination with GRID DATA repository teams to house these datasets for public access will also be discussed.

  13. Explicit all-atom modeling of realistically sized ligand-capped nanocrystals

    KAUST Repository

    Kaushik, Ananth P.

    2012-01-01

    We present a study of an explicit all-atom representation of nanocrystals of experimentally relevant sizes (up to 6 nm), capped with alkyl chain ligands, in vacuum. We employ all-atom molecular dynamics simulation methods in concert with a well-tested intermolecular potential model, MM3 (molecular mechanics 3), for the studies presented here. These studies include determining the preferred conformation of an isolated single nanocrystal (NC), pairs of isolated NCs, and (presaging studies of superlattice arrays) unit cells of NC superlattices. We observe that very small NCs (3 nm) behave differently in a superlattice as compared to larger NCs (6 nm and above) due to the conformations adopted by the capping ligands on the NC surface. Short ligands adopt a uniform distribution of orientational preferences, including some that lie against the face of the nanocrystal. In contrast, longer ligands prefer to interdigitate. We also study the effect of changing ligand length and ligand coverage on the NCs on the preferred ligand configurations. Since explicit all-atom modeling constrains the maximum system size that can be studied, we discuss issues related to coarse-graining the representation of the ligands, including a comparison of two commonly used coarse-grained models. We find that care has to be exercised in the choice of coarse-grained model. The data provided by these realistically sized ligand-capped NCs, determined using explicit all-atom models, should serve as a reference standard for future models of coarse-graining ligands using united atom models, especially for self-assembly processes. © 2012 American Institute of Physics.

  14. Comparative study of non-premixed and partially-premixed combustion simulations in a realistic Tay model combustor

    OpenAIRE

    Zhang, K.; Ghobadian, A.; Nouri, J. M.

    2017-01-01

    A comparative study of two combustion models based on non-premixed assumption and partially premixed assumptions using the overall models of Zimont Turbulent Flame Speed Closure Method (ZTFSC) and Extended Coherent Flamelet Method (ECFM) are conducted through Reynolds stress turbulence modelling of Tay model gas turbine combustor for the first time. The Tay model combustor retains all essential features of a realistic gas turbine combustor. It is seen that the non-premixed combustion model fa...

  15. Atomic level insights into realistic molecular models of dendrimer-drug complexes through MD simulations

    Science.gov (United States)

    Jain, Vaibhav; Maiti, Prabal K.; Bharatam, Prasad V.

    2016-09-01

    Computational studies performed on dendrimer-drug complexes usually consider 1:1 stoichiometry, which is far from reality, since in experiments more number of drug molecules get encapsulated inside a dendrimer. In the present study, molecular dynamic (MD) simulations were implemented to characterize the more realistic molecular models of dendrimer-drug complexes (1:n stoichiometry) in order to understand the effect of high drug loading on the structural properties and also to unveil the atomistic level details. For this purpose, possible inclusion complexes of model drug Nateglinide (Ntg) (antidiabetic, belongs to Biopharmaceutics Classification System class II) with amine- and acetyl-terminated G4 poly(amidoamine) (G4 PAMAM(NH2) and G4 PAMAM(Ac)) dendrimers at neutral and low pH conditions are explored in this work. MD simulation analysis on dendrimer-drug complexes revealed that the drug encapsulation efficiency of G4 PAMAM(NH2) and G4 PAMAM(Ac) dendrimers at neutral pH was 6 and 5, respectively, while at low pH it was 12 and 13, respectively. Center-of-mass distance analysis showed that most of the drug molecules are located in the interior hydrophobic pockets of G4 PAMAM(NH2) at both the pH; while in the case of G4 PAMAM(Ac), most of them are distributed near to the surface at neutral pH and in the interior hydrophobic pockets at low pH. Structural properties such as radius of gyration, shape, radial density distribution, and solvent accessible surface area of dendrimer-drug complexes were also assessed and compared with that of the drug unloaded dendrimers. Further, binding energy calculations using molecular mechanics Poisson-Boltzmann surface area approach revealed that the location of drug molecules in the dendrimer is not the decisive factor for the higher and lower binding affinity of the complex, but the charged state of dendrimer and drug, intermolecular interactions, pH-induced conformational changes, and surface groups of dendrimer do play an

  16. PIV-measured versus CFD-predicted flow dynamics in anatomically realistic cerebral aneurysm models.

    Science.gov (United States)

    Ford, Matthew D; Nikolov, Hristo N; Milner, Jaques S; Lownie, Stephen P; Demont, Edwin M; Kalata, Wojciech; Loth, Francis; Holdsworth, David W; Steinman, David A

    2008-04-01

    Computational fluid dynamics (CFD) modeling of nominally patient-specific cerebral aneurysms is increasingly being used as a research tool to further understand the development, prognosis, and treatment of brain aneurysms. We have previously developed virtual angiography to indirectly validate CFD-predicted gross flow dynamics against the routinely acquired digital subtraction angiograms. Toward a more direct validation, here we compare detailed, CFD-predicted velocity fields against those measured using particle imaging velocimetry (PIV). Two anatomically realistic flow-through phantoms, one a giant internal carotid artery (ICA) aneurysm and the other a basilar artery (BA) tip aneurysm, were constructed of a clear silicone elastomer. The phantoms were placed within a computer-controlled flow loop, programed with representative flow rate waveforms. PIV images were collected on several anterior-posterior (AP) and lateral (LAT) planes. CFD simulations were then carried out using a well-validated, in-house solver, based on micro-CT reconstructions of the geometries of the flow-through phantoms and inlet/outlet boundary conditions derived from flow rates measured during the PIV experiments. PIV and CFD results from the central AP plane of the ICA aneurysm showed a large stable vortex throughout the cardiac cycle. Complex vortex dynamics, captured by PIV and CFD, persisted throughout the cardiac cycle on the central LAT plane. Velocity vector fields showed good overall agreement. For the BA, aneurysm agreement was more compelling, with both PIV and CFD similarly resolving the dynamics of counter-rotating vortices on both AP and LAT planes. Despite the imposition of periodic flow boundary conditions for the CFD simulations, cycle-to-cycle fluctuations were evident in the BA aneurysm simulations, which agreed well, in terms of both amplitudes and spatial distributions, with cycle-to-cycle fluctuations measured by PIV in the same geometry. The overall good agreement

  17. Mathematical models in biological discovery

    CERN Document Server

    Walter, Charles

    1977-01-01

    When I was asked to help organize an American Association for the Advancement of Science symposium about how mathematical models have con­ tributed to biology, I agreed immediately. The subject is of immense importance and wide-spread interest. However, too often it is discussed in biologically sterile environments by "mutual admiration society" groups of "theoreticians", many of whom have never seen, and most of whom have never done, an original scientific experiment with the biolog­ ical materials they attempt to describe in abstract (and often prejudiced) terms. The opportunity to address the topic during an annual meeting of the AAAS was irresistable. In order to try to maintain the integrity ;,f the original intent of the symposium, it was entitled, "Contributions of Mathematical Models to Biological Discovery". This symposium was organized by Daniel Solomon and myself, held during the 141st annual meeting of the AAAS in New York during January, 1975, sponsored by sections G and N (Biological and Medic...

  18. Rapidly re-computable EEG (electroencephalography) forward models for realistic head shapes

    International Nuclear Information System (INIS)

    Ermer, J.J.; Mosher, J.C.; Baillet, S.; Leahy, R.M.

    2001-01-01

    Solution of the EEG source localization (inverse) problem utilizing model-based methods typically requires a significant number of forward model evaluations. For subspace based inverse methods like MUSIC (6), the total number of forward model evaluations can often approach an order of 10 3 or 10 4 . Techniques based on least-squares minimization may require significantly more evaluations. The observed set of measurements over an M-sensor array is often expressed as a linear forward spatio-temporal model of the form: F = GQ + N (1) where the observed forward field F (M-sensors x N-time samples) can be expressed in terms of the forward model G, a set of dipole moment(s) Q (3xP-dipoles x N-time samples) and additive noise N. Because of their simplicity, ease of computation, and relatively good accuracy, multi-layer spherical models (7) (or fast approximations described in (1), (7)) have traditionally been the 'forward model of choice' for approximating the human head. However, approximation of the human head via a spherical model does have several key drawbacks. By its very shape, the use of a spherical model distorts the true distribution of passive currents in the skull cavity. Spherical models also require that the sensor positions be projected onto the fitted sphere (Fig. 1), resulting in a distortion of the true sensor-dipole spatial geometry (and ultimately the computed surface potential). The use of a single 'best-fitted' sphere has the added drawback of incomplete coverage of the inner skull region, often ignoring areas such as the frontal cortex. In practice, this problem is typically countered by fitting additional sphere(s) to those region(s) not covered by the primary sphere. The use of these additional spheres results in added complication to the forward model. Using high-resolution spatial information obtained via X-ray CT or MR imaging, a realistic head model can be formed by tessellating the head into a set of contiguous regions (typically the scalp

  19. Simulating the value of electric-vehicle-grid integration using a behaviourally realistic model

    Science.gov (United States)

    Wolinetz, Michael; Axsen, Jonn; Peters, Jotham; Crawford, Curran

    2018-02-01

    Vehicle-grid integration (VGI) uses the interaction between electric vehicles and the electrical grid to provide benefits that may include reducing the cost of using intermittent renwable electricity or providing a financial incentive for electric vehicle ownerhip. However, studies that estimate the value of VGI benefits have largely ignored how consumer behaviour will affect the magnitude of the impact. Here, we simulate the long-term impact of VGI using behaviourally realistic and empirically derived models of vehicle adoption and charging combined with an electricity system model. We focus on the case where a central entity manages the charging rate and timing for participating electric vehicles. VGI is found not to increase the adoption of electric vehicles, but does have a a small beneficial impact on electricity prices. By 2050, VGI reduces wholesale electricity prices by 0.6-0.7% (0.7 MWh-1, 2010 CAD) relative to an equivalent scenario without VGI. Excluding consumer behaviour from the analysis inflates the value of VGI.

  20. 3D Realistic Radiative Hydrodynamic Modeling of a Moderate-Mass Star: Effects of Rotation

    Science.gov (United States)

    Kitiashvili, Irina; Kosovichev, Alexander G.; Mansour, Nagi N.; Wray, Alan A.

    2018-01-01

    Recent progress in stellar observations opens new perspectives in understanding stellar evolution and structure. However, complex interactions in the turbulent radiating plasma together with effects of magnetic fields and rotation make inferences of stellar properties uncertain. The standard 1D mixing-length-based evolutionary models are not able to capture many physical processes of stellar interior dynamics, but they provide an initial approximation of the stellar structure that can be used to initialize 3D time-dependent radiative hydrodynamics simulations, based on first physical principles, that take into account the effects of turbulence, radiation, and others. In this presentation we will show simulation results from a 3D realistic modeling of an F-type main-sequence star with mass 1.47 Msun, in which the computational domain includes the upper layers of the radiation zone, the entire convection zone, and the photosphere. The simulation results provide new insight into the formation and properties of the convective overshoot region, the dynamics of the near-surface, highly turbulent layer, the structure and dynamics of granulation, and the excitation of acoustic and gravity oscillations. We will discuss the thermodynamic structure, oscillations, and effects of rotation on the dynamics of the star across these layers.

  1. A realistic pattern of fermion masses from a five-dimensional SO(10) model

    International Nuclear Information System (INIS)

    Feruglio, Ferruccio; Patel, Ketan M.; Vicino, Denise

    2015-01-01

    We provide a unified description of fermion masses and mixing angles in the framework of a supersymmetric grand unified SO(10) model with anarchic Yukawa couplings of order unity. The space-time is five dimensional and the extra flat spatial dimension is compactified on the orbifold S 1 /(Z 2 ×Z 2 ′ ), leading to Pati-Salam gauge symmetry on the boundary where Yukawa interactions are localised. The gauge symmetry breaking is completed by means of a rather economic scalar sector, avoiding the doublet-triplet splitting problem. The matter fields live in the bulk and their massless modes get exponential profiles, which naturally explain the mass hierarchy of the different fermion generations. Quarks and leptons properties are naturally reproduced by a mechanism, first proposed by Kitano and Li, that lifts the SO(10) degeneracy of bulk masses in terms of a single parameter. The model provides a realistic pattern of fermion masses and mixing angles for large values of tan β. It favours normally ordered neutrino mass spectrum with the lightest neutrino mass below 0.01 eV and no preference for leptonic CP violating phases. The right handed neutrino mass spectrum is very hierarchical and does not allow for thermal leptogenesis. We analyse several variants of the basic framework and find that the results concerning the fermion spectrum are remarkably stable.

  2. Development of Realistic Head Models for Electromagnetic Source Imaging of the Human Brain

    National Research Council Canada - National Science Library

    Akalin, Z

    2001-01-01

    ... images is performed Then triangular, quadratic meshes are formed for the interfaces of the tissues, Thus, realistic meshes, representing scalp, skull, CSF, brain and eye tissues, are formed, At least...

  3. Successful N2 leptogenesis with flavour coupling effects in realistic unified models

    International Nuclear Information System (INIS)

    Bari, Pasquale Di; King, Stephen F.

    2015-01-01

    In realistic unified models involving so-called SO(10)-inspired patterns of Dirac and heavy right-handed (RH) neutrino masses, the lightest right-handed neutrino N 1 is too light to yield successful thermal leptogenesis, barring highly fine tuned solutions, while the second heaviest right-handed neutrino N 2 is typically in the correct mass range. We show that flavour coupling effects in the Boltzmann equations may be crucial to the success of such N 2 dominated leptogenesis, by helping to ensure that the flavour asymmetries produced at the N 2 scale survive N 1 washout. To illustrate these effects we focus on N 2 dominated leptogenesis in an existing model, the A to Z of flavour with Pati-Salam, where the neutrino Dirac mass matrix may be equal to an up-type quark mass matrix and has a particular constrained structure. The numerical results, supported by analytical insight, show that in order to achieve successful N 2 leptogenesis, consistent with neutrino phenomenology, requires a ''flavour swap scenario'' together with a less hierarchical pattern of RH neutrino masses than naively expected, at the expense of some mild fine-tuning. In the considered A to Z model neutrino masses are predicted to be normal ordered, with an atmospheric neutrino mixing angle well into the second octant and the Dirac phase δ≅ 20 o , a set of predictions that will be tested in the next years in neutrino oscillation experiments. Flavour coupling effects may be relevant for other SO(10)-inspired unified models where N 2 leptogenesis is necessary

  4. Cardiac autonomic functions and the emergence of violence in a highly realistic model of social conflict in humans.

    Directory of Open Access Journals (Sweden)

    Jozsef eHaller

    2014-10-01

    Full Text Available Among the multitude of factors that can transform human social interactions into violent conflicts, biological features received much attention in recent years as correlates of decision making and aggressiveness especially in critical situations. We present here a highly realistic new model of human aggression and violence, where genuine acts of aggression are readily performed and which at the same time allows the parallel recording of biological concomitants. Particularly, we studied police officers trained at the International Training Centre (Budapest, Hungary, who are prepared to perform operations under extreme conditions of stress. We found that aggressive arousal can transform a basically peaceful social encounter into a violent conflict. Autonomic recordings show that this change is accompanied by increased heart rates, which was associated earlier with reduced cognitive complexity of perceptions (attentional myopia and promotes a bias towards hostile attributions and aggression. We also observed reduced heart rate variability in violent subjects, which is believed to signal a poor functioning of prefrontal-subcortical inhibitory circuits and reduces self-control. Importantly, these autonomic particularities were observed already at the beginning of social encounters i.e. before aggressive acts were initiated, suggesting that individual characteristics of the stress-response define the way in which social pressure affects social behavior, particularly the way in which this develops into violence. Taken together, these findings suggest that cardiac autonomic functions are valuable external symptoms of internal motivational states and decision making processes, and raise the possibility that behavior under social pressure can be predicted by the individual characteristics of stress responsiveness.

  5. Turbulent transport measurements in a cold model of GT-burner at realistic flow rates

    Directory of Open Access Journals (Sweden)

    Gobyzov Oleg

    2016-01-01

    Full Text Available In the present work simultaneous velocity field and passive admixture concentration field measurements at realistic flow-rates conditions in a non-reacting flow in a model of combustion chamber with an industrial mixing device are reported. In the experiments for safety reasons the real fuel (natural gas was replaced with neon gas to simulate stratification in a strongly swirling flow. Measurements were performed by means of planar laser-induced fluorescence (PLIF and particle image velocimetry technique (PIV at Reynolds number, based on the mean flow rate and nozzle diameter, ≈300 000. Details on experimental technique, features of the experimental setup, images and data preprocessing procedures and results of performed measurements are given in the paper. In addition to the raw velocity and admixture concentration data in-depth evaluation approaches aimed for estimation of turbulent kinetic energy (TKE components, assessment of turbulent Schmidt number and analysis of the gradient closure hypothesis from experimental data are presented in the paper.

  6. Downscaling Ocean Conditions: Initial Results using a Quasigeostrophic and Realistic Ocean Model

    Science.gov (United States)

    Katavouta, Anna; Thompson, Keith

    2014-05-01

    Previous theoretical work (Henshaw et al, 2003) has shown that the small-scale modes of variability of solutions of the unforced, incompressible Navier-Stokes equation, and Burgers' equation, can be reconstructed with surprisingly high accuracy from the time history of a few of the large-scale modes. Motivated by this theoretical work we first describe a straightforward method for assimilating information on the large scales in order to recover the small scale oceanic variability. The method is based on nudging in specific wavebands and frequencies and is similar to the so-called spectral nudging method that has been used successfully for atmospheric downscaling with limited area models (e.g. von Storch et al., 2000). The validity of the method is tested using a quasigestrophic model configured to simulate a double ocean gyre separated by an unstable mid-ocean jet. It is shown that important features of the ocean circulation including the position of the meandering mid-ocean jet and associated pinch-off eddies can indeed be recovered from the time history of a small number of large-scales modes. The benefit of assimilating additional time series of observations from a limited number of locations, that alone are too sparse to significantly improve the recovery of the small scales using traditional assimilation techniques, is also demonstrated using several twin experiments. The final part of the study outlines the application of the approach using a realistic high resolution (1/36 degree) model, based on the NEMO (Nucleus for European Modelling of the Ocean) modeling framework, configured for the Scotian Shelf of the east coast of Canada. The large scale conditions used in this application are obtained from the HYCOM (HYbrid Coordinate Ocean Model) + NCODA (Navy Coupled Ocean Data Assimilation) global 1/12 degree analysis product. Henshaw, W., Kreiss, H.-O., Ystrom, J., 2003. Numerical experiments on the interaction between the larger- and the small-scale motion of

  7. A realistic closed-form radiobiological model of clinical tumor-control data incorporating intertumor heterogeneity

    International Nuclear Information System (INIS)

    Roberts, Stephen A.; Hendry, Jolyon H.

    1998-01-01

    Purpose: To investigate the role of intertumor heterogeneity in clinical tumor control datasets and the relationship to in vitro measurements of tumor biopsy samples. Specifically, to develop a modified linear-quadratic (LQ) model incorporating such heterogeneity that it is practical to fit to clinical tumor-control datasets. Methods and Materials: We developed a modified version of the linear-quadratic (LQ) model for tumor control, incorporating a (lagged) time factor to allow for tumor cell repopulation. We explicitly took into account the interpatient heterogeneity in clonogen number, radiosensitivity, and repopulation rate. Using this model, we could generate realistic TCP curves using parameter estimates consistent with those reported from in vitro studies, subject to the inclusion of a radiosensitivity (or dose)-modifying factor. We then demonstrated that the model was dominated by the heterogeneity in α (tumor radiosensitivity) and derived an approximate simplified model incorporating this heterogeneity. This simplified model is expressible in a compact closed form, which it is practical to fit to clinical datasets. Using two previously analysed datasets, we fit the model using direct maximum-likelihood techniques and obtained parameter estimates that were, again, consistent with the experimental data on the radiosensitivity of primary human tumor cells. This heterogeneity model includes the same number of adjustable parameters as the standard LQ model. Results: The modified model provides parameter estimates that can easily be reconciled with the in vitro measurements. The simplified (approximate) form of the heterogeneity model is a compact, closed-form probit function that can readily be fitted to clinical series by conventional maximum-likelihood methodology. This heterogeneity model provides a slightly better fit to the datasets than the conventional LQ model, with the same numbers of fitted parameters. The parameter estimates of the clinically

  8. How realistic are air quality hindcasts driven by forcings from climate model simulations?

    Science.gov (United States)

    Lacressonnière, G.; Peuch, V.-H.; Arteta, J.; Josse, B.; Joly, M.; Marécal, V.; Saint Martin, D.; Déqué, M.; Watson, L.

    2012-12-01

    Predicting how European air quality could evolve over the next decades in the context of changing climate requires the use of climate models to produce results that can be averaged in a climatologically and statistically sound manner. This is a very different approach from the one that is generally used for air quality hindcasts for the present period; analysed meteorological fields are used to represent specifically each date and hour. Differences arise both from the fact that a climate model run results in a pure model output, with no influence from observations (which are useful to correct for a range of errors), and that in a "climate" set-up, simulations on a given day, month or even season cannot be related to any specific period of time (but can just be interpreted in a climatological sense). Hence, although an air quality model can be thoroughly validated in a "realistic" set-up using analysed meteorological fields, the question remains of how far its outputs can be interpreted in a "climate" set-up. For this purpose, we focus on Europe and on the current decade using three 5-yr simulations performed with the multiscale chemistry-transport model MOCAGE and use meteorological forcings either from operational meteorological analyses or from climate simulations. We investigate how statistical skill indicators compare in the different simulations, discriminating also the effects of meteorology on atmospheric fields (winds, temperature, humidity, pressure, etc.) and on the dependent emissions and deposition processes (volatile organic compound emissions, deposition velocities, etc.). Our results show in particular how differing boundary layer heights and deposition velocities affect horizontal and vertical distributions of species. When the model is driven by operational analyses, the simulation accurately reproduces the observed values of O3, NOx, SO2 and, with some bias that can be explained by the set-up, PM10. We study how the simulations driven by climate

  9. Realistic modeling of seismic input for megacities and large urban areas

    Science.gov (United States)

    Panza, G. F.; Unesco/Iugs/Igcp Project 414 Team

    2003-04-01

    The project addressed the problem of pre-disaster orientation: hazard prediction, risk assessment, and hazard mapping, in connection with seismic activity and man-induced vibrations. The definition of realistic seismic input has been obtained from the computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different source and structural models. The innovative modeling technique, that constitutes the common tool to the entire project, takes into account source, propagation and local site effects. This is done using first principles of physics about wave generation and propagation in complex media, and does not require to resort to convolutive approaches, that have been proven to be quite unreliable, mainly when dealing with complex geological structures, the most interesting from the practical point of view. In fact, several techniques that have been proposed to empirically estimate the site effects using observations convolved with theoretically computed signals corresponding to simplified models, supply reliable information about the site response to non-interfering seismic phases. They are not adequate in most of the real cases, when the seismic sequel is formed by several interfering waves. The availability of realistic numerical simulations enables us to reliably estimate the amplification effects even in complex geological structures, exploiting the available geotechnical, lithological, geophysical parameters, topography of the medium, tectonic, historical, palaeoseismological data, and seismotectonic models. The realistic modeling of the ground motion is a very important base of knowledge for the preparation of groundshaking scenarios that represent a valid and economic tool for the seismic microzonation. This knowledge can be very fruitfully used by civil engineers in the design of new seismo-resistant constructions and in the reinforcement of the existing built environment, and, therefore

  10. Mathematics Instructional Model Based on Realistic Mathematics Education to Promote Problem Solving Ability at Junior High School Padang

    OpenAIRE

    Edwin Musdi

    2016-01-01

    This research aims to develop a mathematics instructional model based realistic mathematics education (RME) to promote students' problem-solving abilities. The design research used Plomp models, which consists of preliminary phase, development or proto-typing phase and assessment phase.  At this study, only the first two phases conducted. The first phase, a preliminary investigation, carried out with a literature study to examine the theory-based instructional learning RME model, characterist...

  11. Model-based dose calculations for COMS eye plaque brachytherapy using an anatomically realistic eye phantom.

    Science.gov (United States)

    Lesperance, Marielle; Inglis-Whalen, M; Thomson, R M

    2014-02-01

    To investigate the effects of the composition and geometry of ocular media and tissues surrounding the eye on dose distributions for COMS eye plaque brachytherapy with(125)I, (103)Pd, or (131)Cs seeds, and to investigate doses to ocular structures. An anatomically and compositionally realistic voxelized eye model with a medial tumor is developed based on a literature review. Mass energy absorption and attenuation coefficients for ocular media are calculated. Radiation transport and dose deposition are simulated using the EGSnrc Monte Carlo user-code BrachyDose for a fully loaded COMS eye plaque within a water phantom and our full eye model for the three radionuclides. A TG-43 simulation with the same seed configuration in a water phantom neglecting the plaque and interseed effects is also performed. The impact on dose distributions of varying tumor position, as well as tumor and surrounding tissue media is investigated. Each simulation and radionuclide is compared using isodose contours, dose volume histograms for the lens and tumor, maximum, minimum, and average doses to structures of interest, and doses to voxels of interest within the eye. Mass energy absorption and attenuation coefficients of the ocular media differ from those of water by as much as 12% within the 20-30 keV photon energy range. For all radionuclides studied, average doses to the tumor and lens regions in the full eye model differ from those for the plaque in water by 8%-10% and 13%-14%, respectively; the average doses to the tumor and lens regions differ between the full eye model and the TG-43 simulation by 2%-17% and 29%-34%, respectively. Replacing the surrounding tissues in the eye model with water increases the maximum and average doses to the lens by 2% and 3%, respectively. Substituting the tumor medium in the eye model for water, soft tissue, or an alternate melanoma composition affects tumor dose compared to the default eye model simulation by up to 16%. In the full eye model

  12. Model-based dose calculations for COMS eye plaque brachytherapy using an anatomically realistic eye phantom

    International Nuclear Information System (INIS)

    Lesperance, Marielle; Inglis-Whalen, M.; Thomson, R. M.

    2014-01-01

    Purpose : To investigate the effects of the composition and geometry of ocular media and tissues surrounding the eye on dose distributions for COMS eye plaque brachytherapy with 125 I, 103 Pd, or 131 Cs seeds, and to investigate doses to ocular structures. Methods : An anatomically and compositionally realistic voxelized eye model with a medial tumor is developed based on a literature review. Mass energy absorption and attenuation coefficients for ocular media are calculated. Radiation transport and dose deposition are simulated using the EGSnrc Monte Carlo user-code BrachyDose for a fully loaded COMS eye plaque within a water phantom and our full eye model for the three radionuclides. A TG-43 simulation with the same seed configuration in a water phantom neglecting the plaque and interseed effects is also performed. The impact on dose distributions of varying tumor position, as well as tumor and surrounding tissue media is investigated. Each simulation and radionuclide is compared using isodose contours, dose volume histograms for the lens and tumor, maximum, minimum, and average doses to structures of interest, and doses to voxels of interest within the eye. Results : Mass energy absorption and attenuation coefficients of the ocular media differ from those of water by as much as 12% within the 20–30 keV photon energy range. For all radionuclides studied, average doses to the tumor and lens regions in the full eye model differ from those for the plaque in water by 8%–10% and 13%–14%, respectively; the average doses to the tumor and lens regions differ between the full eye model and the TG-43 simulation by 2%–17% and 29%–34%, respectively. Replacing the surrounding tissues in the eye model with water increases the maximum and average doses to the lens by 2% and 3%, respectively. Substituting the tumor medium in the eye model for water, soft tissue, or an alternate melanoma composition affects tumor dose compared to the default eye model simulation by up

  13. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  14. Time lags in biological models

    CERN Document Server

    MacDonald, Norman

    1978-01-01

    In many biological models it is necessary to allow the rates of change of the variables to depend on the past history, rather than only the current values, of the variables. The models may require discrete lags, with the use of delay-differential equations, or distributed lags, with the use of integro-differential equations. In these lecture notes I discuss the reasons for including lags, especially distributed lags, in biological models. These reasons may be inherent in the system studied, or may be the result of simplifying assumptions made in the model used. I examine some of the techniques available for studying the solution of the equations. A large proportion of the material presented relates to a special method that can be applied to a particular class of distributed lags. This method uses an extended set of ordinary differential equations. I examine the local stability of equilibrium points, and the existence and frequency of periodic solutions. I discuss the qualitative effects of lags, and how these...

  15. A realistic intersecting D6-brane model after the first LHC run

    Science.gov (United States)

    Li, Tianjun; Nanopoulos, D. V.; Raza, Shabbar; Wang, Xiao-Chuan

    2014-08-01

    With the Higgs boson mass around 125 GeV and the LHC supersymmetry search constraints, we revisit a three-family Pati-Salam model from intersecting D6-branes in Type IIA string theory on the T 6/(ℤ2 × ℤ2) orientifold which has a realistic phenomenology. We systematically scan the parameter space for μ 0, and find that the gravitino mass is generically heavier than about 2 TeV for both cases due to the Higgs mass low bound 123 GeV. In particular, we identify a region of parameter space with the electroweak fine-tuning as small as Δ EW ~ 24-32 (3-4%). In the viable parameter space which is consistent with all the current constraints, the mass ranges for gluino, the first two-generation squarks and sleptons are respectively [3, 18] TeV, [3, 16] TeV, and [2, 7] TeV. For the third-generation sfermions, the light stop satisfying 5 σ WMAP bounds via neutralino-stop coannihilation has mass from 0.5 to 1.2 TeV, and the light stau can be as light as 800 GeV. We also show various coannihilation and resonance scenarios through which the observed dark matter relic density is achieved. Interestingly, the certain portions of parameter space has excellent t- b- τ and b- τ Yukawa coupling unification. Three regions of parameter space are highlighted as well where the dominant component of the lightest neutralino is a bino, wino or higgsino. We discuss various scenarios in which such solutions may avoid recent astrophysical bounds in case if they satisfy or above observed relic density bounds. Prospects of finding higgsino-like neutralino in direct and indirect searches are also studied. And we display six tables of benchmark points depicting various interesting features of our model. Note that the lightest neutralino can be heavy up to 2.8 TeV, and there exists a natural region of parameter space from low-energy fine-tuning definition with heavy gluino and first two-generation squarks/sleptons, we point out that the 33 TeV and 100 TeV proton-proton colliders are indeed

  16. Modelling the performance of interferometric gravitational-wave detectors with realistically imperfect optics

    Science.gov (United States)

    Bochner, Brett

    1998-12-01

    The LIGO project is part of a world-wide effort to detect the influx of Gravitational Waves upon the earth from astrophysical sources, via their interaction with laser beams in interferometric detectors that are designed for extraordinarily high sensitivity. Central to the successful performance of LIGO detectors is the quality of their optical components, and the efficient optimization of interferometer configuration parameters. To predict LIGO performance with optics possessing realistic imperfections, we have developed a numerical simulation program to compute the steady-state electric fields of a complete, coupled-cavity LIGO interferometer. The program can model a wide variety of deformations, including laser beam mismatch and/or misalignment, finite mirror size, mirror tilts, curvature distortions, mirror surface roughness, and substrate inhomogeneities. Important interferometer parameters are automatically optimized during program execution to achieve the best possible sensitivity for each new set of perturbed mirrors. This thesis includes investigations of two interferometer designs: the initial LIGO system, and an advanced LIGO configuration called Dual Recycling. For Initial-LIGO simulations, the program models carrier and sideband frequency beams to compute the explicit shot-noise-limited gravitational wave sensitivity of the interferometer. It is demonstrated that optics of exceptional quality (root-mean-square deformations of less than ~1 nm in the central mirror regions) are necessary to meet Initial-LIGO performance requirements, but that they can be feasibly met. It is also shown that improvements in mirror quality can substantially increase LIGO's sensitivity to selected astrophysical sources. For Dual Recycling, the program models gravitational- wave-induced sidebands over a range of frequencies to demonstrate that the tuned and narrow-banded signal responses predicted for this configuration can be achieved with imperfect optics. Dual Recycling

  17. Realistic modelling of external flooding scenarios - A multi-disciplinary approach

    International Nuclear Information System (INIS)

    Brinkman, J.L.

    2014-01-01

    against flooding and timing of the events into account as basis for the development and screening of flooding scenarios. Realistic modelling of external flooding scenarios in a PSA requires a multi-disciplinary approach. Next to being thoroughly familiar with the design features of the plant against flooding, like its critical elevations for safety (related) equipment and the strength of buildings, additional knowledge is necessary on design of flood protection measures as dikes and dunes, their failure behaviour and modelling. The approach does not change the basic flooding scenarios - the event tree structure - itself, but impacts the initiating event of the specific flooding scenarios. (authors)

  18. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo; Artina, Marco; Foransier, Massimo; Markowich, Peter A.

    2015-01-01

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation

  19. Modelling Analysis of Echo Signature and Target Strength of a Realistically Modelled Ship Wake for a Generic Forward Looking Active Sonar

    NARCIS (Netherlands)

    Schippers, P.

    2009-01-01

    The acoustic modelling in TNO’s ALMOST (=Acoustic Loss Model for Operational Studies and Tasks) uses a bubble migration model as realistic input for wake modelling. The modelled bubble cloud represents the actual ship wake. Ship hull, propeller and bow wave are the main generators of bubbles in the

  20. Lazy Updating of hubs can enable more realistic models by speeding up stochastic simulations

    International Nuclear Information System (INIS)

    Ehlert, Kurt; Loewe, Laurence

    2014-01-01

    To respect the nature of discrete parts in a system, stochastic simulation algorithms (SSAs) must update for each action (i) all part counts and (ii) each action's probability of occurring next and its timing. This makes it expensive to simulate biological networks with well-connected “hubs” such as ATP that affect many actions. Temperature and volume also affect many actions and may be changed significantly in small steps by the network itself during fever and cell growth, respectively. Such trends matter for evolutionary questions, as cell volume determines doubling times and fever may affect survival, both key traits for biological evolution. Yet simulations often ignore such trends and assume constant environments to avoid many costly probability updates. Such computational convenience precludes analyses of important aspects of evolution. Here we present “Lazy Updating,” an add-on for SSAs designed to reduce the cost of simulating hubs. When a hub changes, Lazy Updating postpones all probability updates for reactions depending on this hub, until a threshold is crossed. Speedup is substantial if most computing time is spent on such updates. We implemented Lazy Updating for the Sorting Direct Method and it is easily integrated into other SSAs such as Gillespie's Direct Method or the Next Reaction Method. Testing on several toy models and a cellular metabolism model showed >10× faster simulations for its use-cases—with a small loss of accuracy. Thus we see Lazy Updating as a valuable tool for some special but important simulation problems that are difficult to address efficiently otherwise

  1. Track structure in biological models.

    Science.gov (United States)

    Curtis, S B

    1986-01-01

    High-energy heavy ions in the galactic cosmic radiation (HZE particles) may pose a special risk during long term manned space flights outside the sheltering confines of the earth's geomagnetic field. These particles are highly ionizing, and they and their nuclear secondaries can penetrate many centimeters of body tissue. The three dimensional patterns of ionizations they create as they lose energy are referred to as their track structure. Several models of biological action on mammalian cells attempt to treat track structure or related quantities in their formulation. The methods by which they do this are reviewed. The proximity function is introduced in connection with the theory of Dual Radiation Action (DRA). The ion-gamma kill (IGK) model introduces the radial energy-density distribution, which is a smooth function characterizing both the magnitude and extension of a charged particle track. The lethal, potentially lethal (LPL) model introduces lambda, the mean distance between relevant ion clusters or biochemical species along the track. Since very localized energy depositions (within approximately 10 nm) are emphasized, the proximity function as defined in the DRA model is not of utility in characterizing track structure in the LPL formulation.

  2. Integrating systems biology models and biomedical ontologies.

    Science.gov (United States)

    Hoehndorf, Robert; Dumontier, Michel; Gennari, John H; Wimalaratne, Sarala; de Bono, Bernard; Cook, Daniel L; Gkoutos, Georgios V

    2011-08-11

    Systems biology is an approach to biology that emphasizes the structure and dynamic behavior of biological systems and the interactions that occur within them. To succeed, systems biology crucially depends on the accessibility and integration of data across domains and levels of granularity. Biomedical ontologies were developed to facilitate such an integration of data and are often used to annotate biosimulation models in systems biology. We provide a framework to integrate representations of in silico systems biology with those of in vivo biology as described by biomedical ontologies and demonstrate this framework using the Systems Biology Markup Language. We developed the SBML Harvester software that automatically converts annotated SBML models into OWL and we apply our software to those biosimulation models that are contained in the BioModels Database. We utilize the resulting knowledge base for complex biological queries that can bridge levels of granularity, verify models based on the biological phenomenon they represent and provide a means to establish a basic qualitative layer on which to express the semantics of biosimulation models. We establish an information flow between biomedical ontologies and biosimulation models and we demonstrate that the integration of annotated biosimulation models and biomedical ontologies enables the verification of models as well as expressive queries. Establishing a bi-directional information flow between systems biology and biomedical ontologies has the potential to enable large-scale analyses of biological systems that span levels of granularity from molecules to organisms.

  3. Learning from Nature - Mapping of Complex Hydrological and Geomorphological Process Systems for More Realistic Modelling of Hazard-related Maps

    Science.gov (United States)

    Chifflard, Peter; Tilch, Nils

    2010-05-01

    , where the fluvial bank erosion only plays a minor role as an initiating factor. On the other hand, fluvial bank erosion does appear to be a cause of smaller mass movements in their final stage which develop spontaneously, most noticeably in regions of gravel-rich soils (coarse-grained) and of shallow weathered material (several decimetres). - numerous marks of surface runoff were found over the entire catchment area to a greatly variable extent and intensity. In the more eastern parts of the catchment, these signs can be linked especially to anthropogenic concentrated inputs of surface discharge e.g. drainage system of streets. Their spread is limited, but usually associated with huge erosion channels of up to 2 m depth. In the western parts of the catchment, however, signs of surface discharge are more commonly found in forests. Depending on their location, they can be a result of an up-hill infiltration surplus in areas of fields and pastures, or an infiltration surplus in the forest itself. In many places, rapid interflow through biologically-created macropores takes place, which often re-emerges at the surface in the form of return flow. In general, it is noticeable that marks of surface runoff often terminate at the scarps of landslides, which were not caused by fluvial bank erosion. The excess water produces a strong local saturation of the ground, which gives a higher landslide-susceptibility of the embankment. Future work Based on the acquired field knowledge, it was possible to distinguish areas of different heterogeneities/homogeneities of the dominant process chains for several micro-scale parts of the catchment area. Subsequently, conceptual slope profiles should be derived from the detailed field data, and these should include information of the dominant and complex process systems. This forms an essential starting point in order to be able to realistically consider relevant hazard-related processes as part of process-oriented modelling.

  4. Spatial Modeling Tools for Cell Biology

    National Research Council Canada - National Science Library

    Przekwas, Andrzej; Friend, Tom; Teixeira, Rodrigo; Chen, Z. J; Wilkerson, Patrick

    2006-01-01

    .... Scientific potentials and military relevance of computational biology and bioinformatics have inspired DARPA/IPTO's visionary BioSPICE project to develop computational framework and modeling tools for cell biology...

  5. A realistic approach to modeling an in-duct desulfurization process based on an experimental pilot plant study

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, F.J.G.; Ollero, P. [University of Seville, Seville (Spain)

    2008-07-15

    This paper has been written to provide a realistic approach to modeling an in-duct desulfurization process and because of the disagreement between the results predicted by published kinetic models of the reaction between hydrated lime and SO{sub 2} at low temperature and the experimental results obtained in pilot plants where this process takes place. Results were obtained from an experimental program carried out in a 3-MWe pilot plant. Additionally, five kinetic models, from the literature, of the reaction of sulfation of Ca(OH){sub 2} at low temperatures were assessed by simulation and indicate that the desulfurization efficiencies predicted by them are clearly lower than those experimentally obtained in our own pilot plant as well as others. Next, a general model was fitted by minimizing the difference between the calculated and the experimental results from the pilot plant, using Matlab{sup TM}. The parameters were reduced as much as possible, to only two. Finally, after implementing this model in a simulation tool of the in-duct sorbent injection process, it was validated and it was shown to yield a realistic approach useful for both analyzing results and aiding in the design of an in-duct desulfurization process.

  6. Regional 3-D Modeling of Ground Geoelectric Field for the Northeast United States due to Realistic Geomagnetic Disturbances

    Science.gov (United States)

    Ivannikova, E.; Kruglyakov, M.; Kuvshinov, A. V.; Rastaetter, L.; Pulkkinen, A. A.; Ngwira, C. M.

    2017-12-01

    During extreme space weather events electric currents in the Earth's magnetosphere and ionosphere experience large variations, which leads to dramatic intensification of the fluctuating magnetic field at the surface of the Earth. According to Faraday's law of induction, the fluctuating geomagnetic field in turn induces electric field that generates harmful currents (so-called "geomagnetically induced currents"; GICs) in grounded technological systems. Understanding (via modeling) of the spatio-temporal evolution of the geoelectric field during enhanced geomagnetic activity is a key consideration in estimating the hazard to technological systems from space weather. We present the results of ground geoelectric field modeling for the Northeast United States, which is performed with the use of our novel numerical tool based on integral equation approach. The tool exploits realistic regional three-dimensional (3-D) models of the Earth's electrical conductivity and realistic global models of the spatio-temporal evolution of the magnetospheric and ionospheric current systems responsible for geomagnetic disturbances. We also explore in detail the manifestation of the coastal effect (anomalous intensification of the geoelectric field near the coasts) in this region.

  7. A biological compression model and its applications.

    Science.gov (United States)

    Cao, Minh Duc; Dix, Trevor I; Allison, Lloyd

    2011-01-01

    A biological compression model, expert model, is presented which is superior to existing compression algorithms in both compression performance and speed. The model is able to compress whole eukaryotic genomes. Most importantly, the model provides a framework for knowledge discovery from biological data. It can be used for repeat element discovery, sequence alignment and phylogenetic analysis. We demonstrate that the model can handle statistically biased sequences and distantly related sequences where conventional knowledge discovery tools often fail.

  8. Credit Card Fraud Detection: A Realistic Modeling and a Novel Learning Strategy.

    Science.gov (United States)

    Dal Pozzolo, Andrea; Boracchi, Giacomo; Caelen, Olivier; Alippi, Cesare; Bontempi, Gianluca

    2017-09-14

    Detecting frauds in credit card transactions is perhaps one of the best testbeds for computational intelligence algorithms. In fact, this problem involves a number of relevant challenges, namely: concept drift (customers' habits evolve and fraudsters change their strategies over time), class imbalance (genuine transactions far outnumber frauds), and verification latency (only a small set of transactions are timely checked by investigators). However, the vast majority of learning algorithms that have been proposed for fraud detection rely on assumptions that hardly hold in a real-world fraud-detection system (FDS). This lack of realism concerns two main aspects: 1) the way and timing with which supervised information is provided and 2) the measures used to assess fraud-detection performance. This paper has three major contributions. First, we propose, with the help of our industrial partner, a formalization of the fraud-detection problem that realistically describes the operating conditions of FDSs that everyday analyze massive streams of credit card transactions. We also illustrate the most appropriate performance measures to be used for fraud-detection purposes. Second, we design and assess a novel learning strategy that effectively addresses class imbalance, concept drift, and verification latency. Third, in our experiments, we demonstrate the impact of class unbalance and concept drift in a real-world data stream containing more than 75 million transactions, authorized over a time window of three years.

  9. Realistic modelling of the effects of asynchronous motion at the base of bridge piers

    International Nuclear Information System (INIS)

    Romanelli, F.; Panza, G.F.; Vaccari, F.

    2002-11-01

    Frequently long-span bridges provide deep valley crossings, which require special consideration due to the possibility of local amplification of the ground motion as a consequence of topographical irregularities and local soil conditions. This does in fact cause locally enhanced seismic input with the possibility for the bridge piers to respond asynchronously. This introduces special design requirements so that possible out-of-phase ground displacements and the associated large relative displacements of adjacent piers can be accommodated without excessive damage. Assessment of the local variability of the ground motion due to local lateral heterogeneities and to attenuation properties is thus crucial toward the realistic definition of the asynchronous motion at the base of the bridge piers. We illustrate the work done in the framework of a large international cooperation to assess the importance of non-synchronous seismic excitation of long structures. To accomplish this task we compute complete synthetic accelerograms using as input a set of parameters that describes, to the best of our knowledge, the geological structure and seismotectonic setting of the investigated area. (author)

  10. Computation of Surface Laplacian for tri-polar ring electrodes on high-density realistic geometry head model.

    Science.gov (United States)

    Junwei Ma; Han Yuan; Sunderam, Sridhar; Besio, Walter; Lei Ding

    2017-07-01

    Neural activity inside the human brain generate electrical signals that can be detected on the scalp. Electroencephalograph (EEG) is one of the most widely utilized techniques helping physicians and researchers to diagnose and understand various brain diseases. Due to its nature, EEG signals have very high temporal resolution but poor spatial resolution. To achieve higher spatial resolution, a novel tri-polar concentric ring electrode (TCRE) has been developed to directly measure Surface Laplacian (SL). The objective of the present study is to accurately calculate SL for TCRE based on a realistic geometry head model. A locally dense mesh was proposed to represent the head surface, where the local dense parts were to match the small structural components in TCRE. Other areas without dense mesh were used for the purpose of reducing computational load. We conducted computer simulations to evaluate the performance of the proposed mesh and evaluated possible numerical errors as compared with a low-density model. Finally, with achieved accuracy, we presented the computed forward lead field of SL for TCRE for the first time in a realistic geometry head model and demonstrated that it has better spatial resolution than computed SL from classic EEG recordings.

  11. On the impacts of coarse-scale models of realistic roughness on a forward-facing step turbulent flow

    International Nuclear Information System (INIS)

    Wu, Yanhua; Ren, Huiying

    2013-01-01

    Highlights: ► Discrete wavelet transform was used to produce coarse-scale models of roughness. ► PIV were performed in a forward-facing step flow with roughness of different scales. ► Impacts of roughness scales on various turbulence statistics were studied. -- Abstract: The present work explores the impacts of the coarse-scale models of realistic roughness on the turbulent boundary layers over forward-facing steps. The surface topographies of different scale resolutions were obtained from a novel multi-resolution analysis using discrete wavelet transform. PIV measurements are performed in the streamwise–wall-normal (x–y) planes at two different spanwise positions in turbulent boundary layers at Re h = 3450 and δ/h = 8, where h is the mean step height and δ is the incoming boundary layer thickness. It was observed that large-scale but low-amplitude roughness scales had small effects on the forward-facing step turbulent flow. For the higher-resolution model of the roughness, the turbulence characteristics within 2h downstream of the steps are observed to be distinct from those over the original realistic rough step at a measurement position where the roughness profile possesses a positive slope immediately after the step’s front. On the other hand, much smaller differences exist in the flow characteristics at the other measurement position whose roughness profile possesses a negative slope following the step’s front

  12. Dose related risk and effect assessment model (DREAM) -- A more realistic approach to risk assessment of offshore discharges

    International Nuclear Information System (INIS)

    Johnsen, S.; Furuholt, E.

    1995-01-01

    Risk assessment of discharges from offshore oil and gas production to the marine environment features determination of potential environmental concentration (PEC) levels and no observed effect concentration (NOEC) levels. The PEC values are normally based on dilution of chemical components in the actual discharge source in the recipient, while the NOEC values are determined by applying a safety factor to acute toxic effects from laboratory tests. The DREAM concept focuses on realistic exposure doses as function of contact time and dilution, rather than fixed exposure concentrations of chemicals in long time exposure regimes. In its present state, the DREAM model is based on a number of assumptions with respect to the link between real life exposure doses and effects observed in laboratory tests. A research project has recently been initiated to develop the concept further, with special focus on chronic effects of different chemical compounds on the marine ecosystem. One of the questions that will be addressed is the link between exposure time, dose, concentration and effect. Validation of the safety factors applied for transforming acute toxic data into NOEC values will also be included. The DREAM model has been used by Statoil for risk assessment of discharges from new and existing offshore oil and gas production fields, and has been found to give a much more realistic results than conventional risk assessment tools. The presentation outlines the background for the DREAM approach, describes the model in its present state, discusses further developments and applications, and shows a number of examples on the performance of DREAM

  13. Continuum Modeling of Biological Network Formation

    KAUST Repository

    Albi, Giacomo; Burger, Martin; Haskovec, Jan; Markowich, Peter A.; Schlottbom, Matthias

    2017-01-01

    We present an overview of recent analytical and numerical results for the elliptic–parabolic system of partial differential equations proposed by Hu and Cai, which models the formation of biological transportation networks. The model describes

  14. The photometric evolution of dissolving star clusters. II. Realistic models. Colours and M/L ratios

    NARCIS (Netherlands)

    Anders, P.; Lamers, H.J.G.L.M.; BAumgardt, H.

    2009-01-01

    Evolutionary synthesis models are the primary means of constructing spectrophotometric models of stellar populations, and deriving physical parameters from observations compared with these models. One of the basic assumptions of evolutionary synthesis models has been the time-independence of the

  15. Using remotely sensed data and stochastic models to simulate realistic flood hazard footprints across the continental US

    Science.gov (United States)

    Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.

    2017-12-01

    Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in

  16. A Biologically Realistic Cortical Model of Eye Movement Control in Reading

    Science.gov (United States)

    Heinzle, Jakob; Hepp, Klaus; Martin, Kevan A. C.

    2010-01-01

    Reading is a highly complex task involving a precise integration of vision, attention, saccadic eye movements, and high-level language processing. Although there is a long history of psychological research in reading, it is only recently that imaging studies have identified some neural correlates of reading. Thus, the underlying neural mechanisms…

  17. Mathematics Instructional Model Based on Realistic Mathematics Education to Promote Problem Solving Ability at Junior High School Padang

    Directory of Open Access Journals (Sweden)

    Edwin Musdi

    2016-02-01

    Full Text Available This research aims to develop a mathematics instructional model based realistic mathematics education (RME to promote students' problem-solving abilities. The design research used Plomp models, which consists of preliminary phase, development or proto-typing phase and assessment phase.  At this study, only the first two phases conducted. The first phase, a preliminary investigation, carried out with a literature study to examine the theory-based instructional learning RME model, characteristics of learners, learning management descriptions by junior high school mathematics teacher and relevant research. The development phase is done by developing a draft model (an early prototype model that consists of the syntax, the social system, the principle of reaction, support systems, and the impact and effects of instructional support. Early prototype model contain a draft model, lesson plans, worksheets, and assessments. Tesssmer formative evaluation model used to revise the model. In this study only phase of one to one evaluation conducted. In the ppreliminary phase has produced a theory-based learning RME model, a description of the characteristics of learners in grade VIII Junior High School Padang and the description of teacher teaching in the classroom. The result showed that most students were still not be able to solve the non-routine problem. Teachers did not optimally facilitate students to develop problem-solving skills of students. It was recommended that the model can be applied in the classroom.

  18. Design and validation of realistic breast models for use in multiple alternative forced choice virtual clinical trials.

    Science.gov (United States)

    Elangovan, Premkumar; Mackenzie, Alistair; Dance, David R; Young, Kenneth C; Cooke, Victoria; Wilkinson, Louise; Given-Wilson, Rosalind M; Wallis, Matthew G; Wells, Kevin

    2017-04-07

    A novel method has been developed for generating quasi-realistic voxel phantoms which simulate the compressed breast in mammography and digital breast tomosynthesis (DBT). The models are suitable for use in virtual clinical trials requiring realistic anatomy which use the multiple alternative forced choice (AFC) paradigm and patches from the complete breast image. The breast models are produced by extracting features of breast tissue components from DBT clinical images including skin, adipose and fibro-glandular tissue, blood vessels and Cooper's ligaments. A range of different breast models can then be generated by combining these components. Visual realism was validated using a receiver operating characteristic (ROC) study of patches from simulated images calculated using the breast models and from real patient images. Quantitative analysis was undertaken using fractal dimension and power spectrum analysis. The average areas under the ROC curves for 2D and DBT images were 0.51  ±  0.06 and 0.54  ±  0.09 demonstrating that simulated and real images were statistically indistinguishable by expert breast readers (7 observers); errors represented as one standard error of the mean. The average fractal dimensions (2D, DBT) for real and simulated images were (2.72  ±  0.01, 2.75  ±  0.01) and (2.77  ±  0.03, 2.82  ±  0.04) respectively; errors represented as one standard error of the mean. Excellent agreement was found between power spectrum curves of real and simulated images, with average β values (2D, DBT) of (3.10  ±  0.17, 3.21  ±  0.11) and (3.01  ±  0.32, 3.19  ±  0.07) respectively; errors represented as one standard error of the mean. These results demonstrate that radiological images of these breast models realistically represent the complexity of real breast structures and can be used to simulate patches from mammograms and DBT images that are indistinguishable from

  19. Effective electric fields along realistic DTI-based neural trajectories for modelling the stimulation mechanisms of TMS

    International Nuclear Information System (INIS)

    De Geeter, N; Crevecoeur, G; Dupré, L; Leemans, A

    2015-01-01

    In transcranial magnetic stimulation (TMS), an applied alternating magnetic field induces an electric field in the brain that can interact with the neural system. It is generally assumed that this induced electric field is the crucial effect exciting a certain region of the brain. More specifically, it is the component of this field parallel to the neuron’s local orientation, the so-called effective electric field, that can initiate neuronal stimulation. Deeper insights on the stimulation mechanisms can be acquired through extensive TMS modelling. Most models study simple representations of neurons with assumed geometries, whereas we embed realistic neural trajectories computed using tractography based on diffusion tensor images. This way of modelling ensures a more accurate spatial distribution of the effective electric field that is in addition patient and case specific. The case study of this paper focuses on the single pulse stimulation of the left primary motor cortex with a standard figure-of-eight coil. Including realistic neural geometry in the model demonstrates the strong and localized variations of the effective electric field between the tracts themselves and along them due to the interplay of factors such as the tract’s position and orientation in relation to the TMS coil, the neural trajectory and its course along the white and grey matter interface. Furthermore, the influence of changes in the coil orientation is studied. Investigating the impact of tissue anisotropy confirms that its contribution is not negligible. Moreover, assuming isotropic tissues lead to errors of the same size as rotating or tilting the coil with 10 degrees. In contrast, the model proves to be less sensitive towards the not well-known tissue conductivity values. (paper)

  20. A Realistic Process Example for MIMO MPC based on Autoregressive Models

    DEFF Research Database (Denmark)

    Huusom, Jakob Kjøbsted; Jørgensen, John Bagterp

    2014-01-01

    for advanced control design develo pment which may be used by non experts in control theory. This paper presents and illustra tes the use of a simple methodology to design an offset-free MPC based on ARX models. Hence a mecha nistic process model is not required. The forced circulation evaporator by Newell...... and Lee is used to illustrate the offset-free MPC based on ARX models for a nonlinear multivariate process ....

  1. Realistic Modeling and Animation of Human Body Based on Scanned Data

    Institute of Scientific and Technical Information of China (English)

    Yong-You Ma; Hui Zhang; Shou-Wei Jiang

    2004-01-01

    In this paper we propose a novel method for building animation model of real human body from surface scanned data.The human model is represented by a triangular mesh and described as a layered geometric model.The model consists of two layers: the control skeleton generating body animation from motion capture data,and the simplified surface model providing an efficient representation of the skin surface shape.The skeleton is generated automatically from surface scanned data using the feature extraction,and thena point-to-line mapping is used to map the surface model onto the underlying skeleton.The resulting model enables real-time and smooth animation by manipulation of the skeleton while maintaining the surface detail.Compared with earlier approach,the principal advantages of our approach are the automated generation of body control skeletons from the scanned data for real-time animation,and the automatic mapping and animation of the captured human surface shape.The human model constructed in this work can be used for applications of ergonomic design,garment CAD,real-time simulating humans in virtual reality environment and so on.

  2. A more realistic estimate of the variances and systematic errors in spherical harmonic geomagnetic field models

    DEFF Research Database (Denmark)

    Lowes, F.J.; Olsen, Nils

    2004-01-01

    Most modern spherical harmonic geomagnetic models based on satellite data include estimates of the variances of the spherical harmonic coefficients of the model; these estimates are based on the geometry of the data and the fitting functions, and on the magnitude of the residuals. However...

  3. Quasi-realistic distribution of interaction fields leading to a variant of Ising spin glass model

    International Nuclear Information System (INIS)

    Tanasa, Radu; Enachescu, Cristian; Stancu, Alexandru; Linares, Jorge; Varret, Francois

    2004-01-01

    The distribution of interaction fields of an Ising-like system, obtained by Monte Carlo entropic sampling is used for modeling the hysteretic behavior of patterned media made of magnetic particles with a common anisotropy axis; a variant of the canonical Edwards-Anderson Ising spin glass model is introduced

  4. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  5. Investigation of realistic PET simulations incorporating tumor patient's specificity using anthropomorphic models: Creation of an oncology database

    Energy Technology Data Exchange (ETDEWEB)

    Papadimitroulas, Panagiotis; Efthimiou, Nikos; Nikiforidis, George C.; Kagadis, George C. [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 265 04 (Greece); Loudos, George [Department of Biomedical Engineering, Technological Educational Institute of Athens, Ag. Spyridonos Street, Egaleo GR 122 10, Athens (Greece); Le Maitre, Amandine; Hatt, Mathieu; Tixier, Florent; Visvikis, Dimitris [Medical Information Processing Laboratory (LaTIM), National Institute of Health and Medical Research (INSERM), 29609 Brest (France)

    2013-11-15

    Purpose: The GATE Monte Carlo simulation toolkit is used for the implementation of realistic PET simulations incorporating tumor heterogeneous activity distributions. The reconstructed patient images include noise from the acquisition process, imaging system's performance restrictions and have limited spatial resolution. For those reasons, the measured intensity cannot be simply introduced in GATE simulations, to reproduce clinical data. Investigation of the heterogeneity distribution within tumors applying partial volume correction (PVC) algorithms was assessed. The purpose of the present study was to create a simulated oncology database based on clinical data with realistic intratumor uptake heterogeneity properties.Methods: PET/CT data of seven oncology patients were used in order to create a realistic tumor database investigating the heterogeneity activity distribution of the simulated tumors. The anthropomorphic models (NURBS based cardiac torso and Zubal phantoms) were adapted to the CT data of each patient, and the activity distribution was extracted from the respective PET data. The patient-specific models were simulated with the Monte Carlo Geant4 application for tomography emission (GATE) in three different levels for each case: (a) using homogeneous activity within the tumor, (b) using heterogeneous activity distribution in every voxel within the tumor as it was extracted from the PET image, and (c) using heterogeneous activity distribution corresponding to the clinical image following PVC. The three different types of simulated data in each case were reconstructed with two iterations and filtered with a 3D Gaussian postfilter, in order to simulate the intratumor heterogeneous uptake. Heterogeneity in all generated images was quantified using textural feature derived parameters in 3D according to the ground truth of the simulation, and compared to clinical measurements. Finally, profiles were plotted in central slices of the tumors, across lines

  6. Investigation of realistic PET simulations incorporating tumor patient's specificity using anthropomorphic models: Creation of an oncology database

    International Nuclear Information System (INIS)

    Papadimitroulas, Panagiotis; Efthimiou, Nikos; Nikiforidis, George C.; Kagadis, George C.; Loudos, George; Le Maitre, Amandine; Hatt, Mathieu; Tixier, Florent; Visvikis, Dimitris

    2013-01-01

    Purpose: The GATE Monte Carlo simulation toolkit is used for the implementation of realistic PET simulations incorporating tumor heterogeneous activity distributions. The reconstructed patient images include noise from the acquisition process, imaging system's performance restrictions and have limited spatial resolution. For those reasons, the measured intensity cannot be simply introduced in GATE simulations, to reproduce clinical data. Investigation of the heterogeneity distribution within tumors applying partial volume correction (PVC) algorithms was assessed. The purpose of the present study was to create a simulated oncology database based on clinical data with realistic intratumor uptake heterogeneity properties.Methods: PET/CT data of seven oncology patients were used in order to create a realistic tumor database investigating the heterogeneity activity distribution of the simulated tumors. The anthropomorphic models (NURBS based cardiac torso and Zubal phantoms) were adapted to the CT data of each patient, and the activity distribution was extracted from the respective PET data. The patient-specific models were simulated with the Monte Carlo Geant4 application for tomography emission (GATE) in three different levels for each case: (a) using homogeneous activity within the tumor, (b) using heterogeneous activity distribution in every voxel within the tumor as it was extracted from the PET image, and (c) using heterogeneous activity distribution corresponding to the clinical image following PVC. The three different types of simulated data in each case were reconstructed with two iterations and filtered with a 3D Gaussian postfilter, in order to simulate the intratumor heterogeneous uptake. Heterogeneity in all generated images was quantified using textural feature derived parameters in 3D according to the ground truth of the simulation, and compared to clinical measurements. Finally, profiles were plotted in central slices of the tumors, across lines with

  7. Validation of Tilt Gain under Realistic Path Loss Model and Network Scenario

    DEFF Research Database (Denmark)

    Nguyen, Huan Cong; Rodriguez, Ignacio; Sørensen, Troels Bundgaard

    2013-01-01

    Despite being a simple and commonly-applied radio optimization technique, the impact on practical network performance from base station antenna downtilt is not well understood. Most published studies based on empirical path loss models report tilt angles and performance gains that are far higher...... than practical experience suggests. We motivate in this paper, based on a practical LTE scenario, that the discrepancy partly lies in the path loss model, and shows that a more detailed semi-deterministic model leads to both lower gains in terms of SINR, outage probability and downlink throughput...... settings, including the use of electrical and/or mechanical antenna downtilt, and therefore it is possible to find multiple optimum tilt profiles in a practical case. A broader implication of this study is that care must be taken when using the 3GPP model to evaluate advanced adaptive antenna techniques...

  8. A space-fractional Monodomain model for cardiac electrophysiology combining anisotropy and heterogeneity on realistic geometries

    Science.gov (United States)

    Cusimano, N.; Gerardo-Giorda, L.

    2018-06-01

    Classical models of electrophysiology do not typically account for the effects of high structural heterogeneity in the spatio-temporal description of excitation waves propagation. We consider a modification of the Monodomain model obtained by replacing the diffusive term of the classical formulation with a fractional power of the operator, defined in the spectral sense. The resulting nonlocal model describes different levels of tissue heterogeneity as the fractional exponent is varied. The numerical method for the solution of the fractional Monodomain relies on an integral representation of the nonlocal operator combined with a finite element discretisation in space, allowing to handle in a natural way bounded domains in more than one spatial dimension. Numerical tests in two spatial dimensions illustrate the features of the model. Activation times, action potential duration and its dispersion throughout the domain are studied as a function of the fractional parameter: the expected peculiar behaviour driven by tissue heterogeneities is recovered.

  9. SMART-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Palmintier, Bryan: Hodge, Bri-Mathias

    2017-01-26

    This presentation provides a Smart-DS project overview and status update for the ARPA-e GRID DATA program meeting 2017, including distribution systems, models, and scenarios, as well as opportunities for GRID DATA collaborations.

  10. Toward synthesizing executable models in biology.

    Science.gov (United States)

    Fisher, Jasmin; Piterman, Nir; Bodik, Rastislav

    2014-01-01

    Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell's behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions), even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modeling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.

  11. Towards Synthesizing Executable Models in Biology

    Directory of Open Access Journals (Sweden)

    Jasmin eFisher

    2014-12-01

    Full Text Available Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell’s behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions, even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modelling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.

  12. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2014-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards. We show that k-guardable polygons generalize two previously identified classes of realistic input. Following this, we give two simple algorithms for triangulating

  13. Realistic modeling of deep brain stimulation implants for electromagnetic MRI safety studies.

    Science.gov (United States)

    Guerin, Bastien; Serano, Peter; Iacono, Maria Ida; Herrington, Todd M; Widge, Alik S; Dougherty, Darin D; Bonmassar, Giorgio; Angelone, Leonardo M; Wald, Lawrence L

    2018-05-04

    We propose a framework for electromagnetic (EM) simulation of deep brain stimulation (DBS) patients in radiofrequency (RF) coils. We generated a model of a DBS patient using post-operative head and neck computed tomography (CT) images stitched together into a 'virtual CT' image covering the entire length of the implant. The body was modeled as homogeneous. The implant path extracted from the CT data contained self-intersections, which we corrected automatically using an optimization procedure. Using the CT-derived DBS path, we built a model of the implant including electrodes, helicoidal internal conductor wires, loops, extension cables, and the implanted pulse generator. We also built four simplified models with straight wires, no extension cables and no loops to assess the impact of these simplifications on safety predictions. We simulated EM fields induced by the RF birdcage body coil in the body model, including at the DBS lead tip at both 1.5 Tesla (64 MHz) and 3 Tesla (123 MHz). We also assessed the robustness of our simulation results by systematically varying the EM properties of the body model and the position and length of the DBS implant (sensitivity analysis). The topology correction algorithm corrected all self-intersection and curvature violations of the initial path while introducing minimal deformations (open-source code available at http://ptx.martinos.org/index.php/Main_Page). The unaveraged lead-tip peak SAR predicted by the five DBS models (0.1 mm resolution grid) ranged from 12.8 kW kg -1 (full model, helicoidal conductors) to 43.6 kW kg -1 (no loops, straight conductors) at 1.5 T (3.4-fold variation) and 18.6 kW kg -1 (full model, straight conductors) to 73.8 kW kg -1 (no loops, straight conductors) at 3 T (4.0-fold variation). At 1.5 T and 3 T, the variability of lead-tip peak SAR with respect to the conductivity ranged between 18% and 30%. Variability with respect to the position and length of the DBS implant ranged between 9.5% and 27.6%.

  14. Realistic modeling of deep brain stimulation implants for electromagnetic MRI safety studies

    Science.gov (United States)

    Guerin, Bastien; Serano, Peter; Iacono, Maria Ida; Herrington, Todd M.; Widge, Alik S.; Dougherty, Darin D.; Bonmassar, Giorgio; Angelone, Leonardo M.; Wald, Lawrence L.

    2018-05-01

    We propose a framework for electromagnetic (EM) simulation of deep brain stimulation (DBS) patients in radiofrequency (RF) coils. We generated a model of a DBS patient using post-operative head and neck computed tomography (CT) images stitched together into a ‘virtual CT’ image covering the entire length of the implant. The body was modeled as homogeneous. The implant path extracted from the CT data contained self-intersections, which we corrected automatically using an optimization procedure. Using the CT-derived DBS path, we built a model of the implant including electrodes, helicoidal internal conductor wires, loops, extension cables, and the implanted pulse generator. We also built four simplified models with straight wires, no extension cables and no loops to assess the impact of these simplifications on safety predictions. We simulated EM fields induced by the RF birdcage body coil in the body model, including at the DBS lead tip at both 1.5 Tesla (64 MHz) and 3 Tesla (123 MHz). We also assessed the robustness of our simulation results by systematically varying the EM properties of the body model and the position and length of the DBS implant (sensitivity analysis). The topology correction algorithm corrected all self-intersection and curvature violations of the initial path while introducing minimal deformations (open-source code available at http://ptx.martinos.org/index.php/Main_Page). The unaveraged lead-tip peak SAR predicted by the five DBS models (0.1 mm resolution grid) ranged from 12.8 kW kg‑1 (full model, helicoidal conductors) to 43.6 kW kg‑1 (no loops, straight conductors) at 1.5 T (3.4-fold variation) and 18.6 kW kg‑1 (full model, straight conductors) to 73.8 kW kg‑1 (no loops, straight conductors) at 3 T (4.0-fold variation). At 1.5 T and 3 T, the variability of lead-tip peak SAR with respect to the conductivity ranged between 18% and 30%. Variability with respect to the position and length of the DBS implant ranged between 9

  15. Towards a realistic approach to validation of reactive transport models for performance assessment

    International Nuclear Information System (INIS)

    Siegel, M.D.

    1993-01-01

    Performance assessment calculations are based on geochemical models that assume that interactions among radionuclides, rocks and groundwaters under natural conditions, can be estimated or bound by data obtained from laboratory-scale studies. The data include radionuclide distribution coefficients, measured in saturated batch systems of powdered rocks, and retardation factors measured in short-term column experiments. Traditional approaches to model validation cannot be applied in a straightforward manner to the simple reactive transport models that use these data. An approach to model validation in support of performance assessment is described in this paper. It is based on a recognition of different levels of model validity and is compatible with the requirements of current regulations for high-level waste disposal. Activities that are being carried out in support of this approach include (1) laboratory and numerical experiments to test the validity of important assumptions inherent in current performance assessment methodologies,(2) integrated transport experiments, and (3) development of a robust coupled reaction/transport code for sensitivity analyses using massively parallel computers

  16. Investigation of tDCS volume conduction effects in a highly realistic head model

    Science.gov (United States)

    Wagner, S.; Rampersad, S. M.; Aydin, Ü.; Vorwerk, J.; Oostendorp, T. F.; Neuling, T.; Herrmann, C. S.; Stegeman, D. F.; Wolters, C. H.

    2014-02-01

    Objective. We investigate volume conduction effects in transcranial direct current stimulation (tDCS) and present a guideline for efficient and yet accurate volume conductor modeling in tDCS using our newly-developed finite element (FE) approach. Approach. We developed a new, accurate and fast isoparametric FE approach for high-resolution geometry-adapted hexahedral meshes and tissue anisotropy. To attain a deeper insight into tDCS, we performed computer simulations, starting with a homogenized three-compartment head model and extending this step by step to a six-compartment anisotropic model. Main results. We are able to demonstrate important tDCS effects. First, we find channeling effects of the skin, the skull spongiosa and the cerebrospinal fluid compartments. Second, current vectors tend to be oriented towards the closest higher conducting region. Third, anisotropic WM conductivity causes current flow in directions more parallel to the WM fiber tracts. Fourth, the highest cortical current magnitudes are not only found close to the stimulation sites. Fifth, the median brain current density decreases with increasing distance from the electrodes. Significance. Our results allow us to formulate a guideline for volume conductor modeling in tDCS. We recommend to accurately model the major tissues between the stimulating electrodes and the target areas, while for efficient yet accurate modeling, an exact representation of other tissues is less important. Because for the low-frequency regime in electrophysiology the quasi-static approach is justified, our results should also be valid for at least low-frequency (e.g., below 100 Hz) transcranial alternating current stimulation.

  17. An Eulerian two-phase flow model for sediment transport under realistic surface waves

    Science.gov (United States)

    Hsu, T. J.; Kim, Y.; Cheng, Z.; Chauchat, J.

    2017-12-01

    Wave-driven sediment transport is of major importance in driving beach morphology. However, the complex mechanisms associated with unsteadiness, free-surface effects, and wave-breaking turbulence have not been fully understood. Particularly, most existing models for sediment transport adopt bottom boundary layer approximation that mimics the flow condition in oscillating water tunnel (U-tube). However, it is well-known that there are key differences in sediment transport when comparing to large wave flume datasets, although the number of wave flume experiments are relatively limited regardless of its importance. Thus, a numerical model which can resolve the entire water column from the bottom boundary layer to the free surface can be a powerful tool. This study reports an on-going effort to better understand and quantify sediment transport under shoaling and breaking surface waves through the creation of open-source numerical models in the OpenFOAM framework. An Eulerian two-phase flow model, SedFoam (Cheng et al., 2017, Coastal Eng.) is fully coupled with a volume-of-fluid solver, interFoam/waves2Foam (Jacobsen et al., 2011, Int. J. Num. Fluid). The fully coupled model, named SedWaveFoam, regards the air and water phases as two immiscible fluids with the interfaces evolution resolved, and the sediment particles as dispersed phase. We carried out model-data comparisons with the large wave flume sheet flow data for nonbreaking waves reported by Dohmen-Janssen and Hanes (2002, J. Geophysical Res.) and good agreements were obtained for sediment concentration and net transport rate. By further simulating a case without free-surface (mimic U-tube condition), the effects of free-surface, most notably the boundary layer streaming effect on total transport, can be quantified.

  18. Bringing a Realistic Global Climate Modeling Experience to a Broader Audience

    Science.gov (United States)

    Sohl, L. E.; Chandler, M. A.; Zhou, J.

    2010-12-01

    EdGCM, the Educational Global Climate Model, was developed with the goal of helping students learn about climate change and climate modeling by giving them the ability to run a genuine NASA global climate model (GCM) on a desktop computer. Since EdGCM was first publicly released in January 2005, tens of thousands of users on seven continents have downloaded the software. EdGCM has been utilized by climate science educators from middle school through graduate school levels, and on occasion even by researchers who otherwise do not have ready access to climate model at national labs in the U.S. and elsewhere. The EdGCM software is designed to walk users through the same process a climate scientist would use in designing and running simulations, and analyzing and visualizing GCM output. Although the current interface design gives users a clear view of some of the complexities involved in using a climate model, it can be daunting for users whose main focus is on climate science rather than modeling per se. As part of the work funded by NASA’s Global Climate Change Education (GCCE) program, we will begin modifications to the user interface that will improve the accessibility of EdGCM to a wider array of users, especially at the middle school and high school levels, by: 1) Developing an automated approach (a “wizard”) to simplify the user experience in setting up new climate simulations; 2) Produce a catalog of “rediscovery experiments” that allow users to reproduce published climate model results, and in some cases compare model projections to real world data; and 3) Enhance distance learning and online learning opportunities through the development of a web-based interface. The prototypes for these modifications will then be presented to educators belonging to an EdGCM Users Group for feedback, so that we can further refine the EdGCM software, and thus deliver the tools and materials educators want and need across a wider range of learning environments.

  19. Sensing of complex buildings and reconstruction into photo-realistic 3D models

    NARCIS (Netherlands)

    Heredia Soriano, F.J.

    2012-01-01

    The 3D reconstruction of indoor and outdoor environments has received an interest only recently, as companies began to recognize that using reconstructed models is a way to generate revenue through location-based services and advertisements. A great amount of research has been done in the field of

  20. A realistic solvable model for the Coulomb dissociation of neutron halo nuclei

    International Nuclear Information System (INIS)

    Baur, G.; Hencken, K.; Trautmann, D.

    2003-01-01

    As a model of a neutron halo nucleus we consider a neutron bound to an inert core by a zero range force. We study the breakup of this simple nucleus in the Coulomb field of a target nucleus. In the post-form DWBA (or, in our simple model CWBA (''Coulomb wave born approximation'')) an analytic solution for the T-matrix is known. We study limiting cases of this T-matrix. As it should be, we recover the Born approximation for weak Coulomb fields (i.e., for the relevant Coulomb parameters much smaller than 1). For strong Coulomb fields, high beam energies, and scattering to the forward region we find a result which is very similar to the Born result. It is only modified by a relative phase (close to 0) between the two terms and a prefactor (close to 1). A similar situation exists for bremsstrahlung emission. This formula can be related to the first order semiclassical treatment of the electromagnetic dissociation. Since our CWBA model contains the electromagnetic interaction between the core and the target nucleus to all orders, this means that higher order effects (including postacceleration effects) are small in the case of high beam energies and forward scattering. Our model also predicts a scaling behavior of the differential cross section, that is, different systems (with different binding energies, beam energies and scattering angles) show the same dependence on two variables x and y. (orig.)

  1. A new theoretical model for inelastic tunneling in realistic systems : comparing STM simulations with experiments

    NARCIS (Netherlands)

    Rossen, E.T.R.

    2012-01-01

    This thesis has been dedicated to modeling the electron transport in tunnel junctions in order to efficiently describe and predict inelastic effects that occur when electrons pass a tunnel junction. These inelastic effects can be considered at several levels of sophistication, from very simple to

  2. An approach to creating a more realistic working model from a protein data bank entry.

    Science.gov (United States)

    Brandon, Christopher J; Martin, Benjamin P; McGee, Kelly J; Stewart, James J P; Braun-Sand, Sonja B

    2015-01-01

    An accurate model of three-dimensional protein structure is important in a variety of fields such as structure-based drug design and mechanistic studies of enzymatic reactions. While the entries in the Protein Data Bank ( http://www.pdb.org ) provide valuable information about protein structures, a small fraction of the PDB structures were found to contain anomalies not reported in the PDB file. The semiempirical PM7 method in MOPAC2012 was used for identifying anomalously short hydrogen bonds, C-H⋯O/C-H⋯N interactions, non-bonding close contacts, and unrealistic covalent bond lengths in recently published Protein Data Bank files. It was also used to generate new structures with these faults removed. When the semiempirical models were compared to those of PDB_REDO (http://www.cmbi.ru.nl/pdb_redo/), the clashscores, as defined by MolProbity ( http://molprobity.biochem.duke.edu/), were better in about 50% of the structures. The semiempirical models also had a lower root-mean-square-deviation value in nearly all cases than those from PDB_REDO, indicative of a better conservation of the tertiary structure. Finally, the semiempirical models were found to have lower clashscores than the initial PDB file in all but one case. Because this approach maintains as much of the original tertiary structure as possible while improving anomalous interactions, it should be useful to theoreticians, experimentalists, and crystallographers investigating the structure and function of proteins.

  3. Quantum Hall conductivity in a Landau type model with a realistic geometry

    International Nuclear Information System (INIS)

    Chandelier, F.; Georgelin, Y.; Masson, T.; Wallet, J.-C.

    2003-01-01

    In this paper, we revisit some quantum mechanical aspects related to the quantum Hall effect. We consider a Landau type model, paying a special attention to the experimental and geometrical features of quantum Hall experiments. The resulting formalism is then used to compute explicitly the Hall conductivity from a Kubo formula

  4. Realistic D-brane models on warped throats: Fluxes, hierarchies and moduli stabilization

    International Nuclear Information System (INIS)

    Cascales, J.F.G.; Garcia del Moral, M.P.; Quevedo, F.; Uranga, A.

    2004-01-01

    We describe the construction of string theory models with semirealistic spectrum in a sector of (anti) D3-branes located at an orbifold singularity at the bottom of a highly warped throat geometry, which is a generalisation of the Klebanov-Strassler deformed conifold. These models realise the Randall-Sundrum proposal to naturally generate the Planck/electroweak hierarchy in a concrete string theory embedding, and yielding interesting chiral open string spectra. We describe examples with Standard Model gauge group (or left-right symmetric extensions) and three families of SM fermions, with correct quantum numbers including hypercharge. The dilaton and complex structure moduli of the geometry are stabilised by the 3-form fluxes required to build the throat. We describe diverse issues concerning the stabilisation of geometric Kahler moduli, like blow-up modes of the orbifold singularities, via D term potentials and gauge theory non-perturbative effects, like gaugino condensation. This local geometry, once embedded in a full compactification, could give rise to models with all moduli stabilised, and with the potential to lead to de Sitter vacua. Issues of gauge unification, proton stability, supersymmetry breaking and Yukawa couplings are also discussed. (author)

  5. Modeling the transport and fate of radioactive noble gases in very dry desert alluvium: Realistic scenarios

    International Nuclear Information System (INIS)

    Lindstrom, F.T.; Cawlfield, D.E.; Donahue, M.E.; Emer, D.F.; Shott, G.J.

    1992-01-01

    US DOE Order 5820.2A (1988) requires that a performance assessment of all new and existing low-level radioactive waste management sites be made. An integral part of every performance assessment is the mathematical modeling of the transport and fate of noble gas radionuclides in the gas phase. Current in depth site characterization of the high desert alluvium in Area 5 of the Nevada Test Site (NTS) is showing that the alluvium is very very dry all the way to the water table (240 meters below land surface). The potential for radioactive noble gas (e.g. Rn-220 and Rn-222) transport to the atmosphere from shallow land burial of Thorium and Uranium waste is very high. Objectives of this modeling effort include: Construct a physics based sits specific noble gas transport model; Include induced advection due to barometric pressure changes at the atmospheric boundary layer (thin) - dry desert alluvium interface; User selected option for use of NOAA barometric pressure or a ''home brewed'' barometric pressure wave made up of up to 15 sinusoids and cosinusoids; Use the model to help make engineering decisions on the design of the burial pits and associated closure caps

  6. Towards realistic threat modeling : attack commodification, irrelevant vulnerabilities, and unrealistic assumptions

    NARCIS (Netherlands)

    Allodi, L.; Etalle, S.

    2017-01-01

    Current threat models typically consider all possible ways an attacker can penetrate a system and assign probabilities to each path according to some metric (e.g. time-to-compromise). In this paper we discuss how this view hinders the realness of both technical (e.g. attack graphs) and strategic

  7. Can we trust climate models to realistically represent severe European windstorms?

    Science.gov (United States)

    Trzeciak, Tomasz M.; Knippertz, Peter; Owen, Jennifer S. R.

    2014-05-01

    Despite the enormous advances made in climate change research, robust projections of the position and the strength of the North Atlantic stormtrack are not yet possible. In particular with respect to damaging windstorms, this incertitude bears enormous risks to European societies and the (re)insurance industry. Previous studies have addressed the problem of climate model uncertainty through statistical comparisons of simulations of the current climate with (re-)analysis data and found that there is large disagreement between different climate models, different ensemble members of the same model and observed climatologies of intense cyclones. One weakness of such statistical evaluations lies in the difficulty to separate influences of the climate model's basic state from the influence of fast processes on the development of the most intense storms. Compensating effects between the two might conceal errors and suggest higher reliability than there really is. A possible way to separate influences of fast and slow processes in climate projections is through a "seamless" approach of hindcasting historical, severe storms with climate models started from predefined initial conditions and run in a numerical weather prediction mode on the time scale of several days. Such a cost-effective case-study approach, which draws from and expands on the concepts from the Transpose-AMIP initiative, has recently been undertaken in the SEAMSEW project at the University of Leeds funded by the AXA Research Fund. Key results from this work focusing on 20 historical storms and using different lead times and horizontal and vertical resolutions include: (a) Tracks are represented reasonably well by most hindcasts. (b) Sensitivity to vertical resolution is low. (c) There is a systematic underprediction of cyclone depth for a coarse resolution of T63, but surprisingly no systematic bias is found for higher-resolution runs using T127, showing that climate models are in fact able to represent the

  8. STEPS: efficient simulation of stochastic reaction–diffusion models in realistic morphologies

    Directory of Open Access Journals (Sweden)

    Hepburn Iain

    2012-05-01

    Full Text Available Abstract Background Models of cellular molecular systems are built from components such as biochemical reactions (including interactions between ligands and membrane-bound proteins, conformational changes and active and passive transport. A discrete, stochastic description of the kinetics is often essential to capture the behavior of the system accurately. Where spatial effects play a prominent role the complex morphology of cells may have to be represented, along with aspects such as chemical localization and diffusion. This high level of detail makes efficiency a particularly important consideration for software that is designed to simulate such systems. Results We describe STEPS, a stochastic reaction–diffusion simulator developed with an emphasis on simulating biochemical signaling pathways accurately and efficiently. STEPS supports all the above-mentioned features, and well-validated support for SBML allows many existing biochemical models to be imported reliably. Complex boundaries can be represented accurately in externally generated 3D tetrahedral meshes imported by STEPS. The powerful Python interface facilitates model construction and simulation control. STEPS implements the composition and rejection method, a variation of the Gillespie SSA, supporting diffusion between tetrahedral elements within an efficient search and update engine. Additional support for well-mixed conditions and for deterministic model solution is implemented. Solver accuracy is confirmed with an original and extensive validation set consisting of isolated reaction, diffusion and reaction–diffusion systems. Accuracy imposes upper and lower limits on tetrahedron sizes, which are described in detail. By comparing to Smoldyn, we show how the voxel-based approach in STEPS is often faster than particle-based methods, with increasing advantage in larger systems, and by comparing to MesoRD we show the efficiency of the STEPS implementation. Conclusion STEPS simulates

  9. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  10. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  11. Toward the realistic three-generation model in the (2,0) heterotic string compactification

    International Nuclear Information System (INIS)

    Asatryan, H.M.; Murayama, A.

    1992-01-01

    In this paper, the three generation models with SUSY SO(10) or SU(5) GUTs derived from the (2,0) compactification of E 8 x E' 8 heterotic string, the massless matter field spectra at the GUT scale M X and the breaking directions of GUT symmetries are discussed. A pseudo-left-right symmetric Pati-Salam model is naturally deduced in the SUSY SO(10) GUT and shown to have an interesting property, M x ≅ M P1 , M R ≅ 10 10 GeV and M S ( the scale of superpartner masses) ≅ 10 4 GeV, as a result of the renormalization group equation analysis using the new precise LEP data

  12. Hydrogen Balmer alpha intensity distributions and line profiles from multiple scattering theory using realistic geocoronal models

    Science.gov (United States)

    Anderson, D. E., Jr.; Meier, R. R.; Hodges, R. R., Jr.; Tinsley, B. A.

    1987-01-01

    The H Balmer alpha nightglow is investigated by using Monte Carlo models of asymmetric geocoronal atomic hydrogen distributions as input to a radiative transfer model of solar Lyman-beta radiation in the thermosphere and atmosphere. It is shown that it is essential to include multiple scattering of Lyman-beta radiation in the interpretation of Balmer alpha airglow data. Observations of diurnal variation in the Balmer alpha airglow showing slightly greater intensities in the morning relative to evening are consistent with theory. No evidence is found for anything other than a single sinusoidal diurnal variation of exobase density. Dramatic changes in effective temperature derived from the observed Balmer alpha line profiles are expected on the basis of changing illumination conditions in the thermosphere and exosphere as different regions of the sky are scanned.

  13. From theoretical stellar spectra to realistic models of the Milky Way : a never ending Odyssey

    OpenAIRE

    Ammon, Karin

    2007-01-01

    The last chapter is dedicated to the compilation of the results and the discussion about the success of - but also about the problems that have arisen during - and in part also survived - this work. The main goal of this thesis was, firstly, to convert the stellar parameters given by galaxy models into observables, and then to compare these theoretical stellar distributions in different viewing directions with real observational data to check, if it is possible to find a best-fitt...

  14. Design and Modeling of Turbine Airfoils with Active Flow Control in Realistic Engine Conditions

    Science.gov (United States)

    2008-07-16

    for cylinders make using a simple 2d model less meaningful. The solver used for the cylinder cases was SFELES, a quasi 3D large eddy simulation that...would take into account the 3d aspects of the flow. This is appropriate because the upstream flow in the tunnel is essentially laminar and at the...H2O Druck pressure transducer to measure the local cp distribution. The cp is calculated by taking the inlet total pressure from an upstream pitot

  15. Thermophysical Properties of Fluids: From Realistic to Simple Models and their Applications

    Czech Academy of Sciences Publication Activity Database

    Nezbeda, Ivo; Vlček, Lukáš

    2004-01-01

    Roč. 25, č. 4 (2004), s. 1037-1049 ISSN 0195-928X. [Symposium on Thermophysical Properties /15./. Boulder CO, 22.06.2003-27.06.2003] R&D Projects: GA ČR GA203/02/0764; GA AV ČR IAA4072303 Institutional research plan: CEZ:AV0Z4072921 Keywords : association fluids * perturbation expansion * primitive model Subject RIV: CH - Nuclear ; Quantum Chemistry Impact factor: 0.846, year: 2004

  16. Modeling of pulsatile flow-dependent nitric oxide regulation in a realistic microvascular network.

    Science.gov (United States)

    Wang, Ruofan; Pan, Qing; Kuebler, Wolfgang M; Li, John K-J; Pries, Axel R; Ning, Gangmin

    2017-09-01

    Hemodynamic pulsatility has been reported to regulate microcirculatory function. To quantitatively assess the impact of flow pulsatility on the microvasculature, a mathematical model was first developed to simulate the regulation of NO production by pulsatile flow in the microcirculation. Shear stress and pressure pulsatility were selected as regulators of endothelial NO production and NO-dependent vessel dilation as feedback to control microvascular hemodynamics. The model was then applied to a real microvascular network of the rat mesentery consisting of 546 microvessels. As compared to steady flow conditions, pulsatile flow increased the average NO concentration in arterioles from 256.8±93.1nM to 274.8±101.1nM (Pflow as compared to steady flow conditions. Network perfusion and flow heterogeneity were improved under pulsatile flow conditions, and vasodilation within the network was more sensitive to heart rate changes than pulse pressure amplitude. The proposed model simulates the role of flow pulsatility in the regulation of a complex microvascular network in terms of NO concentration and hemodynamics under varied physiological conditions. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Setting Parameters for Biological Models With ANIMO

    NARCIS (Netherlands)

    Schivo, Stefano; Scholma, Jetse; Karperien, Hermanus Bernardus Johannes; Post, Janine Nicole; van de Pol, Jan Cornelis; Langerak, Romanus; André, Étienne; Frehse, Goran

    2014-01-01

    ANIMO (Analysis of Networks with Interactive MOdeling) is a software for modeling biological networks, such as e.g. signaling, metabolic or gene networks. An ANIMO model is essentially the sum of a network topology and a number of interaction parameters. The topology describes the interactions

  18. Field Distribution of Transcranial Static Magnetic Stimulation in Realistic Human Head Model.

    Science.gov (United States)

    Tharayil, Joseph J; Goetz, Stefan M; Bernabei, John M; Peterchev, Angel V

    2017-10-10

    The objective of this work was to characterize the magnetic field (B-field) that arises in a human brain model from the application of transcranial static magnetic field stimulation (tSMS). The spatial distribution of the B-field magnitude and gradient of a cylindrical, 5.08 cm × 2.54 cm NdFeB magnet were simulated in air and in a human head model using the finite element method and calibrated with measurements in air. The B-field was simulated for magnet placements over prefrontal, motor, sensory, and visual cortex targets. The impact of magnetic susceptibility of head tissues on the B-field was quantified. Peak B-field magnitude and gradient respectively ranged from 179-245 mT and from 13.3-19.0 T/m across the cortical targets. B-field magnitude, focality, and gradient decreased with magnet-cortex distance. The variation in B-field strength and gradient across the anatomical targets largely arose from the magnet-cortex distance. Head magnetic susceptibilities had negligible impact on the B-field characteristics. The half-maximum focality of the tSMS B-field ranged from 7-12 cm 3 . This is the first presentation and characterization of the three-dimensional (3D) spatial distribution of the B-field generated in a human brain model by tSMS. These data can provide quantitative dosing guidance for tSMS applications across various cortical targets and subjects. The finding that the B-field gradient is high near the magnet edges should be considered in studies where neural tissue is placed close to the magnet. The observation that susceptibility has negligible effects confirms assumptions in the literature. © 2017 International Neuromodulation Society.

  19. Realistic Creativity Training for Innovation Practitioners: The Know-Recognize-React Model

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    As creativity becomes increasingly recognized as important raw material for innovation, the importance of identifying ways to increase practitioners’ creativity through rigorously designed creativity training programs is highlighted. Therefore we sat out to design a creativity training program...... the transdisciplinary study described in this paper. Co-creation was employed as a method to ensure the three layers of focus would be taken into consideration. The result is a program called Creative Awareness Training which is based on the new Know-Recognize-React model....

  20. A Clinically Realistic Large Animal Model of Intra-Articular Fracture

    Science.gov (United States)

    2014-12-01

    of the left hock using a purpose-designed “offset” impaction technique (Figure 3).1,2 First, the distal impact “tripod” is anchored to the talus...injury to PTOA. While development of PTOA in the human ankle is often reported to occur very quickly (approximately 2 years after injury), even this...vinylpyridine. Anal Biochem 1980; 106(1): 207-12. Distribution and Progression of Chondrocyte Damage in a Whole-Organ Model of Human Ankle Intra

  1. Towards realistic models from Higher-Dimensional theories with Fuzzy extra dimensions

    CERN Document Server

    Gavriil, D.; Zoupanos, G.

    2014-01-01

    We briefly review the Coset Space Dimensional Reduction (CSDR) programme and the best model constructed so far and then we present some details of the corresponding programme in the case that the extra dimensions are considered to be fuzzy. In particular, we present a four-dimensional $\\mathcal{N} = 4$ Super Yang Mills Theory, orbifolded by $\\mathbb{Z}_3$, which mimics the behaviour of a dimensionally reduced $\\mathcal{N} = 1$, 10-dimensional gauge theory over a set of fuzzy spheres at intermediate high scales and leads to the trinification GUT $SU(3)^3$ at slightly lower, which in turn can be spontaneously broken to the MSSM in low scales.

  2. A Computational, Tissue-Realistic Model of Pressure Ulcer Formation in Individuals with Spinal Cord Injury.

    Directory of Open Access Journals (Sweden)

    Cordelia Ziraldo

    2015-06-01

    Full Text Available People with spinal cord injury (SCI are predisposed to pressure ulcers (PU. PU remain a significant burden in cost of care and quality of life despite improved mechanistic understanding and advanced interventions. An agent-based model (ABM of ischemia/reperfusion-induced inflammation and PU (the PUABM was created, calibrated to serial images of post-SCI PU, and used to investigate potential treatments in silico. Tissue-level features of the PUABM recapitulated visual patterns of ulcer formation in individuals with SCI. These morphological features, along with simulated cell counts and mediator concentrations, suggested that the influence of inflammatory dynamics caused simulations to be committed to "better" vs. "worse" outcomes by 4 days of simulated time and prior to ulcer formation. Sensitivity analysis of model parameters suggested that increasing oxygen availability would reduce PU incidence. Using the PUABM, in silico trials of anti-inflammatory treatments such as corticosteroids and a neutralizing antibody targeted at Damage-Associated Molecular Pattern molecules (DAMPs suggested that, at best, early application at a sufficiently high dose could attenuate local inflammation and reduce pressure-associated tissue damage, but could not reduce PU incidence. The PUABM thus shows promise as an adjunct for mechanistic understanding, diagnosis, and design of therapies in the setting of PU.

  3. A Computational, Tissue-Realistic Model of Pressure Ulcer Formation in Individuals with Spinal Cord Injury.

    Science.gov (United States)

    Ziraldo, Cordelia; Solovyev, Alexey; Allegretti, Ana; Krishnan, Shilpa; Henzel, M Kristi; Sowa, Gwendolyn A; Brienza, David; An, Gary; Mi, Qi; Vodovotz, Yoram

    2015-06-01

    People with spinal cord injury (SCI) are predisposed to pressure ulcers (PU). PU remain a significant burden in cost of care and quality of life despite improved mechanistic understanding and advanced interventions. An agent-based model (ABM) of ischemia/reperfusion-induced inflammation and PU (the PUABM) was created, calibrated to serial images of post-SCI PU, and used to investigate potential treatments in silico. Tissue-level features of the PUABM recapitulated visual patterns of ulcer formation in individuals with SCI. These morphological features, along with simulated cell counts and mediator concentrations, suggested that the influence of inflammatory dynamics caused simulations to be committed to "better" vs. "worse" outcomes by 4 days of simulated time and prior to ulcer formation. Sensitivity analysis of model parameters suggested that increasing oxygen availability would reduce PU incidence. Using the PUABM, in silico trials of anti-inflammatory treatments such as corticosteroids and a neutralizing antibody targeted at Damage-Associated Molecular Pattern molecules (DAMPs) suggested that, at best, early application at a sufficiently high dose could attenuate local inflammation and reduce pressure-associated tissue damage, but could not reduce PU incidence. The PUABM thus shows promise as an adjunct for mechanistic understanding, diagnosis, and design of therapies in the setting of PU.

  4. Parallel Solver for Diffuse Optical Tomography on Realistic Head Models With Scattering and Clear Regions.

    Science.gov (United States)

    Placati, Silvio; Guermandi, Marco; Samore, Andrea; Scarselli, Eleonora Franchi; Guerrieri, Roberto

    2016-09-01

    Diffuse optical tomography is an imaging technique, based on evaluation of how light propagates within the human head to obtain the functional information about the brain. Precision in reconstructing such an optical properties map is highly affected by the accuracy of the light propagation model implemented, which needs to take into account the presence of clear and scattering tissues. We present a numerical solver based on the radiosity-diffusion model, integrating the anatomical information provided by a structural MRI. The solver is designed to run on parallel heterogeneous platforms based on multiple GPUs and CPUs. We demonstrate how the solver provides a 7 times speed-up over an isotropic-scattered parallel Monte Carlo engine based on a radiative transport equation for a domain composed of 2 million voxels, along with a significant improvement in accuracy. The speed-up greatly increases for larger domains, allowing us to compute the light distribution of a full human head ( ≈ 3 million voxels) in 116 s for the platform used.

  5. A Computational, Tissue-Realistic Model of Pressure Ulcer Formation in Individuals with Spinal Cord Injury

    Science.gov (United States)

    Ziraldo, Cordelia; Solovyev, Alexey; Allegretti, Ana; Krishnan, Shilpa; Henzel, M. Kristi; Sowa, Gwendolyn A.; Brienza, David; An, Gary; Mi, Qi; Vodovotz, Yoram

    2015-01-01

    People with spinal cord injury (SCI) are predisposed to pressure ulcers (PU). PU remain a significant burden in cost of care and quality of life despite improved mechanistic understanding and advanced interventions. An agent-based model (ABM) of ischemia/reperfusion-induced inflammation and PU (the PUABM) was created, calibrated to serial images of post-SCI PU, and used to investigate potential treatments in silico. Tissue-level features of the PUABM recapitulated visual patterns of ulcer formation in individuals with SCI. These morphological features, along with simulated cell counts and mediator concentrations, suggested that the influence of inflammatory dynamics caused simulations to be committed to “better” vs. “worse” outcomes by 4 days of simulated time and prior to ulcer formation. Sensitivity analysis of model parameters suggested that increasing oxygen availability would reduce PU incidence. Using the PUABM, in silico trials of anti-inflammatory treatments such as corticosteroids and a neutralizing antibody targeted at Damage-Associated Molecular Pattern molecules (DAMPs) suggested that, at best, early application at a sufficiently high dose could attenuate local inflammation and reduce pressure-associated tissue damage, but could not reduce PU incidence. The PUABM thus shows promise as an adjunct for mechanistic understanding, diagnosis, and design of therapies in the setting of PU. PMID:26111346

  6. CFD Modelling of Abdominal Aortic Aneurysm on Hemodynamic Loads Using a Realistic Geometry with CT

    Directory of Open Access Journals (Sweden)

    Eduardo Soudah

    2013-01-01

    Full Text Available The objective of this study is to find a correlation between the abdominal aortic aneurysm (AAA geometric parameters, wall stress shear (WSS, abdominal flow patterns, intraluminal thrombus (ILT, and AAA arterial wall rupture using computational fluid dynamics (CFD. Real AAA 3D models were created by three-dimensional (3D reconstruction of in vivo acquired computed tomography (CT images from 5 patients. Based on 3D AAA models, high quality volume meshes were created using an optimal tetrahedral aspect ratio for the whole domain. In order to quantify the WSS and the recirculation inside the AAA, a 3D CFD using finite elements analysis was used. The CFD computation was performed assuming that the arterial wall is rigid and the blood is considered a homogeneous Newtonian fluid with a density of 1050 kg/m3 and a kinematic viscosity of 4×10-3 Pa·s. Parallelization procedures were used in order to increase the performance of the CFD calculations. A relation between AAA geometric parameters (asymmetry index (β, saccular index (γ, deformation diameter ratio (χ, and tortuosity index (ε and hemodynamic loads was observed, and it could be used as a potential predictor of AAA arterial wall rupture and potential ILT formation.

  7. Photo Realistic 3d Modeling with Uav: GEDİK Ahmet Pasha Mosque in AFYONKARAHİSAR

    Science.gov (United States)

    Uysal, M.; Toprak, A. S.; Polat, N.

    2013-07-01

    Many of the cultural heritages in the world have been totally or partly destroyed by natural events and human activities such as earthquake, flood and fire until the present day. Cultural heritages are legacy for us as well; it is also a fiduciary for next generation. To deliver this fiduciary to the future generations, cultural heritages have to be protected and registered. There are different methods for applying this registry but Photogrammetry is the most accurate and rapid method. Photogrammetry enables us to registry cultural heritages and generating 3D photo-realistic models. Nowadays, 3D models are being used in various fields such as education and tourism. In registration of complex and high construction by Photogrammetry, there are some problems in data acquisition and processing. Especially for high construction's photographs, some additional equipment is required such as balloon and lifter. In recent years The Unmanned Aerial Vehicles (UAV) are commonly started to be used in different fields for different goals. In Photogrammetry, The UAVs are being used for particularly data acquisition. It is not always easy to capture data due to the situation of historical places and their neighbourhood. The use of UAVs for documentation of cultural heritage will make an important contribution. The main goals of this study are to survey cultural heritages by Photogrammetry and to investigate the potential of UAVs in 3D modelling. In this purpose we surveyed Gedik Ahmet Pasha Mosque photogrammetricly by UAV and will produce photorealistic 3D model. Gedik Ahmet Pasha, The Grand Vizier of Fatih Sultan Mehmet, has been in Afyonkarahisar during the campaign to Karaman between the years of 1472-1473. He wanted Architect Ayaz Agha to build a complex of Bathhouse, Mosque and a Madrasah here, Afyon, due to admiration of this city. Gedik Ahmet Pasha Mosque is in the centre of this complex. Gedik Ahmet Pasha Mosque is popularly known as Imaret Mosque among the people of Afyon

  8. PHOTO REALISTIC 3D MODELING WITH UAV: GEDİK AHMET PASHA MOSQUE IN AFYONKARAHİSAR

    Directory of Open Access Journals (Sweden)

    M. Uysal

    2013-07-01

    Full Text Available Many of the cultural heritages in the world have been totally or partly destroyed by natural events and human activities such as earthquake, flood and fire until the present day. Cultural heritages are legacy for us as well; it is also a fiduciary for next generation. To deliver this fiduciary to the future generations, cultural heritages have to be protected and registered. There are different methods for applying this registry but Photogrammetry is the most accurate and rapid method. Photogrammetry enables us to registry cultural heritages and generating 3D photo-realistic models. Nowadays, 3D models are being used in various fields such as education and tourism. In registration of complex and high construction by Photogrammetry, there are some problems in data acquisition and processing. Especially for high construction's photographs, some additional equipment is required such as balloon and lifter. In recent years The Unmanned Aerial Vehicles (UAV are commonly started to be used in different fields for different goals. In Photogrammetry, The UAVs are being used for particularly data acquisition. It is not always easy to capture data due to the situation of historical places and their neighbourhood. The use of UAVs for documentation of cultural heritage will make an important contribution. The main goals of this study are to survey cultural heritages by Photogrammetry and to investigate the potential of UAVs in 3D modelling. In this purpose we surveyed Gedik Ahmet Pasha Mosque photogrammetricly by UAV and will produce photorealistic 3D model. Gedik Ahmet Pasha, The Grand Vizier of Fatih Sultan Mehmet, has been in Afyonkarahisar during the campaign to Karaman between the years of 1472–1473. He wanted Architect Ayaz Agha to build a complex of Bathhouse, Mosque and a Madrasah here, Afyon, due to admiration of this city. Gedik Ahmet Pasha Mosque is in the centre of this complex. Gedik Ahmet Pasha Mosque is popularly known as Imaret Mosque among

  9. Modeling Thermal Transport and Surface Deformation on Europa using Realistic Rheologies

    Science.gov (United States)

    Linneman, D.; Lavier, L.; Becker, T. W.; Soderlund, K. M.

    2017-12-01

    Most existing studies of Europa's icy shell model the ice as a Maxwell visco-elastic solid or viscous fluid. However, these approaches do not allow for modeling of localized deformation of the brittle part of the ice shell, which is important for understanding the satellite's evolution and unique geology. Here, we model the shell as a visco-elasto-plastic material, with a brittle Mohr-Coulomb elasto-plastic layer on top of a convective Maxwell viscoelastic layer, to investigate how thermal transport processes relate to the observed deformation and topography on Europa's surface. We use Fast Lagrangian Analysis of Continua (FLAC) code, which employs an explicit time-stepping algorithm to simulate deformation processes in Europa's icy shell. Heat transfer drives surface deformation within the icy shell through convection and tidal dissipation due to its elliptical orbit around Jupiter. We first analyze the visco-elastic behavior of a convecting ice layer and the parameters that govern this behavior. The regime of deformation depends on the magnitude of the stress (diffusion creep at low stresses, grain-size-sensitive creep at intermediate stresses, dislocation creep at high stresses), so we calculate effective viscosity each time step using the constitutive stress-strain equation and a combined flow law that accounts for all types of deformation. Tidal dissipation rate is calculated as a function of the temperature-dependent Maxwell relaxation time and the square of the second invariant of the strain rate averaged over each orbital period. After we initiate convection in the viscoelastic layer by instituting an initial temperature perturbation, we then add an elastoplastic layer on top of the convecting layer and analyze how the brittle ice reacts to stresses from below and any resulting topography. We also take into account shear heating along fractures in the brittle layer. We vary factors such as total shell thickness and minimum viscosity, as these parameters are

  10. Towards realistic modelling of spectral line formation - lessons learnt from red giants

    Science.gov (United States)

    Lind, Karin

    2015-08-01

    Many decades of quantitative spectroscopic studies of red giants have revealed much about the formation histories and interlinks between the main components of the Galaxy and its satellites. Telescopes and instrumentation are now able to deliver high-resolution data of superb quality for large stellar samples and Galactic archaeology has entered a new era. At the same time, we have learnt how simplifying physical assumptions in the modelling of spectroscopic data can bias the interpretations, in particular one-dimensional homogeneity and local thermodynamic equilibrium (LTE). I will present lessons learnt so far from non-LTE spectral line formation in 3D radiation-hydrodynamic atmospheres of red giants, the smaller siblings of red supergiants.

  11. Realistic Creativity Training for Innovation Practitioners: The Know-Recognize-React Model

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    As creativity becomes increasingly recognized as important raw material for innovation, the importance of identifying ways to increase practitioners’ creativity through rigorously designed creativity training programs is highlighted. Therefore we sat out to design a creativity training program sp...... the transdisciplinary study described in this paper. Co-creation was employed as a method to ensure the three layers of focus would be taken into consideration. The result is a program called Creative Awareness Training which is based on the new Know-Recognize-React model.......As creativity becomes increasingly recognized as important raw material for innovation, the importance of identifying ways to increase practitioners’ creativity through rigorously designed creativity training programs is highlighted. Therefore we sat out to design a creativity training program...

  12. Toward computational cumulative biology by combining models of biological datasets.

    Science.gov (United States)

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations-for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database.

  13. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  14. Statistical multi-path exposure method for assessing the whole-body SAR in a heterogeneous human body model in a realistic environment.

    Science.gov (United States)

    Vermeeren, Günter; Joseph, Wout; Martens, Luc

    2013-04-01

    Assessing the whole-body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi-path exposure method for heterogeneous realistic human body models. The method is applied for the 6-year-old Virtual Family boy (VFB) exposed to the GSM downlink at 950 MHz. It is shown that the whole-body SAR does not differ significantly over the different environments at an operating frequency of 950 MHz. Furthermore, the whole-body SAR in the VFB for multi-path exposure exceeds the whole-body SAR for worst-case single-incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950 MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole-body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. Copyright © 2012 Wiley Periodicals, Inc.

  15. Modeling biology using relational databases.

    Science.gov (United States)

    Peitzsch, Robert M

    2003-02-01

    There are several different methodologies that can be used for designing a database schema; no one is the best for all occasions. This unit demonstrates two different techniques for designing relational tables and discusses when each should be used. These two techniques presented are (1) traditional Entity-Relationship (E-R) modeling and (2) a hybrid method that combines aspects of data warehousing and E-R modeling. The method of choice depends on (1) how well the information and all its inherent relationships are understood, (2) what types of questions will be asked, (3) how many different types of data will be included, and (4) how much data exists.

  16. Modelling effects of diquat under realistic exposure patterns in genetically differentiated populations of the gastropod Lymnaea stagnalis.

    Science.gov (United States)

    Ducrot, Virginie; Péry, Alexandre R R; Lagadic, Laurent

    2010-11-12

    Pesticide use leads to complex exposure and response patterns in non-target aquatic species, so that the analysis of data from standard toxicity tests may result in unrealistic risk forecasts. Developing models that are able to capture such complexity from toxicity test data is thus a crucial issue for pesticide risk assessment. In this study, freshwater snails from two genetically differentiated populations of Lymnaea stagnalis were exposed to repeated acute applications of environmentally realistic concentrations of the herbicide diquat, from the embryo to the adult stage. Hatching rate, embryonic development duration, juvenile mortality, feeding rate and age at first spawning were investigated during both exposure and recovery periods. Effects of diquat on mortality were analysed using a threshold hazard model accounting for time-varying herbicide concentrations. All endpoints were significantly impaired at diquat environmental concentrations in both populations. Snail evolutionary history had no significant impact on their sensitivity and responsiveness to diquat, whereas food acted as a modulating factor of toxicant-induced mortality. The time course of effects was adequately described by the model, which thus appears suitable to analyse long-term effects of complex exposure patterns based upon full life cycle experiment data. Obtained model outputs (e.g. no-effect concentrations) could be directly used for chemical risk assessment.

  17. A realistic multimodal modeling approach for the evaluation of distributed source analysis: application to sLORETA.

    Science.gov (United States)

    Cosandier-Rimélé, D; Ramantani, G; Zentner, J; Schulze-Bonhage, A; Dümpelmann, M

    2017-10-01

    Electrical source localization (ESL) deriving from scalp EEG and, in recent years, from intracranial EEG (iEEG), is an established method in epilepsy surgery workup. We aimed to validate the distributed ESL derived from scalp EEG and iEEG, particularly regarding the spatial extent of the source, using a realistic epileptic spike activity simulator. ESL was applied to the averaged scalp EEG and iEEG spikes of two patients with drug-resistant structural epilepsy. The ESL results for both patients were used to outline the location and extent of epileptic cortical patches, which served as the basis for designing a spatiotemporal source model. EEG signals for both modalities were then generated for different anatomic locations and spatial extents. ESL was subsequently performed on simulated signals with sLORETA, a commonly used distributed algorithm. ESL accuracy was quantitatively assessed for iEEG and scalp EEG. The source volume was overestimated by sLORETA at both EEG scales, with the error increasing with source size, particularly for iEEG. For larger sources, ESL accuracy drastically decreased, and reconstruction volumes shifted to the center of the head for iEEG, while remaining stable for scalp EEG. Overall, the mislocalization of the reconstructed source was more pronounced for iEEG. We present a novel multiscale framework for the evaluation of distributed ESL, based on realistic multiscale EEG simulations. Our findings support that reconstruction results for scalp EEG are often more accurate than for iEEG, owing to the superior 3D coverage of the head. Particularly the iEEG-derived reconstruction results for larger, widespread generators should be treated with caution.

  18. A realistic multimodal modeling approach for the evaluation of distributed source analysis: application to sLORETA

    Science.gov (United States)

    Cosandier-Rimélé, D.; Ramantani, G.; Zentner, J.; Schulze-Bonhage, A.; Dümpelmann, M.

    2017-10-01

    Objective. Electrical source localization (ESL) deriving from scalp EEG and, in recent years, from intracranial EEG (iEEG), is an established method in epilepsy surgery workup. We aimed to validate the distributed ESL derived from scalp EEG and iEEG, particularly regarding the spatial extent of the source, using a realistic epileptic spike activity simulator. Approach. ESL was applied to the averaged scalp EEG and iEEG spikes of two patients with drug-resistant structural epilepsy. The ESL results for both patients were used to outline the location and extent of epileptic cortical patches, which served as the basis for designing a spatiotemporal source model. EEG signals for both modalities were then generated for different anatomic locations and spatial extents. ESL was subsequently performed on simulated signals with sLORETA, a commonly used distributed algorithm. ESL accuracy was quantitatively assessed for iEEG and scalp EEG. Main results. The source volume was overestimated by sLORETA at both EEG scales, with the error increasing with source size, particularly for iEEG. For larger sources, ESL accuracy drastically decreased, and reconstruction volumes shifted to the center of the head for iEEG, while remaining stable for scalp EEG. Overall, the mislocalization of the reconstructed source was more pronounced for iEEG. Significance. We present a novel multiscale framework for the evaluation of distributed ESL, based on realistic multiscale EEG simulations. Our findings support that reconstruction results for scalp EEG are often more accurate than for iEEG, owing to the superior 3D coverage of the head. Particularly the iEEG-derived reconstruction results for larger, widespread generators should be treated with caution.

  19. ON THE REQUIREMENTS FOR REALISTIC MODELING OF NEUTRINO TRANSPORT IN SIMULATIONS OF CORE-COLLAPSE SUPERNOVAE

    Energy Technology Data Exchange (ETDEWEB)

    Lentz, Eric J. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996-1200 (United States); Mezzacappa, Anthony; Hix, W. Raphael [Physics Division, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN 37831-6354 (United States); Messer, O. E. Bronson [Computer Science and Mathematics Division, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN 37831-6164 (United States); Liebendoerfer, Matthias [Department of Physics, University of Basel, Klingelbergstrasse 82, CH-4056 Basel (Switzerland); Bruenn, Stephen W., E-mail: elentz@utk.edu, E-mail: mezzacappaa@ornl.gov [Department of Physics, Florida Atlantic University, 777 Glades Road, Boca Raton, FL 33431-0991 (United States)

    2012-03-01

    We have conducted a series of numerical experiments with the spherically symmetric, general relativistic, neutrino radiation hydrodynamics code AGILE-BOLTZTRAN to examine the effects of several approximations used in multidimensional core-collapse supernova simulations. Our code permits us to examine the effects of these approximations quantitatively by removing, or substituting for, the pieces of supernova physics of interest. These approximations include: (1) using Newtonian versus general relativistic gravity, hydrodynamics, and transport; (2) using a reduced set of weak interactions, including the omission of non-isoenergetic neutrino scattering, versus the current state-of-the-art; and (3) omitting the velocity-dependent terms, or observer corrections, from the neutrino Boltzmann kinetic equation. We demonstrate that each of these changes has noticeable effects on the outcomes of our simulations. Of these, we find that the omission of observer corrections is particularly detrimental to the potential for neutrino-driven explosions and exhibits a failure to conserve lepton number. Finally, we discuss the impact of these results on our understanding of current, and the requirements for future, multidimensional models.

  20. Accessory enzymes influence cellulase hydrolysis of the model substrate and the realistic lignocellulosic biomass.

    Science.gov (United States)

    Sun, Fubao Fuebiol; Hong, Jiapeng; Hu, Jinguang; Saddler, Jack N; Fang, Xu; Zhang, Zhenyu; Shen, Song

    2015-11-01

    The potential of cellulase enzymes in the developing and ongoing "biorefinery" industry has provided a great motivation to develop an efficient cellulase mixture. Recent work has shown how important the role that the so-called accessory enzymes can play in an effective enzymatic hydrolysis. In this study, three newest Novozymes Cellic CTec cellulase preparations (CTec 1/2/3) were compared to hydrolyze steam pretreated lignocellulosic substrates and model substances at an identical FPA loading. These cellulase preparations were found to display significantly different hydrolytic performances irrelevant with the FPA. And this difference was even observed on the filter paper itself when the FPA based assay was revisited. The analysis of specific enzyme activity in cellulase preparations demonstrated that different accessory enzymes were mainly responsible for the discrepancy of enzymatic hydrolysis between diversified substrates and various cellulases. Such the active role of accessory enzymes present in cellulase preparations was finally verified by supplementation with β-glucosidase, xylanase and lytic polysaccharide monooxygenases AA9. This paper provides new insights into the role of accessory enzymes, which can further provide a useful reference for the rational customization of cellulase cocktails in order to realize an efficient conversion of natural lignocellulosic substrates. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. ON THE REQUIREMENTS FOR REALISTIC MODELING OF NEUTRINO TRANSPORT IN SIMULATIONS OF CORE-COLLAPSE SUPERNOVAE

    International Nuclear Information System (INIS)

    Lentz, Eric J.; Mezzacappa, Anthony; Hix, W. Raphael; Messer, O. E. Bronson; Liebendörfer, Matthias; Bruenn, Stephen W.

    2012-01-01

    We have conducted a series of numerical experiments with the spherically symmetric, general relativistic, neutrino radiation hydrodynamics code AGILE-BOLTZTRAN to examine the effects of several approximations used in multidimensional core-collapse supernova simulations. Our code permits us to examine the effects of these approximations quantitatively by removing, or substituting for, the pieces of supernova physics of interest. These approximations include: (1) using Newtonian versus general relativistic gravity, hydrodynamics, and transport; (2) using a reduced set of weak interactions, including the omission of non-isoenergetic neutrino scattering, versus the current state-of-the-art; and (3) omitting the velocity-dependent terms, or observer corrections, from the neutrino Boltzmann kinetic equation. We demonstrate that each of these changes has noticeable effects on the outcomes of our simulations. Of these, we find that the omission of observer corrections is particularly detrimental to the potential for neutrino-driven explosions and exhibits a failure to conserve lepton number. Finally, we discuss the impact of these results on our understanding of current, and the requirements for future, multidimensional models.

  2. Electron scattering data as the basis for kinetic models -- what can we realistically provide, and how?

    Science.gov (United States)

    Buckman, Stephen

    2009-10-01

    It is unlikely that anyone would dispute the important role that the availability of accurate data can play in the modeling and simulation of low temperature plasmas. Fundamental measurements of collision processes, from the relatively simple (eg. elastic scattering) to the complex (eg. molecular dissociation) are critical to developing an understanding of discharge and plasma behaviour. While there has been a healthy relationship between the data users and data gatherers at meetings such as GEC for many years, there are often misunderstandings about the capabilities that reside in each of these areas, and how best to maintain and strengthen the communication between them. This paper will attempt to summarise those electron-driven processes that are accessible, in a quantitative sense, in modern scattering experiments. Advances in treating reactive and excited species will also be discussed, as will the potential to push our measurement technologies further. An inescapable conclusion is that the collision community can best contribute through a strategic alliance between experiment and theory. Theory should be benchmarked against experiment for those processes and targets that are accessible, and used wisely for those processes where experiment cannot contribute.

  3. Structured population models in biology and epidemiology

    CERN Document Server

    Ruan, Shigui

    2008-01-01

    This book consists of six chapters written by leading researchers in mathematical biology. These chapters present recent and important developments in the study of structured population models in biology and epidemiology. Topics include population models structured by age, size, and spatial position; size-structured models for metapopulations, macroparasitc diseases, and prion proliferation; models for transmission of microparasites between host populations living on non-coincident spatial domains; spatiotemporal patterns of disease spread; method of aggregation of variables in population dynamics; and biofilm models. It is suitable as a textbook for a mathematical biology course or a summer school at the advanced undergraduate and graduate level. It can also serve as a reference book for researchers looking for either interesting and specific problems to work on or useful techniques and discussions of some particular problems.

  4. From Realistic to Simple Models of Associating Fluids. II. Primitive Models of Ammonia, Ethanol and Models of Water Revisited

    Czech Academy of Sciences Publication Activity Database

    Vlček, Lukáš; Nezbeda, Ivo

    2004-01-01

    Roč. 102, č. 5 (2004), s. 485-497 ISSN 0026-8976 R&D Projects: GA ČR GA203/02/0764; GA AV ČR IAA4072303; GA AV ČR IAA4072309 Institutional research plan: CEZ:AV0Z4072921 Keywords : primitive model * association fluids * ethanol Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.406, year: 2004

  5. Normal and Pathological NCAT Image and Phantom Data Based on Physiologically Realistic Left Ventricle Finite-Element Models

    International Nuclear Information System (INIS)

    Veress, Alexander I.; Segars, W. Paul; Weiss, Jeffrey A.; Tsui, Benjamin M.W.; Gullberg, Grant T.

    2006-01-01

    The 4D NURBS-based Cardiac-Torso (NCAT) phantom, which provides a realistic model of the normal human anatomy and cardiac and respiratory motions, is used in medical imaging research to evaluate and improve imaging devices and techniques, especially dynamic cardiac applications. One limitation of the phantom is that it lacks the ability to accurately simulate altered functions of the heart that result from cardiac pathologies such as coronary artery disease (CAD). The goal of this work was to enhance the 4D NCAT phantom by incorporating a physiologically based, finite-element (FE) mechanical model of the left ventricle (LV) to simulate both normal and abnormal cardiac motions. The geometry of the FE mechanical model was based on gated high-resolution x-ray multi-slice computed tomography (MSCT) data of a healthy male subject. The myocardial wall was represented as transversely isotropichyperelastic material, with the fiber angle varying from -90 degrees at the epicardial surface, through 0 degrees at the mid-wall, to 90 degrees at the endocardial surface. A time varying elastance model was used to simulate fiber contraction, and physiological intraventricular systolic pressure-time curves were applied to simulate the cardiac motion over the entire cardiac cycle. To demonstrate the ability of the FE mechanical model to accurately simulate the normal cardiac motion as well abnormal motions indicative of CAD, a normal case and two pathologic cases were simulated and analyzed. In the first pathologic model, a subendocardial anterior ischemic region was defined. A second model was created with a transmural ischemic region defined in the same location. The FE based deformations were incorporated into the 4D NCAT cardiac model through the control points that define the cardiac structures in the phantom which were set to move according to the predictions of the mechanical model. A simulation study was performed using the FE-NCAT combination to investigate how the

  6. The effects of simulating a realistic eye model on the eye dose of an adult male undergoing head computed tomography.

    Science.gov (United States)

    Akhlaghi, Parisa; Ebrahimi-Khankook, Atiyeh; Vejdani-Noghreiyan, Alireza

    2017-05-01

    In head computed tomography, radiation upon the eye lens (as an organ with high radiosensitivity) may cause lenticular opacity and cataracts. Therefore, quantitative dose assessment due to exposure of the eye lens and surrounding tissue is a matter of concern. For this purpose, an accurate eye model with realistic geometry and shape, in which different eye substructures are considered, is needed. To calculate the absorbed radiation dose of visual organs during head computed tomography scans, in this study, an existing sophisticated eye model was inserted at the related location in the head of the reference adult male phantom recommended by the International Commission on Radiological Protection (ICRP). Then absorbed doses and distributions of energy deposition in different parts of this eye model were calculated and compared with those based on a previous simple eye model. All calculations were done using the Monte Carlo code MCNP4C for tube voltages of 80, 100, 120 and 140 kVp. In spite of the similarity of total dose to the eye lens for both eye models, the dose delivered to the sensitive zone, which plays an important role in the induction of cataracts, was on average 3% higher for the sophisticated model as compared to the simple model. By increasing the tube voltage, differences between the total dose to the eye lens between the two phantoms decrease to 1%. Due to this level of agreement, use of the sophisticated eye model for patient dosimetry is not necessary. However, it still helps for an estimation of doses received by different eye substructures separately.

  7. Normal and Pathological NCAT Image and PhantomData Based onPhysiologically Realistic Left Ventricle Finite-Element Models

    Energy Technology Data Exchange (ETDEWEB)

    Veress, Alexander I.; Segars, W. Paul; Weiss, Jeffrey A.; Tsui,Benjamin M.W.; Gullberg, Grant T.

    2006-08-02

    The 4D NURBS-based Cardiac-Torso (NCAT) phantom, whichprovides a realistic model of the normal human anatomy and cardiac andrespiratory motions, is used in medical imaging research to evaluate andimprove imaging devices and techniques, especially dynamic cardiacapplications. One limitation of the phantom is that it lacks the abilityto accurately simulate altered functions of the heart that result fromcardiac pathologies such as coronary artery disease (CAD). The goal ofthis work was to enhance the 4D NCAT phantom by incorporating aphysiologically based, finite-element (FE) mechanical model of the leftventricle (LV) to simulate both normal and abnormal cardiac motions. Thegeometry of the FE mechanical model was based on gated high-resolutionx-ray multi-slice computed tomography (MSCT) data of a healthy malesubject. The myocardial wall was represented as transversely isotropichyperelastic material, with the fiber angle varying from -90 degrees atthe epicardial surface, through 0 degreesat the mid-wall, to 90 degreesat the endocardial surface. A time varying elastance model was used tosimulate fiber contraction, and physiological intraventricular systolicpressure-time curves were applied to simulate the cardiac motion over theentire cardiac cycle. To demonstrate the ability of the FE mechanicalmodel to accurately simulate the normal cardiac motion as well abnormalmotions indicative of CAD, a normal case and two pathologic cases weresimulated and analyzed. In the first pathologic model, a subendocardialanterior ischemic region was defined. A second model was created with atransmural ischemic region defined in the same location. The FE baseddeformations were incorporated into the 4D NCAT cardiac model through thecontrol points that define the cardiac structures in the phantom whichwere set to move according to the predictions of the mechanical model. Asimulation study was performed using the FE-NCAT combination toinvestigate how the differences in contractile function

  8. Magnetism of metallacrown single-molecule magnets: From a simplest model to realistic systems

    Science.gov (United States)

    Pavlyukh, Y.; Rentschler, E.; Elmers, H. J.; Hübner, W.; Lefkidis, G.

    2018-06-01

    Electronic and magnetic properties of molecular nanomagnets are determined by competing energy scales due to the crystal field splitting, the exchange interactions between transition metal atoms, and relativistic effects. We present a comprehensive theory embracing all these phenomena based on first-principles calculations. In order to achieve this goal, we start from the FeNi4 cluster as a paradigm. The system can be accurately described on the ab initio level yielding all expected electronic states in a range of multiplicities from 1 to 9, with a ferromagnetic ground state. By adding the spin-orbit coupling between them we obtain the zero-field splitting. This allows to introduce a spin Hamiltonian of a giant spin model, which operates on a smaller energy scale. We compare the computed parameters of this Hamiltonian with the experimental and theoretical magnetic anisotropy energies of the monolayer Ni/Cu(001). In line with them, we find that the anisotropy almost entirely originates from the second-order spin-orbit coupling, the spin-spin coupling constitutes only a small fraction. Finally, we include the ligand atoms in our consideration. This component has a decisive role for the stabilization of molecules in experimental synthesis and characterization, and also substantially complicates the theory by bringing the superexchange mechanisms into play. Since they are higher-order effects involving two hopping matrix elements, not every theory can describe them. Our generalization of the corresponding perturbation theory substantiates the use of complete active space methods for the description of superexchange. At the same time, our numerical results for the {CuFe4} system demonstrate that the Goodenough-Kanamori rules, which are often used to determine the sign of these exchange interactions, cannot deliver quantitative predictions due to the interplay of other mechanisms, e. g., involving multicenter Coulomb integrals. We conclude by comparing ab initio values

  9. A Framework for Realistic Modeling and Display of Object Surface Appearance

    Science.gov (United States)

    Darling, Benjamin A.

    With advances in screen and video hardware technology, the type of content presented on computers has progressed from text and simple shapes to high-resolution photographs, photorealistic renderings, and high-definition video. At the same time, there have been significant advances in the area of content capture, with the development of devices and methods for creating rich digital representations of real-world objects. Unlike photo or video capture, which provide a fixed record of the light in a scene, these new technologies provide information on the underlying properties of the objects, allowing their appearance to be simulated for novel lighting and viewing conditions. These capabilities provide an opportunity to continue the computer display progression, from high-fidelity image presentations to digital surrogates that recreate the experience of directly viewing objects in the real world. In this dissertation, a framework was developed for representing objects with complex color, gloss, and texture properties and displaying them onscreen to appear as if they are part of the real-world environment. At its core, there is a conceptual shift from a traditional image-based display workflow to an object-based one. Instead of presenting the stored patterns of light from a scene, the objective is to reproduce the appearance attributes of a stored object by simulating its dynamic patterns of light for the real viewing and lighting geometry. This is accomplished using a computational approach where the physical light sources are modeled and the observer and display screen are actively tracked. Surface colors are calculated for the real spectral composition of the illumination with a custom multispectral rendering pipeline. In a set of experiments, the accuracy of color and gloss reproduction was evaluated by measuring the screen directly with a spectroradiometer. Gloss reproduction was assessed by comparing gonio measurements of the screen output to measurements of the

  10. Unified data model for biological data

    International Nuclear Information System (INIS)

    Idrees, M.

    2014-01-01

    A data model empowers us to store, retrieve and manipulate data in a unified way. We consider the biological data consists of DNA (De-Oxyribonucleic Acid), RNA (Ribonucleic Acid) and protein structures. In our Bioinformatics Lab (Bioinformatics Lab, Alkhawarizmi Institute of Computer Science, University of Engineering and Technology, Lahore, Pakistan), we have already proposed two data models for DNA and protein structures individually. In this paper, we propose a unified data model by using the data models of TOS (Temporal Object Oriented System) after making some necessary modifications to this data model and our already proposed the two data models. This proposed unified data model can be used for the modeling and maintaining the biological data (i.e. DNA, RNA and protein structures), in a single unified way. (author)

  11. Laser interaction with biological material mathematical modeling

    CERN Document Server

    Kulikov, Kirill

    2014-01-01

    This book covers the principles of laser interaction with biological cells and tissues of varying degrees of organization. The problems of biomedical diagnostics are considered. Scattering of laser irradiation of blood cells is modeled for biological structures (dermis, epidermis, vascular plexus). An analytic theory is provided which is based on solving the wave equation for the electromagnetic field. It allows the accurate analysis of interference effects arising from the partial superposition of scattered waves. Treated topics of mathematical modeling are: optical characterization of biological tissue with large-scale and small-scale inhomogeneities in the layers, heating blood vessel under laser irradiation incident on the outer surface of the skin and thermo-chemical denaturation of biological structures at the example of human skin.

  12. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  13. Simulation of microarray data with realistic characteristics

    Directory of Open Access Journals (Sweden)

    Lehmussola Antti

    2006-07-01

    Full Text Available Abstract Background Microarray technologies have become common tools in biological research. As a result, a need for effective computational methods for data analysis has emerged. Numerous different algorithms have been proposed for analyzing the data. However, an objective evaluation of the proposed algorithms is not possible due to the lack of biological ground truth information. To overcome this fundamental problem, the use of simulated microarray data for algorithm validation has been proposed. Results We present a microarray simulation model which can be used to validate different kinds of data analysis algorithms. The proposed model is unique in the sense that it includes all the steps that affect the quality of real microarray data. These steps include the simulation of biological ground truth data, applying biological and measurement technology specific error models, and finally simulating the microarray slide manufacturing and hybridization. After all these steps are taken into account, the simulated data has realistic biological and statistical characteristics. The applicability of the proposed model is demonstrated by several examples. Conclusion The proposed microarray simulation model is modular and can be used in different kinds of applications. It includes several error models that have been proposed earlier and it can be used with different types of input data. The model can be used to simulate both spotted two-channel and oligonucleotide based single-channel microarrays. All this makes the model a valuable tool for example in validation of data analysis algorithms.

  14. The Strategies of Modeling in Biology Education

    Science.gov (United States)

    Svoboda, Julia; Passmore, Cynthia

    2013-01-01

    Modeling, like inquiry more generally, is not a single method, but rather a complex suite of strategies. Philosophers of biology, citing the diverse aims, interests, and disciplinary cultures of biologists, argue that modeling is best understood in the context of its epistemic aims and cognitive payoffs. In the science education literature,…

  15. Introduction to stochastic models in biology

    DEFF Research Database (Denmark)

    Ditlevsen, Susanne; Samson, Adeline

    2013-01-01

    This chapter is concerned with continuous time processes, which are often modeled as a system of ordinary differential equations (ODEs). These models assume that the observed dynamics are driven exclusively by internal, deterministic mechanisms. However, real biological systems will always be exp...

  16. Agent-based modeling traction force mediated compaction of cell-populated collagen gels using physically realistic fibril mechanics.

    Science.gov (United States)

    Reinhardt, James W; Gooch, Keith J

    2014-02-01

    Agent-based modeling was used to model collagen fibrils, composed of a string of nodes serially connected by links that act as Hookean springs. Bending mechanics are implemented as torsional springs that act upon each set of three serially connected nodes as a linear function of angular deflection about the central node. These fibrils were evaluated under conditions that simulated axial extension, simple three-point bending and an end-loaded cantilever. The deformation of fibrils under axial loading varied <0.001% from the analytical solution for linearly elastic fibrils. For fibrils between 100 μm and 200 μm in length experiencing small deflections, differences between simulated deflections and their analytical solutions were <1% for fibrils experiencing three-point bending and <7% for fibrils experiencing cantilever bending. When these new rules for fibril mechanics were introduced into a model that allowed for cross-linking of fibrils to form a network and the application of cell traction force, the fibrous network underwent macroscopic compaction and aligned between cells. Further, fibril density increased between cells to a greater extent than that observed macroscopically and appeared similar to matrical tracks that have been observed experimentally in cell-populated collagen gels. This behavior is consistent with observations in previous versions of the model that did not allow for the physically realistic simulation of fibril mechanics. The significance of the torsional spring constant value was then explored to determine its impact on remodeling of the simulated fibrous network. Although a stronger torsional spring constant reduced the degree of quantitative remodeling that occurred, the inclusion of torsional springs in the model was not necessary for the model to reproduce key qualitative aspects of remodeling, indicating that the presence of Hookean springs is essential for this behavior. These results suggest that traction force mediated matrix

  17. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  18. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar; Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knuepfer, Christian; Liebermeister, Wolfram

    2016-01-01

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users' intuition about model similarity, and to support complex model searches in databases.

  19. Bending and Twisting the Embryonic Heart: A Computational Model for C-Looping Based on Realistic Geometry

    Directory of Open Access Journals (Sweden)

    Yunfei eShi

    2014-08-01

    Full Text Available The morphogenetic process of cardiac looping transforms the straight heart tube into a curved tube that resembles the shape of the future four-chambered heart. Although great progress has been made in identifying the molecular and genetic factors involved in looping, the physical mechanisms that drive this process have remained poorly understood. Recent work, however, has shed new light on this complicated problem. After briefly reviewing the current state of knowledge, we propose a relatively comprehensive hypothesis for the mechanics of the first phase of looping, termed c-looping, as the straight heart tube deforms into a c-shaped tube. According to this hypothesis, differential hypertrophic growth in the myocardium supplies the main forces that cause the heart tube to bend ventrally, while regional growth and contraction in the omphalomesenteric veins (primitive atria and compressive loads exerted by the splanchnopleuric membrane drive rightward torsion. A computational model based on realistic embryonic heart geometry is used to test this hypothesis. The behavior of the model is in reasonable agreement with available experimental data from control and perturbed embryos, offering support for our hypothesis. The results also suggest, however, that several other mechanisms contribute secondarily to normal looping, and we speculate that these mechanisms play backup roles when looping is perturbed. Finally, some outstanding questions are discussed for future study.

  20. Notes on the Implementation of Non-Parametric Statistics within the Westinghouse Realistic Large Break LOCA Evaluation Model (ASTRUM)

    International Nuclear Information System (INIS)

    Frepoli, Cesare; Oriani, Luca

    2006-01-01

    In recent years, non-parametric or order statistics methods have been widely used to assess the impact of the uncertainties within Best-Estimate LOCA evaluation models. The bounding of the uncertainties is achieved with a direct Monte Carlo sampling of the uncertainty attributes, with the minimum trial number selected to 'stabilize' the estimation of the critical output values (peak cladding temperature (PCT), local maximum oxidation (LMO), and core-wide oxidation (CWO A non-parametric order statistics uncertainty analysis was recently implemented within the Westinghouse Realistic Large Break LOCA evaluation model, also referred to as 'Automated Statistical Treatment of Uncertainty Method' (ASTRUM). The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature. The USNRC-approved Westinghouse method follows a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a Large Break LOCA analysis. This is a solid approach which guarantees that a bounding value (at 95% probability) of the 95 th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. The objective of this paper is to provide additional insights on the ASTRUM statistical approach, with a more in-depth analysis of pros and cons of the order statistics and of the Westinghouse approach in the implementation of this statistical methodology. (authors)

  1. Spiral-wave dynamics in ionically realistic mathematical models for human ventricular tissue: the effects of periodic deformation.

    Science.gov (United States)

    Nayak, Alok R; Pandit, Rahul

    2014-01-01

    We carry out an extensive numerical study of the dynamics of spiral waves of electrical activation, in the presence of periodic deformation (PD) in two-dimensional simulation domains, in the biophysically realistic mathematical models of human ventricular tissue due to (a) ten-Tusscher and Panfilov (the TP06 model) and (b) ten-Tusscher, Noble, Noble, and Panfilov (the TNNP04 model). We first consider simulations in cable-type domains, in which we calculate the conduction velocity θ and the wavelength λ of a plane wave; we show that PD leads to a periodic, spatial modulation of θ and a temporally periodic modulation of λ; both these modulations depend on the amplitude and frequency of the PD. We then examine three types of initial conditions for both TP06 and TNNP04 models and show that the imposition of PD leads to a rich variety of spatiotemporal patterns in the transmembrane potential including states with a single rotating spiral (RS) wave, a spiral-turbulence (ST) state with a single meandering spiral, an ST state with multiple broken spirals, and a state SA in which all spirals are absorbed at the boundaries of our simulation domain. We find, for both TP06 and TNNP04 models, that spiral-wave dynamics depends sensitively on the amplitude and frequency of PD and the initial condition. We examine how these different types of spiral-wave states can be eliminated in the presence of PD by the application of low-amplitude pulses by square- and rectangular-mesh suppression techniques. We suggest specific experiments that can test the results of our simulations.

  2. Spiral-Wave Dynamics in Ionically Realistic MathematicalModels for Human Ventricular Tissue: The Effects of PeriodicDeformation

    Directory of Open Access Journals (Sweden)

    Alok Ranjan Nayak

    2014-06-01

    Full Text Available We carry out an extensive numerical study of the dynamics of spiral waves of electrical activation, in the presence of periodic deformation (PD in two-dimensional simulation domains, in the biophysically realistic mathematical models of human ventricular tissue due to (a ten-Tusscher and Panfilov (the TP06 model and (b ten-Tusscher, Noble, Noble, and Panfilov (theTNNP04 model. We first consider simulations in cable-type domains, in which we calculate the conduction velocity $CV$ andthe wavelength $lambda$ of a plane wave; we show that PD leads to a periodic, spatial modulation of $CV$ and a temporallyperiodic modulation of $lambda$; both these modulations depend on the amplitude and frequency of the PD. We then examine three types of initial conditions for both TP06 and TNNP04 models and show that the imposition of PD leads to a rich variety ofspatiotemporal patterns in the transmembrane potential including states with a single rotating spiral (RS wave, a spiral-turbulence (ST state with a single meandering spiral, an ST state with multiple broken spirals, and a state SA in which all spirals are absorbed at the boundaries of our simulation domain. We find, for both TP06 and TNNP04 models, that spiral-wave dynamics depends sensitively on the amplitude and frequency of PD and the initial condition. We examine how these different types of spiral-wave states can be eliminated in the presence of PD by the application of low-amplitude pulses on square and rectangular control meshes. We suggest specific experiments that can test the results of our simulations.

  3. Experimental Section: On the magnetic field distribution generated by a dipolar current source situated in a realistically shaped compartment model of the head

    NARCIS (Netherlands)

    Meijs, J.W.H.; Bosch, F.G.C.; Peters, M.J.; Lopes da silva, F.H.

    1987-01-01

    The magnetic field distribution around the head is simulated using a realistically shaped compartment model of the head. The model is based on magnetic resonance images. The 3 compartments describe the brain, the skull and the scalp. The source is represented by a current dipole situated in the

  4. Skin dose in longitudinal and transverse linac-MRIs using Monte Carlo and realistic 3D MRI field models.

    Science.gov (United States)

    Keyvanloo, A; Burke, B; Warkentin, B; Tadic, T; Rathee, S; Kirkby, C; Santos, D M; Fallone, B G

    2012-10-01

    The magnetic fields of linac-MR systems modify the path of contaminant electrons in photon beams, which alters patient skin dose. To accurately quantify the magnitude of changes in skin dose, the authors use Monte Carlo calculations that incorporate realistic 3D magnetic field models of longitudinal and transverse linac-MR systems. Finite element method (FEM) is used to generate complete 3D magnetic field maps for 0.56 T longitudinal and transverse linac-MR magnet assemblies, as well as for representative 0.5 and 1.0 T Helmholtz MRI systems. EGSnrc simulations implementing these 3D magnetic fields are performed. The geometry for the BEAMnrc simulations incorporates the Varian 600C 6 MV linac, magnet poles, the yoke, and the magnetic shields of the linac-MRIs. Resulting phase-space files are used to calculate the central axis percent depth-doses in a water phantom and 2D skin dose distributions for 70 μm entrance and exit layers using DOSXYZnrc. For comparison, skin doses are also calculated in the absence of magnetic field, and using a 1D magnetic field with an unrealistically large fringe field. The effects of photon field size, air gap (longitudinal configuration), and angle of obliquity (transverse configuration) are also investigated. Realistic modeling of the 3D magnetic fields shows that fringe fields decay rapidly and have a very small magnitude at the linac head. As a result, longitudinal linac-MR systems mostly confine contaminant electrons that are generated in the air gap and have an insignificant effect on electrons produced further upstream. The increase in the skin dose for the longitudinal configuration compared to the zero B-field case varies from ∼1% to ∼14% for air gaps of 5-31 cm, respectively. (All dose changes are reported as a % of D(max).) The increase is also field-size dependent, ranging from ∼3% at 20 × 20 cm(2) to ∼11% at 5 × 5 cm(2). The small changes in skin dose are in contrast to significant increases that are

  5. Functional model of biological neural networks.

    Science.gov (United States)

    Lo, James Ting-Ho

    2010-12-01

    A functional model of biological neural networks, called temporal hierarchical probabilistic associative memory (THPAM), is proposed in this paper. THPAM comprises functional models of dendritic trees for encoding inputs to neurons, a first type of neuron for generating spike trains, a second type of neuron for generating graded signals to modulate neurons of the first type, supervised and unsupervised Hebbian learning mechanisms for easy learning and retrieving, an arrangement of dendritic trees for maximizing generalization, hardwiring for rotation-translation-scaling invariance, and feedback connections with different delay durations for neurons to make full use of present and past informations generated by neurons in the same and higher layers. These functional models and their processing operations have many functions of biological neural networks that have not been achieved by other models in the open literature and provide logically coherent answers to many long-standing neuroscientific questions. However, biological justifications of these functional models and their processing operations are required for THPAM to qualify as a macroscopic model (or low-order approximate) of biological neural networks.

  6. An online model composition tool for system biology models.

    Science.gov (United States)

    Coskun, Sarp A; Cicek, A Ercument; Lai, Nicola; Dash, Ranjan K; Ozsoyoglu, Z Meral; Ozsoyoglu, Gultekin

    2013-09-05

    There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user's input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well.

  7. Hemodynamic Changes Caused by Flow Diverters in Rabbit Aneurysm Models: Comparison of Virtual and Realistic FD Deployments Based on Micro-CT Reconstruction

    Science.gov (United States)

    Fang, Yibin; Yu, Ying; Cheng, Jiyong; Wang, Shengzhang; Wang, Kuizhong; Liu, Jian-Min; Huang, Qinghai

    2013-01-01

    Adjusting hemodynamics via flow diverter (FD) implantation is emerging as a novel method of treating cerebral aneurysms. However, most previous FD-related hemodynamic studies were based on virtual FD deployment, which may produce different hemodynamic outcomes than realistic (in vivo) FD deployment. We compared hemodynamics between virtual FD and realistic FD deployments in rabbit aneurysm models using computational fluid dynamics (CFD) simulations. FDs were implanted for aneurysms in 14 rabbits. Vascular models based on rabbit-specific angiograms were reconstructed for CFD studies. Real FD configurations were reconstructed based on micro-CT scans after sacrifice, while virtual FD configurations were constructed with SolidWorks software. Hemodynamic parameters before and after FD deployment were analyzed. According to the metal coverage (MC) of implanted FDs calculated based on micro-CT reconstruction, 14 rabbits were divided into two groups (A, MC >35%; B, MC 0.05). The normalized mean WSS in Group A after realistic FD implantation was significantly lower than that of Group B. All parameters in Group B exhibited no significant difference between realistic and virtual FDs. This study confirmed MC-correlated differences in hemodynamic parameters between realistic and virtual FD deployment. PMID:23823503

  8. Creation of a realistic model for removal of a metallic corneal foreign body for less than $75.

    Directory of Open Access Journals (Sweden)

    Sayegh, Julie Sami

    2017-01-01

    Full Text Available Metallic corneal foreign bodies (MCFBs are one of the most common causes of ocular injury presenting to the emergency department. Delays in removal, or forceful attempts to remove the MCFB can lead to infection, further injury to the eye, and worsening of vision. In order to prevent these underlying complications, it is imperative for the medical provider to properly master this technique. As current trends in simulation become more focused on patient safety, task-trainers can provide an invaluable learning experience for residents, medical students and physicians. Models made from bovine eyes, agar plates, gelatin, and corneas created from glass and paraffin wax have been previously been created.One study also used a rubber glove filled with water to simulate intraocular measurement with a Tonopen. However the use of corneas created from ballistics gel for MCFB removal and intraocular pressure measurement has not been studied. We propose a realistic, sustainable, cost-effective MCFB task-trainer to introduce the fundamental skills required for MCFB removal and measurement of intraocular pressure with a Tonopen. A pilot survey study performed on medical students and emergency medicine resident physicians showed an increase in comfort levels performing both MCFB removal and measurement of intraocular pressure with a Tonopen after using this task-trainer.

  9. Calculations of the response functions of Bonner spheres with a spherical 3He proportional counter using a realistic detector model

    International Nuclear Information System (INIS)

    Wiegel, B.; Alevra, A.V.; Siebert, B.R.L.

    1994-11-01

    A realistic geometry model of a Bonner sphere system with a spherical 3 He-filled proportional counter and 12 polyethylene moderating spheres with diameters ranging from 7,62 cm (3'') to 45,72 cm (18'') is introduced. The MCNP Monte Carlo computer code is used to calculate the responses of this Bonner sphere system to monoenergetic neutrons in the energy range between 1 meV to 20 MeV. The relative uncertainties of the responses due to the Monte Carlo calculations are less than 1% for spheres up to 30,48 cm (12'') in diameter and less than 2% for the 15'' and 18'' spheres. Resonances in the carbon cross section are seen as significant structures in the response functions. Additional calculations were made to study the influence of the 3 He number density and the polyethylene mass density on the response as well as the angular dependence of the Bonner sphere system. The calculated responses can be adjusted to a large set of calibration measurements with only a single fit factor common to all sphere diameters and energies. (orig.) [de

  10. Magnetic drug targeting through a realistic model of human tracheobronchial airways using computational fluid and particle dynamics.

    Science.gov (United States)

    Pourmehran, Oveis; Gorji, Tahereh B; Gorji-Bandpy, Mofid

    2016-10-01

    Magnetic drug targeting (MDT) is a local drug delivery system which aims to concentrate a pharmacological agent at its site of action in order to minimize undesired side effects due to systemic distribution in the organism. Using magnetic drug particles under the influence of an external magnetic field, the drug particles are navigated toward the target region. Herein, computational fluid dynamics was used to simulate the air flow and magnetic particle deposition in a realistic human airway geometry obtained by CT scan images. Using discrete phase modeling and one-way coupling of particle-fluid phases, a Lagrangian approach for particle tracking in the presence of an external non-uniform magnetic field was applied. Polystyrene (PMS40) particles were utilized as the magnetic drug carrier. A parametric study was conducted, and the influence of particle diameter, magnetic source position, magnetic field strength and inhalation condition on the particle transport pattern and deposition efficiency (DE) was reported. Overall, the results show considerable promise of MDT in deposition enhancement at the target region (i.e., left lung). However, the positive effect of increasing particle size on DE enhancement was evident at smaller magnetic field strengths (Mn [Formula: see text] 1.5 T), whereas, at higher applied magnetic field strengths, increasing particle size has a inverse effect on DE. This implies that for efficient MTD in the human respiratory system, an optimal combination of magnetic drug career characteristics and magnetic field strength has to be achieved.

  11. Notions of similarity for systems biology models.

    Science.gov (United States)

    Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knüpfer, Christian; Liebermeister, Wolfram; Waltemath, Dagmar

    2018-01-01

    Systems biology models are rapidly increasing in complexity, size and numbers. When building large models, researchers rely on software tools for the retrieval, comparison, combination and merging of models, as well as for version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of 'similarity' may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here we survey existing methods for the comparison of models, introduce quantitative measures for model similarity, and discuss potential applications of combined similarity measures. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on a combination of different model aspects. The six aspects that we define as potentially relevant for similarity are underlying encoding, references to biological entities, quantitative behaviour, qualitative behaviour, mathematical equations and parameters and network structure. We argue that future similarity measures will benefit from combining these model aspects in flexible, problem-specific ways to mimic users' intuition about model similarity, and to support complex model searches in databases. © The Author 2016. Published by Oxford University Press.

  12. A realistic large-scale model of the cerebellum granular layer predicts circuit spatio-temporal filtering properties

    Directory of Open Access Journals (Sweden)

    Sergio Solinas

    2010-05-01

    Full Text Available The way the cerebellar granular layer transforms incoming mossy fiber signals into new spike patterns to be related to Purkinje cells is not yet clear. Here, a realistic computational model of the granular layer was developed and used to address four main functional hypotheses: center-surround organization, time-windowing, high-pass filtering in responses to spike bursts and coherent oscillations in response to diffuse random activity. The model network was activated using patterns inspired by those recorded in vivo. Burst stimulation of a small mossy fiber bundle resulted in granule cell bursts delimited in time (time windowing and space (center-surround by network inhibition. This burst-burst transmission showed marked frequency-dependence configuring a high-pass filter with cut-off frequency around 100 Hz. The contrast between center and surround properties was regulated by the excitatory-inhibitory balance. The stronger excitation made the center more responsive to 10-50 Hz input frequencies and enhanced the granule cell output (with spike occurring earlier and with higher frequency and number compared to the surround. Finally, over a certain level of mossy fiber background activity, the circuit generated coherent oscillations in the theta-frequency band. All these processes were fine-tuned by NMDA and GABA-A receptor activation and neurotransmitter vesicle cycling in the cerebellar glomeruli. This model shows that available knowledge on cellular mechanisms is sufficient to unify the main functional hypotheses on the cerebellum granular layer and suggests that this network can behave as an adaptable spatio-temporal filter coordinated by theta-frequency oscillations.

  13. Ultrafast spectroscopy of model biological membranes

    NARCIS (Netherlands)

    Ghosh, Avishek

    2009-01-01

    In this PhD thesis, I have described the novel time-resolved sum-frequency generation (TR-SFG) spectroscopic technique that I developed during the course of my PhD research and used it study the ultrafast vibrational, structural and orientational dynamics of water molecules at model biological

  14. Prospective Tests on Biological Models of Acupuncture

    Directory of Open Access Journals (Sweden)

    Charles Shang

    2009-01-01

    Full Text Available The biological effects of acupuncture include the regulation of a variety of neurohumoral factors and growth control factors. In science, models or hypotheses with confirmed predictions are considered more convincing than models solely based on retrospective explanations. Literature review showed that two biological models of acupuncture have been prospectively tested with independently confirmed predictions: The neurophysiology model on the long-term effects of acupuncture emphasizes the trophic and anti-inflammatory effects of acupuncture. Its prediction on the peripheral effect of endorphin in acupuncture has been confirmed. The growth control model encompasses the neurophysiology model and suggests that a macroscopic growth control system originates from a network of organizers in embryogenesis. The activity of the growth control system is important in the formation, maintenance and regulation of all the physiological systems. Several phenomena of acupuncture such as the distribution of auricular acupuncture points, the long-term effects of acupuncture and the effect of multimodal non-specific stimulation at acupuncture points are consistent with the growth control model. The following predictions of the growth control model have been independently confirmed by research results in both acupuncture and conventional biomedical sciences: (i Acupuncture has extensive growth control effects. (ii Singular point and separatrix exist in morphogenesis. (iii Organizers have high electric conductance, high current density and high density of gap junctions. (iv A high density of gap junctions is distributed as separatrices or boundaries at body surface after early embryogenesis. (v Many acupuncture points are located at transition points or boundaries between different body domains or muscles, coinciding with the connective tissue planes. (vi Some morphogens and organizers continue to function after embryogenesis. Current acupuncture research suggests a

  15. Agent-based modelling in synthetic biology.

    Science.gov (United States)

    Gorochowski, Thomas E

    2016-11-30

    Biological systems exhibit complex behaviours that emerge at many different levels of organization. These span the regulation of gene expression within single cells to the use of quorum sensing to co-ordinate the action of entire bacterial colonies. Synthetic biology aims to make the engineering of biology easier, offering an opportunity to control natural systems and develop new synthetic systems with useful prescribed behaviours. However, in many cases, it is not understood how individual cells should be programmed to ensure the emergence of a required collective behaviour. Agent-based modelling aims to tackle this problem, offering a framework in which to simulate such systems and explore cellular design rules. In this article, I review the use of agent-based models in synthetic biology, outline the available computational tools, and provide details on recently engineered biological systems that are amenable to this approach. I further highlight the challenges facing this methodology and some of the potential future directions. © 2016 The Author(s).

  16. Bio-heat transfer model of electroconvulsive therapy: Effect of biological properties on induced temperature variation.

    Science.gov (United States)

    de Oliveira, Marilia M; Wen, Paul; Ahfock, Tony

    2016-08-01

    A realistic human head model consisting of six tissue layers was modelled to investigate the behavior of temperature profile and magnitude when applying electroconvulsive therapy stimulation and different biological properties. The thermo-electrical model was constructed with the use of bio-heat transfer equation and Laplace equation. Three different electrode montages were analyzed as well as the influence of blood perfusion, metabolic heat and electric and thermal conductivity in the scalp. Also, the effect of including the fat layer was investigated. The results showed that temperature increase is inversely proportional to electrical and thermal conductivity increase. Furthermore, the inclusion of blood perfusion slightly drops the peak temperature. Finally, the inclusion of fat is highly recommended in order to acquire more realistic results from the thermo-electrical models.

  17. From Biology to Mathematical Models and Back: Teaching Modeling to Biology Students, and Biology to Math and Engineering Students

    Science.gov (United States)

    Chiel, Hillel J.; McManus, Jeffrey M.; Shaw, Kendrick M.

    2010-01-01

    We describe the development of a course to teach modeling and mathematical analysis skills to students of biology and to teach biology to students with strong backgrounds in mathematics, physics, or engineering. The two groups of students have different ways of learning material and often have strong negative feelings toward the area of knowledge…

  18. Numerical investigation of inspiratory airflow in a realistic model of the human tracheobronchial airways and a comparison with experimental results.

    Science.gov (United States)

    Elcner, Jakub; Lizal, Frantisek; Jedelsky, Jan; Jicha, Miroslav; Chovancova, Michaela

    2016-04-01

    In this article, the results of numerical simulations using computational fluid dynamics (CFD) and a comparison with experiments performed with phase Doppler anemometry are presented. The simulations and experiments were conducted in a realistic model of the human airways, which comprised the throat, trachea and tracheobronchial tree up to the fourth generation. A full inspiration/expiration breathing cycle was used with tidal volumes 0.5 and 1 L, which correspond to a sedentary regime and deep breath, respectively. The length of the entire breathing cycle was 4 s, with inspiration and expiration each lasting 2 s. As a boundary condition for the CFD simulations, experimentally obtained flow rate distribution in 10 terminal airways was used with zero pressure resistance at the throat inlet. CCM+ CFD code (Adapco) was used with an SST k-ω low-Reynolds Number RANS model. The total number of polyhedral control volumes was 2.6 million with a time step of 0.001 s. Comparisons were made at several points in eight cross sections selected according to experiments in the trachea and the left and right bronchi. The results agree well with experiments involving the oscillation (temporal relocation) of flow structures in the majority of the cross sections and individual local positions. Velocity field simulation in several cross sections shows a very unstable flow field, which originates in the tracheal laryngeal jet and propagates far downstream with the formation of separation zones in both left and right airways. The RANS simulation agrees with the experiments in almost all the cross sections and shows unstable local flow structures and a quantitatively acceptable solution for the time-averaged flow field.

  19. Modeling human risk: Cell & molecular biology in context

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    It is anticipated that early in the next century manned missions into outer space will occur, with a mission to Mars scheduled between 2015 and 2020. However, before such missions can be undertaken, a realistic estimation of the potential risks to the flight crews is required. One of the uncertainties remaining in this risk estimation is that posed by the effects of exposure to the radiation environment of outer space. Although the composition of this environment is fairly well understood, the biological effects arising from exposure to it are not. The reasons for this are three-fold: (1) A small but highly significant component of the radiation spectrum in outer space consists of highly charged, high energy (HZE) particles which are not routinely experienced on earth, and for which there are insufficient data on biological effects; (2) Most studies on the biological effects of radiation to date have been high-dose, high dose-rate, whereas in space, with the exception of solar particle events, radiation exposures will be low-dose, low dose-rate; (3) Although it has been established that the virtual absence of gravity in space has a profound effect on human physiology, it is not clear whether these effects will act synergistically with those of radiation exposure. A select panel will evaluate the utilizing experiments and models to accurately predict the risks associated with exposure to HZE particles. Topics of research include cellular and tissue response, health effects associated with radiation damage, model animal systems, and critical markers of Radiation response.

  20. Modeling human risk: Cell ampersand molecular biology in context

    International Nuclear Information System (INIS)

    1997-06-01

    It is anticipated that early in the next century manned missions into outer space will occur, with a mission to Mars scheduled between 2015 and 2020. However, before such missions can be undertaken, a realistic estimation of the potential risks to the flight crews is required. One of the uncertainties remaining in this risk estimation is that posed by the effects of exposure to the radiation environment of outer space. Although the composition of this environment is fairly well understood, the biological effects arising from exposure to it are not. The reasons for this are three-fold: (1) A small but highly significant component of the radiation spectrum in outer space consists of highly charged, high energy (HZE) particles which are not routinely experienced on earth, and for which there are insufficient data on biological effects; (2) Most studies on the biological effects of radiation to date have been high-dose, high dose-rate, whereas in space, with the exception of solar particle events, radiation exposures will be low-dose, low dose-rate; (3) Although it has been established that the virtual absence of gravity in space has a profound effect on human physiology, it is not clear whether these effects will act synergistically with those of radiation exposure. A select panel will evaluate the utilizing experiments and models to accurately predict the risks associated with exposure to HZE particles. Topics of research include cellular and tissue response, health effects associated with radiation damage, model animal systems, and critical markers of Radiation response

  1. Structural Identifiability of Dynamic Systems Biology Models.

    Science.gov (United States)

    Villaverde, Alejandro F; Barreiro, Antonio; Papachristodoulou, Antonis

    2016-10-01

    A powerful way of gaining insight into biological systems is by creating a nonlinear differential equation model, which usually contains many unknown parameters. Such a model is called structurally identifiable if it is possible to determine the values of its parameters from measurements of the model outputs. Structural identifiability is a prerequisite for parameter estimation, and should be assessed before exploiting a model. However, this analysis is seldom performed due to the high computational cost involved in the necessary symbolic calculations, which quickly becomes prohibitive as the problem size increases. In this paper we show how to analyse the structural identifiability of a very general class of nonlinear models by extending methods originally developed for studying observability. We present results about models whose identifiability had not been previously determined, report unidentifiabilities that had not been found before, and show how to modify those unidentifiable models to make them identifiable. This method helps prevent problems caused by lack of identifiability analysis, which can compromise the success of tasks such as experiment design, parameter estimation, and model-based optimization. The procedure is called STRIKE-GOLDD (STRuctural Identifiability taKen as Extended-Generalized Observability with Lie Derivatives and Decomposition), and it is implemented in a MATLAB toolbox which is available as open source software. The broad applicability of this approach facilitates the analysis of the increasingly complex models used in systems biology and other areas.

  2. Realistic Visualization of Virtual Views

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    that can be impractical and sometime impossible. In addition, the artificial nature of data often makes visualized virtual scenarios not realistic enough. Not realistic in the sense that a synthetic scene is easy to discriminate visually from a natural scene. A new field of research has consequently...... developed and received much attention in recent years: Realistic Virtual View Synthesis. The main goal is a high fidelity representation of virtual scenarios while easing modeling and physical phenomena simulation. In particular, realism is achieved by the transfer to the novel view of all the physical...... phenomena captured in the reference photographs, (i.e. the transfer of photographic-realism). An overview of most prominent approaches in realistic virtual view synthesis will be presented and briefly discussed. Applications of proposed methods to visual survey, virtual cinematography, as well as mobile...

  3. Study of cyclic and steady particle motion in a realistic human airway model using phase-Doppler anemometry

    Science.gov (United States)

    Jedelský, Jan; Lízal, František; Jícha, Miroslav

    2012-04-01

    Transport and deposition of particles in human airways has been of research interest for many years. Various experimental methods such as constant temperature anemometry, particle image velocimetry and laser-Doppler based techniques were employed for study of aerosol transport in the past. We use Phase-Doppler Particle Analyser (P/DPA) for time resolved size and velocity measurement of liquid aerosol particles in a size range 1 to 8 μm. The di-2ethylhexyl sabacate (DEHS) particles were produced by condensation monodisperse aerosol generator. A thin-wall transparent model of human airways with non-symmetric bifurcations and non-planar geometry containing parts from throat to 3rd-4th generation of bronchi was fabricated for the study. Several cyclic (sinusoidal) breathing regimes were simulated using pneumatic breathing mechanism. Analogous steady-flow regimes were also investigated and used for comparison. An analysis of the particle velocity data was performed with aim to gain deeper understanding of the transport phenomena in the realistic bifurcating airway system. Flows of particles of different sizes in range 1 - 10 μm was found to slightly differ for extremely high Stokes numbers. Differences in steady and cyclic turbulence intensities were documented in the paper. Systematically higher turbulence intensity was found for cyclic flows and mainly in the expiration breathing phase. Negligible differences were found for behaviour of different particle size classes in the inspected range 1 to 8 μm. Possibility of velocity spectra estimation of air flow using the P/DPA data is discussed.

  4. Study of cyclic and steady particle motion in a realistic human airway model using phase-Doppler anemometry

    Directory of Open Access Journals (Sweden)

    Jícha Miroslav

    2012-04-01

    Full Text Available Transport and deposition of particles in human airways has been of research interest for many years. Various experimental methods such as constant temperature anemometry, particle image velocimetry and laser-Doppler based techniques were employed for study of aerosol transport in the past. We use Phase-Doppler Particle Analyser (P/DPA for time resolved size and velocity measurement of liquid aerosol particles in a size range 1 to 8 μm. The di-2ethylhexyl sabacate (DEHS particles were produced by condensation monodisperse aerosol generator. A thin-wall transparent model of human airways with non-symmetric bifurcations and non-planar geometry containing parts from throat to 3rd-4th generation of bronchi was fabricated for the study. Several cyclic (sinusoidal breathing regimes were simulated using pneumatic breathing mechanism. Analogous steady-flow regimes were also investigated and used for comparison. An analysis of the particle velocity data was performed with aim to gain deeper understanding of the transport phenomena in the realistic bifurcating airway system. Flows of particles of different sizes in range 1 – 10 μm was found to slightly differ for extremely high Stokes numbers. Differences in steady and cyclic turbulence intensities were documented in the paper. Systematically higher turbulence intensity was found for cyclic flows and mainly in the expiration breathing phase. Negligible differences were found for behaviour of different particle size classes in the inspected range 1 to 8 μm. Possibility of velocity spectra estimation of air flow using the P/DPA data is discussed.

  5. Automatic skull segmentation from MR images for realistic volume conductor models of the head: Assessment of the state-of-the-art

    DEFF Research Database (Denmark)

    Nielsen, Jesper Duemose; Madsen, Kristoffer Hougaard; Puonti, Oula

    2018-01-01

    Anatomically realistic volume conductor models of the human head are important for accurate forward modeling of the electric field during transcranial brain stimulation (TBS), electro- (EEG) and magnetoencephalography (MEG). In particular, the skull compartment exerts a strong influence on the fi......Anatomically realistic volume conductor models of the human head are important for accurate forward modeling of the electric field during transcranial brain stimulation (TBS), electro- (EEG) and magnetoencephalography (MEG). In particular, the skull compartment exerts a strong influence...... local defects. In contrast to FSL BET2, the SPM12-based segmentation with extended spatial tissue priors and the BrainSuite-based segmentation provide coarse reconstructions of the vertebrae, enabling the construction of volume conductor models that include the neck. We exemplarily demonstrate...

  6. Institute for Multiscale Modeling of Biological Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Paulaitis, Michael E; Garcia-Moreno, Bertrand; Lenhoff, Abraham

    2009-12-26

    The Institute for Multiscale Modeling of Biological Interactions (IMMBI) has two primary goals: Foster interdisciplinary collaborations among faculty and their research laboratories that will lead to novel applications of multiscale simulation and modeling methods in the biological sciences and engineering; and Building on the unique biophysical/biology-based engineering foundations of the participating faculty, train scientists and engineers to apply computational methods that collectively span multiple time and length scales of biological organization. The success of IMMBI will be defined by the following: Size and quality of the applicant pool for pre-doctoral and post-doctoral fellows; Academic performance; Quality of the pre-doctoral and post-doctoral research; Impact of the research broadly and to the DOE (ASCR program) mission; Distinction of the next career step for pre-doctoral and post-doctoral fellows; and Faculty collaborations that result from IMMBI activities. Specific details about accomplishments during the three years of DOE support for IMMBI have been documented in Annual Progress Reports (April 2005, June 2006, and March 2007) and a Report for a National Academy of Sciences Review (October 2005) that were submitted to DOE on the dates indicated. An overview of these accomplishments is provided.

  7. Long term contaminant migration and impacts from uranium mill tailings. Comparison of computer models using a realistic dataset

    Energy Technology Data Exchange (ETDEWEB)

    Camus, H. [CEA Centre d' Etudes Nucleaires de Fontenay-aux-Roses, 92 (France). Inst. de Protection et de Surete Nucleaire] [and others

    1996-08-01

    This is the final report of the Working Group describing: the enhancement of the previously devised V1 scenario to produce a V2 scenario which includes more detailed source term and other site specific data; the application of models in deterministic and probabilistic mode to calculate contaminant concentrations in biosphere media, and related radiation doses, contaminant intakes and health risks, including estimates of uncertainties; the comparison and analysis of the resulting calculations. A series of scenarios was developed based on data provided by Working Group members from a range of actual tailings disposal sites, culminating in the V2.2 and V2.3 scenarios. The V2.2 and V2.3 scenarios are identical in all respects, except that the V2.2 considers radioactive (U-238 chain) contaminants, whilst the V2.3 considers stable elements (As, Ni, Pb). Since the scenarios are based on data obtained from a range of actual sites, they should be considered to be generically realistic rather than representative of a particular single site. In both scenarios, the contaminants of interest are assumed to be released in leachate from a tailings pile into an underlying aquifer. They are transported in groundwater through the aquifer to a well. Water is abstracted from the well and used for: watering beef cattle; human consumption; and irrigating leafy vegetables. The beef and leafy vegetables are consumed by humans living in the area. The same contaminants are also released into the atmosphere due to the wind erosion of the pile and then deposited upon the soil, pasture and leafy vegetables. In addition, for the V2.2 scenario, Rn-222 is assumed to be released to atmosphere from the pile. Unlike the V1 scenario, no consideration is given to surface water exposure pathways. Results show that there is exceedingly good agreement between participants' deterministic and probabilistic estimates of total dose or intake. They agree within a factor of two to three for both scenarios

  8. Long term contaminant migration and impacts from uranium mill tailings. Comparison of computer models using a realistic dataset

    International Nuclear Information System (INIS)

    Camus, H.

    1996-08-01

    This is the final report of the Working Group describing: the enhancement of the previously devised V1 scenario to produce a V2 scenario which includes more detailed source term and other site specific data; the application of models in deterministic and probabilistic mode to calculate contaminant concentrations in biosphere media, and related radiation doses, contaminant intakes and health risks, including estimates of uncertainties; the comparison and analysis of the resulting calculations. A series of scenarios was developed based on data provided by Working Group members from a range of actual tailings disposal sites, culminating in the V2.2 and V2.3 scenarios. The V2.2 and V2.3 scenarios are identical in all respects, except that the V2.2 considers radioactive (U-238 chain) contaminants, whilst the V2.3 considers stable elements (As, Ni, Pb). Since the scenarios are based on data obtained from a range of actual sites, they should be considered to be generically realistic rather than representative of a particular single site. In both scenarios, the contaminants of interest are assumed to be released in leachate from a tailings pile into an underlying aquifer. They are transported in groundwater through the aquifer to a well. Water is abstracted from the well and used for: watering beef cattle; human consumption; and irrigating leafy vegetables. The beef and leafy vegetables are consumed by humans living in the area. The same contaminants are also released into the atmosphere due to the wind erosion of the pile and then deposited upon the soil, pasture and leafy vegetables. In addition, for the V2.2 scenario, Rn-222 is assumed to be released to atmosphere from the pile. Unlike the V1 scenario, no consideration is given to surface water exposure pathways. Results show that there is exceedingly good agreement between participants' deterministic and probabilistic estimates of total dose or intake. They agree within a factor of two to three for both scenarios. Even

  9. Realistic simulation of reduced-dose CT with noise modeling and sinogram synthesis using DICOM CT images

    International Nuclear Information System (INIS)

    Won Kim, Chang; Kim, Jong Hyo

    2014-01-01

    Purpose: Reducing the patient dose while maintaining the diagnostic image quality during CT exams is the subject of a growing number of studies, in which simulations of reduced-dose CT with patient data have been used as an effective technique when exploring the potential of various dose reduction techniques. Difficulties in accessing raw sinogram data, however, have restricted the use of this technique to a limited number of institutions. Here, we present a novel reduced-dose CT simulation technique which provides realistic low-dose images without the requirement of raw sinogram data. Methods: Two key characteristics of CT systems, the noise equivalent quanta (NEQ) and the algorithmic modulation transfer function (MTF), were measured for various combinations of object attenuation and tube currents by analyzing the noise power spectrum (NPS) of CT images obtained with a set of phantoms. Those measurements were used to develop a comprehensive CT noise model covering the reduced x-ray photon flux, object attenuation, system noise, and bow-tie filter, which was then employed to generate a simulated noise sinogram for the reduced-dose condition with the use of a synthetic sinogram generated from a reference CT image. The simulated noise sinogram was filtered with the algorithmic MTF and back-projected to create a noise CT image, which was then added to the reference CT image, finally providing a simulated reduced-dose CT image. The simulation performance was evaluated in terms of the degree of NPS similarity, the noise magnitude, the bow-tie filter effect, and the streak noise pattern at photon starvation sites with the set of phantom images. Results: The simulation results showed good agreement with actual low-dose CT images in terms of their visual appearance and in a quantitative evaluation test. The magnitude and shape of the NPS curves of the simulated low-dose images agreed well with those of real low-dose images, showing discrepancies of less than +/−3.2% in

  10. From biology to mathematical models and back: teaching modeling to biology students, and biology to math and engineering students.

    Science.gov (United States)

    Chiel, Hillel J; McManus, Jeffrey M; Shaw, Kendrick M

    2010-01-01

    We describe the development of a course to teach modeling and mathematical analysis skills to students of biology and to teach biology to students with strong backgrounds in mathematics, physics, or engineering. The two groups of students have different ways of learning material and often have strong negative feelings toward the area of knowledge that they find difficult. To give students a sense of mastery in each area, several complementary approaches are used in the course: 1) a "live" textbook that allows students to explore models and mathematical processes interactively; 2) benchmark problems providing key skills on which students make continuous progress; 3) assignment of students to teams of two throughout the semester; 4) regular one-on-one interactions with instructors throughout the semester; and 5) a term project in which students reconstruct, analyze, extend, and then write in detail about a recently published biological model. Based on student evaluations and comments, an attitude survey, and the quality of the students' term papers, the course has significantly increased the ability and willingness of biology students to use mathematical concepts and modeling tools to understand biological systems, and it has significantly enhanced engineering students' appreciation of biology.

  11. From Biology to Mathematical Models and Back: Teaching Modeling to Biology Students, and Biology to Math and Engineering Students

    Science.gov (United States)

    McManus, Jeffrey M.; Shaw, Kendrick M.

    2010-01-01

    We describe the development of a course to teach modeling and mathematical analysis skills to students of biology and to teach biology to students with strong backgrounds in mathematics, physics, or engineering. The two groups of students have different ways of learning material and often have strong negative feelings toward the area of knowledge that they find difficult. To give students a sense of mastery in each area, several complementary approaches are used in the course: 1) a “live” textbook that allows students to explore models and mathematical processes interactively; 2) benchmark problems providing key skills on which students make continuous progress; 3) assignment of students to teams of two throughout the semester; 4) regular one-on-one interactions with instructors throughout the semester; and 5) a term project in which students reconstruct, analyze, extend, and then write in detail about a recently published biological model. Based on student evaluations and comments, an attitude survey, and the quality of the students' term papers, the course has significantly increased the ability and willingness of biology students to use mathematical concepts and modeling tools to understand biological systems, and it has significantly enhanced engineering students' appreciation of biology. PMID:20810957

  12. Modeling the Biological Diversity of Pig Carcasses

    DEFF Research Database (Denmark)

    Erbou, Søren Gylling Hemmingsen

    This thesis applies methods from medical image analysis for modeling the biological diversity of pig carcasses. The Danish meat industry is very focused on improving product quality and productivity by optimizing the use of the carcasses and increasing productivity in the abattoirs. In order...... equipment is investigated, without the need for a calibration against a less accurate manual dissection. The rest of the contributions regard the construction and use of point distribution models (PDM). PDM’s are able to capture the shape variation of a population of shapes, in this case a 3D surface...

  13. Biologic Constraints on Modelling Virus Assembly

    Directory of Open Access Journals (Sweden)

    Robert L. Garcea

    2008-01-01

    Full Text Available The mathematic modelling of icosahedral virus assembly has drawn increasing interest because of the symmetric geometry of the outer shell structures. Many models involve equilibrium expressions of subunit binding, with reversible subunit additions forming various intermediate structures. The underlying assumption is that a final lowest energy state drives the equilibrium toward assembly. In their simplest forms, these models have explained why high subunit protein concentrations and strong subunit association constants can result in kinetic traps forming off pathway partial and aberrant structures. However, the cell biology of virus assembly is exceedingly complex. The biochemistry and biology of polyoma and papillomavirus assembly described here illustrates many of these specific issues. Variables include the use of cellular ‘chaperone’ proteins as mediators of assembly fidelity, the coupling of assembly to encapsidation of a specific nucleic acid genome, the use of cellular structures as ‘workbenches’ upon which assembly occurs, and the underlying problem of making a capsid structure that is metastable and capable of rapid disassembly upon infection. Although formidable to model, incorporating these considerations could advance the relevance of mathematical models of virus assembly to the real world.

  14. Proposed actions are no actions: re-modeling an ontology design pattern with a realist top-level ontology.

    Science.gov (United States)

    Seddig-Raufie, Djamila; Jansen, Ludger; Schober, Daniel; Boeker, Martin; Grewe, Niels; Schulz, Stefan

    2012-09-21

    Ontology Design Patterns (ODPs) are representational artifacts devised to offer solutions for recurring ontology design problems. They promise to enhance the ontology building process in terms of flexibility, re-usability and expansion, and to make the result of ontology engineering more predictable. In this paper, we analyze ODP repositories and investigate their relation with upper-level ontologies. In particular, we compare the BioTop upper ontology to the Action ODP from the NeOn an ODP repository. In view of the differences in the respective approaches, we investigate whether the Action ODP can be embedded into BioTop. We demonstrate that this requires re-interpreting the meaning of classes of the NeOn Action ODP in the light of the precepts of realist ontologies. As a result, the re-design required clarifying the ontological commitment of the ODP classes by assigning them to top-level categories. Thus, ambiguous definitions are avoided. Classes of real entities are clearly distinguished from classes of information artifacts. The proposed approach avoids the commitment to the existence of unclear future entities which underlies the NeOn Action ODP. Our re-design is parsimonious in the sense that existing BioTop content proved to be largely sufficient to define the different types of actions and plans. The proposed model demonstrates that an expressive upper-level ontology provides enough resources and expressivity to represent even complex ODPs, here shown with the different flavors of Action as proposed in the NeOn ODP. The advantage of ODP inclusion into a top-level ontology is the given predetermined dependency of each class, an existing backbone structure and well-defined relations. Our comparison shows that the use of some ODPs is more likely to cause problems for ontology developers, rather than to guide them. Besides the structural properties, the explanation of classification results were particularly hard to grasp for 'self-sufficient' ODPs as

  15. Mathematical modeling in biology: A critical assessment

    Energy Technology Data Exchange (ETDEWEB)

    Buiatti, M. [Florence, Univ. (Italy). Dipt. di Biologia Animale e Genetica

    1998-01-01

    The molecular revolution and the development of biology-derived industry have led in the last fifty years to an unprecedented `lead forward` of life sciences in terms of experimental data. Less success has been achieved in the organisation of such data and in the consequent development of adequate explanatory and predictive theories and models. After a brief historical excursus inborn difficulties of mathematisation of biological objects and processes derived from the complex dynamics of life are discussed along with the logical tools (simplifications, choice of observation points etc.) used to overcome them. `Autistic`, monodisciplinary attitudes towards biological modeling of mathematicians, physicists, biologists aimed in each case at the use of the tools of other disciplines to solve `selfish` problems are also taken into account and a warning against derived dangers (reification of mono disciplinary metaphors, lack of falsification etc.) is given. Finally `top.down` (deductive) and `bottom up` (inductive) heuristic interactive approaches to mathematisation are critically discussed with the help of serie of examples.

  16. Mathematical modeling in biology: A critical assessment

    International Nuclear Information System (INIS)

    Buiatti, M.

    1998-01-01

    The molecular revolution and the development of biology-derived industry have led in the last fifty years to an unprecedented 'lead forward' of life sciences in terms of experimental data. Less success has been achieved in the organisation of such data and in the consequent development of adequate explanatory and predictive theories and models. After a brief historical excursus inborn difficulties of mathematisation of biological objects and processes derived from the complex dynamics of life are discussed along with the logical tools (simplifications, choice of observation points etc.) used to overcome them. 'Autistic', monodisciplinary attitudes towards biological modeling of mathematicians, physicists, biologists aimed in each case at the use of the tools of other disciplines to solve 'selfish' problems are also taken into account and a warning against derived dangers (reification of mono disciplinary metaphors, lack of falsification etc.) is given. Finally 'top.down' (deductive) and 'bottom up' (inductive) heuristic interactive approaches to mathematisation are critically discussed with the help of serie of examples

  17. Continuum Modeling of Biological Network Formation

    KAUST Repository

    Albi, Giacomo

    2017-04-10

    We present an overview of recent analytical and numerical results for the elliptic–parabolic system of partial differential equations proposed by Hu and Cai, which models the formation of biological transportation networks. The model describes the pressure field using a Darcy type equation and the dynamics of the conductance network under pressure force effects. Randomness in the material structure is represented by a linear diffusion term and conductance relaxation by an algebraic decay term. We first introduce micro- and mesoscopic models and show how they are connected to the macroscopic PDE system. Then, we provide an overview of analytical results for the PDE model, focusing mainly on the existence of weak and mild solutions and analysis of the steady states. The analytical part is complemented by extensive numerical simulations. We propose a discretization based on finite elements and study the qualitative properties of network structures for various parameter values.

  18. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  19. Converting differential-equation models of biological systems to membrane computing.

    Science.gov (United States)

    Muniyandi, Ravie Chandren; Zin, Abdullah Mohd; Sanders, J W

    2013-12-01

    This paper presents a method to convert the deterministic, continuous representation of a biological system by ordinary differential equations into a non-deterministic, discrete membrane computation. The dynamics of the membrane computation is governed by rewrite rules operating at certain rates. That has the advantage of applying accurately to small systems, and to expressing rates of change that are determined locally, by region, but not necessary globally. Such spatial information augments the standard differentiable approach to provide a more realistic model. A biological case study of the ligand-receptor network of protein TGF-β is used to validate the effectiveness of the conversion method. It demonstrates the sense in which the behaviours and properties of the system are better preserved in the membrane computing model, suggesting that the proposed conversion method may prove useful for biological systems in particular. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. Evaluation of biological models using Spacelab

    Science.gov (United States)

    Tollinger, D.; Williams, B. A.

    1980-01-01

    Biological models of hypogravity effects are described, including the cardiovascular-fluid shift, musculoskeletal, embryological and space sickness models. These models predict such effects as loss of extracellular fluid and electrolytes, decrease in red blood cell mass, and the loss of muscle and bone mass in weight-bearing portions of the body. Experimentation in Spacelab by the use of implanted electromagnetic flow probes, by fertilizing frog eggs in hypogravity and fixing the eggs at various stages of early development and by assessing the role of the vestibulocular reflex arc in space sickness is suggested. It is concluded that the use of small animals eliminates the uncertainties caused by corrective or preventive measures employed with human subjects.

  1. Water versus DNA A new deal for proton transport modeling in biological matter

    International Nuclear Information System (INIS)

    Champion, C; Quinto, M A; Monti, J M; Galassi, M E; Fojón, O A; Hanssen, J; Rivarola, R D; Week, P F

    2015-01-01

    Water vapor is a common surrogate of DNA for modeling the proton-induced ionizing processes in living tissue exposed to radiations. The present study aims at scrutinizing the validity of this approximation and then revealing new insights into proton-induced energy transfers by a comparative analysis between water and realistic biological medium. In this context, self-consistent quantum mechanical modeling of the ionization and electron capture processes is reported within the continuum distorted wave-eikonal initial state framework for both isolated water molecules and DNA components impacted by proton beams. (paper)

  2. Neural network models: from biology to many - body phenomenology

    International Nuclear Information System (INIS)

    Clark, J.W.

    1993-01-01

    Theoretical work in neural networks has a strange feel for most physicists. In some cases the aspect of design becomes paramount. More comfortable ground at least for many body theorists may be found in realistic biological simulation, although the complexity of most problems is so awesome that incisive results will be hard won. It has also shown the impressive capabilities of artificial networks in pattern recognition and classification may be exploited to solve management problems in experimental physics and for discovery of radically new theoretical description of physical systems. This advance represents an important step towards the ultimate goal of neuro biological paradigm. (A.B.)

  3. Modeling biological pathway dynamics with timed automata.

    Science.gov (United States)

    Schivo, Stefano; Scholma, Jetse; Wanders, Brend; Urquidi Camacho, Ricardo A; van der Vet, Paul E; Karperien, Marcel; Langerak, Rom; van de Pol, Jaco; Post, Janine N

    2014-05-01

    Living cells are constantly subjected to a plethora of environmental stimuli that require integration into an appropriate cellular response. This integration takes place through signal transduction events that form tightly interconnected networks. The understanding of these networks requires capturing their dynamics through computational support and models. ANIMO (analysis of Networks with Interactive Modeling) is a tool that enables the construction and exploration of executable models of biological networks, helping to derive hypotheses and to plan wet-lab experiments. The tool is based on the formalism of Timed Automata, which can be analyzed via the UPPAAL model checker. Thanks to Timed Automata, we can provide a formal semantics for the domain-specific language used to represent signaling networks. This enforces precision and uniformity in the definition of signaling pathways, contributing to the integration of isolated signaling events into complex network models. We propose an approach to discretization of reaction kinetics that allows us to efficiently use UPPAAL as the computational engine to explore the dynamic behavior of the network of interest. A user-friendly interface hides the use of Timed Automata from the user, while keeping the expressive power intact. Abstraction to single-parameter kinetics speeds up construction of models that remain faithful enough to provide meaningful insight. The resulting dynamic behavior of the network components is displayed graphically, allowing for an intuitive and interactive modeling experience.

  4. Spherical Cancer Models in Tumor Biology

    Directory of Open Access Journals (Sweden)

    Louis-Bastien Weiswald

    2015-01-01

    Full Text Available Three-dimensional (3D in vitro models have been used in cancer research as an intermediate model between in vitro cancer cell line cultures and in vivo tumor. Spherical cancer models represent major 3D in vitro models that have been described over the past 4 decades. These models have gained popularity in cancer stem cell research using tumorospheres. Thus, it is crucial to define and clarify the different spherical cancer models thus far described. Here, we focus on in vitro multicellular spheres used in cancer research. All these spherelike structures are characterized by their well-rounded shape, the presence of cancer cells, and their capacity to be maintained as free-floating cultures. We propose a rational classification of the four most commonly used spherical cancer models in cancer research based on culture methods for obtaining them and on subsequent differences in sphere biology: the multicellular tumor spheroid model, first described in the early 70s and obtained by culture of cancer cell lines under nonadherent conditions; tumorospheres, a model of cancer stem cell expansion established in a serum-free medium supplemented with growth factors; tissue-derived tumor spheres and organotypic multicellular spheroids, obtained by tumor tissue mechanical dissociation and cutting. In addition, we describe their applications to and interest in cancer research; in particular, we describe their contribution to chemoresistance, radioresistance, tumorigenicity, and invasion and migration studies. Although these models share a common 3D conformation, each displays its own intrinsic properties. Therefore, the most relevant spherical cancer model must be carefully selected, as a function of the study aim and cancer type.

  5. The importance of realistic dispersal models in conservation planning: application of a novel modelling platform to evaluate management scenarios in an Afrotropical biodiversity hotspot.

    Science.gov (United States)

    Aben, Job; Bocedi, Greta; Palmer, Stephen C F; Pellikka, Petri; Strubbe, Diederik; Hallmann, Caspar; Travis, Justin M J; Lens, Luc; Matthysen, Erik

    2016-08-01

    As biodiversity hotspots are often characterized by high human population densities, implementation of conservation management practices that focus only on the protection and enlargement of pristine habitats is potentially unrealistic. An alternative approach to curb species extinction risk involves improving connectivity among existing habitat patches. However, evaluation of spatially explicit management strategies is challenging, as predictive models must account for the process of dispersal, which is difficult in terms of both empirical data collection and modelling.Here, we use a novel, individual-based modelling platform that couples demographic and mechanistic dispersal models to evaluate the effectiveness of realistic management scenarios tailored to conserve forest birds in a highly fragmented biodiversity hotspot. Scenario performance is evaluated based on the spatial population dynamics of a well-studied forest bird species.The largest population increase was predicted to occur under scenarios increasing habitat area. However, the effectiveness was sensitive to spatial planning. Compared to adding one large patch to the habitat network, adding several small patches yielded mixed benefits: although overall population sizes increased, specific newly created patches acted as dispersal sinks, which compromised population persistence in some existing patches. Increasing matrix connectivity by the creation of stepping stones is likely to result in enhanced dispersal success and occupancy of smaller patches. Synthesis and applications . We show that the effectiveness of spatial management is strongly driven by patterns of individual dispersal across landscapes. For species conservation planning, we advocate the use of models that incorporate adequate realism in demography and, particularly, in dispersal behaviours.

  6. Modeling individual movement decisions of brown hare (Lepus europaeus) as a key concept for realistic spatial behavior and exposure: A population model for landscape-level risk assessment.

    Science.gov (United States)

    Kleinmann, Joachim U; Wang, Magnus

    2017-09-01

    Spatial behavior is of crucial importance for the risk assessment of pesticides and for the assessment of effects of agricultural practice or multiple stressors, because it determines field use, exposition, and recovery. Recently, population models have increasingly been used to understand the mechanisms driving risk and recovery or to conduct landscape-level risk assessments. To include spatial behavior appropriately in population models for use in risk assessments, a new method, "probabilistic walk," was developed, which simulates the detailed daily movement of individuals by taking into account food resources, vegetation cover, and the presence of conspecifics. At each movement step, animals decide where to move next based on probabilities being determined from this information. The model was parameterized to simulate populations of brown hares (Lepus europaeus). A detailed validation of the model demonstrated that it can realistically reproduce various natural patterns of brown hare ecology and behavior. Simulated proportions of time animals spent in fields (PT values) were also comparable to field observations. It is shown that these important parameters for the risk assessment may, however, vary in different landscapes. The results demonstrate the value of using population models to reduce uncertainties in risk assessment and to better understand which factors determine risk in a landscape context. Environ Toxicol Chem 2017;36:2299-2307. © 2017 SETAC. © 2017 SETAC.

  7. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2008-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards in the interior of the object. In this abstract, we describe a simple algorithm for triangulating k-guardable polygons. Our algorithm, which is easily implementable, takes

  8. ACTIVE AND PARTICIPATORY METHODS IN BIOLOGY: MODELING

    Directory of Open Access Journals (Sweden)

    Brînduşa-Antonela SBÎRCEA

    2011-01-01

    Full Text Available By using active and participatory methods it is hoped that pupils will not only come to a deeper understanding of the issues involved, but also that their motivation will be heightened. Pupil involvement in their learning is essential. Moreover, by using a variety of teaching techniques, we can help students make sense of the world in different ways, increasing the likelihood that they will develop a conceptual understanding. The teacher must be a good facilitator, monitoring and supporting group dynamics. Modeling is an instructional strategy in which the teacher demonstrates a new concept or approach to learning and pupils learn by observing. In the teaching of biology the didactic materials are fundamental tools in the teaching-learning process. Reading about scientific concepts or having a teacher explain them is not enough. Research has shown that modeling can be used across disciplines and in all grade and ability level classrooms. Using this type of instruction, teachers encourage learning.

  9. Documentation of TRU biological transport model (BIOTRAN)

    Energy Technology Data Exchange (ETDEWEB)

    Gallegos, A.F.; Garcia, B.J.; Sutton, C.M.

    1980-01-01

    Inclusive of Appendices, this document describes the purpose, rationale, construction, and operation of a biological transport model (BIOTRAN). This model is used to predict the flow of transuranic elements (TRU) through specified plant and animal environments using biomass as a vector. The appendices are: (A) Flows of moisture, biomass, and TRU; (B) Intermediate variables affecting flows; (C) Mnemonic equivalents (code) for variables; (D) Variable library (code); (E) BIOTRAN code (Fortran); (F) Plants simulated; (G) BIOTRAN code documentation; (H) Operating instructions for BIOTRAN code. The main text is presented with a specific format which uses a minimum of space, yet is adequate for tracking most relationships from their first appearance to their formulation in the code. Because relationships are treated individually in this manner, and rely heavily on Appendix material for understanding, it is advised that the reader familiarize himself with these materials before proceeding with the main text.

  10. Documentation of TRU biological transport model (BIOTRAN)

    International Nuclear Information System (INIS)

    Gallegos, A.F.; Garcia, B.J.; Sutton, C.M.

    1980-01-01

    Inclusive of Appendices, this document describes the purpose, rationale, construction, and operation of a biological transport model (BIOTRAN). This model is used to predict the flow of transuranic elements (TRU) through specified plant and animal environments using biomass as a vector. The appendices are: (A) Flows of moisture, biomass, and TRU; (B) Intermediate variables affecting flows; (C) Mnemonic equivalents (code) for variables; (D) Variable library (code); (E) BIOTRAN code (Fortran); (F) Plants simulated; (G) BIOTRAN code documentation; (H) Operating instructions for BIOTRAN code. The main text is presented with a specific format which uses a minimum of space, yet is adequate for tracking most relationships from their first appearance to their formulation in the code. Because relationships are treated individually in this manner, and rely heavily on Appendix material for understanding, it is advised that the reader familiarize himself with these materials before proceeding with the main text

  11. Monte Carlo simulation of near-infrared light propagation in realistic adult head models with hair follicles

    Science.gov (United States)

    Pan, Boan; Fang, Xiang; Liu, Weichao; Li, Nanxi; Zhao, Ke; Li, Ting

    2018-02-01

    Near infrared spectroscopy (NIRS) and diffuse correlation spectroscopy (DCS) has been used to measure brain activation, which are clinically important. Monte Carlo simulation has been applied to the near infrared light propagation model in biological tissue, and has the function of predicting diffusion and brain activation. However, previous studies have rarely considered hair and hair follicles as a contributing factor. Here, we attempt to use MCVM (Monte Carlo simulation based on 3D voxelized media) to examine light transmission, absorption, fluence, spatial sensitivity distribution (SSD) and brain activation judgement in the presence or absence of the hair follicles. The data in this study is a series of high-resolution cryosectional color photograph of a standing Chinse male adult. We found that the number of photons transmitted under the scalp decreases dramatically and the photons exported to detector is also decreasing, as the density of hair follicles increases. If there is no hair follicle, the above data increase and has the maximum value. Meanwhile, the light distribution and brain activation have a stable change along with the change of hair follicles density. The findings indicated hair follicles make influence of NIRS in light distribution and brain activation judgement.

  12. Simple and Realistic Data Generation

    DEFF Research Database (Denmark)

    Pedersen, Kenneth Houkjær; Torp, Kristian; Wind, Rico

    2006-01-01

    This paper presents a generic, DBMS independent, and highly extensible relational data generation tool. The tool can efficiently generate realistic test data for OLTP, OLAP, and data streaming applications. The tool uses a graph model to direct the data generation. This model makes it very simple...... to generate data even for large database schemas with complex inter- and intra table relationships. The model also makes it possible to generate data with very accurate characteristics....

  13. At the biological modeling and simulation frontier.

    Science.gov (United States)

    Hunt, C Anthony; Ropella, Glen E P; Lam, Tai Ning; Tang, Jonathan; Kim, Sean H J; Engelberg, Jesse A; Sheikh-Bahaei, Shahab

    2009-11-01

    We provide a rationale for and describe examples of synthetic modeling and simulation (M&S) of biological systems. We explain how synthetic methods are distinct from familiar inductive methods. Synthetic M&S is a means to better understand the mechanisms that generate normal and disease-related phenomena observed in research, and how compounds of interest interact with them to alter phenomena. An objective is to build better, working hypotheses of plausible mechanisms. A synthetic model is an extant hypothesis: execution produces an observable mechanism and phenomena. Mobile objects representing compounds carry information enabling components to distinguish between them and react accordingly when different compounds are studied simultaneously. We argue that the familiar inductive approaches contribute to the general inefficiencies being experienced by pharmaceutical R&D, and that use of synthetic approaches accelerates and improves R&D decision-making and thus the drug development process. A reason is that synthetic models encourage and facilitate abductive scientific reasoning, a primary means of knowledge creation and creative cognition. When synthetic models are executed, we observe different aspects of knowledge in action from different perspectives. These models can be tuned to reflect differences in experimental conditions and individuals, making translational research more concrete while moving us closer to personalized medicine.

  14. Nonlinear Rheology in a Model Biological Tissue

    Science.gov (United States)

    Matoz-Fernandez, D. A.; Agoritsas, Elisabeth; Barrat, Jean-Louis; Bertin, Eric; Martens, Kirsten

    2017-04-01

    The rheological response of dense active matter is a topic of fundamental importance for many processes in nature such as the mechanics of biological tissues. One prominent way to probe mechanical properties of tissues is to study their response to externally applied forces. Using a particle-based model featuring random apoptosis and environment-dependent division rates, we evidence a crossover from linear flow to a shear-thinning regime with an increasing shear rate. To rationalize this nonlinear flow we derive a theoretical mean-field scenario that accounts for the interplay of mechanical and active noise in local stresses. These noises are, respectively, generated by the elastic response of the cell matrix to cell rearrangements and by the internal activity.

  15. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  16. On the limitations of standard statistical modeling in biological systems: a full Bayesian approach for biology.

    Science.gov (United States)

    Gomez-Ramirez, Jaime; Sanz, Ricardo

    2013-09-01

    One of the most important scientific challenges today is the quantitative and predictive understanding of biological function. Classical mathematical and computational approaches have been enormously successful in modeling inert matter, but they may be inadequate to address inherent features of biological systems. We address the conceptual and methodological obstacles that lie in the inverse problem in biological systems modeling. We introduce a full Bayesian approach (FBA), a theoretical framework to study biological function, in which probability distributions are conditional on biophysical information that physically resides in the biological system that is studied by the scientist. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Biologically based multistage modeling of radiation effects

    Energy Technology Data Exchange (ETDEWEB)

    William Hazelton; Suresh Moolgavkar; E. Georg Luebeck

    2005-08-30

    This past year we have made substantial progress in modeling the contribution of homeostatic regulation to low-dose radiation effects and carcinogenesis. We have worked to refine and apply our multistage carcinogenesis models to explicitly incorporate cell cycle states, simple and complex damage, checkpoint delay, slow and fast repair, differentiation, and apoptosis to study the effects of low-dose ionizing radiation in mouse intestinal crypts, as well as in other tissues. We have one paper accepted for publication in ''Advances in Space Research'', and another manuscript in preparation describing this work. I also wrote a chapter describing our combined cell-cycle and multistage carcinogenesis model that will be published in a book on stochastic carcinogenesis models edited by Wei-Yuan Tan. In addition, we organized and held a workshop on ''Biologically Based Modeling of Human Health Effects of Low dose Ionizing Radiation'', July 28-29, 2005 at Fred Hutchinson Cancer Research Center in Seattle, Washington. We had over 20 participants, including Mary Helen Barcellos-Hoff as keynote speaker, talks by most of the low-dose modelers in the DOE low-dose program, experimentalists including Les Redpath (and Mary Helen), Noelle Metting from DOE, and Tony Brooks. It appears that homeostatic regulation may be central to understanding low-dose radiation phenomena. The primary effects of ionizing radiation (IR) are cell killing, delayed cell cycling, and induction of mutations. However, homeostatic regulation causes cells that are killed or damaged by IR to eventually be replaced. Cells with an initiating mutation may have a replacement advantage, leading to clonal expansion of these initiated cells. Thus we have focused particularly on modeling effects that disturb homeostatic regulation as early steps in the carcinogenic process. There are two primary considerations that support our focus on homeostatic regulation. First, a number of

  18. Model checking biological systems described using ambient calculus

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Priami, Corrado; Qualia, Paola

    2005-01-01

    Model checking biological systems described using ambient calculus. In Proc. of the second International Workshop on Computational Methods in Systems Biology (CMSB04), Lecture Notes in Bioinformatics 3082:85-103, Springer, 2005.......Model checking biological systems described using ambient calculus. In Proc. of the second International Workshop on Computational Methods in Systems Biology (CMSB04), Lecture Notes in Bioinformatics 3082:85-103, Springer, 2005....

  19. Modeling of nonlinear biological phenomena modeled by S-systems.

    Science.gov (United States)

    Mansouri, Majdi M; Nounou, Hazem N; Nounou, Mohamed N; Datta, Aniruddha A

    2014-03-01

    A central challenge in computational modeling of biological systems is the determination of the model parameters. In such cases, estimating these variables or parameters from other easily obtained measurements can be extremely useful. For example, time-series dynamic genomic data can be used to develop models representing dynamic genetic regulatory networks, which can be used to design intervention strategies to cure major diseases and to better understand the behavior of biological systems. Unfortunately, biological measurements are usually highly infected by errors that hide the important characteristics in the data. Therefore, these noisy measurements need to be filtered to enhance their usefulness in practice. This paper addresses the problem of state and parameter estimation of biological phenomena modeled by S-systems using Bayesian approaches, where the nonlinear observed system is assumed to progress according to a probabilistic state space model. The performances of various conventional and state-of-the-art state estimation techniques are compared. These techniques include the extended Kalman filter (EKF), unscented Kalman filter (UKF), particle filter (PF), and the developed variational Bayesian filter (VBF). Specifically, two comparative studies are performed. In the first comparative study, the state variables (the enzyme CadA, the model cadBA, the cadaverine Cadav and the lysine Lys for a model of the Cad System in Escherichia coli (CSEC)) are estimated from noisy measurements of these variables, and the various estimation techniques are compared by computing the estimation root mean square error (RMSE) with respect to the noise-free data. In the second comparative study, the state variables as well as the model parameters are simultaneously estimated. In this case, in addition to comparing the performances of the various state estimation techniques, the effect of the number of estimated model parameters on the accuracy and convergence of these

  20. A novel model to assess the efficacy of steam surface pasteurization of cooked surimi gels inoculated with realistic levels of Listeria innocua.

    Science.gov (United States)

    Skåra, Torstein; Valdramidis, Vasilis P; Rosnes, Jan Thomas; Noriega, Estefanía; Van Impe, Jan F M

    2014-12-01

    Steam surface pasteurization is a promising decontamination technology for reducing pathogenic bacteria in different stages of food production. The effect of the artificial inoculation type and initial microbial load, however, has not been thoroughly assessed in the context of inactivation studies. In order to optimize the efficacy of the technology, the aim of this study was to design and validate a model system for steam surface pasteurization, assessing different inoculation methods and realistic microbial levels. More specifically, the response of Listeria innocua, a surrogate organism of Listeria monocytogenes, on a model fish product, and the effect of different inoculation levels following treatments with a steam surface pasteurization system was investigated. The variation in the resulting inoculation level on the samples was too large (77%) for the contact inoculation procedure to be further considered. In contrast, the variation of a drop inoculation procedure was 17%. Inoculation with high levels showed a rapid 1-2 log decrease after 3-5 s, and then no further inactivation beyond 20 s. A low level inoculation study was performed by analysing the treated samples using a novel contact plating approach, which can be performed without sample homogenization and dilution. Using logistic regression, results from this method were used to model the binary responses of Listeria on surfaces with realistic inoculation levels. According to this model, a treatment time of 23 s will result in a 1 log reduction (for P = 0.1). Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. The ultimate intrinsic signal-to-noise ratio of loop- and dipole-like current patterns in a realistic human head model.

    Science.gov (United States)

    Pfrommer, Andreas; Henning, Anke

    2018-03-13

    The ultimate intrinsic signal-to-noise ratio (UISNR) represents an upper bound for the achievable SNR of any receive coil. To reach this threshold a complete basis set of equivalent surface currents is required. This study systematically investigated to what extent either loop- or dipole-like current patterns are able to reach the UISNR threshold in a realistic human head model between 1.5 T and 11.7 T. Based on this analysis, we derived guidelines for coil designers to choose the best array element at a given field strength. Moreover, we present ideal current patterns yielding the UISNR in a realistic body model. We distributed generic current patterns on a cylindrical and helmet-shaped surface around a realistic human head model. We excited electromagnetic fields in the human head by using eigenfunctions of the spherical and cylindrical Helmholtz operator. The electromagnetic field problem was solved by a fast volume integral equation solver. At 7 T and above, adding curl-free current patterns to divergence-free current patterns substantially increased the SNR in the human head (locally >20%). This was true for the helmet-shaped and the cylindrical surface. On the cylindrical surface, dipole-like current patterns had high SNR performance in central regions at ultra-high field strength. The UISNR increased superlinearly with B0 in most parts of the cerebrum but only sublinearly in the periphery of the human head. The combination of loop and dipole elements could enhance the SNR performance in the human head at ultra-high field strength. © 2018 International Society for Magnetic Resonance in Medicine.

  2. Generating Systems Biology Markup Language Models from the Synthetic Biology Open Language.

    Science.gov (United States)

    Roehner, Nicholas; Zhang, Zhen; Nguyen, Tramy; Myers, Chris J

    2015-08-21

    In the context of synthetic biology, model generation is the automated process of constructing biochemical models based on genetic designs. This paper discusses the use cases for model generation in genetic design automation (GDA) software tools and introduces the foundational concepts of standards and model annotation that make this process useful. Finally, this paper presents an implementation of model generation in the GDA software tool iBioSim and provides an example of generating a Systems Biology Markup Language (SBML) model from a design of a 4-input AND sensor written in the Synthetic Biology Open Language (SBOL).

  3. Details of regional particle deposition and airflow structures in a realistic model of human tracheobronchial airways: two-phase flow simulation.

    Science.gov (United States)

    Rahimi-Gorji, Mohammad; Gorji, Tahereh B; Gorji-Bandpy, Mofid

    2016-07-01

    In the present investigation, detailed two-phase flow modeling of airflow, transport and deposition of micro-particles (1-10µm) in a realistic tracheobronchial airway geometry based on CT scan images under various breathing conditions (i.e. 10-60l/min) was considered. Lagrangian particle tracking has been used to investigate the particle deposition patterns in a model comprising mouth up to generation G6 of tracheobronchial airways. The results demonstrated that during all breathing patterns, the maximum velocity change occurred in the narrow throat region (Larynx). Due to implementing a realistic geometry for simulations, many irregularities and bending deflections exist in the airways model. Thereby, at higher inhalation rates, these areas are prone to vortical effects which tend to entrap the inhaled particles. According to the results, deposition fraction has a direct relationship with particle aerodynamic diameter (for dp=1-10µm). Enhancing inhalation flow rate and particle size will largely increase the inertial force and consequently, more particle deposition is evident suggesting that inertial impaction is the dominant deposition mechanism in tracheobronchial airways. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Development of a Shipboard Remote Control and Telemetry Experimental System for Large-Scale Model's Motions and Loads Measurement in Realistic Sea Waves.

    Science.gov (United States)

    Jiao, Jialong; Ren, Huilong; Adenya, Christiaan Adika; Chen, Chaohe

    2017-10-29

    Wave-induced motion and load responses are important criteria for ship performance evaluation. Physical experiments have long been an indispensable tool in the predictions of ship's navigation state, speed, motions, accelerations, sectional loads and wave impact pressure. Currently, majority of the experiments are conducted in laboratory tank environment, where the wave environments are different from the realistic sea waves. In this paper, a laboratory tank testing system for ship motions and loads measurement is reviewed and reported first. Then, a novel large-scale model measurement technique is developed based on the laboratory testing foundations to obtain accurate motion and load responses of ships in realistic sea conditions. For this purpose, a suite of advanced remote control and telemetry experimental system was developed in-house to allow for the implementation of large-scale model seakeeping measurement at sea. The experimental system includes a series of technique sensors, e.g., the Global Position System/Inertial Navigation System (GPS/INS) module, course top, optical fiber sensors, strain gauges, pressure sensors and accelerometers. The developed measurement system was tested by field experiments in coastal seas, which indicates that the proposed large-scale model testing scheme is capable and feasible. Meaningful data including ocean environment parameters, ship navigation state, motions and loads were obtained through the sea trial campaign.

  5. INTERVAL OBSERVER FOR A BIOLOGICAL REACTOR MODEL

    Directory of Open Access Journals (Sweden)

    T. A. Kharkovskaia

    2014-05-01

    Full Text Available The method of an interval observer design for nonlinear systems with parametric uncertainties is considered. The interval observer synthesis problem for systems with varying parameters consists in the following. If there is the uncertainty restraint for the state values of the system, limiting the initial conditions of the system and the set of admissible values for the vector of unknown parameters and inputs, the interval existence condition for the estimations of the system state variables, containing the actual state at a given time, needs to be held valid over the whole considered time segment as well. Conditions of the interval observers design for the considered class of systems are shown. They are: limitation of the input and state, the existence of a majorizing function defining the uncertainty vector for the system, Lipschitz continuity or finiteness of this function, the existence of an observer gain with the suitable Lyapunov matrix. The main condition for design of such a device is cooperativity of the interval estimation error dynamics. An individual observer gain matrix selection problem is considered. In order to ensure the property of cooperativity for interval estimation error dynamics, a static transformation of coordinates is proposed. The proposed algorithm is demonstrated by computer modeling of the biological reactor. Possible applications of these interval estimation systems are the spheres of robust control, where the presence of various types of uncertainties in the system dynamics is assumed, biotechnology and environmental systems and processes, mechatronics and robotics, etc.

  6. Toward University Modeling Instruction—Biology: Adapting Curricular Frameworks from Physics to Biology

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-01-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence. PMID:23737628

  7. Toward university modeling instruction--biology: adapting curricular frameworks from physics to biology.

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-06-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence.

  8. Physical models of biological information and adaptation.

    Science.gov (United States)

    Stuart, C I

    1985-04-07

    The bio-informational equivalence asserts that biological processes reduce to processes of information transfer. In this paper, that equivalence is treated as a metaphor with deeply anthropomorphic content of a sort that resists constitutive-analytical definition, including formulation within mathematical theories of information. It is argued that continuance of the metaphor, as a quasi-theoretical perspective in biology, must entail a methodological dislocation between biological and physical science. It is proposed that a general class of functions, drawn from classical physics, can serve to eliminate the anthropomorphism. Further considerations indicate that the concept of biological adaptation is central to the general applicability of the informational idea in biology; a non-anthropomorphic treatment of adaptive phenomena is suggested in terms of variational principles.

  9. Oscillation and stability of delay models in biology

    CERN Document Server

    Agarwal, Ravi P; Saker, Samir H

    2014-01-01

    Environmental variation plays an important role in many biological and ecological dynamical systems. This monograph focuses on the study of oscillation and the stability of delay models occurring in biology. The book presents recent research results on the qualitative behavior of mathematical models under different physical and environmental conditions, covering dynamics including the distribution and consumption of food. Researchers in the fields of mathematical modeling, mathematical biology, and population dynamics will be particularly interested in this material.

  10. Combining NMR ensembles and molecular dynamics simulations provides more realistic models of protein structures in solution and leads to better chemical shift prediction

    International Nuclear Information System (INIS)

    Lehtivarjo, Juuso; Tuppurainen, Kari; Hassinen, Tommi; Laatikainen, Reino; Peräkylä, Mikael

    2012-01-01

    While chemical shifts are invaluable for obtaining structural information from proteins, they also offer one of the rare ways to obtain information about protein dynamics. A necessary tool in transforming chemical shifts into structural and dynamic information is chemical shift prediction. In our previous work we developed a method for 4D prediction of protein 1 H chemical shifts in which molecular motions, the 4th dimension, were modeled using molecular dynamics (MD) simulations. Although the approach clearly improved the prediction, the X-ray structures and single NMR conformers used in the model cannot be considered fully realistic models of protein in solution. In this work, NMR ensembles (NMRE) were used to expand the conformational space of proteins (e.g. side chains, flexible loops, termini), followed by MD simulations for each conformer to map the local fluctuations. Compared with the non-dynamic model, the NMRE+MD model gave 6–17% lower root-mean-square (RMS) errors for different backbone nuclei. The improved prediction indicates that NMR ensembles with MD simulations can be used to obtain a more realistic picture of protein structures in solutions and moreover underlines the importance of short and long time-scale dynamics for the prediction. The RMS errors of the NMRE+MD model were 0.24, 0.43, 0.98, 1.03, 1.16 and 2.39 ppm for 1 Hα, 1 HN, 13 Cα, 13 Cβ, 13 CO and backbone 15 N chemical shifts, respectively. The model is implemented in the prediction program 4DSPOT, available at http://www.uef.fi/4dspothttp://www.uef.fi/4dspot.

  11. Combining NMR ensembles and molecular dynamics simulations provides more realistic models of protein structures in solution and leads to better chemical shift prediction

    Energy Technology Data Exchange (ETDEWEB)

    Lehtivarjo, Juuso, E-mail: juuso.lehtivarjo@uef.fi; Tuppurainen, Kari; Hassinen, Tommi; Laatikainen, Reino [University of Eastern Finland, School of Pharmacy (Finland); Peraekylae, Mikael [University of Eastern Finland, Institute of Biomedicine (Finland)

    2012-03-15

    While chemical shifts are invaluable for obtaining structural information from proteins, they also offer one of the rare ways to obtain information about protein dynamics. A necessary tool in transforming chemical shifts into structural and dynamic information is chemical shift prediction. In our previous work we developed a method for 4D prediction of protein {sup 1}H chemical shifts in which molecular motions, the 4th dimension, were modeled using molecular dynamics (MD) simulations. Although the approach clearly improved the prediction, the X-ray structures and single NMR conformers used in the model cannot be considered fully realistic models of protein in solution. In this work, NMR ensembles (NMRE) were used to expand the conformational space of proteins (e.g. side chains, flexible loops, termini), followed by MD simulations for each conformer to map the local fluctuations. Compared with the non-dynamic model, the NMRE+MD model gave 6-17% lower root-mean-square (RMS) errors for different backbone nuclei. The improved prediction indicates that NMR ensembles with MD simulations can be used to obtain a more realistic picture of protein structures in solutions and moreover underlines the importance of short and long time-scale dynamics for the prediction. The RMS errors of the NMRE+MD model were 0.24, 0.43, 0.98, 1.03, 1.16 and 2.39 ppm for {sup 1}H{alpha}, {sup 1}HN, {sup 13}C{alpha}, {sup 13}C{beta}, {sup 13}CO and backbone {sup 15}N chemical shifts, respectively. The model is implemented in the prediction program 4DSPOT, available at http://www.uef.fi/4dspothttp://www.uef.fi/4dspot.

  12. Computerised modelling for developmental biology : an exploration with case studies

    NARCIS (Netherlands)

    Bertens, Laura M.F.

    2012-01-01

    Many studies in developmental biology rely on the construction and analysis of models. This research presents a broad view of modelling approaches for developmental biology, with a focus on computational methods. An overview of modelling techniques is given, followed by several case studies. Using

  13. Morphogenesis and pattern formation in biological systems experiments and models

    CERN Document Server

    Noji, Sumihare; Ueno, Naoto; Maini, Philip

    2003-01-01

    A central goal of current biology is to decode the mechanisms that underlie the processes of morphogenesis and pattern formation. Concerned with the analysis of those phenomena, this book covers a broad range of research fields, including developmental biology, molecular biology, plant morphogenesis, ecology, epidemiology, medicine, paleontology, evolutionary biology, mathematical biology, and computational biology. In Morphogenesis and Pattern Formation in Biological Systems: Experiments and Models, experimental and theoretical aspects of biology are integrated for the construction and investigation of models of complex processes. This collection of articles on the latest advances by leading researchers not only brings together work from a wide spectrum of disciplines, but also provides a stepping-stone to the creation of new areas of discovery.

  14. The Development and Validation of an In Vitro Airway Model to Assess Realistic Airway Deposition and Drug Permeation Behavior of Orally Inhaled Products Across Synthetic Membranes.

    Science.gov (United States)

    Huynh, Bao K; Traini, Daniela; Farkas, Dale R; Longest, P Worth; Hindle, Michael; Young, Paul M

    2018-04-01

    Current in vitro approaches to assess lung deposition, dissolution, and cellular transport behavior of orally inhaled products (OIPs) have relied on compendial impactors to collect drug particles that are likely to deposit in the airway; however, the main drawback with this approach is that these impactors do not reflect the airway and may not necessarily represent drug deposition behavior in vivo. The aim of this article is to describe the development and method validation of a novel hybrid in vitro approach to assess drug deposition and permeation behavior in a more representative airway model. The medium-sized Virginia Commonwealth University (VCU) mouth-throat (MT) and tracheal-bronchial (TB) realistic upper airway models were used in this study as representative models of the upper airway. The TB model was modified to accommodate two Snapwell ® inserts above the first TB airway bifurcation region to collect deposited nebulized ciprofloxacin-hydrochloride (CIP-HCL) droplets as a model drug aerosol system. Permeation characteristics of deposited nebulized CIP-HCL droplets were assessed across different synthetic membranes using the Snapwell test system. The Snapwell test system demonstrated reproducible and discriminatory drug permeation profiles for already dissolved and nebulized CIP-HCL droplets through a range of synthetic permeable membranes under different test conditions. The rate and extent of drug permeation depended on the permeable membrane material used, presence of a stirrer in the receptor compartment, and, most importantly, the drug collection method. This novel hybrid in vitro approach, which incorporates a modified version of a realistic upper airway model, coupled with the Snapwell test system holds great potential to evaluate postairway deposition characteristics, such as drug permeation and particle dissolution behavior of OIPs. Future studies will expand this approach using a cell culture-based setup instead of synthetic membranes, within a

  15. Dynamics of leaf gas exchange, xylem and phloem transport, water potential and carbohydrate concentration in a realistic 3-D model tree crown.

    Science.gov (United States)

    Nikinmaa, Eero; Sievänen, Risto; Hölttä, Teemu

    2014-09-01

    Tree models simulate productivity using general gas exchange responses and structural relationships, but they rarely check whether leaf gas exchange and resulting water and assimilate transport and driving pressure gradients remain within acceptable physical boundaries. This study presents an implementation of the cohesion-tension theory of xylem transport and the Münch hypothesis of phloem transport in a realistic 3-D tree structure and assesses the gas exchange and transport dynamics. A mechanistic model of xylem and phloem transport was used, together with a tested leaf assimilation and transpiration model in a realistic tree architecture to simulate leaf gas exchange and water and carbohydrate transport within an 8-year-old Scots pine tree. The model solved the dynamics of the amounts of water and sucrose solute in the xylem, cambium and phloem using a fine-grained mesh with a system of coupled ordinary differential equations. The simulations predicted the observed patterns of pressure gradients and sugar concentration. Diurnal variation of environmental conditions influenced tree-level gradients in turgor pressure and sugar concentration, which are important drivers of carbon allocation. The results and between-shoot variation were sensitive to structural and functional parameters such as tree-level scaling of conduit size and phloem unloading. Linking whole-tree-level water and assimilate transport, gas exchange and sink activity opens a new avenue for plant studies, as features that are difficult to measure can be studied dynamically with the model. Tree-level responses to local and external conditions can be tested, thus making the approach described here a good test-bench for studies of whole-tree physiology.

  16. Genome-scale biological models for industrial microbial systems.

    Science.gov (United States)

    Xu, Nan; Ye, Chao; Liu, Liming

    2018-04-01

    The primary aims and challenges associated with microbial fermentation include achieving faster cell growth, higher productivity, and more robust production processes. Genome-scale biological models, predicting the formation of an interaction among genetic materials, enzymes, and metabolites, constitute a systematic and comprehensive platform to analyze and optimize the microbial growth and production of biological products. Genome-scale biological models can help optimize microbial growth-associated traits by simulating biomass formation, predicting growth rates, and identifying the requirements for cell growth. With regard to microbial product biosynthesis, genome-scale biological models can be used to design product biosynthetic pathways, accelerate production efficiency, and reduce metabolic side effects, leading to improved production performance. The present review discusses the development of microbial genome-scale biological models since their emergence and emphasizes their pertinent application in improving industrial microbial fermentation of biological products.

  17. OFFl Models: Novel Schema for Dynamical Modeling of Biological Systems.

    Science.gov (United States)

    Ogbunugafor, C Brandon; Robinson, Sean P

    2016-01-01

    Flow diagrams are a common tool used to help build and interpret models of dynamical systems, often in biological contexts such as consumer-resource models and similar compartmental models. Typically, their usage is intuitive and informal. Here, we present a formalized version of flow diagrams as a kind of weighted directed graph which follow a strict grammar, which translate into a system of ordinary differential equations (ODEs) by a single unambiguous rule, and which have an equivalent representation as a relational database. (We abbreviate this schema of "ODEs and formalized flow diagrams" as OFFL.) Drawing a diagram within this strict grammar encourages a mental discipline on the part of the modeler in which all dynamical processes of a system are thought of as interactions between dynamical species that draw parcels from one or more source species and deposit them into target species according to a set of transformation rules. From these rules, the net rate of change for each species can be derived. The modeling schema can therefore be understood as both an epistemic and practical heuristic for modeling, serving both as an organizational framework for the model building process and as a mechanism for deriving ODEs. All steps of the schema beyond the initial scientific (intuitive, creative) abstraction of natural observations into model variables are algorithmic and easily carried out by a computer, thus enabling the future development of a dedicated software implementation. Such tools would empower the modeler to consider significantly more complex models than practical limitations might have otherwise proscribed, since the modeling framework itself manages that complexity on the modeler's behalf. In this report, we describe the chief motivations for OFFL, carefully outline its implementation, and utilize a range of classic examples from ecology and epidemiology to showcase its features.

  18. OFFl Models: Novel Schema for Dynamical Modeling of Biological Systems.

    Directory of Open Access Journals (Sweden)

    C Brandon Ogbunugafor

    Full Text Available Flow diagrams are a common tool used to help build and interpret models of dynamical systems, often in biological contexts such as consumer-resource models and similar compartmental models. Typically, their usage is intuitive and informal. Here, we present a formalized version of flow diagrams as a kind of weighted directed graph which follow a strict grammar, which translate into a system of ordinary differential equations (ODEs by a single unambiguous rule, and which have an equivalent representation as a relational database. (We abbreviate this schema of "ODEs and formalized flow diagrams" as OFFL. Drawing a diagram within this strict grammar encourages a mental discipline on the part of the modeler in which all dynamical processes of a system are thought of as interactions between dynamical species that draw parcels from one or more source species and deposit them into target species according to a set of transformation rules. From these rules, the net rate of change for each species can be derived. The modeling schema can therefore be understood as both an epistemic and practical heuristic for modeling, serving both as an organizational framework for the model building process and as a mechanism for deriving ODEs. All steps of the schema beyond the initial scientific (intuitive, creative abstraction of natural observations into model variables are algorithmic and easily carried out by a computer, thus enabling the future development of a dedicated software implementation. Such tools would empower the modeler to consider significantly more complex models than practical limitations might have otherwise proscribed, since the modeling framework itself manages that complexity on the modeler's behalf. In this report, we describe the chief motivations for OFFL, carefully outline its implementation, and utilize a range of classic examples from ecology and epidemiology to showcase its features.

  19. Protocol for an HTA report: Does therapeutic writing help people with long-term conditions? Systematic review, realist synthesis and economic modelling.

    Science.gov (United States)

    Meads, C; Nyssen, O P; Wong, G; Steed, L; Bourke, L; Ross, C A; Hayman, S; Field, V; Lord, J; Greenhalgh, T; Taylor, S J C

    2014-02-18

    Long-term medical conditions (LTCs) cause reduced health-related quality of life and considerable health service expenditure. Writing therapy has potential to improve physical and mental health in people with LTCs, but its effectiveness is not established. This project aims to establish the clinical and cost-effectiveness of therapeutic writing in LTCs by systematic review and economic evaluation, and to evaluate context and mechanisms by which it might work, through realist synthesis. Included are any comparative study of therapeutic writing compared with no writing, waiting list, attention control or placebo writing in patients with any diagnosed LTCs that report at least one of the following: relevant clinical outcomes; quality of life; health service use; psychological, behavioural or social functioning; adherence or adverse events. Searches will be conducted in the main medical databases including MEDLINE, EMBASE, PsycINFO, The Cochrane Library and Science Citation Index. For the realist review, further purposive and iterative searches through snowballing techniques will be undertaken. Inclusions, data extraction and quality assessment will be in duplicate with disagreements resolved through discussion. Quality assessment will include using Grading of Recommendations Assessment, Development and Evaluation (GRADE) criteria. Data synthesis will be narrative and tabular with meta-analysis where appropriate. De novo economic modelling will be attempted in one clinical area if sufficient evidence is available and performed according to the National Institute for Health and Care Excellence (NICE) reference case.

  20. Protocol for an HTA report: Does therapeutic writing help people with long-term conditions? Systematic review, realist synthesis and economic modelling

    Science.gov (United States)

    Meads, C; Nyssen, O P; Wong, G; Steed, L; Bourke, L; Ross, C A; Hayman, S; Field, V; Lord, J; Greenhalgh, T; Taylor, S J C

    2014-01-01

    Introduction Long-term medical conditions (LTCs) cause reduced health-related quality of life and considerable health service expenditure. Writing therapy has potential to improve physical and mental health in people with LTCs, but its effectiveness is not established. This project aims to establish the clinical and cost-effectiveness of therapeutic writing in LTCs by systematic review and economic evaluation, and to evaluate context and mechanisms by which it might work, through realist synthesis. Methods Included are any comparative study of therapeutic writing compared with no writing, waiting list, attention control or placebo writing in patients with any diagnosed LTCs that report at least one of the following: relevant clinical outcomes; quality of life; health service use; psychological, behavioural or social functioning; adherence or adverse events. Searches will be conducted in the main medical databases including MEDLINE, EMBASE, PsycINFO, The Cochrane Library and Science Citation Index. For the realist review, further purposive and iterative searches through snowballing techniques will be undertaken. Inclusions, data extraction and quality assessment will be in duplicate with disagreements resolved through discussion. Quality assessment will include using Grading of Recommendations Assessment, Development and Evaluation (GRADE) criteria. Data synthesis will be narrative and tabular with meta-analysis where appropriate. De novo economic modelling will be attempted in one clinical area if sufficient evidence is available and performed according to the National Institute for Health and Care Excellence (NICE) reference case. PMID:24549165

  1. Evaluation of a micro-scale wind model's performance over realistic building clusters using wind tunnel experiments

    Science.gov (United States)

    Zhang, Ning; Du, Yunsong; Miao, Shiguang; Fang, Xiaoyi

    2016-08-01

    The simulation performance over complex building clusters of a wind simulation model (Wind Information Field Fast Analysis model, WIFFA) in a micro-scale air pollutant dispersion model system (Urban Microscale Air Pollution dispersion Simulation model, UMAPS) is evaluated using various wind tunnel experimental data including the CEDVAL (Compilation of Experimental Data for Validation of Micro-Scale Dispersion Models) wind tunnel experiment data and the NJU-FZ experiment data (Nanjing University-Fang Zhuang neighborhood wind tunnel experiment data). The results show that the wind model can reproduce the vortexes triggered by urban buildings well, and the flow patterns in urban street canyons and building clusters can also be represented. Due to the complex shapes of buildings and their distributions, the simulation deviations/discrepancies from the measurements are usually caused by the simplification of the building shapes and the determination of the key zone sizes. The computational efficiencies of different cases are also discussed in this paper. The model has a high computational efficiency compared to traditional numerical models that solve the Navier-Stokes equations, and can produce very high-resolution (1-5 m) wind fields of a complex neighborhood scale urban building canopy (~ 1 km ×1 km) in less than 3 min when run on a personal computer.

  2. Inner-shell corrections to the Bethe stopping-power formula evaluated from a realistic atomic model

    International Nuclear Information System (INIS)

    Inokuti, M.; Manson, S.T.

    1985-01-01

    Generalized oscillator strengths for K- and L-shell ionization have been calculated using a central potential derived from the Hartree-Slater model. In cases in which an ejected electron carries low kinetic energies, sizable differences with hydrogenic-model calculations are evident

  3. Solar system tests for realistic f(T) models with non-minimal torsion-matter coupling

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Rui-Hui; Zhai, Xiang-Hua; Li, Xin-Zhou [Shanghai Normal University, Shanghai United Center for Astrophysics (SUCA), Shanghai (China)

    2017-08-15

    In the previous paper, we have constructed two f(T) models with non-minimal torsion-matter coupling extension, which are successful in describing the evolution history of the Universe including the radiation-dominated era, the matter-dominated era, and the present accelerating expansion. Meantime, the significant advantage of these models is that they could avoid the cosmological constant problem of ΛCDM. However, the non-minimal coupling between matter and torsion will affect the tests of the Solar system. In this paper, we study the effects of the Solar system in these models, including the gravitation redshift, geodetic effect and perihelion precession. We find that Model I can pass all three of the Solar system tests. For Model II, the parameter is constrained by the uncertainties of the planets' estimated perihelion precessions. (orig.)

  4. Empirical assessment of the validity limits of the surface wave full ray theory using realistic 3-D Earth models

    KAUST Repository

    Parisi, Laura

    2016-02-10

    The surface wave full ray theory (FRT) is an efficient tool to calculate synthetic waveforms of surface waves. It combines the concept of local modes with exact ray tracing as a function of frequency, providing a more complete description of surface wave propagation than the widely used great circle approximation (GCA). The purpose of this study is to evaluate the ability of the FRT approach to model teleseismic long-period surface waveforms (T ∼ 45–150 s) in the context of current 3-D Earth models to empirically assess its validity domain and its scope for future studies in seismic tomography. To achieve this goal, we compute vertical and horizontal component fundamental mode synthetic Rayleigh waveforms using the FRT, which are compared with calculations using the highly accurate spectral element method. We use 13 global earth models including 3-D crustal and mantle structure, which are derived by successively varying the strength and lengthscale of heterogeneity in current tomographic models. For completeness, GCA waveforms are also compared with the spectral element method. We find that the FRT accurately predicts the phase and amplitude of long-period Rayleigh waves (T ∼ 45–150 s) for almost all the models considered, with errors in the modelling of the phase (amplitude) of Rayleigh waves being smaller than 5 per cent (10 per cent) in most cases. The largest errors in phase and amplitude are observed for T ∼ 45 s and for the three roughest earth models considered that exhibit shear wave anomalies of up to ∼20 per cent, which is much larger than in current global tomographic models. In addition, we find that overall the GCA does not predict Rayleigh wave amplitudes well, except for the longest wave periods (T ∼ 150 s) and the smoothest models considered. Although the GCA accurately predicts Rayleigh wave phase for current earth models such as S20RTS and S40RTS, FRT\\'s phase errors are smaller, notably for the shortest wave periods considered (T

  5. Preservice Biology Teachers' Conceptions about the Tentative Nature of Theories and Models in Biology

    Science.gov (United States)

    Reinisch, Bianca; Krüger, Dirk

    2018-01-01

    In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers (N = 10) were asked about their understanding of theories…

  6. Biochemical Space: A Framework for Systemic Annotation of Biological Models

    Czech Academy of Sciences Publication Activity Database

    Klement, M.; Děd, T.; Šafránek, D.; Červený, Jan; Müller, Stefan; Steuer, Ralf

    2014-01-01

    Roč. 306, JUL (2014), s. 31-44 ISSN 1571-0661 R&D Projects: GA MŠk(CZ) EE2.3.20.0256 Institutional support: RVO:67179843 Keywords : biological models * model annotation * systems biology * cyanobacteria Subject RIV: EH - Ecology, Behaviour

  7. Review of "Stochastic Modelling for Systems Biology" by Darren Wilkinson

    Directory of Open Access Journals (Sweden)

    Bullinger Eric

    2006-12-01

    Full Text Available Abstract "Stochastic Modelling for Systems Biology" by Darren Wilkinson introduces the peculiarities of stochastic modelling in biology. This book is particularly suited to as a textbook or for self-study, and for readers with a theoretical background.

  8. Coupling a Mesoscale Numerical Weather Prediction Model with Large-Eddy Simulation for Realistic Wind Plant Aerodynamics Simulations (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C.; Churchfield, M.; Mirocha, J.; Lee, S.; Lundquist, J.; Michalakes, J.; Moriarty, P.; Purkayastha, A.; Sprague, M.; Vanderwende, B.

    2014-06-01

    Wind plant aerodynamics are influenced by a combination of microscale and mesoscale phenomena. Incorporating mesoscale atmospheric forcing (e.g., diurnal cycles and frontal passages) into wind plant simulations can lead to a more accurate representation of microscale flows, aerodynamics, and wind turbine/plant performance. Our goal is to couple a numerical weather prediction model that can represent mesoscale flow [specifically the Weather Research and Forecasting model] with a microscale LES model (OpenFOAM) that can predict microscale turbulence and wake losses.

  9. Kuhn: Realist or Antirealist?

    Directory of Open Access Journals (Sweden)

    Michel Ghins

    1998-06-01

    Full Text Available Although Kuhn is much more an antirealist than a realist, the earlier and later articulations of realist and antirealist ingredients in his views merit close scrutiny. What are the constituents of the real invariant World posited by Kuhn and its relation to the mutable paradigm-related worlds? Various proposed solutions to this problem (dubbed the "new-world problem" by Ian Hacking are examined and shown to be unsatisfactory. In The Structure of Scientific Revolutions, the stable World can reasonably be taken to be made up of ordinary perceived objects, whereas in Kuhn's later works the transparadigmatic World is identified with something akin to the Kantian world-in-itself. It is argued that both proposals are beset with insuperable difficulties which render Kuhn's earlier and later versions of antirealism implausible.

  10. CONSTRAINING MODELS OF TWIN-PEAK QUASI-PERIODIC OSCILLATIONS WITH REALISTIC NEUTRON STAR EQUATIONS OF STATE

    Energy Technology Data Exchange (ETDEWEB)

    Török, Gabriel; Goluchová, Katerina; Urbanec, Martin, E-mail: gabriel.torok@gmail.com, E-mail: katka.g@seznam.cz, E-mail: martin.urbanec@physics.cz [Research Centre for Computational Physics and Data Processing, Institute of Physics, Faculty of Philosophy and Science, Silesian University in Opava, Bezručovo nám. 13, CZ-746, 01 Opava (Czech Republic); and others

    2016-12-20

    Twin-peak quasi-periodic oscillations (QPOs) are observed in the X-ray power-density spectra of several accreting low-mass neutron star (NS) binaries. In our previous work we have considered several QPO models. We have identified and explored mass–angular-momentum relations implied by individual QPO models for the atoll source 4U 1636-53. In this paper we extend our study and confront QPO models with various NS equations of state (EoS). We start with simplified calculations assuming Kerr background geometry and then present results of detailed calculations considering the influence of NS quadrupole moment (related to rotationally induced NS oblateness) assuming Hartle–Thorne spacetimes. We show that the application of concrete EoS together with a particular QPO model yields a specific mass–angular-momentum relation. However, we demonstrate that the degeneracy in mass and angular momentum can be removed when the NS spin frequency inferred from the X-ray burst observations is considered. We inspect a large set of EoS and discuss their compatibility with the considered QPO models. We conclude that when the NS spin frequency in 4U 1636-53 is close to 580 Hz, we can exclude 51 of the 90 considered combinations of EoS and QPO models. We also discuss additional restrictions that may exclude even more combinations. Namely, 13 EOS are compatible with the observed twin-peak QPOs and the relativistic precession model. However, when considering the low-frequency QPOs and Lense–Thirring precession, only 5 EOS are compatible with the model.

  11. Modelling hen harrier dynamics to inform human-wildlife conflict resolution: a spatially-realistic, individual-based approach.

    Directory of Open Access Journals (Sweden)

    Johannes P M Heinonen

    Full Text Available Individual-based models have gained popularity in ecology, and enable simultaneous incorporation of spatial explicitness and population dynamic processes to understand spatio-temporal patterns of populations. We introduce an individual-based model for understanding and predicting spatial hen harrier (Circus cyaneus population dynamics in Great Britain. The model uses a landscape with habitat, prey and game management indices. The hen harrier population was initialised according to empirical census estimates for 1988/89 and simulated until 2030, and predictions for 1998, 2004 and 2010 were compared to empirical census estimates for respective years. The model produced a good qualitative match to overall trends between 1989 and 2010. Parameter explorations revealed relatively high elasticity in particular to demographic parameters such as juvenile male mortality. This highlights the need for robust parameter estimates from empirical research. There are clearly challenges for replication of real-world population trends, but this model provides a useful tool for increasing understanding of drivers of hen harrier dynamics and focusing research efforts in order to inform conflict management decisions.

  12. Modelling hen harrier dynamics to inform human-wildlife conflict resolution: a spatially-realistic, individual-based approach.

    Science.gov (United States)

    Heinonen, Johannes P M; Palmer, Stephen C F; Redpath, Steve M; Travis, Justin M J

    2014-01-01

    Individual-based models have gained popularity in ecology, and enable simultaneous incorporation of spatial explicitness and population dynamic processes to understand spatio-temporal patterns of populations. We introduce an individual-based model for understanding and predicting spatial hen harrier (Circus cyaneus) population dynamics in Great Britain. The model uses a landscape with habitat, prey and game management indices. The hen harrier population was initialised according to empirical census estimates for 1988/89 and simulated until 2030, and predictions for 1998, 2004 and 2010 were compared to empirical census estimates for respective years. The model produced a good qualitative match to overall trends between 1989 and 2010. Parameter explorations revealed relatively high elasticity in particular to demographic parameters such as juvenile male mortality. This highlights the need for robust parameter estimates from empirical research. There are clearly challenges for replication of real-world population trends, but this model provides a useful tool for increasing understanding of drivers of hen harrier dynamics and focusing research efforts in order to inform conflict management decisions.

  13. Local air gap thickness and contact area models for realistic simulation of human thermo-physiological response

    Science.gov (United States)

    Psikuta, Agnes; Mert, Emel; Annaheim, Simon; Rossi, René M.

    2018-02-01

    To evaluate the quality of new energy-saving and performance-supporting building and urban settings, the thermal sensation and comfort models are often used. The accuracy of these models is related to accurate prediction of the human thermo-physiological response that, in turn, is highly sensitive to the local effect of clothing. This study aimed at the development of an empirical regression model of the air gap thickness and the contact area in clothing to accurately simulate human thermal and perceptual response. The statistical model predicted reliably both parameters for 14 body regions based on the clothing ease allowances. The effect of the standard error in air gap prediction on the thermo-physiological response was lower than the differences between healthy humans. It was demonstrated that currently used assumptions and methods for determination of the air gap thickness can produce a substantial error for all global, mean, and local physiological parameters, and hence, lead to false estimation of the resultant physiological state of the human body, thermal sensation, and comfort. Thus, this model may help researchers to strive for improvement of human thermal comfort, health, productivity, safety, and overall sense of well-being with simultaneous reduction of energy consumption and costs in built environment.

  14. FDTD Modeling of LEMP Propagation in the Earth-Ionosphere Waveguide With Emphasis on Realistic Representation of Lightning Source

    Science.gov (United States)

    Tran, Thang H.; Baba, Yoshihiro; Somu, Vijaya B.; Rakov, Vladimir A.

    2017-12-01

    The finite difference time domain (FDTD) method in the 2-D cylindrical coordinate system was used to compute the nearly full-frequency-bandwidth vertical electric field and azimuthal magnetic field waveforms produced on the ground surface by lightning return strokes. The lightning source was represented by the modified transmission-line model with linear current decay with height, which was implemented in the FDTD computations as an appropriate vertical phased-current-source array. The conductivity of atmosphere was assumed to increase exponentially with height, with different conductivity profiles being used for daytime and nighttime conditions. The fields were computed at distances ranging from 50 to 500 km. Sky waves (reflections from the ionosphere) were identified in computed waveforms and used for estimation of apparent ionospheric reflection heights. It was found that our model reproduces reasonably well the daytime electric field waveforms measured at different distances and simulated (using a more sophisticated propagation model) by Qin et al. (2017). Sensitivity of model predictions to changes in the parameters of atmospheric conductivity profile, as well as influences of the lightning source characteristics (current waveshape parameters, return-stroke speed, and channel length) and ground conductivity were examined.

  15. Towards a Cognitively Realistic Computational Model of Team Problem Solving Using ACT-R Agents and the ELICIT Experimentation Framework

    Science.gov (United States)

    2014-06-01

    reasons for selecting ACT-R, in this respect, relates to its widespread use in psychological modeling. As mentioned above, ACT-R has a long history of use...cognitive architecture,” in 6th International Conference on Advanced Cognitive Technologies and Applications (COGNITIVE’14), Venice , Italy, 2014. [44

  16. The Effects of Realistic Geological Heterogeneity on Seismic Modeling: Applications in Shear Wave Generation and Near-Surface Tunnel Detection

    Science.gov (United States)

    Sherman, Christopher Scott

    Naturally occurring geologic heterogeneity is an important, but often overlooked, aspect of seismic wave propagation. This dissertation presents a strategy for modeling the effects of heterogeneity using a combination of geostatistics and Finite Difference simulation. In the first chapter, I discuss my motivations for studying geologic heterogeneity and seis- mic wave propagation. Models based upon fractal statistics are powerful tools in geophysics for modeling heterogeneity. The important features of these fractal models are illustrated using borehole log data from an oil well and geomorphological observations from a site in Death Valley, California. A large part of the computational work presented in this disserta- tion was completed using the Finite Difference Code E3D. I discuss the Python-based user interface for E3D and the computational strategies for working with heterogeneous models developed over the course of this research. The second chapter explores a phenomenon observed for wave propagation in heteroge- neous media - the generation of unexpected shear wave phases in the near-source region. In spite of their popularity amongst seismic researchers, approximate methods for modeling wave propagation in these media, such as the Born and Rytov methods or Radiative Trans- fer Theory, are incapable of explaining these shear waves. This is primarily due to these method's assumptions regarding the coupling of near-source terms with the heterogeneities and mode conversion. To determine the source of these shear waves, I generate a suite of 3D synthetic heterogeneous fractal geologic models and use E3D to simulate the wave propaga- tion for a vertical point force on the surface of the models. I also present a methodology for calculating the effective source radiation patterns from the models. The numerical results show that, due to a combination of mode conversion and coupling with near-source hetero- geneity, shear wave energy on the order of 10% of the

  17. Accurate hardening modeling as basis for the realistic simulation of sheet forming processes with complex strain-path changes

    International Nuclear Information System (INIS)

    Levkovitch, Vladislav; Svendsen, Bob

    2007-01-01

    Sheet metal forming involves large strains and severe strain-path changes. Large plastic strains lead in many metals to the development of persistent dislocation structures resulting in strong flow anisotropy. This induced anisotropic behavior manifests itself in the case of a strain path change through very different stress-strain responses depending on the type of the strain-path change. While many metals exhibit a drop of the yield stress (Bauschinger effect) after a load reversal, some metals show an increase of the yield stress after an orthogonal strain-path change (so-called cross hardening). To model the Bauschinger effect, kinematic hardening has been successfully used for years. However, the usage of the kinematic hardening leads automatically to a drop of the yield stress after an orthogonal strain-path change contradicting tests exhibiting the cross hardening effect. Another effect, not accounted for in the classical elasto-plasticity, is the difference between the tensile and compressive strength, exhibited e.g. by some steel materials. In this work we present a phenomenological material model whose structure is motivated by polycrystalline modeling that takes into account the evolution of polarized dislocation structures on the grain level - the main cause of the induced flow anisotropy on the macroscopic level. The model considers besides the movement of the yield surface and its proportional expansion, as it is the case in conventional plasticity, also the changes of the yield surface shape (distortional hardening) and accounts for the pressure dependence of the flow stress. All these additional attributes turn out to be essential to model the stress-strain response of dual phase high strength steels subjected to non-proportional loading

  18. Accurate Hardening Modeling As Basis For The Realistic Simulation Of Sheet Forming Processes With Complex Strain-Path Changes

    International Nuclear Information System (INIS)

    Levkovitch, Vladislav; Svendsen, Bob

    2007-01-01

    Sheet metal forming involves large strains and severe strain-path changes. Large plastic strains lead in many metals to the development of persistent dislocation structures resulting in strong flow anisotropy. This induced anisotropic behavior manifests itself in the case of a strain path change through very different stress-strain responses depending on the type of the strain-path change. While many metals exhibit a drop of the yield stress (Bauschinger effect) after a load reversal, some metals show an increase of the yield stress after an orthogonal strain-path change (so-called cross hardening). To model the Bauschinger effect, kinematic hardening has been successfully used for years. However, the usage of the kinematic hardening leads automatically to a drop of the yield stress after an orthogonal strain-path change contradicting tests exhibiting the cross hardening effect. Another effect, not accounted for in the classical elasto-plasticity, is the difference between the tensile and compressive strength, exhibited e.g. by some steel materials. In this work we present a phenomenological material model whose structure is motivated by polycrystalline modeling that takes into account the evolution of polarized dislocation structures on the grain level - the main cause of the induced flow anisotropy on the macroscopic level. The model considers besides the movement of the yield surface and its proportional expansion, as it is the case in conventional plasticity, also the changes of the yield surface shape (distortional hardening) and accounts for the pressure dependence of the flow stress. All these additional attributes turn out to be essential to model the stress-strain response of dual phase high strength steels subjected to non-proportional loading

  19. Application of realistic (best- estimate) methodologies for large break loss of coolant (LOCA) safety analysis: licensing of Westinghouse ASTRUM evaluation model in Spain

    International Nuclear Information System (INIS)

    Lage, Carlos; Frepoli, Cesare

    2010-01-01

    When the LOCA Final Acceptance Criteria for Light Water Reactors was issued in Appendix K of 10CFR50 both the USNRC and the industry recognized that the rule was highly conservative. At that time, however, the degree of conservatism in the analysis could not be quantified. As a result, the USNRC began a research program to identify the degree of conservatism in those models permitted in the Appendix K rule and to develop improved thermal-hydraulic computer codes so that realistic accident analysis calculations could be performed. The overall results of this research program quantified the conservatism in the Appendix K rule and confirmed that some relaxation of the rule can be made without a loss in safety to the public. Also, from a risk-informed perspective it is recognized that conservatism is not always a complete defense for lack of sophistication in models. In 1988, as a result of the improved understanding of LOCA phenomena, the USNRC staff amended the requirements of 10 CFR 50.46 and Appendix K, 'ECCS Evaluation Models', so that a realistic evaluation model may be used to analyze the performance of the ECCS during a hypothetical LOCA. Under the amended rules, best-estimate plus uncertainty (BEPU) thermal-hydraulic analysis may be used in place of the overly prescriptive set of models mandated by Appendix K rule. Further guidance for the use of best-estimate codes was provided in Regulatory Guide 1.157 To demonstrate use of the revised ECCS rule, the USNRC and its consultants developed a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology as an approach for defining and qualifying a best-estimate thermal-hydraulic code and quantifying the uncertainties in a LOCA analysis. More recently the CSAU principles have been generalized in the Evaluation Model Development and Assessment Process (EMDAP) of Regulatory Guide 1.203. ASTRUM is the Westinghouse Best Estimate Large Break LOCA evaluation model applicable to two-, three

  20. Towards realistic flow modelling. Creation and evaluation of two-dimensional simulated porous media: An image analysis approach

    Science.gov (United States)

    Anguy, Yannick; Bernard, Dominique; Ehrlich, Robert

    1996-05-01

    This work is part of an attempt to quantify the relationship between the permeability tensor ( K) and the micro-structure of natural porous media. A brief account is first provided of popular theories used to relate the micro-structure to K. Reasons for the lack of predictive power and restricted generality of current models are discussed. An alternative is an empirically based implicit model wherein K is expressed as a consequence of a few “pore-types” arising from the dynamics of depositional processes. The analytical form of that implicit model arises from evidence of universal association between pore-type and throat size in sandstones and carbonates. An explicit model, relying on the local change of scale technique is then addressed. That explicit model allows, from knowledge of the three-dimensional micro-geometry to calculate K explicitly without having recourse to any constitutive assumptions. The predictive and general character of the explicit model is underlined. The relevance of the change of scale technique is recalled to be contingent on the availability of rock-like three-dimensional synthetic media. A random stationary ergodic process is developed, that allows us to generate three-dimensional synthetic media from a two-dimensional autocorrelation function r(λ x ,λ y ) and associated probability density function ∈ β measured on a single binary image. The focus of this work is to ensure the rock-like character of those synthetic media. This is done first through a direct approach: n two-dimensional synthetic media, derived from single set ( ∈ β , r(λ x ,λ y )) yield n permeability tensors K {/i-1,n i} (calculated by the local change of scale) of the same order. This is a necessary condition to ensure that r(λ x ,λ y ) and ∈ β carry all structural information relevant to K. The limits of this direct approach, in terms of required Central Process Unit time and Memory is underlined, raising the need for an alternative. This is done by

  1. Dipole estimation errors due to not incorporating anisotropic conductivities in realistic head models for EEG source analysis

    Science.gov (United States)

    Hallez, Hans; Staelens, Steven; Lemahieu, Ignace

    2009-10-01

    EEG source analysis is a valuable tool for brain functionality research and for diagnosing neurological disorders, such as epilepsy. It requires a geometrical representation of the human head or a head model, which is often modeled as an isotropic conductor. However, it is known that some brain tissues, such as the skull or white matter, have an anisotropic conductivity. Many studies reported that the anisotropic conductivities have an influence on the calculated electrode potentials. However, few studies have assessed the influence of anisotropic conductivities on the dipole estimations. In this study, we want to determine the dipole estimation errors due to not taking into account the anisotropic conductivities of the skull and/or brain tissues. Therefore, head models are constructed with the same geometry, but with an anisotropically conducting skull and/or brain tissue compartment. These head models are used in simulation studies where the dipole location and orientation error is calculated due to neglecting anisotropic conductivities of the skull and brain tissue. Results show that not taking into account the anisotropic conductivities of the skull yields a dipole location error between 2 and 25 mm, with an average of 10 mm. When the anisotropic conductivities of the brain tissues are neglected, the dipole location error ranges between 0 and 5 mm. In this case, the average dipole location error was 2.3 mm. In all simulations, the dipole orientation error was smaller than 10°. We can conclude that the anisotropic conductivities of the skull have to be incorporated to improve the accuracy of EEG source analysis. The results of the simulation, as presented here, also suggest that incorporation of the anisotropic conductivities of brain tissues is not necessary. However, more studies are needed to confirm these suggestions.

  2. Dipole estimation errors due to not incorporating anisotropic conductivities in realistic head models for EEG source analysis

    International Nuclear Information System (INIS)

    Hallez, Hans; Staelens, Steven; Lemahieu, Ignace

    2009-01-01

    EEG source analysis is a valuable tool for brain functionality research and for diagnosing neurological disorders, such as epilepsy. It requires a geometrical representation of the human head or a head model, which is often modeled as an isotropic conductor. However, it is known that some brain tissues, such as the skull or white matter, have an anisotropic conductivity. Many studies reported that the anisotropic conductivities have an influence on the calculated electrode potentials. However, few studies have assessed the influence of anisotropic conductivities on the dipole estimations. In this study, we want to determine the dipole estimation errors due to not taking into account the anisotropic conductivities of the skull and/or brain tissues. Therefore, head models are constructed with the same geometry, but with an anisotropically conducting skull and/or brain tissue compartment. These head models are used in simulation studies where the dipole location and orientation error is calculated due to neglecting anisotropic conductivities of the skull and brain tissue. Results show that not taking into account the anisotropic conductivities of the skull yields a dipole location error between 2 and 25 mm, with an average of 10 mm. When the anisotropic conductivities of the brain tissues are neglected, the dipole location error ranges between 0 and 5 mm. In this case, the average dipole location error was 2.3 mm. In all simulations, the dipole orientation error was smaller than 10 deg. We can conclude that the anisotropic conductivities of the skull have to be incorporated to improve the accuracy of EEG source analysis. The results of the simulation, as presented here, also suggest that incorporation of the anisotropic conductivities of brain tissues is not necessary. However, more studies are needed to confirm these suggestions.

  3. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C

    2004-01-01

    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  4. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  5. The realistic consideration of human factors in model based simulation tools for the air traffic control domain.

    Science.gov (United States)

    Duca, Gabriella; Attaianese, Erminia

    2012-01-01

    Advanced Air Traffic Management (ATM) concepts related to automation, airspace organization and operational procedures are driven by the overall goal to increase ATM system performance. Independently on the nature and/or impact of envisaged changes (e.g. from a short term procedure adjustment to a very long term operational concept or aid tools completion), the preliminary assessment of possible gains in airspace/airport capacity, safety and cost-effectiveness is done by running Model Based Simulations (MBSs, also known as Fast Time Simulations - FTS). Being a not human-in-the-loop technique, the reliability of a MBS results depend on the accuracy and significance of modeled human factors. Despite that, it can be observed in the practice that modeling tools commonly assume a generalized standardization of human behaviors and tasks and consider a very few range of work environment factors that, in the reality, affect the actual human-system performance. The present paper is aimed at opening a discussion about the possibility to keep task description and related weight at a high/general level, suitable for an efficient use of MBSs and, at the same time, increasing simulations reliability adopting some adjustment coming from the elaboration of further variables related to the human aspects of controllers workload.

  6. Development of realistic thermal-hydraulic system analysis codes ; development of thermal hydraulic test requirements for multidimensional flow modeling

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Kune Yull; Yoon, Sang Hyuk; Noh, Sang Woo; Lee, Il Suk [Seoul National University, Seoul (Korea)

    2002-03-01

    This study is concerned with developing a multidimensional flow model required for the system analysis code MARS to more mechanistically simulate a variety of thermal hydraulic phenomena in the nuclear stem supply system. The capability of the MARS code as a thermal hydraulic analysis tool for optimized system design can be expanded by improving the current calculational methods and adding new models. In this study the relevant literature was surveyed on the multidimensional flow models that may potentially be applied to the multidimensional analysis code. Research items were critically reviewed and suggested to better predict the multidimensional thermal hydraulic behavior and to identify test requirements. A small-scale preliminary test was performed in the downcomer formed by two vertical plates to analyze multidimensional flow pattern in a simple geometry. The experimental result may be applied to the code for analysis of the fluid impingement to the reactor downcomer wall. Also, data were collected to find out the controlling parameters for the one-dimensional and multidimensional flow behavior. 22 refs., 40 figs., 7 tabs. (Author)

  7. Does preliminary optimisation of an anatomically correct skull-brain model using simple simulants produce clinically realistic ballistic injury fracture patterns?

    Science.gov (United States)

    Mahoney, P F; Carr, D J; Delaney, R J; Hunt, N; Harrison, S; Breeze, J; Gibb, I

    2017-07-01

    Ballistic head injury remains a significant threat to military personnel. Studying such injuries requires a model that can be used with a military helmet. This paper describes further work on a skull-brain model using skulls made from three different polyurethane plastics and a series of skull 'fills' to simulate brain (3, 5, 7 and 10% gelatine by mass and PermaGel™). The models were subjected to ballistic impact from 7.62 × 39 mm mild steel core bullets. The first part of the work compares the different polyurethanes (mean bullet muzzle velocity of 708 m/s), and the second part compares the different fills (mean bullet muzzle velocity of 680 m/s). The impact events were filmed using high speed cameras. The resulting fracture patterns in the skulls were reviewed and scored by five clinicians experienced in assessing penetrating head injury. In over half of the models, one or more assessors felt aspects of the fracture pattern were close to real injury. Limitations of the model include the skull being manufactured in two parts and the lack of a realistic skin layer. Further work is ongoing to address these.

  8. Fresh tar (from biomass gasification) destruction with downstream catalysts: comparison of their intrinsic activity with a realistic kinetic model

    Energy Technology Data Exchange (ETDEWEB)

    Corella, J.; Narvaez, I.; Orio, A. [Complutense Univ. of Madrid (Spain). Dept. of Chemical Engineering

    1996-12-31

    A model for fresh tar destruction over catalysts placed downstream a biomass gasifier is presented. It includes the stoichio-metry and the calculation of the kinetic constants for the tar destruction. Catalysts studied include commercial Ni steam reforming catalysts and calcinated dolomites. Kinetic constants for tar destruction are calculated for several particle sizes, times- on-stream and temperatures of the catalyst and equivalence ratios in the gasifier. Such intrinsic kinetic constants allow a rigorous or scientific comparison of solids and conditions to be used in an advanced gasification process. (orig.) 4 refs.

  9. Fresh tar (from biomass gasification) destruction with downstream catalysts: comparison of their intrinsic activity with a realistic kinetic model

    Energy Technology Data Exchange (ETDEWEB)

    Corella, J; Narvaez, I; Orio, A [Complutense Univ. of Madrid (Spain). Dept. of Chemical Engineering

    1997-12-31

    A model for fresh tar destruction over catalysts placed downstream a biomass gasifier is presented. It includes the stoichio-metry and the calculation of the kinetic constants for the tar destruction. Catalysts studied include commercial Ni steam reforming catalysts and calcinated dolomites. Kinetic constants for tar destruction are calculated for several particle sizes, times- on-stream and temperatures of the catalyst and equivalence ratios in the gasifier. Such intrinsic kinetic constants allow a rigorous or scientific comparison of solids and conditions to be used in an advanced gasification process. (orig.) 4 refs.

  10. Dose conversion coefficients for monoenergetic electrons incident on a realistic human eye model with different lens cell populations.

    Science.gov (United States)

    Nogueira, P; Zankl, M; Schlattl, H; Vaz, P

    2011-11-07

    The radiation-induced posterior subcapsular cataract has long been generally accepted to be a deterministic effect that does not occur at doses below a threshold of at least 2 Gy. Recent epidemiological studies indicate that the threshold for cataract induction may be much lower or that there may be no threshold at all. A thorough study of this subject requires more accurate dose estimates for the eye lens than those available in ICRP Publication 74. Eye lens absorbed dose per unit fluence conversion coefficients for electron irradiation were calculated using a geometrical model of the eye that takes into account different cell populations of the lens epithelium, together with the MCNPX Monte Carlo radiation transport code package. For the cell population most sensitive to ionizing radiation-the germinative cells-absorbed dose per unit fluence conversion coefficients were determined that are up to a factor of 4.8 higher than the mean eye lens absorbed dose conversion coefficients for electron energies below 2 MeV. Comparison of the results with previously published values for a slightly different eye model showed generally good agreement for all electron energies. Finally, the influence of individual anatomical variability was quantified by positioning the lens at various depths below the cornea. A depth difference of 2 mm between the shallowest and the deepest location of the germinative zone can lead to a difference between the resulting absorbed doses of up to nearly a factor of 5000 for electron energy of 0.7 MeV.

  11. A realistic bi-hemispheric model of the cerebellum uncovers the purpose of the abundant granule cells during motor control.

    Science.gov (United States)

    Pinzon-Morales, Ruben-Dario; Hirata, Yutaka

    2015-01-01

    The cerebellar granule cells (GCs) have been proposed to perform lossless, adaptive spatio-temporal coding of incoming sensory/motor information required by downstream cerebellar circuits to support motor learning, motor coordination, and cognition. Here we use a physio-anatomically inspired bi-hemispheric cerebellar neuronal network (biCNN) to selectively enable/disable the output of GCs and evaluate the behavioral and neural consequences during three different control scenarios. The control scenarios are a simple direct current motor (1 degree of freedom: DOF), an unstable two-wheel balancing robot (2 DOFs), and a simulation model of a quadcopter (6 DOFs). Results showed that adequate control was maintained with a relatively small number of GCs (< 200) in all the control scenarios. However, the minimum number of GCs required to successfully govern each control plant increased with their complexity (i.e., DOFs). It was also shown that increasing the number of GCs resulted in higher robustness against changes in the initialization parameters of the biCNN model (i.e., synaptic connections and synaptic weights). Therefore, we suggest that the abundant GCs in the cerebellar cortex provide the computational power during the large repertoire of motor activities and motor plants the cerebellum is involved with, and bring robustness against changes in the cerebellar microcircuit (e.g., neuronal connections).

  12. A realistic bi-hemispheric model of the cerebellum uncovers the purpose of the abundant granule cells during motor control

    Directory of Open Access Journals (Sweden)

    Ruben Dario Pinzon Morales

    2015-05-01

    Full Text Available The cerebellar granule cells (GCs have been proposed to perform lossless, adaptive spatio-temporal coding of incoming sensory/motor information required by downstream cerebellar circuits to textcolor{red}{support} motor learning, motor coordination, and cognition. Here we use a physio-anatomically inspired bi-hemispheric cerebellar neuronal network (biCNN to selectively enable/disable the output of GCs and evaluate the behavioral and neural consequences during three different control scenarios. The control scenarios are a simple direct current motor (1 degree of freedom: DOF, an unstable two-wheel balancing robot (2 DOFs, and a simulation model of a quadcopter (6 DOFs. Results showed that adequate control was maintained with a relatively small number of GCs ($<$ 200 in all the control scenarios. However, the minimum number of GCs required to successfully govern each control plant increased with their complexity (i.e., DOFs. It was also shown that increasing the number of GCs resulted in higher robustness against changes in the initialization parameters of the biCNN model (i.e., synaptic connections and synaptic weights. Therefore, we suggest that the abundant GCs in the cerebellar cortex provide the computational power during the large repertoire of motor activities and motor plants the cerebellum is involved with, and bring robustness against changes in the cerebellar microcircuit (e.g., neuronal connections.

  13. Dose conversion coefficients for monoenergetic electrons incident on a realistic human eye model with different lens cell populations

    International Nuclear Information System (INIS)

    Nogueira, P; Vaz, P; Zankl, M; Schlattl, H

    2011-01-01

    The radiation-induced posterior subcapsular cataract has long been generally accepted to be a deterministic effect that does not occur at doses below a threshold of at least 2 Gy. Recent epidemiological studies indicate that the threshold for cataract induction may be much lower or that there may be no threshold at all. A thorough study of this subject requires more accurate dose estimates for the eye lens than those available in ICRP Publication 74. Eye lens absorbed dose per unit fluence conversion coefficients for electron irradiation were calculated using a geometrical model of the eye that takes into account different cell populations of the lens epithelium, together with the MCNPX Monte Carlo radiation transport code package. For the cell population most sensitive to ionizing radiation-the germinative cells-absorbed dose per unit fluence conversion coefficients were determined that are up to a factor of 4.8 higher than the mean eye lens absorbed dose conversion coefficients for electron energies below 2 MeV. Comparison of the results with previously published values for a slightly different eye model showed generally good agreement for all electron energies. Finally, the influence of individual anatomical variability was quantified by positioning the lens at various depths below the cornea. A depth difference of 2 mm between the shallowest and the deepest location of the germinative zone can lead to a difference between the resulting absorbed doses of up to nearly a factor of 5000 for electron energy of 0.7 MeV.

  14. Getting realistic; Endstation Demut

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.P.

    2004-01-28

    The fuel cell hype of the turn of the millenium has reached its end. The industry is getting realistic. If at all, fuel cell systems for private single-family and multiple dwellings will not be available until the next decade. With a Europe-wide field test, Vaillant intends to advance the PEM technology. [German] Der Brennstoffzellen-Hype der Jahrtausendwende ist verfolgen. Die Branche uebt sich in Bescheidenheit. Die Marktreife der Systeme fuer Ein- und Mehrfamilienhaeuser wird - wenn ueberhaupt - wohl erst im naechsten Jahrzehnt erreicht sein. Vaillant will durch einen europaweiten Feldtest die Entwicklung der PEM-Technologie vorantreiben. (orig.)

  15. Learning (from) the errors of a systems biology model.

    Science.gov (United States)

    Engelhardt, Benjamin; Frőhlich, Holger; Kschischo, Maik

    2016-02-11

    Mathematical modelling is a labour intensive process involving several iterations of testing on real data and manual model modifications. In biology, the domain knowledge guiding model development is in many cases itself incomplete and uncertain. A major problem in this context is that biological systems are open. Missed or unknown external influences as well as erroneous interactions in the model could thus lead to severely misleading results. Here we introduce the dynamic elastic-net, a data driven mathematical method which automatically detects such model errors in ordinary differential equation (ODE) models. We demonstrate for real and simulated data, how the dynamic elastic-net approach can be used to automatically (i) reconstruct the error signal, (ii) identify the target variables of model error, and (iii) reconstruct the true system state even for incomplete or preliminary models. Our work provides a systematic computational method facilitating modelling of open biological systems under uncertain knowledge.

  16. A New Multi-Gaussian Auto-Correlation Function for the Modeling of Realistic Shot Peened Random Rough Surfaces

    International Nuclear Information System (INIS)

    Hassan, W.; Blodgett, M.

    2006-01-01

    Shot peening is the primary surface treatment used to create a uniform, consistent, and reliable sub-surface compressive residual stress layer in aero engine components. A by-product of the shot peening process is random surface roughness that can affect the measurements of the resulting residual stresses and therefore impede their NDE assessment. High frequency eddy current conductivity measurements have the potential to assess these residual stresses in Ni-base super alloys. However, the effect of random surface roughness is expected to become significant in the desired measurement frequency range of 10 to 100 MHz. In this paper, a new Multi-Gaussian (MG) auto-correlation function is proposed for modeling the resulting pseudo-random rough profiles. Its use in the calculation of the Apparent Eddy Current Conductivity (AECC) loss due to surface roughness is demonstrated. The numerical results presented need to be validated with experimental measurements

  17. BSim: an agent-based tool for modeling bacterial populations in systems and synthetic biology.

    Directory of Open Access Journals (Sweden)

    Thomas E Gorochowski

    Full Text Available Large-scale collective behaviors such as synchronization and coordination spontaneously arise in many bacterial populations. With systems biology attempting to understand these phenomena, and synthetic biology opening up the possibility of engineering them for our own benefit, there is growing interest in how bacterial populations are best modeled. Here we introduce BSim, a highly flexible agent-based computational tool for analyzing the relationships between single-cell dynamics and population level features. BSim includes reference implementations of many bacterial traits to enable the quick development of new models partially built from existing ones. Unlike existing modeling tools, BSim fully considers spatial aspects of a model allowing for the description of intricate micro-scale structures, enabling the modeling of bacterial behavior in more realistic three-dimensional, complex environments. The new opportunities that BSim opens are illustrated through several diverse examples covering: spatial multicellular computing, modeling complex environments, population dynamics of the lac operon, and the synchronization of genetic oscillators. BSim is open source software that is freely available from http://bsim-bccs.sf.net and distributed under the Open Source Initiative (OSI recognized MIT license. Developer documentation and a wide range of example simulations are also available from the website. BSim requires Java version 1.6 or higher.

  18. Automatic skull segmentation from MR images for realistic volume conductor models of the head: Assessment of the state-of-the-art.

    Science.gov (United States)

    Nielsen, Jesper D; Madsen, Kristoffer H; Puonti, Oula; Siebner, Hartwig R; Bauer, Christian; Madsen, Camilla Gøbel; Saturnino, Guilherme B; Thielscher, Axel

    2018-03-12

    Anatomically realistic volume conductor models of the human head are important for accurate forward modeling of the electric field during transcranial brain stimulation (TBS), electro- (EEG) and magnetoencephalography (MEG). In particular, the skull compartment exerts a strong influence on the field distribution due to its low conductivity, suggesting the need to represent its geometry accurately. However, automatic skull reconstruction from structural magnetic resonance (MR) images is difficult, as compact bone has a very low signal in magnetic resonance imaging (MRI). Here, we evaluate three methods for skull segmentation, namely FSL BET2, the unified segmentation routine of SPM12 with extended spatial tissue priors, and the skullfinder tool of BrainSuite. To our knowledge, this study is the first to rigorously assess the accuracy of these state-of-the-art tools by comparison with CT-based skull segmentations on a group of ten subjects. We demonstrate several key factors that improve the segmentation quality, including the use of multi-contrast MRI data, the optimization of the MR sequences and the adaptation of the parameters of the segmentation methods. We conclude that FSL and SPM12 achieve better skull segmentations than BrainSuite. The former methods obtain reasonable results for the upper part of the skull when a combination of T1- and T2-weighted images is used as input. The SPM12-based results can be improved slightly further by means of simple morphological operations to fix local defects. In contrast to FSL BET2, the SPM12-based segmentation with extended spatial tissue priors and the BrainSuite-based segmentation provide coarse reconstructions of the vertebrae, enabling the construction of volume conductor models that include the neck. We exemplarily demonstrate that the extended models enable a more accurate estimation of the electric field distribution during transcranial direct current stimulation (tDCS) for montages that involve extraencephalic

  19. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  20. Mathematical manipulative models: in defense of "beanbag biology".

    Science.gov (United States)

    Jungck, John R; Gaff, Holly; Weisstein, Anton E

    2010-01-01

    Mathematical manipulative models have had a long history of influence in biological research and in secondary school education, but they are frequently neglected in undergraduate biology education. By linking mathematical manipulative models in a four-step process-1) use of physical manipulatives, 2) interactive exploration of computer simulations, 3) derivation of mathematical relationships from core principles, and 4) analysis of real data sets-we demonstrate a process that we have shared in biological faculty development workshops led by staff from the BioQUEST Curriculum Consortium over the past 24 yr. We built this approach based upon a broad survey of literature in mathematical educational research that has convincingly demonstrated the utility of multiple models that involve physical, kinesthetic learning to actual data and interactive simulations. Two projects that use this approach are introduced: The Biological Excel Simulations and Tools in Exploratory, Experiential Mathematics (ESTEEM) Project (http://bioquest.org/esteem) and Numerical Undergraduate Mathematical Biology Education (NUMB3R5 COUNT; http://bioquest.org/numberscount). Examples here emphasize genetics, ecology, population biology, photosynthesis, cancer, and epidemiology. Mathematical manipulative models help learners break through prior fears to develop an appreciation for how mathematical reasoning informs problem solving, inference, and precise communication in biology and enhance the diversity of quantitative biology education.

  1. Development of a Value Inquiry Model in Biology Education.

    Science.gov (United States)

    Jeong, Eun-Young; Kim, Young-Soo

    2000-01-01

    Points out the rapid advances in biology, increasing bioethical issues, and how students need to make rational decisions. Introduces a value inquiry model development that includes identifying and clarifying value problems; understanding biological knowledge related to conflict situations; considering, selecting, and evaluating each alternative;…

  2. SEEK: a systems biology data and model management platform.

    NARCIS (Netherlands)

    Wolstencroft, K.J.; Owen, S.; Krebs, O.; Nguyen, Q.; Stanford, N.J.; Golebiewski, M.; Weidemann, A.; Bittkowski, M.; An, L.; Shockley, D.; Snoep, J.L.; Mueller, W.; Goble, C.

    2015-01-01

    Background: Systems biology research typically involves the integration and analysis of heterogeneous data types in order to model and predict biological processes. Researchers therefore require tools and resources to facilitate the sharing and integration of data, and for linking of data to systems

  3. Unified Deep Learning Architecture for Modeling Biology Sequence.

    Science.gov (United States)

    Wu, Hongjie; Cao, Chengyuan; Xia, Xiaoyan; Lu, Qiang

    2017-10-09

    Prediction of the spatial structure or function of biological macromolecules based on their sequence remains an important challenge in bioinformatics. When modeling biological sequences using traditional sequencing models, characteristics, such as long-range interactions between basic units, the complicated and variable output of labeled structures, and the variable length of biological sequences, usually lead to different solutions on a case-by-case basis. This study proposed the use of bidirectional recurrent neural networks based on long short-term memory or a gated recurrent unit to capture long-range interactions by designing the optional reshape operator to adapt to the diversity of the output labels and implementing a training algorithm to support the training of sequence models capable of processing variable-length sequences. Additionally, the merge and pooling operators enhanced the ability to capture short-range interactions between basic units of biological sequences. The proposed deep-learning model and its training algorithm might be capable of solving currently known biological sequence-modeling problems through the use of a unified framework. We validated our model on one of the most difficult biological sequence-modeling problems currently known, with our results indicating the ability of the model to obtain predictions of protein residue interactions that exceeded the accuracy of current popular approaches by 10% based on multiple benchmarks.

  4. Genome Scale Modeling in Systems Biology: Algorithms and Resources

    Science.gov (United States)

    Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali

    2014-01-01

    In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031

  5. The Comparison of Think Talk Write and Think Pair Share Model with Realistic Mathematics Education Approach Viewed from Mathematical-Logical Intelligence

    Directory of Open Access Journals (Sweden)

    Himmatul Afthina

    2017-12-01

    Full Text Available The aims of this research to determine the effect of Think Talk Write (TTW and Think Pair Share (TPS model with Realistic Mathematics Education (RME approach viewed from mathematical-logical intelligence. This research employed the quasi experimental research. The population of research was all students of the eight graders of junior high school in Karangamyar Regency in academic year 2016/2017. The result of this research shows that (1 TTW with RME approach gave better mathematics achievement than TPS with RME approach, (2 Students with high mathematical-logical intelligence can reach a better mathematics achievement than those with average and low, whereas students with average mathematical-logical intelligence can reach a better achievement than those with low one, (3 In TTW model with RME approach, students with high mathematical-logical intelligence can reach a better mathematics achievement than those with average and low, whereas students with average and low mathematical-logical intelligence gave same mathematics achievement, and  in TPS model with RME approach students with high mathematical-logical intelligence can reach a better mathematics achievement than those with average and low, whereas students with average mathematical-logical intelligence can reach a better achievement than those with low one (4 In each category of  mathematical-logical intelligence, TTW with RME approach and TPS with RME approach gave same mathematics achievement.

  6. Theoretical Biology and Medical Modelling: ensuring continued growth and future leadership.

    Science.gov (United States)

    Nishiura, Hiroshi; Rietman, Edward A; Wu, Rongling

    2013-07-11

    Theoretical biology encompasses a broad range of biological disciplines ranging from mathematical biology and biomathematics to philosophy of biology. Adopting a broad definition of "biology", Theoretical Biology and Medical Modelling, an open access journal, considers original research studies that focus on theoretical ideas and models associated with developments in biology and medicine.

  7. Development of a kinetic model for biological sulphate reduction ...

    African Journals Online (AJOL)

    A two-phase (aqueous/gas) physical, biological and chemical processes ... Additionally, the background weak acid/base chemistry for water, carbonate, ... in the UCTADM1 model, and hence the physical gas exchange for sulphide is included.

  8. Mathematical models in biology bringing mathematics to life

    CERN Document Server

    Ferraro, Maria; Guarracino, Mario

    2015-01-01

    This book presents an exciting collection of contributions based on the workshop “Bringing Maths to Life” held October 27-29, 2014 in Naples, Italy.  The state-of-the art research in biology and the statistical and analytical challenges facing huge masses of data collection are treated in this Work. Specific topics explored in depth surround the sessions and special invited sessions of the workshop and include genetic variability via differential expression, molecular dynamics and modeling, complex biological systems viewed from quantitative models, and microscopy images processing, to name several. In depth discussions of the mathematical analysis required to extract insights from complex bodies of biological datasets, to aid development in the field novel algorithms, methods and software tools for genetic variability, molecular dynamics, and complex biological systems are presented in this book. Researchers and graduate students in biology, life science, and mathematics/statistics will find the content...

  9. Modeling dynamics of biological and chemical components of aquatic ecosystems

    International Nuclear Information System (INIS)

    Lassiter, R.R.

    1975-05-01

    To provide capability to model aquatic ecosystems or their subsystems as needed for particular research goals, a modeling strategy was developed. Submodels of several processes common to aquatic ecosystems were developed or adapted from previously existing ones. Included are submodels for photosynthesis as a function of light and depth, biological growth rates as a function of temperature, dynamic chemical equilibrium, feeding and growth, and various types of losses to biological populations. These submodels may be used as modules in the construction of models of subsystems or ecosystems. A preliminary model for the nitrogen cycle subsystem was developed using the modeling strategy and applicable submodels. (U.S.)

  10. Systematic integration of experimental data and models in systems biology.

    Science.gov (United States)

    Li, Peter; Dada, Joseph O; Jameson, Daniel; Spasic, Irena; Swainston, Neil; Carroll, Kathleen; Dunn, Warwick; Khan, Farid; Malys, Naglis; Messiha, Hanan L; Simeonidis, Evangelos; Weichart, Dieter; Winder, Catherine; Wishart, Jill; Broomhead, David S; Goble, Carole A; Gaskell, Simon J; Kell, Douglas B; Westerhoff, Hans V; Mendes, Pedro; Paton, Norman W

    2010-11-29

    The behaviour of biological systems can be deduced from their mathematical models. However, multiple sources of data in diverse forms are required in the construction of a model in order to define its components and their biochemical reactions, and corresponding parameters. Automating the assembly and use of systems biology models is dependent upon data integration processes involving the interoperation of data and analytical resources. Taverna workflows have been developed for the automated assembly of quantitative parameterised metabolic networks in the Systems Biology Markup Language (SBML). A SBML model is built in a systematic fashion by the workflows which starts with the construction of a qualitative network using data from a MIRIAM-compliant genome-scale model of yeast metabolism. This is followed by parameterisation of the SBML model with experimental data from two repositories, the SABIO-RK enzyme kinetics database and a database of quantitative experimental results. The models are then calibrated and simulated in workflows that call out to COPASIWS, the web service interface to the COPASI software application for analysing biochemical networks. These systems biology workflows were evaluated for their ability to construct a parameterised model of yeast glycolysis. Distributed information about metabolic reactions that have been described to MIRIAM standards enables the automated assembly of quantitative systems biology models of metabolic networks based on user-defined criteria. Such data integration processes can be implemented as Taverna workflows to provide a rapid overview of the components and their relationships within a biochemical system.

  11. Some Issues of Biological Shape Modelling with Applications

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Hilger, Klaus Baggesen; Skoglund, Karl

    2003-01-01

    This paper illustrates current research at Informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations to, modifications to, and applications of the elements of constructing models of shape or appearance...

  12. Entropy stable modeling of non-isothermal multi-component diffuse-interface two-phase flows with realistic equations of state

    KAUST Repository

    Kou, Jisheng; Sun, Shuyu

    2018-01-01

    In this paper, we consider mathematical modeling and numerical simulation of non-isothermal compressible multi-component diffuse-interface two-phase flows with realistic equations of state. A general model with general reference velocity is derived rigorously through thermodynamical laws and Onsager's reciprocal principle, and it is capable of characterizing compressibility and partial miscibility between multiple fluids. We prove a novel relation among the pressure, temperature and chemical potentials, which results in a new formulation of the momentum conservation equation indicating that the gradients of chemical potentials and temperature become the primary driving force of the fluid motion except for the external forces. A key challenge in numerical simulation is to develop entropy stable numerical schemes preserving the laws of thermodynamics. Based on the convex-concave splitting of Helmholtz free energy density with respect to molar densities and temperature, we propose an entropy stable numerical method, which solves the total energy balance equation directly, and thus, naturally satisfies the first law of thermodynamics. Unconditional entropy stability (the second law of thermodynamics) of the proposed method is proved by estimating the variations of Helmholtz free energy and kinetic energy with time steps. Numerical results validate the proposed method.

  13. Entropy stable modeling of non-isothermal multi-component diffuse-interface two-phase flows with realistic equations of state

    KAUST Repository

    Kou, Jisheng

    2018-02-25

    In this paper, we consider mathematical modeling and numerical simulation of non-isothermal compressible multi-component diffuse-interface two-phase flows with realistic equations of state. A general model with general reference velocity is derived rigorously through thermodynamical laws and Onsager\\'s reciprocal principle, and it is capable of characterizing compressibility and partial miscibility between multiple fluids. We prove a novel relation among the pressure, temperature and chemical potentials, which results in a new formulation of the momentum conservation equation indicating that the gradients of chemical potentials and temperature become the primary driving force of the fluid motion except for the external forces. A key challenge in numerical simulation is to develop entropy stable numerical schemes preserving the laws of thermodynamics. Based on the convex-concave splitting of Helmholtz free energy density with respect to molar densities and temperature, we propose an entropy stable numerical method, which solves the total energy balance equation directly, and thus, naturally satisfies the first law of thermodynamics. Unconditional entropy stability (the second law of thermodynamics) of the proposed method is proved by estimating the variations of Helmholtz free energy and kinetic energy with time steps. Numerical results validate the proposed method.

  14. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Realistic multisite lattice-gas modeling and KMC simulation of catalytic surface reactions: Kinetics and multiscale spatial behavior for CO-oxidation on metal (1 0 0) surfaces

    Science.gov (United States)

    Liu, Da-Jiang; Evans, James W.

    2013-12-01

    A realistic molecular-level description of catalytic reactions on single-crystal metal surfaces can be provided by stochastic multisite lattice-gas (msLG) models. This approach has general applicability, although in this report, we will focus on the example of CO-oxidation on the unreconstructed fcc metal (1 0 0) or M(1 0 0) surfaces of common catalyst metals M = Pd, Rh, Pt and Ir (i.e., avoiding regimes where Pt and Ir reconstruct). These models can capture the thermodynamics and kinetics of adsorbed layers for the individual reactants species, such as CO/M(1 0 0) and O/M(1 0 0), as well as the interaction and reaction between different reactant species in mixed adlayers, such as (CO + O)/M(1 0 0). The msLG models allow population of any of hollow, bridge, and top sites. This enables a more flexible and realistic description of adsorption and adlayer ordering, as well as of reaction configurations and configuration-dependent barriers. Adspecies adsorption and interaction energies, as well as barriers for various processes, constitute key model input. The choice of these energies is guided by experimental observations, as well as by extensive Density Functional Theory analysis. Model behavior is assessed via Kinetic Monte Carlo (KMC) simulation. We also address the simulation challenges and theoretical ramifications associated with very rapid diffusion and local equilibration of reactant adspecies such as CO. These msLG models are applied to describe adsorption, ordering, and temperature programmed desorption (TPD) for individual CO/M(1 0 0) and O/M(1 0 0) reactant adlayers. In addition, they are also applied to predict mixed (CO + O)/M(1 0 0) adlayer structure on the nanoscale, the complete bifurcation diagram for reactive steady-states under continuous flow conditions, temperature programmed reaction (TPR) spectra, and titration reactions for the CO-oxidation reaction. Extensive and reasonably successful comparison of model predictions is made with experimental

  16. Multiscale modeling of emergent materials: biological and soft matter

    DEFF Research Database (Denmark)

    Murtola, Teemu; Bunker, Alex; Vattulainen, Ilpo

    2009-01-01

    In this review, we focus on four current related issues in multiscale modeling of soft and biological matter. First, we discuss how to use structural information from detailed models (or experiments) to construct coarse-grained ones in a hierarchical and systematic way. This is discussed in the c......In this review, we focus on four current related issues in multiscale modeling of soft and biological matter. First, we discuss how to use structural information from detailed models (or experiments) to construct coarse-grained ones in a hierarchical and systematic way. This is discussed...

  17. Statistical Model Checking for Biological Systems

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2014-01-01

    Statistical Model Checking (SMC) is a highly scalable simulation-based verification approach for testing and estimating the probability that a stochastic system satisfies a given linear temporal property. The technique has been applied to (discrete and continuous time) Markov chains, stochastic...

  18. Nematodes: Model Organisms in High School Biology

    Science.gov (United States)

    Bliss, TJ; Anderson, Margery; Dillman, Adler; Yourick, Debra; Jett, Marti; Adams, Byron J.; Russell, RevaBeth

    2007-01-01

    In a collaborative effort between university researchers and high school science teachers, an inquiry-based laboratory module was designed using two species of insecticidal nematodes to help students apply scientific inquiry and elements of thoughtful experimental design. The learning experience and model are described in this article. (Contains 4…

  19. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  20. Polynomial algebra of discrete models in systems biology.

    Science.gov (United States)

    Veliz-Cuba, Alan; Jarrah, Abdul Salam; Laubenbacher, Reinhard

    2010-07-01

    An increasing number of discrete mathematical models are being published in Systems Biology, ranging from Boolean network models to logical models and Petri nets. They are used to model a variety of biochemical networks, such as metabolic networks, gene regulatory networks and signal transduction networks. There is increasing evidence that such models can capture key dynamic features of biological networks and can be used successfully for hypothesis generation. This article provides a unified framework that can aid the mathematical analysis of Boolean network models, logical models and Petri nets. They can be represented as polynomial dynamical systems, which allows the use of a variety of mathematical tools from computer algebra for their analysis. Algorithms are presented for the translation into polynomial dynamical systems. Examples are given of how polynomial algebra can be used for the model analysis. alanavc@vt.edu Supplementary data are available at Bioinformatics online.

  1. SEEK: a systems biology data and model management platform.

    Science.gov (United States)

    Wolstencroft, Katherine; Owen, Stuart; Krebs, Olga; Nguyen, Quyen; Stanford, Natalie J; Golebiewski, Martin; Weidemann, Andreas; Bittkowski, Meik; An, Lihua; Shockley, David; Snoep, Jacky L; Mueller, Wolfgang; Goble, Carole

    2015-07-11

    Systems biology research typically involves the integration and analysis of heterogeneous data types in order to model and predict biological processes. Researchers therefore require tools and resources to facilitate the sharing and integration of data, and for linking of data to systems biology models. There are a large number of public repositories for storing biological data of a particular type, for example transcriptomics or proteomics, and there are several model repositories. However, this silo-type storage of data and models is not conducive to systems biology investigations. Interdependencies between multiple omics datasets and between datasets and models are essential. Researchers require an environment that will allow the management and sharing of heterogeneous data and models in the context of the experiments which created them. The SEEK is a suite of tools to support the management, sharing and exploration of data and models in systems biology. The SEEK platform provides an access-controlled, web-based environment for scientists to share and exchange data and models for day-to-day collaboration and for public dissemination. A plug-in architecture allows the linking of experiments, their protocols, data, models and results in a configurable system that is available 'off the shelf'. Tools to run model simulations, plot experimental data and assist with data annotation and standardisation combine to produce a collection of resources that support analysis as well as sharing. Underlying semantic web resources additionally extract and serve SEEK metadata in RDF (Resource Description Format). SEEK RDF enables rich semantic queries, both within SEEK and between related resources in the web of Linked Open Data. The SEEK platform has been adopted by many systems biology consortia across Europe. It is a data management environment that has a low barrier of uptake and provides rich resources for collaboration. This paper provides an update on the functions and

  2. Biological-Mathematical Modeling of Chronic Toxicity.

    Science.gov (United States)

    1981-07-22

    34Mathematical Model of Uptake and Distribution," Uptake and Distribution of Anesthetic Agents, E. M. Papper and R. J. Kitz (Editors, McGraw-Hill Book Co., Inc...distribution, In: Papper , E.M. and Kltz, R.J.(eds.) Uptake and distribution of anesthetic agents, McGraw- Hill, New York, p. 72 3. Plpleson, W.W...1963) Quantitative prediction of anesthetic concentrations. In: Papper , E.M. and Kitz, R.J. (eds.) Uptake and distribution of anesthetic agents, McGraw

  3. Modelling effective dielectric properties of materials containing diverse types of biological cells

    International Nuclear Information System (INIS)

    Huclova, Sonja; Froehlich, Juerg; Erni, Daniel

    2010-01-01

    An efficient and versatile numerical method for the generation of different realistically shaped biological cells is developed. This framework is used to calculate the dielectric spectra of materials containing specific types of biological cells. For the generation of the numerical models of the cells a flexible parametrization method based on the so-called superformula is applied including the option of obtaining non-axisymmetric shapes such as box-shaped cells and even shapes corresponding to echinocytes. The dielectric spectra of effective media containing various cell morphologies are calculated focusing on the dependence of the spectral features on the cell shape. The numerical method is validated by comparing a model of spherical inclusions at a low volume fraction with the analytical solution obtained by the Maxwell-Garnett mixing formula, resulting in good agreement. Our simulation data for different cell shapes suggest that around 1MHz the effective dielectric properties of different cell shapes at different volume fractions significantly deviate from the spherical case. The most pronounced change exhibits ε eff between 0.1 and 1 MHz with a deviation of up to 35% for a box-shaped cell and 15% for an echinocyte compared with the sphere at a volume fraction of 0.4. This hampers the unique interpretation of changes in cellular features measured by dielectric spectroscopy when simplified material models are used.

  4. Modeling life the mathematics of biological systems

    CERN Document Server

    Garfinkel, Alan; Guo, Yina

    2017-01-01

    From predator-prey populations in an ecosystem, to hormone regulation within the body, the natural world abounds in dynamical systems that affect us profoundly. This book develops the mathematical tools essential for students in the life sciences to describe these interacting systems and to understand and predict their behavior. Complex feedback relations and counter-intuitive responses are common in dynamical systems in nature; this book develops the quantitative skills needed to explore these interactions. Differential equations are the natural mathematical tool for quantifying change, and are the driving force throughout this book. The use of Euler’s method makes nonlinear examples tractable and accessible to a broad spectrum of early-stage undergraduates, thus providing a practical alternative to the procedural approach of a traditional Calculus curriculum. Tools are developed within numerous, relevant examples, with an emphasis on the construction, evaluation, and interpretation of mathematical models ...

  5. Profiling the biological activity of oxide nanomaterials with mechanistic models

    NARCIS (Netherlands)

    Burello, E.

    2013-01-01

    In this study we present three mechanistic models for profiling the potential biological and toxicological effects of oxide nanomaterials. The models attempt to describe the reactivity, protein adsorption and membrane adhesion processes of a large range of oxide materials and are based on properties

  6. Building executable biological pathway models automatically from BioPAX

    NARCIS (Netherlands)

    Willemsen, Timo; Feenstra, Anton; Groth, Paul

    2013-01-01

    The amount of biological data exposed in semantic formats is steadily increasing. In particular, pathway information (a model of how molecules interact within a cell) from databases such as KEGG and WikiPathways are available in a standard RDF-based format BioPAX. However, these models are

  7. Multi-level and hybrid modelling approaches for systems biology.

    Science.gov (United States)

    Bardini, R; Politano, G; Benso, A; Di Carlo, S

    2017-01-01

    During the last decades, high-throughput techniques allowed for the extraction of a huge amount of data from biological systems, unveiling more of their underling complexity. Biological systems encompass a wide range of space and time scales, functioning according to flexible hierarchies of mechanisms making an intertwined and dynamic interplay of regulations. This becomes particularly evident in processes such as ontogenesis, where regulative assets change according to process context and timing, making structural phenotype and architectural complexities emerge from a single cell, through local interactions. The information collected from biological systems are naturally organized according to the functional levels composing the system itself. In systems biology, biological information often comes from overlapping but different scientific domains, each one having its own way of representing phenomena under study. That is, the different parts of the system to be modelled may be described with different formalisms. For a model to have improved accuracy and capability for making a good knowledge base, it is good to comprise different system levels, suitably handling the relative formalisms. Models which are both multi-level and hybrid satisfy both these requirements, making a very useful tool in computational systems biology. This paper reviews some of the main contributions in this field.

  8. Group clinics for young adults with diabetes in an ethnically diverse, socioeconomically deprived setting (TOGETHER study): protocol for a realist review, co-design and mixed methods, participatory evaluation of a new care model.

    Science.gov (United States)

    Papoutsi, Chrysanthi; Hargreaves, Dougal; Colligan, Grainne; Hagell, Ann; Patel, Anita; Campbell-Richards, Desirée; Viner, Russell M; Vijayaraghavan, Shanti; Marshall, Martin; Greenhalgh, Trisha; Finer, Sarah

    2017-06-21

    Young adults with diabetes often report dissatisfaction with care and have poor diabetes-related health outcomes. As diabetes prevalence continues to rise, group-based care could provide a sustainable alternative to traditional one-to-one consultations, by engaging young people through life stage-, context- and culturally-sensitive approaches. In this study, we will co-design and evaluate a group-based care model for young adults with diabetes and complex health and social needs in socioeconomically deprived areas. This participatory study will include three phases. In phase 1, we will carry out a realist review to synthesise the literature on group-based care for young adults with diabetes. This theory-driven understanding will provide the basis for phase 2, where we will draw on experience-based co-design methodologies to develop a new, group-based care model for young adults (aged researcher-in-residence approach to implement and evaluate the co-designed group clinic model and compare with traditional care. We will employ qualitative (observations in clinics, patient and staff interviews and document analysis) and quantitative methods (eg, biological markers, patient enablement instrument and diabetes distress scale), including a cost analysis. National Health Service ethics approval has been granted (reference 17/NI/0019). The project will directly inform service redesign to better meet the needs of young adults with diabetes in socioeconomically deprived areas and may guide a possible cluster-randomised trial, powered to clinical and cost-effectiveness outcomes. Findings from this study may be transferable to other long-term conditions and/or age groups. Project outputs will include briefing statements, summaries and academic papers, tailored for different audiences, including people living with diabetes, clinicians, policy makers and strategic decision makers. PROSPERO (CRD42017058726). © Article author(s) (or their employer(s) unless otherwise stated in the

  9. Modelling of radio frequency sheath and fast wave coupling on the realistic ion cyclotron resonant antenna surroundings and the outer wall

    Science.gov (United States)

    Lu, L.; Colas, L.; Jacquot, J.; Després, B.; Heuraux, S.; Faudot, E.; Van Eester, D.; Crombé, K.; Křivská, A.; Noterdaeme, J.-M.; Helou, W.; Hillairet, J.

    2018-03-01

    In order to model the sheath rectification in a realistic geometry over the size of ion cyclotron resonant heating (ICRH) antennas, the self-consistent sheaths and waves for ICH (SSWICH) code couples self-consistently the RF wave propagation and the DC SOL biasing via nonlinear RF and DC sheath boundary conditions applied at plasma/wall interfaces. A first version of SSWICH had 2D (toroidal and radial) geometry, rectangular walls either normal or parallel to the confinement magnetic field B 0 and only included the evanescent slow wave (SW) excited parasitically by the ICRH antenna. The main wave for plasma heating, the fast wave (FW) plays no role on the sheath excitation in this version. A new version of the code, 2D SSWICH-full wave, was developed based on the COMSOL software, to accommodate full RF field polarization and shaped walls tilted with respect to B 0 . SSWICH-full wave simulations have shown the mode conversion of FW into SW occurring at the sharp corners where the boundary shape varies rapidly. It has also evidenced ‘far-field’ sheath oscillations appearing at the shaped walls with a relatively long magnetic connection length to the antenna, that are only accessible to the propagating FW. Joint simulation, conducted by SSWICH-full wave within a multi-2D approach excited using the 3D wave coupling code (RAPLICASOL), has recovered the double-hump poloidal structure measured in the experimental temperature and potential maps when only the SW is modelled. The FW contribution on the potential poloidal structure seems to be affected by the 3D effects, which was ignored in the current stage. Finally, SSWICH-full wave simulation revealed the left-right asymmetry that has been observed extensively in the unbalanced strap feeding experiments, suggesting that the spatial proximity effects in RF sheath excitation, studied for SW only previously, is still important in the vicinity of the wave launcher under full wave polarizations.

  10. Development of realistic high-resolution whole-body voxel models of Japanese adult males and females of average height and weight, and application of models to radio-frequency electromagnetic-field dosimetry

    International Nuclear Information System (INIS)

    Nagaoka, Tomoaki; Watanabe, Soichi; Sakurai, Kiyoko; Kunieda, Etsuo; Watanabe, Satoshi; Taki, Masao; Yamanaka, Yukio

    2004-01-01

    With advances in computer performance, the use of high-resolution voxel models of the entire human body has become more frequent in numerical dosimetries of electromagnetic waves. Using magnetic resonance imaging, we have developed realistic high-resolution whole-body voxel models for Japanese adult males and females of average height and weight. The developed models consist of cubic voxels of 2 mm on each side; the models are segmented into 51 anatomic regions. The adult female model is the first of its kind in the world and both are the first Asian voxel models (representing average Japanese) that enable numerical evaluation of electromagnetic dosimetry at high frequencies of up to 3 GHz. In this paper, we will also describe the basic SAR characteristics of the developed models for the VHF/UHF bands, calculated using the finite-difference time-domain method

  11. Predictive modelling of complex agronomic and biological systems.

    Science.gov (United States)

    Keurentjes, Joost J B; Molenaar, Jaap; Zwaan, Bas J

    2013-09-01

    Biological systems are tremendously complex in their functioning and regulation. Studying the multifaceted behaviour and describing the performance of such complexity has challenged the scientific community for years. The reduction of real-world intricacy into simple descriptive models has therefore convinced many researchers of the usefulness of introducing mathematics into biological sciences. Predictive modelling takes such an approach another step further in that it takes advantage of existing knowledge to project the performance of a system in alternating scenarios. The ever growing amounts of available data generated by assessing biological systems at increasingly higher detail provide unique opportunities for future modelling and experiment design. Here we aim to provide an overview of the progress made in modelling over time and the currently prevalent approaches for iterative modelling cycles in modern biology. We will further argue for the importance of versatility in modelling approaches, including parameter estimation, model reduction and network reconstruction. Finally, we will discuss the difficulties in overcoming the mathematical interpretation of in vivo complexity and address some of the future challenges lying ahead. © 2013 John Wiley & Sons Ltd.

  12. An anatomically realistic whole-body pregnant-woman model and specific absorption rates for pregnant-woman exposure to electromagnetic plane waves from 10 MHz to 2 GHz

    International Nuclear Information System (INIS)

    Nagaoka, Tomoaki; Togashi, Toshihiro; Saito, Kazuyuki; Takahashi, Masaharu; Ito, Koichi; Watanabe, Soichi

    2007-01-01

    The numerical dosimetry of pregnant women is an important issue in electromagnetic-field safety. However, an anatomically realistic whole-body pregnant-woman model for electromagnetic dosimetry has not been developed. Therefore, we have developed a high-resolution whole-body model of pregnant women. A new fetus model including inherent tissues of pregnant women was constructed on the basis of abdominal magnetic resonance imaging data of a 26-week-pregnant woman. The whole-body pregnant-woman model was developed by combining the fetus model and a nonpregnant-woman model that was developed previously. The developed model consists of about 7 million cubical voxels of 2 mm size and is segmented into 56 tissues and organs. This pregnant-woman model is the first completely anatomically realistic voxel model that includes a realistic fetus model and enables a numerical simulation of electromagnetic dosimetry up to the gigahertz band. In this paper, we also present the basic specific absorption rate characteristics of the pregnant-woman model exposed to vertically and horizontally polarized electromagnetic waves from 10 MHz to 2 GHz

  13. A distributed approach for parameters estimation in System Biology models

    International Nuclear Information System (INIS)

    Mosca, E.; Merelli, I.; Alfieri, R.; Milanesi, L.

    2009-01-01

    Due to the lack of experimental measurements, biological variability and experimental errors, the value of many parameters of the systems biology mathematical models is yet unknown or uncertain. A possible computational solution is the parameter estimation, that is the identification of the parameter values that determine the best model fitting respect to experimental data. We have developed an environment to distribute each run of the parameter estimation algorithm on a different computational resource. The key feature of the implementation is a relational database that allows the user to swap the candidate solutions among the working nodes during the computations. The comparison of the distributed implementation with the parallel one showed that the presented approach enables a faster and better parameter estimation of systems biology models.

  14. Preservice Biology Teachers' Conceptions About the Tentative Nature of Theories and Models in Biology

    Science.gov (United States)

    Reinisch, Bianca; Krüger, Dirk

    2018-02-01

    In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers ( N = 10) were asked about their understanding of theories and models. They were requested to give reasons why they see theories and models as either tentative or certain constructs. Their conceptions were then compared to philosophers' positions (e.g., Popper, Giere). A category system was developed from the qualitative content analysis of the interviews. These categories include 16 conceptions for theories ( n tentative = 11; n certai n = 5) and 18 conceptions for models ( n tentative = 10; n certain = 8). The analysis of the interviews showed that the preservice teachers gave reasons for the tentativeness or certainty of theories and models either due to their understanding of the terms or due to their understanding of the generation or evaluation of theories and models. Therefore, a variety of different terminology, from different sources, should be used in learning-teaching situations. Additionally, an understanding of which processes lead to the generation, evaluation, and refinement or rejection of theories and models should be discussed with preservice teachers. Within philosophy of science, there has been a shift from theories to models. This should be transferred to educational contexts by firstly highlighting the role of models and also their connections to theories.

  15. Yeast as a Model System to Study Tau Biology

    Directory of Open Access Journals (Sweden)

    Ann De Vos

    2011-01-01

    Full Text Available Hyperphosphorylated and aggregated human protein tau constitutes a hallmark of a multitude of neurodegenerative diseases called tauopathies, exemplified by Alzheimer's disease. In spite of an enormous amount of research performed on tau biology, several crucial questions concerning the mechanisms of tau toxicity remain unanswered. In this paper we will highlight some of the processes involved in tau biology and pathology, focusing on tau phosphorylation and the interplay with oxidative stress. In addition, we will introduce the development of a human tau-expressing yeast model, and discuss some crucial results obtained in this model, highlighting its potential in the elucidation of cellular processes leading to tau toxicity.

  16. BayesMD: flexible biological modeling for motif discovery

    DEFF Research Database (Denmark)

    Tang, Man-Hung Eric; Krogh, Anders; Winther, Ole

    2008-01-01

    We present BayesMD, a Bayesian Motif Discovery model with several new features. Three different types of biological a priori knowledge are built into the framework in a modular fashion. A mixture of Dirichlets is used as prior over nucleotide probabilities in binding sites. It is trained on trans......We present BayesMD, a Bayesian Motif Discovery model with several new features. Three different types of biological a priori knowledge are built into the framework in a modular fashion. A mixture of Dirichlets is used as prior over nucleotide probabilities in binding sites. It is trained...

  17. Biology learning evaluation model in Senior High Schools

    Directory of Open Access Journals (Sweden)

    Sri Utari

    2017-06-01

    Full Text Available The study was to develop a Biology learning evaluation model in senior high schools that referred to the research and development model by Borg & Gall and the logic model. The evaluation model included the components of input, activities, output and outcomes. The developing procedures involved a preliminary study in the form of observation and theoretical review regarding the Biology learning evaluation in senior high schools. The product development was carried out by designing an evaluation model, designing an instrument, performing instrument experiment and performing implementation. The instrument experiment involved teachers and Students from Grade XII in senior high schools located in the City of Yogyakarta. For the data gathering technique and instrument, the researchers implemented observation sheet, questionnaire and test. The questionnaire was applied in order to attain information regarding teacher performance, learning performance, classroom atmosphere and scientific attitude; on the other hand, test was applied in order to attain information regarding Biology concept mastery. Then, for the analysis of instrument construct, the researchers performed confirmatory factor analysis by means of Lisrel 0.80 software and the results of this analysis showed that the evaluation instrument valid and reliable. The construct validity was between 0.43-0.79 while the reliability of measurement model was between 0.88-0.94. Last but not the least, the model feasibility test showed that the theoretical model had been supported by the empirical data.

  18. Modeling of biological intelligence for SCM system optimization.

    Science.gov (United States)

    Chen, Shengyong; Zheng, Yujun; Cattani, Carlo; Wang, Wanliang

    2012-01-01

    This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM) systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms.

  19. Modeling of Biological Intelligence for SCM System Optimization

    Directory of Open Access Journals (Sweden)

    Shengyong Chen

    2012-01-01

    Full Text Available This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms.

  20. Modeling of Biological Intelligence for SCM System Optimization

    Science.gov (United States)

    Chen, Shengyong; Zheng, Yujun; Cattani, Carlo; Wang, Wanliang

    2012-01-01

    This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM) systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms. PMID:22162724

  1. The role of crown architecture for light harvesting and carbon gain in extreme light environments assessed with a structurally realistic 3-D model

    Directory of Open Access Journals (Sweden)

    Valladares, Fernando

    2000-06-01

    Full Text Available Main results from different studies of crown architecture adaptation to extreme light environments are presented. Light capture and carbon gain by plants from low (forest understory and high (open Mediterranean-type ecosystems light environments were simulated with a 3-D model (YPLANT, which was developed specifically to analyse the structural features that determine light interception and photosynthesis at the whole plant level. Distantly related taxa with contrasting architectures exhibited similar efficiencies of light interception (functional convergence. Between habitats large differences in architecture existed depending on whether light capture must be maximised or whether excess photon flux density must be avoided. These differences are realised both at the species level and within a species because of plastic adjustments of crown architecture to the external light environment. Realistic, 3-D architectural models are indispensable tools in this kind of comparative studies due to the intrinsic complexity of plant architecture. Their efficient development requires a fluid exchange of ideas between botanists, ecologists and plant modellers.Se presentan los resultados principales de varios estudios sobre las adaptaciones del follaje a ambientes lumínicos extremos. Plantas de ambientes oscuros (sotobosques de bosques templados y tropicales y de ambientes muy luminosos (ecosistemas abiertos de tipo Mediterráneo han sido estudiadas mediante un modelo (YPLANT que permite la reconstrucción tridimensional de la parte aérea de las plantas e identificar los rasgos estructurales que determinan la interceptación de luz y la fotosíntesis y transpiraci6n potencial a nivel de toda la copa. Taxones no relacionados y con arquitecturas muy diferentes mostraron una eficiencia en la interceptaci6n de luz similar (convergencia funcional. La comparación entre hábitat revelo grandes diferencias arquitecturales dependiendo de si la absorción de luz deb

  2. Boolean Models of Biological Processes Explain Cascade-Like Behavior.

    Science.gov (United States)

    Chen, Hao; Wang, Guanyu; Simha, Rahul; Du, Chenghang; Zeng, Chen

    2016-01-29

    Biological networks play a key role in determining biological function and therefore, an understanding of their structure and dynamics is of central interest in systems biology. In Boolean models of such networks, the status of each molecule is either "on" or "off" and along with the molecules interact with each other, their individual status changes from "on" to "off" or vice-versa and the system of molecules in the network collectively go through a sequence of changes in state. This sequence of changes is termed a biological process. In this paper, we examine the common perception that events in biomolecular networks occur sequentially, in a cascade-like manner, and ask whether this is likely to be an inherent property. In further investigations of the budding and fission yeast cell-cycle, we identify two generic dynamical rules. A Boolean system that complies with these rules will automatically have a certain robustness. By considering the biological requirements in robustness and designability, we show that those Boolean dynamical systems, compared to an arbitrary dynamical system, statistically present the characteristics of cascadeness and sequentiality, as observed in the budding and fission yeast cell- cycle. These results suggest that cascade-like behavior might be an intrinsic property of biological processes.

  3. A phenomenological biological dose model for proton therapy based on linear energy transfer spectra.

    Science.gov (United States)

    Rørvik, Eivind; Thörnqvist, Sara; Stokkevåg, Camilla H; Dahle, Tordis J; Fjaera, Lars Fredrik; Ytre-Hauge, Kristian S

    2017-06-01

    The relative biological effectiveness (RBE) of protons varies with the radiation quality, quantified by the linear energy transfer (LET). Most phenomenological models employ a linear dependency of the dose-averaged LET (LET d ) to calculate the biological dose. However, several experiments have indicated a possible non-linear trend. Our aim was to investigate if biological dose models including non-linear LET dependencies should be considered, by introducing a LET spectrum based dose model. The RBE-LET relationship was investigated by fitting of polynomials from 1st to 5th degree to a database of 85 data points from aerobic in vitro experiments. We included both unweighted and weighted regression, the latter taking into account experimental uncertainties. Statistical testing was performed to decide whether higher degree polynomials provided better fits to the data as compared to lower degrees. The newly developed models were compared to three published LET d based models for a simulated spread out Bragg peak (SOBP) scenario. The statistical analysis of the weighted regression analysis favored a non-linear RBE-LET relationship, with the quartic polynomial found to best represent the experimental data (P = 0.010). The results of the unweighted regression analysis were on the borderline of statistical significance for non-linear functions (P = 0.053), and with the current database a linear dependency could not be rejected. For the SOBP scenario, the weighted non-linear model estimated a similar mean RBE value (1.14) compared to the three established models (1.13-1.17). The unweighted model calculated a considerably higher RBE value (1.22). The analysis indicated that non-linear models could give a better representation of the RBE-LET relationship. However, this is not decisive, as inclusion of the experimental uncertainties in the regression analysis had a significant impact on the determination and ranking of the models. As differences between the models were

  4. Realistic Simulation of Rice Plant

    Directory of Open Access Journals (Sweden)

    Wei-long DING

    2011-09-01

    Full Text Available The existing research results of virtual modeling of rice plant, however, is far from perfect compared to that of other crops due to its complex structure and growth process. Techniques to visually simulate the architecture of rice plant and its growth process are presented based on the analysis of the morphological characteristics at different stages. Firstly, the simulations of geometrical shape, the bending status and the structural distortion of rice leaves are conducted. Then, by using an improved model for bending deformation, the curved patterns of panicle axis and various types of panicle branches are generated, and the spatial shape of rice panicle is therefore created. Parametric L-system is employed to generate its topological structures, and finite-state automaton is adopted to describe the development of geometrical structures. Finally, the computer visualization of three-dimensional morphologies of rice plant at both organ and individual levels is achieved. The experimental results showed that the proposed methods of modeling the three-dimensional shapes of organs and simulating the growth of rice plant are feasible and effective, and the generated three-dimensional images are realistic.

  5. Quantitative relationship between SAR and temperature rise inside eyeball in a realistic human heat model for 1.5 GHz-microwave exposure; 1.5GHz maikuroha wo abita tobu real model ni okeru gankyunai no hikyushuritsu to josho ondo tono teiryo kankei

    Energy Technology Data Exchange (ETDEWEB)

    Takai, K.; Fujiwara, O. [Nagoya Institute of Technology, Nagoya (Japan)

    1997-12-20

    For investigating biological effects of a localized SAR (specific absorption rate) deposited in a human body for electromagnetic wave exposure, it is indispensable to graps a temperature-rise inside a human brain including the control center for the body temperature. This paper numerically analyzes a temperature-rise inside an eyeball of our developed realistic head model for 1.5 GHz microwave exposure, using the FD-TD (finite-difference time-domain) method. The computed results are validated in comparison with the data obtained by Taflove and his colleague. In order to examine a quantitative relationship between the localized SAR and temperature-rise, we also obtained a tissue amount over which the localized SAR should be averaged so as to well reflect the temperature-rise distribution inside the eyeball. 15 refs., 9 figs., 3 tabs.

  6. A Radiosity Approach to Realistic Image Synthesis

    Science.gov (United States)

    1992-12-01

    AD-A259 082 AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE SYNTHESIS THESIS Richard L. Remington Captain, USAF fl ECTE AFIT/GCE/ENG/92D...09 SJANl 1993U 93-00134 Approved for public release; distribution unlimited 93& 1! A -A- AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE...assistance in creating the input geometry file for the AWACS aircraft interior. Without his assistance, a good model for the diffuse radiosity implementation

  7. Generative models versus underlying symmetries to explain biological pattern.

    Science.gov (United States)

    Frank, S A

    2014-06-01

    Mathematical models play an increasingly important role in the interpretation of biological experiments. Studies often present a model that generates the observations, connecting hypothesized process to an observed pattern. Such generative models confirm the plausibility of an explanation and make testable hypotheses for further experiments. However, studies rarely consider the broad family of alternative models that match the same observed pattern. The symmetries that define the broad class of matching models are in fact the only aspects of information truly revealed by observed pattern. Commonly observed patterns derive from simple underlying symmetries. This article illustrates the problem by showing the symmetry associated with the observed rate of increase in fitness in a constant environment. That underlying symmetry reveals how each particular generative model defines a single example within the broad class of matching models. Further progress on the relation between pattern and process requires deeper consideration of the underlying symmetries. © 2014 The Author. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.

  8. A framework to establish credibility of computational models in biology.

    Science.gov (United States)

    Patterson, Eann A; Whelan, Maurice P

    2017-10-01

    Computational models in biology and biomedical science are often constructed to aid people's understanding of phenomena or to inform decisions with socioeconomic consequences. Model credibility is the willingness of people to trust a model's predictions and is often difficult to establish for computational biology models. A 3 × 3 matrix has been proposed to allow such models to be categorised with respect to their testability and epistemic foundation in order to guide the selection of an appropriate process of validation to supply evidence to establish credibility. Three approaches to validation are identified that can be deployed depending on whether a model is deemed untestable, testable or lies somewhere in between. In the latter two cases, the validation process involves the quantification of uncertainty which is a key output. The issues arising due to the complexity and inherent variability of biological systems are discussed and the creation of 'digital twins' proposed as a means to alleviate the issues and provide a more robust, transparent and traceable route to model credibility and acceptance. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. A methodology to annotate systems biology markup language models with the synthetic biology open language.

    Science.gov (United States)

    Roehner, Nicholas; Myers, Chris J

    2014-02-21

    Recently, we have begun to witness the potential of synthetic biology, noted here in the form of bacteria and yeast that have been genetically engineered to produce biofuels, manufacture drug precursors, and even invade tumor cells. The success of these projects, however, has often failed in translation and application to new projects, a problem exacerbated by a lack of engineering standards that combine descriptions of the structure and function of DNA. To address this need, this paper describes a methodology to connect the systems biology markup language (SBML) to the synthetic biology open language (SBOL), existing standards that describe biochemical models and DNA components, respectively. Our methodology involves first annotating SBML model elements such as species and reactions with SBOL DNA components. A graph is then constructed from the model, with vertices corresponding to elements within the model and edges corresponding to the cause-and-effect relationships between these elements. Lastly, the graph is traversed to assemble the annotating DNA components into a composite DNA component, which is used to annotate the model itself and can be referenced by other composite models and DNA components. In this way, our methodology can be used to build up a hierarchical library of models annotated with DNA components. Such a library is a useful input to any future genetic technology mapping algorithm that would automate the process of composing DNA components to satisfy a behavioral specification. Our methodology for SBML-to-SBOL annotation is implemented in the latest version of our genetic design automation (GDA) software tool, iBioSim.

  10. Natural crayfish clone as emerging model for various biological ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Biosciences; Volume 36; Issue 2. Marmorkrebs: Natural crayfish clone as emerging model for various biological disciplines. Günter Vogt. Mini-review Volume 36 Issue 2 June 2011 pp 377-382. Fulltext. Click here to view fulltext PDF. Permanent link:

  11. Learning through Creating Robotic Models of Biological Systems

    Science.gov (United States)

    Cuperman, Dan; Verner, Igor M.

    2013-01-01

    This paper considers an approach to studying issues in technology and science, which integrates design and inquiry activities towards creating and exploring technological models of scientific phenomena. We implemented this approach in a context where the learner inquires into a biological phenomenon and develops its representation in the form of a…

  12. Model calculations of nuclear data for biologically-important elements

    International Nuclear Information System (INIS)

    Chadwick, M.B.; Blann, M.; Reffo, G.; Young, P.G.

    1994-05-01

    We describe calculations of neutron-induced reactions on carbon and oxygen for incident energies up to 70 MeV, the relevant clinical energy in radiation neutron therapy. Our calculations using the FKK-GNASH, GNASH, and ALICE codes are compared with experimental measurements, and their usefulness for modeling reactions on biologically-important elements is assessed

  13. A Calculus for Modelling, Simulating and Analysing Compartmentalized Biological Systems

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Ihekwaba, Adoha

    2007-01-01

    A. Ihekwaba, R. Mardare. A Calculus for Modelling, Simulating and Analysing Compartmentalized Biological Systems. Case study: NFkB system. In Proc. of International Conference of Computational Methods in Sciences and Engineering (ICCMSE), American Institute of Physics, AIP Proceedings, N 2...

  14. Part 6: Modelling of simultaneous chemical-biological P removal ...

    African Journals Online (AJOL)

    drinie

    approaches taken in modelling the chemical P removal processes. In the literature .... to 2 mgP/l) for an iron dose of ~1 to 10 mg/l as Fe - refer to dashed line in Fig. 1). ...... systems exhibiting biological enhanced phosphate removal. Part 3:.

  15. Universally sloppy parameter sensitivities in systems biology models.

    Directory of Open Access Journals (Sweden)

    Ryan N Gutenkunst

    2007-10-01

    Full Text Available Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring in vivo biochemical parameters is difficult, and collectively fitting them to other experimental data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a "sloppy" spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our growth-factor-signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.

  16. Universally sloppy parameter sensitivities in systems biology models.

    Science.gov (United States)

    Gutenkunst, Ryan N; Waterfall, Joshua J; Casey, Fergal P; Brown, Kevin S; Myers, Christopher R; Sethna, James P

    2007-10-01

    Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring in vivo biochemical parameters is difficult, and collectively fitting them to other experimental data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a "sloppy" spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our growth-factor-signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.

  17. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  18. Computer modeling in developmental biology: growing today, essential tomorrow.

    Science.gov (United States)

    Sharpe, James

    2017-12-01

    D'Arcy Thompson was a true pioneer, applying mathematical concepts and analyses to the question of morphogenesis over 100 years ago. The centenary of his famous book, On Growth and Form , is therefore a great occasion on which to review the types of computer modeling now being pursued to understand the development of organs and organisms. Here, I present some of the latest modeling projects in the field, covering a wide range of developmental biology concepts, from molecular patterning to tissue morphogenesis. Rather than classifying them according to scientific question, or scale of problem, I focus instead on the different ways that modeling contributes to the scientific process and discuss the likely future of modeling in developmental biology. © 2017. Published by The Company of Biologists Ltd.

  19. Evaluation of radiobiological effects in 3 distinct biological models

    International Nuclear Information System (INIS)

    Lemos, J.; Costa, P.; Cunha, L.; Metello, L.F.; Carvalho, A.P.; Vasconcelos, V.; Genesio, P.; Ponte, F.; Costa, P.S.; Crespo, P.

    2015-01-01

    Full text of publication follows. The present work aims at sharing the process of development of advanced biological models to study radiobiological effects. Recognizing several known limitations and difficulties of the current monolayer cellular models, as well as the increasing difficulties to use advanced biological models, our group has been developing advanced biological alternative models, namely three-dimensional cell cultures and a less explored animal model (the Zebra fish - Danio rerio - which allows the access to inter-generational data, while characterized by a great genetic homology towards the humans). These 3 models (monolayer cellular model, three-dimensional cell cultures and zebra fish) were externally irradiated with 100 mGy, 500 mGy or 1 Gy. The consequences of that irradiation were studied using cellular and molecular tests. Our previous experimental studies with 100 mGy external gamma irradiation of HepG2 monolayer cells showed a slight increase in the proliferation rate 24 h, 48 h and 72 h post irradiation. These results also pointed into the presence of certain bystander effects 72 h post irradiation, constituting the starting point for the need of a more accurate analysis realized with this work. At this stage, we continue focused on the acute biological effects. Obtained results, namely MTT and clonogenic assays for evaluating cellular metabolic activity and proliferation in the in vitro models, as well as proteomics for the evaluation of in vivo effects will be presented, discussed and explained. Several hypotheses will be presented and defended based on the facts previously demonstrated. This work aims at sharing the actual state and the results already available from this medium-term project, building the proof of the added value on applying these advanced models, while demonstrating the strongest and weakest points from all of them (so allowing the comparison between them and to base the subsequent choice for research groups starting

  20. Modeling drug- and chemical- induced hepatotoxicity with systems biology approaches

    Directory of Open Access Journals (Sweden)

    Sudin eBhattacharya

    2012-12-01

    Full Text Available We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of ‘toxicity pathways’ is described in the context of the 2007 US National Academies of Science report, Toxicity testing in the 21st Century: A Vision and A Strategy. Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically-based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular virtual tissue model of the liver lobule that combines molecular circuits in individual hepatocytes with cell-cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the AhR toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsymTM to understand drug-induced liver injury (DILI, the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales.

  1. Biologically based modelling and simulation of carcinogenesis at low doses

    International Nuclear Information System (INIS)

    Ouchi, Noriyuki B.

    2003-01-01

    The process of the carcinogenesis is studied by computer simulation. In general, we need a large number of experimental samples to detect mutations at low doses, but in practice it is difficult to get such a large number of data. To satisfy the requirements of the situation at low doses, it is good to study the process of carcinogenesis using biologically based mathematical model. We have mainly studied it by using as known as 'multi-stage model'; the model seems to get complicated, as we adopt the recent new findings of molecular biological experiments. Moreover, the basic idea of the multi-stage model is based on the epidemiologic data of log-log variation of cancer incidence with age, it seems to be difficult to compare with experimental data of irradiated cell culture system, which has been increasing in recent years. Taking above into consideration, we concluded that we had better make new model with following features: 1) a unit of the target system is a cell, 2) the new information of the molecular biology can be easily introduced, 3) having spatial coordinates for checking a colony formation or tumorigenesis. In this presentation, we will show the detail of the model and some simulation results about the carcinogenesis. (author)

  2. Methods and models in mathematical biology deterministic and stochastic approaches

    CERN Document Server

    Müller, Johannes

    2015-01-01

    This book developed from classes in mathematical biology taught by the authors over several years at the Technische Universität München. The main themes are modeling principles, mathematical principles for the analysis of these models, and model-based analysis of data. The key topics of modern biomathematics are covered: ecology, epidemiology, biochemistry, regulatory networks, neuronal networks, and population genetics. A variety of mathematical methods are introduced, ranging from ordinary and partial differential equations to stochastic graph theory and  branching processes. A special emphasis is placed on the interplay between stochastic and deterministic models.

  3. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    OpenAIRE

    Matthew P. Adams; Catherine J. Collier; Sven Uthicke; Yan X. Ow; Lucas Langlois; Katherine R. O’Brien

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluat...

  4. Realistic neurons can compute the operations needed by quantum probability theory and other vector symbolic architectures.

    Science.gov (United States)

    Stewart, Terrence C; Eliasmith, Chris

    2013-06-01

    Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B).

  5. Green Algae as Model Organisms for Biological Fluid Dynamics

    Science.gov (United States)

    Goldstein, Raymond E.

    2015-01-01

    In the past decade, the volvocine green algae, spanning from the unicellular Chlamydomonas to multicellular Volvox, have emerged as model organisms for a number of problems in biological fluid dynamics. These include flagellar propulsion, nutrient uptake by swimming organisms, hydrodynamic interactions mediated by walls, collective dynamics and transport within suspensions of microswimmers, the mechanism of phototaxis, and the stochastic dynamics of flagellar synchronization. Green algae are well suited to the study of such problems because of their range of sizes (from 10 μm to several millimeters), their geometric regularity, the ease with which they can be cultured, and the availability of many mutants that allow for connections between molecular details and organism-level behavior. This review summarizes these recent developments and highlights promising future directions in the study of biological fluid dynamics, especially in the context of evolutionary biology, that can take advantage of these remarkable organisms.

  6. Dynamics of mathematical models in biology bringing mathematics to life

    CERN Document Server

    Zazzu, Valeria; Guarracino, Mario

    2016-01-01

    This volume focuses on contributions from both the mathematics and life science community surrounding the concepts of time and dynamicity of nature, two significant elements which are often overlooked in modeling process to avoid exponential computations. The book is divided into three distinct parts: dynamics of genomes and genetic variation, dynamics of motifs, and dynamics of biological networks. Chapters included in dynamics of genomes and genetic variation analyze the molecular mechanisms and evolutionary processes that shape the structure and function of genomes and those that govern genome dynamics. The dynamics of motifs portion of the volume provides an overview of current methods for motif searching in DNA, RNA and proteins, a key process to discover emergent properties of cells, tissues, and organisms. The part devoted to the dynamics of biological networks covers networks aptly discusses networks in complex biological functions and activities that interpret processes in cells. Moreover, chapters i...

  7. Biological parameters for lung cancer in mathematical models of carcinogenesis

    International Nuclear Information System (INIS)

    Jacob, P.; Jacob, V.

    2003-01-01

    Applications of the two-step model of carcinogenesis with clonal expansion (TSCE) to lung cancer data are reviewed, including those on atomic bomb survivors from Hiroshima and Nagasaki, British doctors, Colorado Plateau miners, and Chinese tin miners. Different sets of identifiable model parameters are used in the literature. The parameter set which could be determined with the lowest uncertainty consists of the net proliferation rate gamma of intermediate cells, the hazard h 55 at an intermediate age, and the hazard H? at an asymptotically large age. Also, the values of these three parameters obtained in the various studies are more consistent than other identifiable combinations of the biological parameters. Based on representative results for these three parameters, implications for the biological parameters in the TSCE model are derived. (author)

  8. Multiway modeling and analysis in stem cell systems biology

    Directory of Open Access Journals (Sweden)

    Vandenberg Scott L

    2008-07-01

    Full Text Available Abstract Background Systems biology refers to multidisciplinary approaches designed to uncover emergent properties of biological systems. Stem cells are an attractive target for this analysis, due to their broad therapeutic potential. A central theme of systems biology is the use of computational modeling to reconstruct complex systems from a wealth of reductionist, molecular data (e.g., gene/protein expression, signal transduction activity, metabolic activity, etc.. A number of deterministic, probabilistic, and statistical learning models are used to understand sophisticated cellular behaviors such as protein expression during cellular differentiation and the activity of signaling networks. However, many of these models are bimodal i.e., they only consider row-column relationships. In contrast, multiway modeling techniques (also known as tensor models can analyze multimodal data, which capture much more information about complex behaviors such as cell differentiation. In particular, tensors can be very powerful tools for modeling the dynamic activity of biological networks over time. Here, we review the application of systems biology to stem cells and illustrate application of tensor analysis to model collagen-induced osteogenic differentiation of human mesenchymal stem cells. Results We applied Tucker1, Tucker3, and Parallel Factor Analysis (PARAFAC models to identify protein/gene expression patterns during extracellular matrix-induced osteogenic differentiation of human mesenchymal stem cells. In one case, we organized our data into a tensor of type protein/gene locus link × gene ontology category × osteogenic stimulant, and found that our cells expressed two distinct, stimulus-dependent sets of functionally related genes as they underwent osteogenic differentiation. In a second case, we organized DNA microarray data in a three-way tensor of gene IDs × osteogenic stimulus × replicates, and found that application of tensile strain to a

  9. Programming biological models in Python using PySB.

    Science.gov (United States)

    Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K

    2013-01-01

    Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.

  10. Agent-based re-engineering of ErbB signaling: a modeling pipeline for integrative systems biology.

    Science.gov (United States)

    Das, Arya A; Ajayakumar Darsana, T; Jacob, Elizabeth

    2017-03-01

    Experiments in systems biology are generally supported by a computational model which quantitatively estimates the parameters of the system by finding the best fit to the experiment. Mathematical models have proved to be successful in reverse engineering the system. The data generated is interpreted to understand the dynamics of the underlying phenomena. The question we have sought to answer is that - is it possible to use an agent-based approach to re-engineer a biological process, making use of the available knowledge from experimental and modelling efforts? Can the bottom-up approach benefit from the top-down exercise so as to create an integrated modelling formalism for systems biology? We propose a modelling pipeline that learns from the data given by reverse engineering, and uses it for re-engineering the system, to carry out in-silico experiments. A mathematical model that quantitatively predicts co-expression of EGFR-HER2 receptors in activation and trafficking has been taken for this study. The pipeline architecture takes cues from the population model that gives the rates of biochemical reactions, to formulate knowledge-based rules for the particle model. Agent-based simulations using these rules, support the existing facts on EGFR-HER2 dynamics. We conclude that, re-engineering models, built using the results of reverse engineering, opens up the possibility of harnessing the power pack of data which now lies scattered in literature. Virtual experiments could then become more realistic when empowered with the findings of empirical cell biology and modelling studies. Implemented on the Agent Modelling Framework developed in-house. C ++ code templates available in Supplementary material . liz.csir@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  11. Generating realistic images using Kray

    Science.gov (United States)

    Tanski, Grzegorz

    2004-07-01

    Kray is an application for creating realistic images. It is written in C++ programming language, has a text-based interface, solves global illumination problem using techniques such as radiosity, path tracing and photon mapping.

  12. Large Eddy Simulation and Reynolds-Averaged Navier-Stokes modeling of flow in a realistic pharyngeal airway model: an investigation of obstructive sleep apnea.

    Science.gov (United States)

    Mihaescu, Mihai; Murugappan, Shanmugam; Kalra, Maninder; Khosla, Sid; Gutmark, Ephraim

    2008-07-19

    Computational fluid dynamics techniques employing primarily steady Reynolds-Averaged Navier-Stokes (RANS) methodology have been recently used to characterize the transitional/turbulent flow field in human airways. The use of RANS implies that flow phenomena are averaged over time, the flow dynamics not being captured. Further, RANS uses two-equation turbulence models that are not adequate for predicting anisotropic flows, flows with high streamline curvature, or flows where separation occurs. A more accurate approach for such flow situations that occur in the human airway is Large Eddy Simulation (LES). The paper considers flow modeling in a pharyngeal airway model reconstructed from cross-sectional magnetic resonance scans of a patient with obstructive sleep apnea. The airway model is characterized by a maximum narrowing at the site of retropalatal pharynx. Two flow-modeling strategies are employed: steady RANS and the LES approach. In the RANS modeling framework both k-epsilon and k-omega turbulence models are used. The paper discusses the differences between the airflow characteristics obtained from the RANS and LES calculations. The largest discrepancies were found in the axial velocity distributions downstream of the minimum cross-sectional area. This region is characterized by flow separation and large radial velocity gradients across the developed shear layers. The largest difference in static pressure distributions on the airway walls was found between the LES and the k-epsilon data at the site of maximum narrowing in the retropalatal pharynx.

  13. Recurrent Convolutional Neural Networks: A Better Model of Biological Object Recognition.

    Science.gov (United States)

    Spoerer, Courtney J; McClure, Patrick; Kriegeskorte, Nikolaus

    2017-01-01

    Feedforward neural networks provide the dominant model of how the brain performs visual object recognition. However, these networks lack the lateral and feedback connections, and the resulting recurrent neuronal dynamics, of the ventral visual pathway in the human and non-human primate brain. Here we investigate recurrent convolutional neural networks with bottom-up (B), lateral (L), and top-down (T) connections. Combining these types of connections yields four architectures (B, BT, BL, and BLT), which we systematically test and compare. We hypothesized that recurrent dynamics might improve recognition performance in the challenging scenario of partial occlusion. We introduce two novel occluded object recognition tasks to test the efficacy of the models, digit clutter (where multiple target digits occlude one another) and digit debris (where target digits are occluded by digit fragments). We find that recurrent neural networks outperform feedforward control models (approximately matched in parametric complexity) at recognizing objects, both in the absence of occlusion and in all occlusion conditions. Recurrent networks were also found to be more robust to the inclusion of additive Gaussian noise. Recurrent neural networks are better in two respects: (1) they are more neurobiologically realistic than their feedforward counterparts; (2) they are better in terms of their ability to recognize objects, especially under challenging conditions. This work shows that computer vision can benefit from using recurrent convolutional architectures and suggests that the ubiquitous recurrent connections in biological brains are essential for task performance.

  14. Echinococcus as a model system: biology and epidemiology.

    Science.gov (United States)

    Thompson, R C A; Jenkins, D J

    2014-10-15

    The introduction of Echinococcus to Australia over 200 years ago and its establishment in sheep rearing areas of the country inflicted a serious medical and economic burden on the country. This resulted in an investment in both basic and applied research aimed at learning more about the biology and life cycle of Echinococcus. This research served to illustrate the uniqueness of the parasite in terms of developmental biology and ecology, and the value of Echinococcus as a model system in a broad range of research, from fundamental biology to theoretical control systems. These studies formed the foundation for an international, diverse and ongoing research effort on the hydatid organisms encompassing stem cell biology, gene regulation, strain variation, wildlife diseases and models of transmission dynamics. We describe the development, nature and diversity of this research, and how it was initiated in Australia but subsequently has stimulated much international and collaborative research on Echinococcus. Copyright © 2014 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.

  15. Evolving cell models for systems and synthetic biology.

    Science.gov (United States)

    Cao, Hongqing; Romero-Campero, Francisco J; Heeb, Stephan; Cámara, Miguel; Krasnogor, Natalio

    2010-03-01

    This paper proposes a new methodology for the automated design of cell models for systems and synthetic biology. Our modelling framework is based on P systems, a discrete, stochastic and modular formal modelling language. The automated design of biological models comprising the optimization of the model structure and its stochastic kinetic constants is performed using an evolutionary algorithm. The evolutionary algorithm evolves model structures by combining different modules taken from a predefined module library and then it fine-tunes the associated stochastic kinetic constants. We investigate four alternative objective functions for the fitness calculation within the evolutionary algorithm: (1) equally weighted sum method, (2) normalization method, (3) randomly weighted sum method, and (4) equally weighted product method. The effectiveness of the methodology is tested on four case studies of increasing complexity including negative and positive autoregulation as well as two gene networks implementing a pulse generator and a bandwidth detector. We provide a systematic analysis of the evolutionary algorithm's results as well as of the resulting evolved cell models.

  16. Bifurcations of a class of singular biological economic models

    International Nuclear Information System (INIS)

    Zhang Xue; Zhang Qingling; Zhang Yue

    2009-01-01

    This paper studies systematically a prey-predator singular biological economic model with time delay. It shows that this model exhibits two bifurcation phenomena when the economic profit is zero. One is transcritical bifurcation which changes the stability of the system, and the other is singular induced bifurcation which indicates that zero economic profit brings impulse, i.e., rapid expansion of the population in biological explanation. On the other hand, if the economic profit is positive, at a critical value of bifurcation parameter, the system undergoes a Hopf bifurcation, i.e., the increase of delay destabilizes the system and bifurcates into small amplitude periodic solution. Finally, by using Matlab software, numerical simulations illustrate the effectiveness of the results obtained here. In addition, we study numerically that the system undergoes a saddle-node bifurcation when the bifurcation parameter goes through critical value of positive economic profit.

  17. Enterococcus infection biology: lessons from invertebrate host models.

    Science.gov (United States)

    Yuen, Grace J; Ausubel, Frederick M

    2014-03-01

    The enterococci are commensals of the gastrointestinal tract of many metazoans, from insects to humans. While they normally do not cause disease in the intestine, they can become pathogenic when they infect sites outside of the gut. Recently, the enterococci have become important nosocomial pathogens, with the majority of human enterococcal infections caused by two species, Enterococcus faecalis and Enterococcus faecium. Studies using invertebrate infection models have revealed insights into the biology of enterococcal infections, as well as general principles underlying host innate immune defense. This review highlights recent findings on Enterococcus infection biology from two invertebrate infection models, the greater wax moth Galleria mellonella and the free-living bacteriovorous nematode Caenorhabditis elegans.

  18. Any realistic theory must be computationally realistic: a response to N. Gisin's definition of a Realistic Physics Theory

    OpenAIRE

    Bolotin, Arkady

    2014-01-01

    It is argued that the recent definition of a realistic physics theory by N. Gisin cannot be considered comprehensive unless it is supplemented with requirement that any realistic theory must be computationally realistic as well.

  19. Dynamic models in research and management of biological invasions.

    Science.gov (United States)

    Buchadas, Ana; Vaz, Ana Sofia; Honrado, João P; Alagador, Diogo; Bastos, Rita; Cabral, João A; Santos, Mário; Vicente, Joana R

    2017-07-01

    Invasive species are increasing in number, extent and impact worldwide. Effective invasion management has thus become a core socio-ecological challenge. To tackle this challenge, integrating spatial-temporal dynamics of invasion processes with modelling approaches is a promising approach. The inclusion of dynamic processes in such modelling frameworks (i.e. dynamic or hybrid models, here defined as models that integrate both dynamic and static approaches) adds an explicit temporal dimension to the study and management of invasions, enabling the prediction of invasions and optimisation of multi-scale management and governance. However, the extent to which dynamic approaches have been used for that purpose is under-investigated. Based on a literature review, we examined the extent to which dynamic modelling has been used to address invasions worldwide. We then evaluated how the use of dynamic modelling has evolved through time in the scope of invasive species management. The results suggest that modelling, in particular dynamic modelling, has been increasingly applied to biological invasions, especially to support management decisions at local scales. Also, the combination of dynamic and static modelling approaches (hybrid models with a spatially explicit output) can be especially effective, not only to support management at early invasion stages (from prevention to early detection), but also to improve the monitoring of invasion processes and impact assessment. Further development and testing of such hybrid models may well be regarded as a priority for future research aiming to improve the management of invasions across scales. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. The University – a Rational-Biologic Model

    Directory of Open Access Journals (Sweden)

    Ion Gh. Rosca

    2008-05-01

    Full Text Available The article advances the extension of the biologic rational model for the organizations, which are reprocessing and living in a turbulent environment. The current “tree” type organizations are not able to satisfy the requirements of the socio-economical environment and are not able to provide the organizational perpetuation and development. Thus, an innovative performing model for both the top and down management areas is presented, with the following recommendations: dividing the organization into departments using neuronal connections, focusing on the formatting processes and not on the activities, rethinking the system of a new organizational culture.

  1. Biological profiling and dose-response modeling tools ...

    Science.gov (United States)

    Through its ToxCast project, the U.S. EPA has developed a battery of in vitro high throughput screening (HTS) assays designed to assess the potential toxicity of environmental chemicals. At present, over 1800 chemicals have been tested in up to 600 assays, yielding a large number of concentration-response data sets. Standard processing of these data sets involves finding a best fitting mathematical model and set of model parameters that specify this model. The model parameters include quantities such as the half-maximal activity concentration (or “AC50”) that have biological significance and can be used to inform the efficacy or potency of a given chemical with respect to a given assay. All of this data is processed and stored in an online-accessible database and website: http://actor.epa.gov/dashboard2. Results from these in vitro assays are used in a multitude of ways. New pathways and targets can be identified and incorporated into new or existing adverse outcome pathways (AOPs). Pharmacokinetic models such as those implemented EPA’s HTTK R package can be used to translate an in vitro concentration into an in vivo dose; i.e., one can predict the oral equivalent dose that might be expected to activate a specific biological pathway. Such predicted values can then be compared with estimated actual human exposures prioritize chemicals for further testing.Any quantitative examination should be accompanied by estimation of uncertainty. We are developing met

  2. Mouse models for gastric cancer: Matching models to biological questions

    Science.gov (United States)

    Poh, Ashleigh R; O'Donoghue, Robert J J

    2016-01-01

    Abstract Gastric cancer is the third leading cause of cancer‐related mortality worldwide. This is in part due to the asymptomatic nature of the disease, which often results in late‐stage diagnosis, at which point there are limited treatment options. Even when treated successfully, gastric cancer patients have a high risk of tumor recurrence and acquired drug resistance. It is vital to gain a better understanding of the molecular mechanisms underlying gastric cancer pathogenesis to facilitate the design of new‐targeted therapies that may improve patient survival. A number of chemically and genetically engineered mouse models of gastric cancer have provided significant insight into the contribution of genetic and environmental factors to disease onset and progression. This review outlines the strengths and limitations of current mouse models of gastric cancer and their relevance to the pre‐clinical development of new therapeutics. PMID:26809278

  3. Caenorhabditis elegans, a Biological Model for Research in Toxicology.

    Science.gov (United States)

    Tejeda-Benitez, Lesly; Olivero-Verbel, Jesus

    2016-01-01

    Caenorhabditis elegans is a nematode of microscopic size which, due to its biological characteristics, has been used since the 1970s as a model for research in molecular biology, medicine, pharmacology, and toxicology. It was the first animal whose genome was completely sequenced and has played a key role in the understanding of apoptosis and RNA interference. The transparency of its body, short lifespan, ability to self-fertilize and ease of culture are advantages that make it ideal as a model in toxicology. Due to the fact that some of its biochemical pathways are similar to those of humans, it has been employed in research in several fields. C. elegans' use as a biological model in environmental toxicological assessments allows the determination of multiple endpoints. Some of these utilize the effects on the biological functions of the nematode and others use molecular markers. Endpoints such as lethality, growth, reproduction, and locomotion are the most studied, and usually employ the wild type Bristol N2 strain. Other endpoints use reporter genes, such as green fluorescence protein, driven by regulatory sequences from other genes related to different mechanisms of toxicity, such as heat shock, oxidative stress, CYP system, and metallothioneins among others, allowing the study of gene expression in a manner both rapid and easy. These transgenic strains of C. elegans represent a powerful tool to assess toxicity pathways for mixtures and environmental samples, and their numbers are growing in diversity and selectivity. However, other molecular biology techniques, including DNA microarrays and MicroRNAs have been explored to assess the effects of different toxicants and samples. C. elegans has allowed the assessment of neurotoxic effects for heavy metals and pesticides, among those more frequently studied, as the nematode has a very well defined nervous system. More recently, nanoparticles are emergent pollutants whose toxicity can be explored using this nematode

  4. Wave basin model tests of technical-biological bank protection

    Science.gov (United States)

    Eisenmann, J.

    2012-04-01

    Sloped embankments of inland waterways are usually protected from erosion and other negative im-pacts of ship-induced hydraulic loads by technical revetments consisting of riprap. Concerning the dimensioning of such bank protection there are several design rules available, e.g. the "Principles for the Design of Bank and Bottom Protection for Inland Waterways" or the Code of Practice "Use of Standard Construction Methods for Bank and Bottom Protection on Waterways" issued by the BAW (Federal Waterways Engineering and Research Institute). Since the European Water Framework Directive has been put into action special emphasis was put on natural banks. Therefore the application of technical-biological bank protection is favoured. Currently design principles for technical-biological bank protection on inland waterways are missing. The existing experiences mainly refer to flowing waters with no or low ship-induced hydraulic loads on the banks. Since 2004 the Federal Waterways Engineering and Research Institute has been tracking the re-search and development project "Alternative Technical-Biological Bank Protection on Inland Water-ways" in company with the Federal Institute of Hydrology. The investigation to date includes the ex-amination of waterway sections where technical- biological bank protection is applied locally. For the development of design rules for technical-biological bank protection investigations shall be carried out in a next step, considering the mechanics and resilience of technical-biological bank protection with special attention to ship-induced hydraulic loads. The presentation gives a short introduction into hydraulic loads at inland waterways and their bank protection. More in detail model tests of a willow brush mattress as a technical-biological bank protec-tion in a wave basin are explained. Within the scope of these tests the brush mattresses were ex-posed to wave impacts to determine their resilience towards hydraulic loads. Since the

  5. Modeling Cancer Metastasis using Global, Quantitative and Integrative Network Biology

    DEFF Research Database (Denmark)

    Schoof, Erwin; Erler, Janine

    understanding of molecular processes which are fundamental to tumorigenesis. In Article 1, we propose a novel framework for how cancer mutations can be studied by taking into account their effect at the protein network level. In Article 2, we demonstrate how global, quantitative data on phosphorylation dynamics...... can be generated using MS, and how this can be modeled using a computational framework for deciphering kinase-substrate dynamics. This framework is described in depth in Article 3, and covers the design of KinomeXplorer, which allows the prediction of kinases responsible for modulating observed...... phosphorylation dynamics in a given biological sample. In Chapter III, we move into Integrative Network Biology, where, by combining two fundamental technologies (MS & NGS), we can obtain more in-depth insights into the links between cellular phenotype and genotype. Article 4 describes the proof...

  6. Agent-Based Modeling in Molecular Systems Biology.

    Science.gov (United States)

    Soheilypour, Mohammad; Mofrad, Mohammad R K

    2018-06-08

    Molecular systems orchestrating the biology of the cell typically involve a complex web of interactions among various components and span a vast range of spatial and temporal scales. Computational methods have advanced our understanding of the behavior of molecular systems by enabling us to test assumptions and hypotheses, explore the effect of different parameters on the outcome, and eventually guide experiments. While several different mathematical and computational methods are developed to study molecular systems at different spatiotemporal scales, there is still a need for methods that bridge the gap between spatially-detailed and computationally-efficient approaches. In this review, we summarize the capabilities of agent-based modeling (ABM) as an emerging molecular systems biology technique that provides researchers with a new tool in exploring the dynamics of molecular systems/pathways in health and disease. © 2018 WILEY Periodicals, Inc.

  7. Estimating confidence intervals in predicted responses for oscillatory biological models.

    Science.gov (United States)

    St John, Peter C; Doyle, Francis J

    2013-07-29

    The dynamics of gene regulation play a crucial role in a cellular control: allowing the cell to express the right proteins to meet changing needs. Some needs, such as correctly anticipating the day-night cycle, require complicated oscillatory features. In the analysis of gene regulatory networks, mathematical models are frequently used to understand how a network's structure enables it to respond appropriately to external inputs. These models typically consist of a set of ordinary differential equations, describing a network of biochemical reactions, and unknown kinetic parameters, chosen such that the model best captures experimental data. However, since a model's parameter values are uncertain, and since dynamic responses to inputs are highly parameter-dependent, it is difficult to assess the confidence associated with these in silico predictions. In particular, models with complex dynamics - such as oscillations - must be fit with computationally expensive global optimization routines, and cannot take advantage of existing measures of identifiability. Despite their difficulty to model mathematically, limit cycle oscillations play a key role in many biological processes, including cell cycling, metabolism, neuron firing, and circadian rhythms. In this study, we employ an efficient parameter estimation technique to enable a bootstrap uncertainty analysis for limit cycle models. Since the primary role of systems biology models is the insight they provide on responses to rate perturbations, we extend our uncertainty analysis to include first order sensitivity coefficients. Using a literature model of circadian rhythms, we show how predictive precision is degraded with decreasing sample points and increasing relative error. Additionally, we show how this method can be used for model discrimination by comparing the output identifiability of two candidate model structures to published literature data. Our method permits modellers of oscillatory systems to confidently

  8. CaliBayes and BASIS: integrated tools for the calibration, simulation and storage of biological simulation models.

    Science.gov (United States)

    Chen, Yuhui; Lawless, Conor; Gillespie, Colin S; Wu, Jake; Boys, Richard J; Wilkinson, Darren J

    2010-05-01

    Dynamic simulation modelling of complex biological processes forms the backbone of systems biology. Discrete stochastic models are particularly appropriate for describing sub-cellular molecular interactions, especially when critical molecular species are thought to be present at low copy-numbers. For example, these stochastic effects play an important role in models of human ageing, where ageing results from the long-term accumulation of random damage at various biological scales. Unfortunately, realistic stochastic simulation of discrete biological processes is highly computationally intensive, requiring specialist hardware, and can benefit greatly from parallel and distributed approaches to computation and analysis. For these reasons, we have developed the BASIS system for the simulation and storage of stochastic SBML models together with associated simulation results. This system is exposed as a set of web services to allow users to incorporate its simulation tools into their workflows. Parameter inference for stochastic models is also difficult and computationally expensive. The CaliBayes system provides a set of web services (together with an R package for consuming these and formatting data) which addresses this problem for SBML models. It uses a sequential Bayesian MCMC method, which is powerful and flexible, providing very rich information. However this approach is exceptionally computationally intensive and requires the use of a carefully designed architecture. Again, these tools are exposed as web services to allow users to take advantage of this system. In this article, we describe these two systems and demonstrate their integrated use with an example workflow to estimate the parameters of a simple model of Saccharomyces cerevisiae growth on agar plates.

  9. Realistic Approach for Phasor Measurement Unit Placement

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    This paper presents a realistic cost-effectivemodel for optimal placement of phasor measurement units (PMUs) for complete observability of a power system considering practical cost implications. The proposed model considers hidden or otherwise unaccounted practical costs involved in PMU...... installation. Consideration of these hidden but significant and integral part of total PMU installation costs was inspired from practical experience on a real-life project. The proposedmodel focuses on the minimization of total realistic costs instead of a widely used theoretical concept of a minimal number...... of PMUs. The proposed model has been applied to IEEE 14-bus, IEEE 24-bus, IEEE 30-bus, New England 39-bus, and large power system of 300 buses and real life Danish grid. A comparison of the presented results with those reported by traditionalmethods has also been shown to justify the effectiveness...

  10. RenderGAN: Generating Realistic Labeled Data

    Directory of Open Access Journals (Sweden)

    Leon Sixt

    2018-06-01

    Full Text Available Deep Convolutional Neuronal Networks (DCNNs are showing remarkable performance on many computer vision tasks. Due to their large parameter space, they require many labeled samples when trained in a supervised setting. The costs of annotating data manually can render the use of DCNNs infeasible. We present a novel framework called RenderGAN that can generate large amounts of realistic, labeled images by combining a 3D model and the Generative Adversarial Network framework. In our approach, image augmentations (e.g., lighting, background, and detail are learned from unlabeled data such that the generated images are strikingly realistic while preserving the labels known from the 3D model. We apply the RenderGAN framework to generate images of barcode-like markers that are attached to honeybees. Training a DCNN on data generated by the RenderGAN yields considerably better performance than training it on various baselines.

  11. Analysis and logical modeling of biological signaling transduction networks

    Science.gov (United States)

    Sun, Zhongyao

    The study of network theory and its application span across a multitude of seemingly disparate fields of science and technology: computer science, biology, social science, linguistics, etc. It is the intrinsic similarities embedded in the entities and the way they interact with one another in these systems that link them together. In this dissertation, I present from both the aspect of theoretical analysis and the aspect of application three projects, which primarily focus on signal transduction networks in biology. In these projects, I assembled a network model through extensively perusing literature, performed model-based simulations and validation, analyzed network topology, and proposed a novel network measure. The application of network modeling to the system of stomatal opening in plants revealed a fundamental question about the process that has been left unanswered in decades. The novel measure of the redundancy of signal transduction networks with Boolean dynamics by calculating its maximum node-independent elementary signaling mode set accurately predicts the effect of single node knockout in such signaling processes. The three projects as an organic whole advance the understanding of a real system as well as the behavior of such network models, giving me an opportunity to take a glimpse at the dazzling facets of the immense world of network science.

  12. Revision history aware repositories of computational models of biological systems.

    Science.gov (United States)

    Miller, Andrew K; Yu, Tommy; Britten, Randall; Cooling, Mike T; Lawson, James; Cowan, Dougal; Garny, Alan; Halstead, Matt D B; Hunter, Peter J; Nickerson, David P; Nunns, Geo; Wimalaratne, Sarala M; Nielsen, Poul M F

    2011-01-14

    Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. We have extended the Physiome Model Repository software to be fully revision history aware

  13. Revision history aware repositories of computational models of biological systems

    Directory of Open Access Journals (Sweden)

    Nickerson David P

    2011-01-01

    Full Text Available Abstract Background Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. Results We have extended the Physiome Model

  14. In silico biology of bone modelling and remodelling: adaptation.

    Science.gov (United States)

    Gerhard, Friederike A; Webster, Duncan J; van Lenthe, G Harry; Müller, Ralph

    2009-05-28

    Modelling and remodelling are the processes by which bone adapts its shape and internal structure to external influences. However, the cellular mechanisms triggering osteoclastic resorption and osteoblastic formation are still unknown. In order to investigate current biological theories, in silico models can be applied. In the past, most of these models were based on the continuum assumption, but some questions related to bone adaptation can be addressed better by models incorporating the trabecular microstructure. In this paper, existing simulation models are reviewed and one of the microstructural models is extended to test the hypothesis that bone adaptation can be simulated without particular knowledge of the local strain distribution in the bone. Validation using an experimental murine loading model showed that this is possible. Furthermore, the experimental model revealed that bone formation cannot be attributed only to an increase in trabecular thickness but also to structural reorganization including the growth of new trabeculae. How these new trabeculae arise is still an unresolved issue and might be better addressed by incorporating other levels of hierarchy, especially the cellular level. The cellular level sheds light on the activity and interplay between the different cell types, leading to the effective change in the whole bone. For this reason, hierarchical multi-scale simulations might help in the future to better understand the biomathematical laws behind bone adaptation.

  15. Human pluripotent stem cells: an emerging model in developmental biology.

    Science.gov (United States)

    Zhu, Zengrong; Huangfu, Danwei

    2013-02-01

    Developmental biology has long benefited from studies of classic model organisms. Recently, human pluripotent stem cells (hPSCs), including human embryonic stem cells and human induced pluripotent stem cells, have emerged as a new model system that offers unique advantages for developmental studies. Here, we discuss how studies of hPSCs can complement classic approaches using model organisms, and how hPSCs can be used to recapitulate aspects of human embryonic development 'in a dish'. We also summarize some of the recently developed genetic tools that greatly facilitate the interrogation of gene function during hPSC differentiation. With the development of high-throughput screening technologies, hPSCs have the potential to revolutionize gene discovery in mammalian development.

  16. Experimental, statistical, and biological models of radon carcinogenesis

    International Nuclear Information System (INIS)

    Cross, F.T.

    1991-09-01

    Risk models developed for underground miners have not been consistently validated in studies of populations exposed to indoor radon. Imprecision in risk estimates results principally from differences between exposures in mines as compared to domestic environments and from uncertainties about the interaction between cigarette-smoking and exposure to radon decay products. Uncertainties in extrapolating miner data to domestic exposures can be reduced by means of a broad-based health effects research program that addresses the interrelated issues of exposure, respiratory tract dose, carcinogenesis (molecular/cellular and animal studies, plus developing biological and statistical models), and the relationship of radon to smoking and other copollutant exposures. This article reviews experimental animal data on radon carcinogenesis observed primarily in rats at Pacific Northwest Laboratory. Recent experimental and mechanistic carcinogenesis models of exposures to radon, uranium ore dust, and cigarette smoke are presented with statistical analyses of animal data. 20 refs., 1 fig

  17. Biologically based neural circuit modelling for the study of fear learning and extinction

    Science.gov (United States)

    Nair, Satish S.; Paré, Denis; Vicentic, Aleksandra

    2016-11-01

    The neuronal systems that promote protective defensive behaviours have been studied extensively using Pavlovian conditioning. In this paradigm, an initially neutral-conditioned stimulus is paired with an aversive unconditioned stimulus leading the subjects to display behavioural signs of fear. Decades of research into the neural bases of this simple behavioural paradigm uncovered that the amygdala, a complex structure comprised of several interconnected nuclei, is an essential part of the neural circuits required for the acquisition, consolidation and expression of fear memory. However, emerging evidence from the confluence of electrophysiological, tract tracing, imaging, molecular, optogenetic and chemogenetic methodologies, reveals that fear learning is mediated by multiple connections between several amygdala nuclei and their distributed targets, dynamical changes in plasticity in local circuit elements as well as neuromodulatory mechanisms that promote synaptic plasticity. To uncover these complex relations and analyse multi-modal data sets acquired from these studies, we argue that biologically realistic computational modelling, in conjunction with experiments, offers an opportunity to advance our understanding of the neural circuit mechanisms of fear learning and to address how their dysfunction may lead to maladaptive fear responses in mental disorders.

  18. Progress in realistic LOCA analysis

    Energy Technology Data Exchange (ETDEWEB)

    Young, M Y; Bajorek, S M; Ohkawa, K [Westinghouse Electric Corporation, Pittsburgh, PA (United States)

    1994-12-31

    While LOCA is a complex transient to simulate, the state of art in thermal hydraulics has advanced sufficiently to allow its realistic prediction and application of advanced methods to actual reactor design as demonstrated by methodology described in this paper 6 refs, 5 figs, 3 tabs

  19. Time management: a realistic approach.

    Science.gov (United States)

    Jackson, Valerie P

    2009-06-01

    Realistic time management and organization plans can improve productivity and the quality of life. However, these skills can be difficult to develop and maintain. The key elements of time management are goals, organization, delegation, and relaxation. The author addresses each of these components and provides suggestions for successful time management.

  20. Should scientific realists be platonists?

    DEFF Research Database (Denmark)

    Busch, Jacob; Morrison, Joe

    2015-01-01

    an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...

  1. Bridging Mechanistic and Phenomenological Models of Complex Biological Systems.

    Science.gov (United States)

    Transtrum, Mark K; Qiu, Peng

    2016-05-01

    The inherent complexity of biological systems gives rise to complicated mechanistic models with a large number of parameters. On the other hand, the collective behavior of these systems can often be characterized by a relatively small number of phenomenological parameters. We use the Manifold Boundary Approximation Method (MBAM) as a tool for deriving simple phenomenological models from complicated mechanistic models. The resulting models are not black boxes, but remain expressed in terms of the microscopic parameters. In this way, we explicitly connect the macroscopic and microscopic descriptions, characterize the equivalence class of distinct systems exhibiting the same range of collective behavior, and identify the combinations of components that function as tunable control knobs for the behavior. We demonstrate the procedure for adaptation behavior exhibited by the EGFR pathway. From a 48 parameter mechanistic model, the system can be effectively described by a single adaptation parameter τ characterizing the ratio of time scales for the initial response and recovery time of the system which can in turn be expressed as a combination of microscopic reaction rates, Michaelis-Menten constants, and biochemical concentrations. The situation is not unlike modeling in physics in which microscopically complex processes can often be renormalized into simple phenomenological models with only a few effective parameters. The proposed method additionally provides a mechanistic explanation for non-universal features of the behavior.

  2. WORKSHOP ON APPLICATION OF STATISTICAL METHODS TO BIOLOGICALLY-BASED PHARMACOKINETIC MODELING FOR RISK ASSESSMENT

    Science.gov (United States)

    Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...

  3. Neural network models for biological waste-gas treatment systems.

    Science.gov (United States)

    Rene, Eldon R; Estefanía López, M; Veiga, María C; Kennes, Christian

    2011-12-15

    This paper outlines the procedure for developing artificial neural network (ANN) based models for three bioreactor configurations used for waste-gas treatment. The three bioreactor configurations chosen for this modelling work were: biofilter (BF), continuous stirred tank bioreactor (CSTB) and monolith bioreactor (MB). Using styrene as the model pollutant, this paper also serves as a general database of information pertaining to the bioreactor operation and important factors affecting gas-phase styrene removal in these biological systems. Biological waste-gas treatment systems are considered to be both advantageous and economically effective in treating a stream of polluted air containing low to moderate concentrations of the target contaminant, over a rather wide range of gas-flow rates. The bioreactors were inoculated with the fungus Sporothrix variecibatus, and their performances were evaluated at different empty bed residence times (EBRT), and at different inlet styrene concentrations (C(i)). The experimental data from these bioreactors were modelled to predict the bioreactors performance in terms of their removal efficiency (RE, %), by adequate training and testing of a three-layered back propagation neural network (input layer-hidden layer-output layer). Two models (BIOF1 and BIOF2) were developed for the BF with different combinations of easily measurable BF parameters as the inputs, that is concentration (gm(-3)), unit flow (h(-1)) and pressure drop (cm of H(2)O). The model developed for the CSTB used two inputs (concentration and unit flow), while the model for the MB had three inputs (concentration, G/L (gas/liquid) ratio, and pressure drop). Sensitivity analysis in the form of absolute average sensitivity (AAS) was performed for all the developed ANN models to ascertain the importance of the different input parameters, and to assess their direct effect on the bioreactors performance. The performance of the models was estimated by the regression

  4. Continuous time Boolean modeling for biological signaling: application of Gillespie algorithm.

    OpenAIRE

    Stoll, Gautier; Viara, Eric; Barillot, Emmanuel; Calzone, Laurence

    2012-01-01

    Abstract Mathematical modeling is used as a Systems Biology tool to answer biological questions, and more precisely, to validate a network that describes biological observations and predict the effect of perturbations. This article presents an algorithm for modeling biological networks in a discrete framework with continuous time. Background There exist two major types of mathematical modeling approaches: (1) quantitative modeling, representing various chemical species concentrations by real...

  5. Fluctuating Nonlinear Spring Model of Mechanical Deformation of Biological Particles.

    Directory of Open Access Journals (Sweden)

    Olga Kononova

    2016-01-01

    Full Text Available The mechanical properties of virus capsids correlate with local conformational dynamics in the capsid structure. They also reflect the required stability needed to withstand high internal pressures generated upon genome loading and contribute to the success of important events in viral infectivity, such as capsid maturation, genome uncoating and receptor binding. The mechanical properties of biological nanoparticles are often determined from monitoring their dynamic deformations in Atomic Force Microscopy nanoindentation experiments; but a comprehensive theory describing the full range of observed deformation behaviors has not previously been described. We present a new theory for modeling dynamic deformations of biological nanoparticles, which considers the non-linear Hertzian deformation, resulting from an indenter-particle physical contact, and the bending of curved elements (beams modeling the particle structure. The beams' deformation beyond the critical point triggers a dynamic transition of the particle to the collapsed state. This extreme event is accompanied by a catastrophic force drop as observed in the experimental or simulated force (F-deformation (X spectra. The theory interprets fine features of the spectra, including the nonlinear components of the FX-curves, in terms of the Young's moduli for Hertzian and bending deformations, and the structural damage dependent beams' survival probability, in terms of the maximum strength and the cooperativity parameter. The theory is exemplified by successfully describing the deformation dynamics of natural nanoparticles through comparing theoretical curves with experimental force-deformation spectra for several virus particles. This approach provides a comprehensive description of the dynamic structural transitions in biological and artificial nanoparticles, which is essential for their optimal use in nanotechnology and nanomedicine applications.

  6. Precise generation of systems biology models from KEGG pathways.

    Science.gov (United States)

    Wrzodek, Clemens; Büchel, Finja; Ruff, Manuel; Dräger, Andreas; Zell, Andreas

    2013-02-21

    The KEGG PATHWAY database provides a plethora of pathways for a diversity of organisms. All pathway components are directly linked to other KEGG databases, such as KEGG COMPOUND or KEGG REACTION. Therefore, the pathways can be extended with an enormous amount of information and provide a foundation for initial structural modeling approaches. As a drawback, KGML-formatted KEGG pathways are primarily designed for visualization purposes and often omit important details for the sake of a clear arrangement of its entries. Thus, a direct conversion into systems biology models would produce incomplete and erroneous models. Here, we present a precise method for processing and converting KEGG pathways into initial metabolic and signaling models encoded in the standardized community pathway formats SBML (Levels 2 and 3) and BioPAX (Levels 2 and 3). This method involves correcting invalid or incomplete KGML content, creating complete and valid stoichiometric reactions, translating relations to signaling models and augmenting the pathway content with various information, such as cross-references to Entrez Gene, OMIM, UniProt ChEBI, and many more.Finally, we compare several existing conversion tools for KEGG pathways and show that the conversion from KEGG to BioPAX does not involve a loss of information, whilst lossless translations to SBML can only be performed using SBML Level 3, including its recently proposed qualitative models and groups extension packages. Building correct BioPAX and SBML signaling models from the KEGG database is a unique characteristic of the proposed method. Further, there is no other approach that is able to appropriately construct metabolic models from KEGG pathways, including correct reactions with stoichiometry. The resulting initial models, which contain valid and comprehensive SBML or BioPAX code and a multitude of cross-references, lay the foundation to facilitate further modeling steps.

  7. Automated parameter estimation for biological models using Bayesian statistical model checking.

    Science.gov (United States)

    Hussain, Faraz; Langmead, Christopher J; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram; Jha, Sumit K

    2015-01-01

    Probabilistic models have gained widespread acceptance in the systems biology community as a useful way to represent complex biological systems. Such models are developed using existing knowledge of the structure and dynamics of the system, experimental observations, and inferences drawn from statistical analysis of empirical data. A key bottleneck in building such models is that some system variables cannot be measured experimentally. These variables are incorporated into the model as numerical parameters. Determining values of these parameters that justify existing experiments and provide reliable predictions when model simulations are performed is a key research problem. Using an agent-based model of the dynamics of acute inflammation, we demonstrate a novel parameter estimation algorithm by discovering the amount and schedule of doses of bacterial lipopolysaccharide that guarantee a set of observed clinical outcomes with high probability. We synthesized values of twenty-eight unknown parameters such that the parameterized model instantiated with these parameter values satisfies four specifications describing the dynamic behavior of the model. We have developed a new algorithmic technique for discovering parameters in complex stochastic models of biological systems given behavioral specifications written in a formal mathematical logic. Our algorithm uses Bayesian model checking, sequential hypothesis testing, and stochastic optimization to automatically synthesize parameters of probabilistic biological models.

  8. Bayesian Network Webserver: a comprehensive tool for biological network modeling.

    Science.gov (United States)

    Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan

    2013-11-01

    The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.

  9. Chimeric animal models in human stem cell biology.

    Science.gov (United States)

    Glover, Joel C; Boulland, Jean-Luc; Halasi, Gabor; Kasumacic, Nedim

    2009-01-01

    The clinical use of stem cells for regenerative medicine is critically dependent on preclinical studies in animal models. In this review we examine some of the key issues and challenges in the use of animal models to study human stem cell biology-experimental standardization, body size, immunological barriers, cell survival factors, fusion of host and donor cells, and in vivo imaging and tracking. We focus particular attention on the various imaging modalities that can be used to track cells in living animals, comparing their strengths and weaknesses and describing technical developments that are likely to lead to new opportunities for the dynamic assessment of stem cell behavior in vivo. We then provide an overview of some of the most commonly used animal models, their advantages and disadvantages, and examples of their use for xenotypic transplantation of human stem cells, with separate reviews of models involving rodents, ungulates, nonhuman primates, and the chicken embryo. As the use of human somatic, embryonic, and induced pluripotent stem cells increases, so too will the range of applications for these animal models. It is likely that increasingly sophisticated uses of human/animal chimeric models will be developed through advances in genetic manipulation, cell delivery, and in vivo imaging.

  10. Naumovozyma castellii: an alternative model for budding yeast molecular biology.

    Science.gov (United States)

    Karademir Andersson, Ahu; Cohn, Marita

    2017-03-01

    Naumovozyma castellii (Saccharomyces castellii) is a member of the budding yeast family Saccharomycetaceae. It has been extensively used as a model organism for telomere biology research and has gained increasing interest as a budding yeast model for functional analyses owing to its amenability to genetic modifications. Owing to the suitable phylogenetic distance to S. cerevisiae, the whole genome sequence of N. castellii has provided unique data for comparative genomic studies, and it played a key role in the establishment of the timing of the whole genome duplication and the evolutionary events that took place in the subsequent genomic evolution of the Saccharomyces lineage. Here we summarize the historical background of its establishment as a laboratory yeast species, and the development of genetic and molecular tools and strains. We review the research performed on N. castellii, focusing on areas where it has significantly contributed to the discovery of new features of molecular biology and to the advancement of our understanding of molecular evolution. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Mass balances for a biological life support system simulation model

    Science.gov (United States)

    Volk, Tyler; Rummel, John D.

    1987-01-01

    Design decisions to aid the development of future space based biological life support systems (BLSS) can be made with simulation models. The biochemistry stoichiometry was developed for: (1) protein, carbohydrate, fat, fiber, and lignin production in the edible and inedible parts of plants; (2) food consumption and production of organic solids in urine, feces, and wash water by the humans; and (3) operation of the waste processor. Flux values for all components are derived for a steady state system with wheat as the sole food source. The large scale dynamics of a materially closed (BLSS) computer model is described in a companion paper. An extension of this methodology can explore multifood systems and more complex biochemical dynamics while maintaining whole system closure as a focus.

  12. Introduction to mathematical biology modeling, analysis, and simulations

    CERN Document Server

    Chou, Ching Shan

    2016-01-01

    This book is based on a one semester course that the authors have been teaching for several years, and includes two sets of case studies. The first includes chemostat models, predator-prey interaction, competition among species, the spread of infectious diseases, and oscillations arising from bifurcations. In developing these topics, readers will also be introduced to the basic theory of ordinary differential equations, and how to work with MATLAB without having any prior programming experience. The second set of case studies were adapted from recent and current research papers to the level of the students. Topics have been selected based on public health interest. This includes the risk of atherosclerosis associated with high cholesterol levels, cancer and immune interactions, cancer therapy, and tuberculosis. Readers will experience how mathematical models and their numerical simulations can provide explanations that guide biological and biomedical research. Considered to be the undergraduate companion to t...

  13. Thermodynamic modeling of transcription: sensitivity analysis differentiates biological mechanism from mathematical model-induced effects.

    Science.gov (United States)

    Dresch, Jacqueline M; Liu, Xiaozhou; Arnosti, David N; Ay, Ahmet

    2010-10-24

    Quantitative models of gene expression generate parameter values that can shed light on biological features such as transcription factor activity, cooperativity, and local effects of repressors. An important element in such investigations is sensitivity analysis, which determines how strongly a model's output reacts to variations in parameter values. Parameters of low sensitivity may not be accurately estimated, leading to unwarranted conclusions. Low sensitivity may reflect the nature of the biological data, or it may be a result of the model structure. Here, we focus on the analysis of thermodynamic models, which have been used extensively to analyze gene transcription. Extracted parameter values have been interpreted biologically, but until now little attention has been given to parameter sensitivity in this context. We apply local and global sensitivity analyses to two recent transcriptional models to determine the sensitivity of individual parameters. We show that in one case, values for repressor efficiencies are very sensitive, while values for protein cooperativities are not, and provide insights on why these differential sensitivities stem from both biological effects and the structure of the applied models. In a second case, we demonstrate that parameters that were thought to prove the system's dependence on activator-activator cooperativity are relatively insensitive. We show that there are numerous parameter sets that do not satisfy the relationships proferred as the optimal solutions, indicating that structural differences between the two types of transcriptional enhancers analyzed may not be as simple as altered activator cooperativity. Our results emphasize the need for sensitivity analysis to examine model construction and forms of biological data used for modeling transcriptional processes, in order to determine the significance of estimated parameter values for thermodynamic models. Knowledge of parameter sensitivities can provide the necessary

  14. A Color-Opponency Based Biological Model for Color Constancy

    Directory of Open Access Journals (Sweden)

    Yongjie Li

    2011-05-01

    Full Text Available Color constancy is the ability of the human visual system to adaptively correct color-biased scenes under different illuminants. Most of the existing color constancy models are nonphysiologically plausible. Among the limited biological models, the great majority is Retinex and its variations, and only two or three models directly simulate the feature of color-opponency, but only of the very earliest stages of visual pathway, i.e., the single-opponent mechanisms involved at the levels of retinal ganglion cells and lateral geniculate nucleus (LGN neurons. Considering the extensive physiological evidences supporting that both the single-opponent cells in retina and LGN and the double-opponent neurons in primary visual cortex (V1 are the building blocks for color constancy, in this study we construct a color-opponency based color constancy model by simulating the opponent fashions of both the single-opponent and double-opponent cells in a forward manner. As for the spatial structure of the receptive fields (RF, both the classical RF (CRF center and the nonclassical RF (nCRF surround are taken into account for all the cells. The proposed model was tested on several typical image databases commonly used for performance evaluation of color constancy methods, and exciting results were achieved.

  15. Models for integrated pest control and their biological implications.

    Science.gov (United States)

    Tang, Sanyi; Cheke, Robert A

    2008-09-01

    Successful integrated pest management (IPM) control programmes depend on many factors which include host-parasitoid ratios, starting densities, timings of parasitoid releases, dosages and timings of insecticide applications and levels of host-feeding and parasitism. Mathematical models can help us to clarify and predict the effects of such factors on the stability of host-parasitoid systems, which we illustrate here by extending the classical continuous and discrete host-parasitoid models to include an IPM control programme. The results indicate that one of three control methods can maintain the host level below the economic threshold (ET) in relation to different ET levels, initial densities of host and parasitoid populations and host-parasitoid ratios. The effects of host intrinsic growth rate and parasitoid searching efficiency on host mean outbreak period can be calculated numerically from the models presented. The instantaneous pest killing rate of an insecticide application is also estimated from the models. The results imply that the modelling methods described can help in the design of appropriate control strategies and assist management decision-making. The results also indicate that a high initial density of parasitoids (such as in inundative releases) and high parasitoid inter-generational survival rates will lead to more frequent host outbreaks and, therefore, greater economic damage. The biological implications of this counter intuitive result are discussed.

  16. Computational brain models: Advances from system biology and future challenges

    Directory of Open Access Journals (Sweden)

    George E. Barreto

    2015-02-01

    Full Text Available Computational brain models focused on the interactions between neurons and astrocytes, modeled via metabolic reconstructions, are reviewed. The large source of experimental data provided by the -omics techniques and the advance/application of computational and data-management tools are being fundamental. For instance, in the understanding of the crosstalk between these cells, the key neuroprotective mechanisms mediated by astrocytes in specific metabolic scenarios (1 and the identification of biomarkers for neurodegenerative diseases (2,3. However, the modeling of these interactions demands a clear view of the metabolic and signaling pathways implicated, but most of them are controversial and are still under evaluation (4. Hence, to gain insight into the complexity of these interactions a current view of the main pathways implicated in the neuron-astrocyte communication processes have been made from recent experimental reports and reviews. Furthermore, target problems, limitations and main conclusions have been identified from metabolic models of the brain reported from 2010. Finally, key aspects to take into account into the development of a computational model of the brain and topics that could be approached from a systems biology perspective in future research are highlighted.

  17. High school and college biology: A multi-level model of the effects of high school biology courses on student academic performance in introductory college biology courses

    Science.gov (United States)

    Loehr, John Francis

    The issue of student preparation for college study in science has been an ongoing concern for both college-bound students and educators of various levels. This study uses a national sample of college students enrolled in introductory biology courses to address the relationship between high school biology preparation and subsequent introductory college biology performance. Multi-Level Modeling was used to investigate the relationship between students' high school science and mathematics experiences and college biology performance. This analysis controls for student demographic and educational background factors along with factors associated with the college or university attended. The results indicated that high school course-taking and science instructional experiences have the largest impact on student achievement in the first introductory college biology course. In particular, enrollment in courses, such as high school Calculus and Advanced Placement (AP) Biology, along with biology course content that focuses on developing a deep understanding of the topics is found to be positively associated with student achievement in introductory college biology. On the other hand, experiencing high numbers of laboratory activities, demonstrations, and independent projects along with higher levels of laboratory freedom are associated with negative achievement. These findings are relevant to high school biology teachers, college students, their parents, and educators looking beyond the goal of high school graduation.

  18. Respectful Modeling: Addressing Uncertainty in Dynamic System Models for Molecular Biology.

    Science.gov (United States)

    Tsigkinopoulou, Areti; Baker, Syed Murtuza; Breitling, Rainer

    2017-06-01

    Although there is still some skepticism in the biological community regarding the value and significance of quantitative computational modeling, important steps are continually being taken to enhance its accessibility and predictive power. We view these developments as essential components of an emerging 'respectful modeling' framework which has two key aims: (i) respecting the models themselves and facilitating the reproduction and update of modeling results by other scientists, and (ii) respecting the predictions of the models and rigorously quantifying the confidence associated with the modeling results. This respectful attitude will guide the design of higher-quality models and facilitate the use of models in modern applications such as engineering and manipulating microbial metabolism by synthetic biology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Realistic rhetoric and legal decision

    Directory of Open Access Journals (Sweden)

    João Maurício Adeodato

    2017-06-01

    Full Text Available The text aims to lay the foundations of a realistic rhetoric, from the descriptive perspective of how the legal decision actually takes place, without normative considerations. Aristotle's rhetorical idealism and its later prestige reduced rhetoric to the art of persuasion, eliminating important elements of sophistry, especially with regard to legal decision. It concludes with a rhetorical perspective of judicial activism in complex societies.

  20. Realist cinema as world cinema

    OpenAIRE

    Nagib, Lucia

    2017-01-01

    The idea that “realism” is the common denominator across the vast range of productions normally labelled as “world cinema” is widespread and seemly uncontroversial. Leaving aside oppositional binaries that define world cinema as the other of Hollywood or of classical cinema, this chapter will test the realist premise by locating it in the mode of production. It will define this mode as an ethics that engages filmmakers, at cinema’s creative peaks, with the physical and historical environment,...

  1. Modeling biological tissue growth: discrete to continuum representations.

    Science.gov (United States)

    Hywood, Jack D; Hackett-Jones, Emily J; Landman, Kerry A

    2013-09-01

    There is much interest in building deterministic continuum models from discrete agent-based models governed by local stochastic rules where an agent represents a biological cell. In developmental biology, cells are able to move and undergo cell division on and within growing tissues. A growing tissue is itself made up of cells which undergo cell division, thereby providing a significant transport mechanism for other cells within it. We develop a discrete agent-based model where domain agents represent tissue cells. Each agent has the ability to undergo a proliferation event whereby an additional domain agent is incorporated into the lattice. If a probability distribution describes the waiting times between proliferation events for an individual agent, then the total length of the domain is a random variable. The average behavior of these stochastically proliferating agents defining the growing lattice is determined in terms of a Fokker-Planck equation, with an advection and diffusion term. The diffusion term differs from the one obtained Landman and Binder [J. Theor. Biol. 259, 541 (2009)] when the rate of growth of the domain is specified, but the choice of agents is random. This discrepancy is reconciled by determining a discrete-time master equation for this process and an associated asymmetric nonexclusion random walk, together with consideration of synchronous and asynchronous updating schemes. All theoretical results are confirmed with numerical simulations. This study furthers our understanding of the relationship between agent-based rules, their implementation, and their associated partial differential equations. Since tissue growth is a significant cellular transport mechanism during embryonic growth, it is important to use the correct partial differential equation description when combining with other cellular functions.

  2. Logic-statistic modeling and analysis of biological sequence data

    DEFF Research Database (Denmark)

    Christiansen, Henning

    2007-01-01

    We describe here the intentions and plans of a newly started, funded research project in order to further the dialogue with the international research in the field. The purpose is to obtain experiences for realistic applications of flexible and powerful modeling tools that integrate logic and sta...

  3. Models to Study NK Cell Biology and Possible Clinical Application.

    Science.gov (United States)

    Zamora, Anthony E; Grossenbacher, Steven K; Aguilar, Ethan G; Murphy, William J

    2015-08-03

    Natural killer (NK) cells are large granular lymphocytes of the innate immune system, responsible for direct targeting and killing of both virally infected and transformed cells. NK cells rapidly recognize and respond to abnormal cells in the absence of prior sensitization due to their wide array of germline-encoded inhibitory and activating receptors, which differs from the receptor diversity found in B and T lymphocytes that is due to the use of recombination-activation gene (RAG) enzymes. Although NK cells have traditionally been described as natural killers that provide a first line of defense prior to the induction of adaptive immunity, a more complex view of NK cells is beginning to emerge, indicating they may also function in various immunoregulatory roles and have the capacity to shape adaptive immune responses. With the growing appreciation for the diverse functions of NK cells, and recent technological advancements that allow for a more in-depth understanding of NK cell biology, we can now begin to explore new ways to manipulate NK cells to increase their clinical utility. In this overview unit, we introduce the reader to various aspects of NK cell biology by reviewing topics ranging from NK cell diversity and function, mouse models, and the roles of NK cells in health and disease, to potential clinical applications. © 2015 by John Wiley & Sons, Inc. Copyright © 2015 John Wiley & Sons, Inc.

  4. Micrasterias as a model system in plant cell biology

    Directory of Open Access Journals (Sweden)

    Ursula Luetz-Meindl

    2016-07-01

    Full Text Available The unicellular freshwater alga Micrasterias denticulata is an exceptional organism due to its extraordinary star-shaped, highly symmetric morphology and has thus attracted the interest of researchers for many decades. As a member of the Streptophyta, Micrasterias is not only genetically closely related to higher land plants but shares common features with them in many physiological and cell biological aspects. These facts, together with its considerable cell size of about 200 µm, its modest cultivation conditions and the uncomplicated accessibility particularly to any microscopic techniques, make Micrasterias a very well suited cell biological plant model system. The review focuses particularly on cell wall formation and composition, dictyosomal structure and function, cytoskeleton control of growth and morphogenesis as well as on ionic regulation and signal transduction. It has been also shown in the recent years that Micrasterias is a highly sensitive indicator for environmental stress impact such as heavy metals, high salinity, oxidative stress or starvation. Stress induced organelle degradation, autophagy, adaption and detoxification mechanisms have moved in the center of interest and have been investigated with modern microscopic techniques such as 3-D- and analytical electron microscopy as well as with biochemical, physiological and molecular approaches. This review is intended to summarize and discuss the most important results obtained in Micrasterias in the last 20 years and to compare the results to similar processes in higher plant cells.

  5. Mesoscale modeling: solving complex flows in biology and biotechnology.

    Science.gov (United States)

    Mills, Zachary Grant; Mao, Wenbin; Alexeev, Alexander

    2013-07-01

    Fluids are involved in practically all physiological activities of living organisms. However, biological and biorelated flows are hard to analyze due to the inherent combination of interdependent effects and processes that occur on a multitude of spatial and temporal scales. Recent advances in mesoscale simulations enable researchers to tackle problems that are central for the understanding of such flows. Furthermore, computational modeling effectively facilitates the development of novel therapeutic approaches. Among other methods, dissipative particle dynamics and the lattice Boltzmann method have become increasingly popular during recent years due to their ability to solve a large variety of problems. In this review, we discuss recent applications of these mesoscale methods to several fluid-related problems in medicine, bioengineering, and biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Theories and models on the biological of cells in space

    Science.gov (United States)

    Todd, P.; Klaus, D. M.

    1996-01-01

    A wide variety of observations on cells in space, admittedly made under constraining and unnatural conditions in may cases, have led to experimental results that were surprising or unexpected. Reproducibility, freedom from artifacts, and plausibility must be considered in all cases, even when results are not surprising. The papers in symposium on 'Theories and Models on the Biology of Cells in Space' are dedicated to the subject of the plausibility of cellular responses to gravity -- inertial accelerations between 0 and 9.8 m/sq s and higher. The mechanical phenomena inside the cell, the gravitactic locomotion of single eukaryotic and prokaryotic cells, and the effects of inertial unloading on cellular physiology are addressed in theoretical and experimental studies.

  7. Progress in realistic LOCA analysis

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Ohkawa, K.

    2004-01-01

    In 1988 the USNRC revised the ECCS rule contained in Appendix K and Section 50.46 of 10 CFR Part 50, which governs the analysis of the Loss Of Coolant Accident (LOCA). The revised regulation allows the use of realistic computer models to calculate the loss of coolant accident. In addition, the new regulation allows the use of high probability estimates of peak cladding temperature (PCT), rather than upper bound estimates. Prior to this modification, the regulations were a prescriptive set of rules which defined what assumptions must be made about the plant initial conditions and how various physical processes should be modeled. The resulting analyses were highly conservative in their prediction of the performance of the ECCS, and placed tight constraints on core power distributions, ECCS set points and functional requirements, and surveillance and testing. These restrictions, if relaxed, will allow for additional economy, flexibility, and in some cases, improved reliability and safety as well. For example, additional economy and operating flexibility can be achieved by implementing several available core and fuel rod designs to increase fuel discharge burnup and reduce neutron flux on the reactor vessel. The benefits of application of best estimate methods to LOCA analyses have typically been associated with reductions in fuel costs, resulting from optimized fuel designs, or increased revenue from power upratings. Fuel cost savings are relatively easy to quantify, and have been estimated at several millions of dollars per cycle for an individual plant. Best estimate methods are also likely to contribute significantly to reductions in O and M costs, although these reductions are more difficult to quantify. Examples of O and M cost reductions are: 1) Delaying equipment replacement. With best estimate methods, LOCA is no longer a factor in limiting power levels for plants with high tube plugging levels or degraded safety injection systems. If other requirements for

  8. Modelling biological invasions: Individual to population scales at interfaces

    KAUST Repository

    Belmonte-Beitia, J.

    2013-10-01

    Extracting the population level behaviour of biological systems from that of the individual is critical in understanding dynamics across multiple scales and thus has been the subject of numerous investigations. Here, the influence of spatial heterogeneity in such contexts is explored for interfaces with a separation of the length scales characterising the individual and the interface, a situation that can arise in applications involving cellular modelling. As an illustrative example, we consider cell movement between white and grey matter in the brain which may be relevant in considering the invasive dynamics of glioma. We show that while one can safely neglect intrinsic noise, at least when considering glioma cell invasion, profound differences in population behaviours emerge in the presence of interfaces with only subtle alterations in the dynamics at the individual level. Transport driven by local cell sensing generates predictions of cell accumulations along interfaces where cell motility changes. This behaviour is not predicted with the commonly used Fickian diffusion transport model, but can be extracted from preliminary observations of specific cell lines in recent, novel, cryo-imaging. Consequently, these findings suggest a need to consider the impact of individual behaviour, spatial heterogeneity and especially interfaces in experimental and modelling frameworks of cellular dynamics, for instance in the characterisation of glioma cell motility. © 2013 Elsevier L