WorldWideScience

Sample records for biologically realistic model

  1. A Novel CPU/GPU Simulation Environment for Large-Scale Biologically-Realistic Neural Modeling

    Directory of Open Access Journals (Sweden)

    Roger V Hoang

    2013-10-01

    Full Text Available Computational Neuroscience is an emerging field that provides unique opportunities to studycomplex brain structures through realistic neural simulations. However, as biological details are added tomodels, the execution time for the simulation becomes longer. Graphics Processing Units (GPUs are now being utilized to accelerate simulations due to their ability to perform computations in parallel. As such, they haveshown significant improvement in execution time compared to Central Processing Units (CPUs. Most neural simulators utilize either multiple CPUs or a single GPU for better performance, but still show limitations in execution time when biological details are not sacrificed. Therefore, we present a novel CPU/GPU simulation environment for large-scale biological networks,the NeoCortical Simulator version 6 (NCS6. NCS6 is a free, open-source, parallelizable, and scalable simula-tor, designed to run on clusters of multiple machines, potentially with high performance computing devicesin each of them. It has built-in leaky-integrate-and-fire (LIF and Izhikevich (IZH neuron models, but usersalso have the capability to design their own plug-in interface for different neuron types as desired. NCS6is currently able to simulate one million cells and 100 million synapses in quasi real time by distributing dataacross these heterogeneous clusters of CPUs and GPUs.

  2. Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations

    International Nuclear Information System (INIS)

    Aristovich, K Y; Khan, S H

    2010-01-01

    Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties - from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects.

  3. Larval dispersal modeling of pearl oyster Pinctada margaritifera following realistic environmental and biological forcing in Ahe atoll lagoon.

    Directory of Open Access Journals (Sweden)

    Yoann Thomas

    Full Text Available Studying the larval dispersal of bottom-dwelling species is necessary to understand their population dynamics and optimize their management. The black-lip pearl oyster (Pinctada margaritifera is cultured extensively to produce black pearls, especially in French Polynesia's atoll lagoons. This aquaculture relies on spat collection, a process that can be optimized by understanding which factors influence larval dispersal. Here, we investigate the sensitivity of P. margaritifera larval dispersal kernel to both physical and biological factors in the lagoon of Ahe atoll. Specifically, using a validated 3D larval dispersal model, the variability of lagoon-scale connectivity is investigated against wind forcing, depth and location of larval release, destination location, vertical swimming behavior and pelagic larval duration (PLD factors. The potential connectivity was spatially weighted according to both the natural and cultivated broodstock densities to provide a realistic view of connectivity. We found that the mean pattern of potential connectivity was driven by the southwest and northeast main barotropic circulation structures, with high retention levels in both. Destination locations, spawning sites and PLD were the main drivers of potential connectivity, explaining respectively 26%, 59% and 5% of the variance. Differences between potential and realistic connectivity showed the significant contribution of the pearl oyster broodstock location to its own dynamics. Realistic connectivity showed larger larval supply in the western destination locations, which are preferentially used by farmers for spat collection. In addition, larval supply in the same sectors was enhanced during summer wind conditions. These results provide new cues to understanding the dynamics of bottom-dwelling populations in atoll lagoons, and show how to take advantage of numerical models for pearl oyster management.

  4. Development of a realistic human airway model.

    Science.gov (United States)

    Lizal, Frantisek; Elcner, Jakub; Hopke, Philip K; Jedelsky, Jan; Jicha, Miroslav

    2012-03-01

    Numerous models of human lungs with various levels of idealization have been reported in the literature; consequently, results acquired using these models are difficult to compare to in vivo measurements. We have developed a set of model components based on realistic geometries, which permits the analysis of the effects of subsequent model simplification. A realistic digital upper airway geometry except for the lack of an oral cavity has been created which proved suitable both for computational fluid dynamics (CFD) simulations and for the fabrication of physical models. Subsequently, an oral cavity was added to the tracheobronchial geometry. The airway geometry including the oral cavity was adjusted to enable fabrication of a semi-realistic model. Five physical models were created based on these three digital geometries. Two optically transparent models, one with and one without the oral cavity, were constructed for flow velocity measurements, two realistic segmented models, one with and one without the oral cavity, were constructed for particle deposition measurements, and a semi-realistic model with glass cylindrical airways was developed for optical measurements of flow velocity and in situ particle size measurements. One-dimensional phase doppler anemometry measurements were made and compared to the CFD calculations for this model and good agreement was obtained.

  5. Realistic Material Appearance Modelling

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Filip, Jiří; Hatka, Martin

    2010-01-01

    Roč. 2010, č. 81 (2010), s. 13-14 ISSN 0926-4981 R&D Projects: GA ČR GA102/08/0593 Institutional research plan: CEZ:AV0Z10750506 Keywords : bidirectional texture function * texture modelling Subject RIV: BD - Theory of Information http:// library .utia.cas.cz/separaty/2010/RO/haindl-realistic material appearance modelling.pdf

  6. 'Semi-realistic'F-term inflation model building in supergravity

    International Nuclear Information System (INIS)

    Kain, Ben

    2008-01-01

    We describe methods for building 'semi-realistic' models of F-term inflation. By semi-realistic we mean that they are built in, and obey the requirements of, 'semi-realistic' particle physics models. The particle physics models are taken to be effective supergravity theories derived from orbifold compactifications of string theory, and their requirements are taken to be modular invariance, absence of mass terms and stabilization of moduli. We review the particle physics models, their requirements and tools and methods for building inflation models

  7. Comparing Realistic Subthalamic Nucleus Neuron Models

    Science.gov (United States)

    Njap, Felix; Claussen, Jens C.; Moser, Andreas; Hofmann, Ulrich G.

    2011-06-01

    The mechanism of action of clinically effective electrical high frequency stimulation is still under debate. However, recent evidence points at the specific activation of GABA-ergic ion channels. Using a computational approach, we analyze temporal properties of the spike trains emitted by biologically realistic neurons of the subthalamic nucleus (STN) as a function of GABA-ergic synaptic input conductances. Our contribution is based on a model proposed by Rubin and Terman and exhibits a wide variety of different firing patterns, silent, low spiking, moderate spiking and intense spiking activity. We observed that most of the cells in our network turn to silent mode when we increase the GABAA input conductance above the threshold of 3.75 mS/cm2. On the other hand, insignificant changes in firing activity are observed when the input conductance is low or close to zero. We thus reproduce Rubin's model with vanishing synaptic conductances. To quantitatively compare spike trains from the original model with the modified model at different conductance levels, we apply four different (dis)similarity measures between them. We observe that Mahalanobis distance, Victor-Purpura metric, and Interspike Interval distribution are sensitive to different firing regimes, whereas Mutual Information seems undiscriminative for these functional changes.

  8. A Model of Biological Attacks on a Realistic Population

    Science.gov (United States)

    Carley, Kathleen M.; Fridsma, Douglas; Casman, Elizabeth; Altman, Neal; Chen, Li-Chiou; Kaminsky, Boris; Nave, Demian; Yahja, Alex

    The capability to assess the impacts of large-scale biological attacks and the efficacy of containment policies is critical and requires knowledge-intensive reasoning about social response and disease transmission within a complex social system. There is a close linkage among social networks, transportation networks, disease spread, and early detection. Spatial dimensions related to public gathering places such as hospitals, nursing homes, and restaurants, can play a major role in epidemics [Klovdahl et. al. 2001]. Like natural epidemics, bioterrorist attacks unfold within spatially defined, complex social systems, and the societal and networked response can have profound effects on their outcome. This paper focuses on bioterrorist attacks, but the model has been applied to emergent and familiar diseases as well.

  9. Cellular potts models multiscale extensions and biological applications

    CERN Document Server

    Scianna, Marco

    2013-01-01

    A flexible, cell-level, and lattice-based technique, the cellular Potts model accurately describes the phenomenological mechanisms involved in many biological processes. Cellular Potts Models: Multiscale Extensions and Biological Applications gives an interdisciplinary, accessible treatment of these models, from the original methodologies to the latest developments. The book first explains the biophysical bases, main merits, and limitations of the cellular Potts model. It then proposes several innovative extensions, focusing on ways to integrate and interface the basic cellular Potts model at the mesoscopic scale with approaches that accurately model microscopic dynamics. These extensions are designed to create a nested and hybrid environment, where the evolution of a biological system is realistically driven by the constant interplay and flux of information between the different levels of description. Through several biological examples, the authors demonstrate a qualitative and quantitative agreement with t...

  10. Realistic molecular model of kerogen's nanostructure.

    Science.gov (United States)

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J-M; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp(2)/sp(3) hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  11. Neutron star models with realistic high-density equations of state

    International Nuclear Information System (INIS)

    Malone, R.C.; Johnson, M.B.; Bethe, H.A.

    1975-01-01

    We calculate neutron star models using four realistic high-density models of the equation of state. We conclude that the maximum mass of a neutron star is unlikely to exceed 2 M/sub sun/. All of the realistic models are consistent with current estimates of the moment of inertia of the Crab pulsar

  12. Interferometric data modelling: issues in realistic data generation

    International Nuclear Information System (INIS)

    Mukherjee, Soma

    2004-01-01

    This study describes algorithms developed for modelling interferometric noise in a realistic manner, i.e. incorporating non-stationarity that can be seen in the data from the present generation of interferometers. The noise model is based on individual component models (ICM) with the application of auto regressive moving average (ARMA) models. The data obtained from the model are vindicated by standard statistical tests, e.g. the KS test and Akaike minimum criterion. The results indicate a very good fit. The advantage of using ARMA for ICMs is that the model parameters can be controlled and hence injection and efficiency studies can be conducted in a more controlled environment. This realistic non-stationary noise generator is intended to be integrated within the data monitoring tool framework

  13. A Fibrocontractive Mechanochemical Model of Dermal Wound Closure Incorporating Realistic Growth Factor Kinetics

    KAUST Repository

    Murphy, Kelly E.

    2012-01-13

    Fibroblasts and their activated phenotype, myofibroblasts, are the primary cell types involved in the contraction associated with dermal wound healing. Recent experimental evidence indicates that the transformation from fibroblasts to myofibroblasts involves two distinct processes: The cells are stimulated to change phenotype by the combined actions of transforming growth factor β (TGFβ) and mechanical tension. This observation indicates a need for a detailed exploration of the effect of the strong interactions between the mechanical changes and growth factors in dermal wound healing. We review the experimental findings in detail and develop a model of dermal wound healing that incorporates these phenomena. Our model includes the interactions between TGFβ and collagenase, providing a more biologically realistic form for the growth factor kinetics than those included in previous mechanochemical descriptions. A comparison is made between the model predictions and experimental data on human dermal wound healing and all the essential features are well matched. © 2012 Society for Mathematical Biology.

  14. A Fibrocontractive Mechanochemical Model of Dermal Wound Closure Incorporating Realistic Growth Factor Kinetics

    KAUST Repository

    Murphy, Kelly E.; Hall, Cameron L.; Maini, Philip K.; McCue, Scott W.; McElwain, D. L. Sean

    2012-01-01

    Fibroblasts and their activated phenotype, myofibroblasts, are the primary cell types involved in the contraction associated with dermal wound healing. Recent experimental evidence indicates that the transformation from fibroblasts to myofibroblasts involves two distinct processes: The cells are stimulated to change phenotype by the combined actions of transforming growth factor β (TGFβ) and mechanical tension. This observation indicates a need for a detailed exploration of the effect of the strong interactions between the mechanical changes and growth factors in dermal wound healing. We review the experimental findings in detail and develop a model of dermal wound healing that incorporates these phenomena. Our model includes the interactions between TGFβ and collagenase, providing a more biologically realistic form for the growth factor kinetics than those included in previous mechanochemical descriptions. A comparison is made between the model predictions and experimental data on human dermal wound healing and all the essential features are well matched. © 2012 Society for Mathematical Biology.

  15. On the Realistic Stochastic Model of GPS Observables: Implementation and Performance

    Science.gov (United States)

    Zangeneh-Nejad, F.; Amiri-Simkooei, A. R.; Sharifi, M. A.; Asgari, J.

    2015-12-01

    High-precision GPS positioning requires a realistic stochastic model of observables. A realistic GPS stochastic model of observables should take into account different variances for different observation types, correlations among different observables, the satellite elevation dependence of observables precision, and the temporal correlation of observables. Least-squares variance component estimation (LS-VCE) is applied to GPS observables using the geometry-based observation model (GBOM). To model the satellite elevation dependent of GPS observables precision, an exponential model depending on the elevation angles of the satellites are also employed. Temporal correlation of the GPS observables is modelled by using a first-order autoregressive noise model. An important step in the high-precision GPS positioning is double difference integer ambiguity resolution (IAR). The fraction or percentage of success among a number of integer ambiguity fixing is called the success rate. A realistic estimation of the GNSS observables covariance matrix plays an important role in the IAR. We consider the ambiguity resolution success rate for two cases, namely a nominal and a realistic stochastic model of the GPS observables using two GPS data sets collected by the Trimble R8 receiver. The results confirm that applying a more realistic stochastic model can significantly improve the IAR success rate on individual frequencies, either on L1 or on L2. An improvement of 20% was achieved to the empirical success rate results. The results also indicate that introducing the realistic stochastic model leads to a larger standard deviation for the baseline components by a factor of about 2.6 on the data sets considered.

  16. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  17. Building Realistic Mobility Models for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Adrian Pullin

    2018-04-01

    Full Text Available A mobile ad hoc network (MANET is a self-configuring wireless network in which each node could act as a router, as well as a data source or sink. Its application areas include battlefields and vehicular and disaster areas. Many techniques applied to infrastructure-based networks are less effective in MANETs, with routing being a particular challenge. This paper presents a rigorous study into simulation techniques for evaluating routing solutions for MANETs with the aim of producing more realistic simulation models and thereby, more accurate protocol evaluations. MANET simulations require models that reflect the world in which the MANET is to operate. Much of the published research uses movement models, such as the random waypoint (RWP model, with arbitrary world sizes and node counts. This paper presents a technique for developing more realistic simulation models to test and evaluate MANET protocols. The technique is animation, which is applied to a realistic scenario to produce a model that accurately reflects the size and shape of the world, node count, movement patterns, and time period over which the MANET may operate. The animation technique has been used to develop a battlefield model based on established military tactics. Trace data has been used to build a model of maritime movements in the Irish Sea. Similar world models have been built using the random waypoint movement model for comparison. All models have been built using the ns-2 simulator. These models have been used to compare the performance of three routing protocols: dynamic source routing (DSR, destination-sequenced distance-vector routing (DSDV, and ad hoc n-demand distance vector routing (AODV. The findings reveal that protocol performance is dependent on the model used. In particular, it is shown that RWP models do not reflect the performance of these protocols under realistic circumstances, and protocol selection is subject to the scenario to which it is applied. To

  18. Realistic biological approaches for improving thermoradiotherapy

    DEFF Research Database (Denmark)

    Horsman, Michael R

    2016-01-01

    There is now definitive clinical evidence that hyperthermia can successfully improve the response of certain human tumour types to radiation therapy, but, there is still the need for improvement. From a biological standpoint this can be achieved by either targeting the cellular or vascular...... or radiation in preclinical models and clear benefits in tumour response observed. But few of these methods have actually been combined with thermoradiotherapy. Furthermore, very few combinations have been tested in relevant normal tissue studies, despite the fact that it is the normal tissue response...... that controls the maximal heat or radiation treatment that can be applied. Here we review the most clinically relevant biological approaches that have been shown to enhance thermoradiotherapy, or have the potential to be applied in this context, and suggest how these should be moved forward into the clinic....

  19. An inexpensive yet realistic model for teaching vasectomy

    Directory of Open Access Journals (Sweden)

    Taylor M. Coe

    2015-04-01

    Full Text Available Purpose Teaching the no-scalpel vasectomy is important, since vasectomy is a safe, simple, and cost-effective method of contraception. This minimally invasive vasectomy technique involves delivering the vas through the skin with specialized tools. This technique is associated with fewer complications than the traditional incisional vasectomy (1. One of the most challenging steps is the delivery of the vas through a small puncture in the scrotal skin, and there is a need for a realistic and inexpensive scrotal model for beginning learners to practice this step. Materials and Methods After careful observation using several scrotal models while teaching residents and senior trainees, we developed a simplified scrotal model that uses only three components–bicycle inner tube, latex tubing, and a Penrose drain. Results This model is remarkably realistic and allows learners to practice a challenging step in the no-scalpel vasectomy. The low cost and simple construction of the model allows wide dissemination of training in this important technique. Conclusions We propose a simple, inexpensive model that will enable learners to master the hand movements involved in delivering the vas through the skin while mitigating the risks of learning on patients.

  20. Toward the M(F)--Theory Embedding of Realistic Free-Fermion Models

    CERN Document Server

    Berglund, P; Faraggi, A E; Nanopoulos, Dimitri V; Qiu, Z; Berglund, Per; Ellis, John; Faraggi, Alon E.; Qiu, Zongan

    1998-01-01

    We construct a Landau-Ginzburg model with the same data and symmetries as a $Z_2\\times Z_2$ orbifold that corresponds to a class of realistic free-fermion models. Within the class of interest, we show that this orbifolding connects between different $Z_2\\times Z_2$ orbifold models and commutes with the mirror symmetry. Our work suggests that duality symmetries previously discussed in the context of specific $M$ and $F$ theory compactifications may be extended to the special $Z_2\\times Z_2$ orbifold that characterizes realistic free-fermion models.

  1. Realistic edge field model code REFC for designing and study of isochronous cyclotron

    International Nuclear Information System (INIS)

    Ismail, M.

    1989-01-01

    The focussing properties and the requirements for isochronism in cyclotron magnet configuration are well-known in hard edge field model. The fact that they quite often change considerably in realistic field can be attributed mainly to the influence of the edge field. A solution to this problem requires a field model which allows a simple construction of equilibrium orbit and yield simple formulae. This can be achieved by using a fitted realistic edge field (Hudson et al 1975) in the region of the pole edge and such a field model is therefore called a realistic edge field model. A code REFC based on realistic edge field model has been developed to design the cyclotron sectors and the code FIELDER has been used to study the beam properties. In this report REFC code has been described along with some relevant explaination of the FIELDER code. (author). 11 refs., 6 figs

  2. Gauge coupling unification in realistic free-fermionic string models

    International Nuclear Information System (INIS)

    Dienes, K.R.; Faraggi, A.E.

    1995-01-01

    We discuss the unification of gauge couplings within the framework of a wide class of realistic free-fermionic string models which have appeared in the literature, including the flipped SU(5), SO(6)xSO(4), and various SU(3)xSU(2)xU(1) models. If the matter spectrum below the string scale is that of the Minimal Supersymmetric Standard Model (MSSM), then string unification is in disagreement with experiment. We therefore examine several effects that may modify the minimal string predictions. First, we develop a systematic procedure for evaluating the one-loop heavy string threshold corrections in free-fermionic string models, and we explicitly evaluate these corrections for each of the realistic models. We find that these string threshold corrections are small, and we provide general arguments explaining why such threshold corrections are suppressed in string theory. Thus heavy thresholds cannot resolve the disagreement with experiment. We also study the effect of non-standard hypercharge normalizations, light SUSY thresholds, and intermediate-scale gauge structure, and similarly conclude that these effects cannot resolve the disagreement with low-energy data. Finally, we examine the effects of additional color triplets and electroweak doublets beyond the MSSM. Although not required in ordinary grand unification scenarios, such states generically appear within the context of certain realistic free-fermionic string models. We show that if these states exist at the appropriate thresholds, then the gauge couplings will indeed unify at the string scale. Thus, within these string models, string unification can be in agreement with low-energy data. (orig.)

  3. Simulation of microarray data with realistic characteristics

    Directory of Open Access Journals (Sweden)

    Lehmussola Antti

    2006-07-01

    Full Text Available Abstract Background Microarray technologies have become common tools in biological research. As a result, a need for effective computational methods for data analysis has emerged. Numerous different algorithms have been proposed for analyzing the data. However, an objective evaluation of the proposed algorithms is not possible due to the lack of biological ground truth information. To overcome this fundamental problem, the use of simulated microarray data for algorithm validation has been proposed. Results We present a microarray simulation model which can be used to validate different kinds of data analysis algorithms. The proposed model is unique in the sense that it includes all the steps that affect the quality of real microarray data. These steps include the simulation of biological ground truth data, applying biological and measurement technology specific error models, and finally simulating the microarray slide manufacturing and hybridization. After all these steps are taken into account, the simulated data has realistic biological and statistical characteristics. The applicability of the proposed model is demonstrated by several examples. Conclusion The proposed microarray simulation model is modular and can be used in different kinds of applications. It includes several error models that have been proposed earlier and it can be used with different types of input data. The model can be used to simulate both spotted two-channel and oligonucleotide based single-channel microarrays. All this makes the model a valuable tool for example in validation of data analysis algorithms.

  4. Realistic Gamow shell model for resonance and continuum in atomic nuclei

    Science.gov (United States)

    Xu, F. R.; Sun, Z. H.; Wu, Q.; Hu, B. S.; Dai, S. J.

    2018-02-01

    The Gamow shell model can describe resonance and continuum for atomic nuclei. The model is established in the complex-moment (complex-k) plane of the Berggren coordinates in which bound, resonant and continuum states are treated on equal footing self-consistently. In the present work, the realistic nuclear force, CD Bonn, has been used. We have developed the full \\hat{Q}-box folded-diagram method to derive the realistic effective interaction in the model space which is nondegenerate and contains resonance and continuum channels. The CD-Bonn potential is renormalized using the V low-k method. With choosing 16O as the inert core, we have applied the Gamow shell model to oxygen isotopes.

  5. Modeling of ultrasonic wave propagation in composite laminates with realistic discontinuity representation.

    Science.gov (United States)

    Zelenyak, Andreea-Manuela; Schorer, Nora; Sause, Markus G R

    2018-02-01

    This paper presents a method for embedding realistic defect geometries of a fiber reinforced material in a finite element modeling environment in order to simulate active ultrasonic inspection. When ultrasonic inspection is used experimentally to investigate the presence of defects in composite materials, the microscopic defect geometry may cause signal characteristics that are difficult to interpret. Hence, modeling of this interaction is key to improve our understanding and way of interpreting the acquired ultrasonic signals. To model the true interaction of the ultrasonic wave field with such defect structures as pores, cracks or delamination, a realistic three dimensional geometry reconstruction is required. We present a 3D-image based reconstruction process which converts computed tomography data in adequate surface representations ready to be embedded for processing with finite element methods. Subsequent modeling using these geometries uses a multi-scale and multi-physics simulation approach which results in quantitative A-Scan ultrasonic signals which can be directly compared with experimental signals. Therefore, besides the properties of the composite material, a full transducer implementation, piezoelectric conversion and simultaneous modeling of the attached circuit is applied. Comparison between simulated and experimental signals provides very good agreement in electrical voltage amplitude and the signal arrival time and thus validates the proposed modeling approach. Simulating ultrasound wave propagation in a medium with a realistic shape of the geometry clearly shows a difference in how the disturbance of the waves takes place and finally allows more realistic modeling of A-scans. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. An Overview of Westinghouse Realistic Large Break LOCA Evaluation Model

    Directory of Open Access Journals (Sweden)

    Cesare Frepoli

    2008-01-01

    Full Text Available Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.

  7. Converting differential-equation models of biological systems to membrane computing.

    Science.gov (United States)

    Muniyandi, Ravie Chandren; Zin, Abdullah Mohd; Sanders, J W

    2013-12-01

    This paper presents a method to convert the deterministic, continuous representation of a biological system by ordinary differential equations into a non-deterministic, discrete membrane computation. The dynamics of the membrane computation is governed by rewrite rules operating at certain rates. That has the advantage of applying accurately to small systems, and to expressing rates of change that are determined locally, by region, but not necessary globally. Such spatial information augments the standard differentiable approach to provide a more realistic model. A biological case study of the ligand-receptor network of protein TGF-β is used to validate the effectiveness of the conversion method. It demonstrates the sense in which the behaviours and properties of the system are better preserved in the membrane computing model, suggesting that the proposed conversion method may prove useful for biological systems in particular. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. A realistic neural mass model of the cortex with laminar-specific connections and synaptic plasticity - evaluation with auditory habituation.

    Directory of Open Access Journals (Sweden)

    Peng Wang

    Full Text Available In this work we propose a biologically realistic local cortical circuit model (LCCM, based on neural masses, that incorporates important aspects of the functional organization of the brain that have not been covered by previous models: (1 activity dependent plasticity of excitatory synaptic couplings via depleting and recycling of neurotransmitters and (2 realistic inter-laminar dynamics via laminar-specific distribution of and connections between neural populations. The potential of the LCCM was demonstrated by accounting for the process of auditory habituation. The model parameters were specified using Bayesian inference. It was found that: (1 besides the major serial excitatory information pathway (layer 4 to layer 2/3 to layer 5/6, there exists a parallel "short-cut" pathway (layer 4 to layer 5/6, (2 the excitatory signal flow from the pyramidal cells to the inhibitory interneurons seems to be more intra-laminar while, in contrast, the inhibitory signal flow from inhibitory interneurons to the pyramidal cells seems to be both intra- and inter-laminar, and (3 the habituation rates of the connections are unsymmetrical: forward connections (from layer 4 to layer 2/3 are more strongly habituated than backward connections (from Layer 5/6 to layer 4. Our evaluation demonstrates that the novel features of the LCCM are of crucial importance for mechanistic explanations of brain function. The incorporation of these features into a mass model makes them applicable to modeling based on macroscopic data (like EEG or MEG, which are usually available in human experiments. Our LCCM is therefore a valuable building block for future realistic models of human cognitive function.

  9. Bio-heat transfer model of electroconvulsive therapy: Effect of biological properties on induced temperature variation.

    Science.gov (United States)

    de Oliveira, Marilia M; Wen, Paul; Ahfock, Tony

    2016-08-01

    A realistic human head model consisting of six tissue layers was modelled to investigate the behavior of temperature profile and magnitude when applying electroconvulsive therapy stimulation and different biological properties. The thermo-electrical model was constructed with the use of bio-heat transfer equation and Laplace equation. Three different electrode montages were analyzed as well as the influence of blood perfusion, metabolic heat and electric and thermal conductivity in the scalp. Also, the effect of including the fat layer was investigated. The results showed that temperature increase is inversely proportional to electrical and thermal conductivity increase. Furthermore, the inclusion of blood perfusion slightly drops the peak temperature. Finally, the inclusion of fat is highly recommended in order to acquire more realistic results from the thermo-electrical models.

  10. Plasticity-modulated seizure dynamics for seizure termination in realistic neuronal models

    NARCIS (Netherlands)

    Koppert, M.M.J.; Kalitzin, S.; Lopes da Silva, F.H.; Viergever, M.A.

    2011-01-01

    In previous studies we showed that autonomous absence seizure generation and termination can be explained by realistic neuronal models eliciting bi-stable dynamics. In these models epileptic seizures are triggered either by external stimuli (reflex epilepsies) or by internal fluctuations. This

  11. Development of vortex model with realistic axial velocity distribution

    International Nuclear Information System (INIS)

    Ito, Kei; Ezure, Toshiki; Ohshima, Hiroyuki

    2014-01-01

    A vortex is considered as one of significant phenomena which may cause gas entrainment (GE) and/or vortex cavitation in sodium-cooled fast reactors. In our past studies, the vortex is assumed to be approximated by the well-known Burgers vortex model. However, the Burgers vortex model has a simple but unreal assumption that the axial velocity component is horizontally constant, while in real the free surface vortex has the axial velocity distribution which shows large gradient in radial direction near the vortex center. In this study, a new vortex model with realistic axial velocity distribution is proposed. This model is derived from the steady axisymmetric Navier-Stokes equation as well as the Burgers vortex model, but the realistic axial velocity distribution in radial direction is considered, which is defined to be zero at the vortex center and to approach asymptotically to zero at infinity. As the verification, the new vortex model is applied to the evaluation of a simple vortex experiment, and shows good agreements with the experimental data in terms of the circumferential velocity distribution and the free surface shape. In addition, it is confirmed that the Burgers vortex model fails to calculate accurate velocity distribution with the assumption of uniform axial velocity. However, the calculation accuracy of the Burgers vortex model can be enhanced close to that of the new vortex model in consideration of the effective axial velocity which is calculated as the average value only in the vicinity of the vortex center. (author)

  12. Realistic shell-model calculations for Sn isotopes

    International Nuclear Information System (INIS)

    Covello, A.; Andreozzi, F.; Coraggio, L.; Gargano, A.; Porrino, A.

    1997-01-01

    We report on a shell-model study of the Sn isotopes in which a realistic effective interaction derived from the Paris free nucleon-nucleon potential is employed. The calculations are performed within the framework of the seniority scheme by making use of the chain-calculation method. This provides practically exact solutions while cutting down the amount of computational work required by a standard seniority-truncated calculation. The behavior of the energy of several low-lying states in the isotopes with A ranging from 122 to 130 is presented and compared with the experimental one. (orig.)

  13. Flow visualization through particle image velocimetry in realistic model of rhesus monkey's upper airway.

    Science.gov (United States)

    Kim, Ji-Woong; Phuong, Nguyen Lu; Aramaki, Shin-Ichiro; Ito, Kazuhide

    2018-05-01

    Studies concerning inhalation toxicology and respiratory drug-delivery systems require biological testing involving experiments performed on animals. Particle image velocimetry (PIV) is an effective in vitro technique that reveals detailed inhalation flow patterns, thereby assisting analyses of inhalation exposure to various substances. A realistic model of a rhesus-monkey upper airway was developed to investigate flow patterns in its oral and nasal cavities through PIV experiments performed under steady-state constant inhalation conditions at various flow rates-4, 10, and 20 L/min. Flow rate of the fluid passing through the inlet into the trachea was measured to obtain characteristic flow mechanisms, and flow phenomena in the model were confirmed via characterized flow fields. It was observed that increase in flow rate leads to constant velocity profiles in upper and lower trachea regions. It is expected that the results of this study would contribute to future validation of studies aimed at developing in silico models, especially those involving computational fluid dynamic (CFD) analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Simplified realistic human head model for simulating Tumor Treating Fields (TTFields).

    Science.gov (United States)

    Wenger, Cornelia; Bomzon, Ze'ev; Salvador, Ricardo; Basser, Peter J; Miranda, Pedro C

    2016-08-01

    Tumor Treating Fields (TTFields) are alternating electric fields in the intermediate frequency range (100-300 kHz) of low-intensity (1-3 V/cm). TTFields are an anti-mitotic treatment against solid tumors, which are approved for Glioblastoma Multiforme (GBM) patients. These electric fields are induced non-invasively by transducer arrays placed directly on the patient's scalp. Cell culture experiments showed that treatment efficacy is dependent on the induced field intensity. In clinical practice, a software called NovoTalTM uses head measurements to estimate the optimal array placement to maximize the electric field delivery to the tumor. Computational studies predict an increase in the tumor's electric field strength when adapting transducer arrays to its location. Ideally, a personalized head model could be created for each patient, to calculate the electric field distribution for the specific situation. Thus, the optimal transducer layout could be inferred from field calculation rather than distance measurements. Nonetheless, creating realistic head models of patients is time-consuming and often needs user interaction, because automated image segmentation is prone to failure. This study presents a first approach to creating simplified head models consisting of convex hulls of the tissue layers. The model is able to account for anisotropic conductivity in the cortical tissues by using a tensor representation estimated from Diffusion Tensor Imaging. The induced electric field distribution is compared in the simplified and realistic head models. The average field intensities in the brain and tumor are generally slightly higher in the realistic head model, with a maximal ratio of 114% for a simplified model with reasonable layer thicknesses. Thus, the present pipeline is a fast and efficient means towards personalized head models with less complexity involved in characterizing tissue interfaces, while enabling accurate predictions of electric field distribution.

  15. Realistic modeling of chamber transport for heavy-ion fusion

    International Nuclear Information System (INIS)

    Sharp, W.M.; Grote, D.P.; Callahan, D.A.; Tabak, M.; Henestroza, E.; Yu, S.S.; Peterson, P.F.; Welch, D.R.; Rose, D.V.

    2003-01-01

    Transport of intense heavy-ion beams to an inertial-fusion target after final focus is simulated here using a realistic computer model. It is found that passing the beam through a rarefied plasma layer before it enters the fusion chamber can largely neutralize the beam space charge and lead to a usable focal spot for a range of ion species and input conditions

  16. Electron percolation in realistic models of carbon nanotube networks

    International Nuclear Information System (INIS)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-01-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models

  17. Electron percolation in realistic models of carbon nanotube networks

    Science.gov (United States)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-09-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models.

  18. Adding a little reality to building ontologies for biology.

    Directory of Open Access Journals (Sweden)

    Phillip Lord

    Full Text Available BACKGROUND: Many areas of biology are open to mathematical and computational modelling. The application of discrete, logical formalisms defines the field of biomedical ontologies. Ontologies have been put to many uses in bioinformatics. The most widespread is for description of entities about which data have been collected, allowing integration and analysis across multiple resources. There are now over 60 ontologies in active use, increasingly developed as large, international collaborations. There are, however, many opinions on how ontologies should be authored; that is, what is appropriate for representation. Recently, a common opinion has been the "realist" approach that places restrictions upon the style of modelling considered to be appropriate. METHODOLOGY/PRINCIPAL FINDINGS: Here, we use a number of case studies for describing the results of biological experiments. We investigate the ways in which these could be represented using both realist and non-realist approaches; we consider the limitations and advantages of each of these models. CONCLUSIONS/SIGNIFICANCE: From our analysis, we conclude that while realist principles may enable straight-forward modelling for some topics, there are crucial aspects of science and the phenomena it studies that do not fit into this approach; realism appears to be over-simplistic which, perversely, results in overly complex ontological models. We suggest that it is impossible to avoid compromise in modelling ontology; a clearer understanding of these compromises will better enable appropriate modelling, fulfilling the many needs for discrete mathematical models within computational biology.

  19. A scan for models with realistic fermion mass patterns

    International Nuclear Information System (INIS)

    Bijnens, J.; Wetterich, C.

    1986-03-01

    We consider models which have no small Yukawa couplings unrelated to symmetry. This situation is generic in higher dimensional unification where Yukawa couplings are predicted to have strength similar to the gauge couplings. Generations have then to be differentiated by symmetry properties and the structure of fermion mass matrices is given in terms of quantum numbers alone. We scan possible symmetries leading to realistic mass matrices. (orig.)

  20. Toward developing more realistic groundwater models using big data

    Science.gov (United States)

    Vahdat Aboueshagh, H.; Tsai, F. T. C.; Bhatta, D.; Paudel, K.

    2017-12-01

    Rich geological data is the backbone of developing realistic groundwater models for groundwater resources management. However, constructing realistic groundwater models can be challenging due to inconsistency between different sources of geological, hydrogeological and geophysical data and difficulty in processing big data to characterize the subsurface environment. This study develops a framework to utilize a big geological dataset to create a groundwater model for the Chicot Aquifer in the southwestern Louisiana, which borders on the Gulf of Mexico at south. The Chicot Aquifer is the principal source of fresh water in southwest Louisiana, underlying an area of about 9,000 square miles. Agriculture is the largest groundwater consumer in this region and overpumping has caused significant groundwater head decline and saltwater intrusion from the Gulf and deep formations. A hydrostratigraphy model was constructed using around 29,000 electrical logs and drillers' logs as well as screen lengths of pumping wells through a natural neighbor interpolation method. These sources of information have different weights in terms of accuracy and trustworthy. A data prioritization procedure was developed to filter untrustworthy log information, eliminate redundant data, and establish consensus of various lithological information. The constructed hydrostratigraphy model shows 40% sand facies, which is consistent with the well log data. The hydrostratigraphy model confirms outcrop areas of the Chicot Aquifer in the north of the study region. The aquifer sand formation is thinning eastward to merge into Atchafalaya River alluvial aquifer and coalesces to the underlying Evangeline aquifer. A grid generator was used to convert the hydrostratigraphy model into a MODFLOW grid with 57 layers. A Chicot groundwater model was constructed using the available hydrologic and hydrogeological data for 2004-2015. Pumping rates for irrigation wells were estimated using the crop type and acreage

  1. Modeling and Analysis of Realistic Fire Scenarios in Spacecraft

    Science.gov (United States)

    Brooker, J. E.; Dietrich, D. L.; Gokoglu, S. A.; Urban, D. L.; Ruff, G. A.

    2015-01-01

    An accidental fire inside a spacecraft is an unlikely, but very real emergency situation that can easily have dire consequences. While much has been learned over the past 25+ years of dedicated research on flame behavior in microgravity, a quantitative understanding of the initiation, spread, detection and extinguishment of a realistic fire aboard a spacecraft is lacking. Virtually all combustion experiments in microgravity have been small-scale, by necessity (hardware limitations in ground-based facilities and safety concerns in space-based facilities). Large-scale, realistic fire experiments are unlikely for the foreseeable future (unlike in terrestrial situations). Therefore, NASA will have to rely on scale modeling, extrapolation of small-scale experiments and detailed numerical modeling to provide the data necessary for vehicle and safety system design. This paper presents the results of parallel efforts to better model the initiation, spread, detection and extinguishment of fires aboard spacecraft. The first is a detailed numerical model using the freely available Fire Dynamics Simulator (FDS). FDS is a CFD code that numerically solves a large eddy simulation form of the Navier-Stokes equations. FDS provides a detailed treatment of the smoke and energy transport from a fire. The simulations provide a wealth of information, but are computationally intensive and not suitable for parametric studies where the detailed treatment of the mass and energy transport are unnecessary. The second path extends a model previously documented at ICES meetings that attempted to predict maximum survivable fires aboard space-craft. This one-dimensional model implies the heat and mass transfer as well as toxic species production from a fire. These simplifications result in a code that is faster and more suitable for parametric studies (having already been used to help in the hatch design of the Multi-Purpose Crew Vehicle, MPCV).

  2. Water versus DNA A new deal for proton transport modeling in biological matter

    International Nuclear Information System (INIS)

    Champion, C; Quinto, M A; Monti, J M; Galassi, M E; Fojón, O A; Hanssen, J; Rivarola, R D; Week, P F

    2015-01-01

    Water vapor is a common surrogate of DNA for modeling the proton-induced ionizing processes in living tissue exposed to radiations. The present study aims at scrutinizing the validity of this approximation and then revealing new insights into proton-induced energy transfers by a comparative analysis between water and realistic biological medium. In this context, self-consistent quantum mechanical modeling of the ionization and electron capture processes is reported within the continuum distorted wave-eikonal initial state framework for both isolated water molecules and DNA components impacted by proton beams. (paper)

  3. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2014-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards. We show that k-guardable polygons generalize two previously identified classes of realistic input. Following this, we give two simple algorithms for triangulating

  4. Realistic modeling of seismic input for megacities and large urban areas

    International Nuclear Information System (INIS)

    Panza, Giuliano F.; Alvarez, Leonardo; Aoudia, Abdelkrim

    2002-06-01

    The project addressed the problem of pre-disaster orientation: hazard prediction, risk assessment, and hazard mapping, in connection with seismic activity and man-induced vibrations. The definition of realistic seismic input has been obtained from the computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different source and structural models. The innovative modeling technique, that constitutes the common tool to the entire project, takes into account source, propagation and local site effects. This is done using first principles of physics about wave generation and propagation in complex media, and does not require to resort to convolutive approaches, that have been proven to be quite unreliable, mainly when dealing with complex geological structures, the most interesting from the practical point of view. In fact, several techniques that have been proposed to empirically estimate the site effects using observations convolved with theoretically computed signals corresponding to simplified models, supply reliable information about the site response to non-interfering seismic phases. They are not adequate in most of the real cases, when the seismic sequel is formed by several interfering waves. The availability of realistic numerical simulations enables us to reliably estimate the amplification effects even in complex geological structures, exploiting the available geotechnical, lithological, geophysical parameters, topography of the medium, tectonic, historical, palaeoseismological data, and seismotectonic models. The realistic modeling of the ground motion is a very important base of knowledge for the preparation of groundshaking scenarios that represent a valid and economic tool for the seismic microzonation. This knowledge can be very fruitfully used by civil engineers in the design of new seismo-resistant constructions and in the reinforcement of the existing built environment, and, therefore

  5. Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention

    National Research Council Canada - National Science Library

    Itti, L; Dhavale, N; Pighin, F

    2003-01-01

    We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained...

  6. Finite Time Blowup in a Realistic Food-Chain Model

    KAUST Repository

    Parshad, Rana; Ait Abderrahmane, Hamid; Upadhyay, Ranjit Kumar; Kumari, Nitu

    2013-01-01

    We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.

  7. Finite Time Blowup in a Realistic Food-Chain Model

    KAUST Repository

    Parshad, Rana

    2013-05-19

    We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.

  8. Development of Realistic Head Models for Electromagnetic Source Imaging of the Human Brain

    National Research Council Canada - National Science Library

    Akalin, Z

    2001-01-01

    In this work, a methodology is developed to solve the forward problem of electromagnetic source imaging using realistic head models, For this purpose, first segmentation of the 3 dimensional MR head...

  9. Realistic modelling of the seismic input: Site effects and parametric studies

    International Nuclear Information System (INIS)

    Romanelli, F.; Vaccari, F.; Panza, G.F.

    2002-11-01

    We illustrate the work done in the framework of a large international cooperation, showing the very recent numerical experiments carried out within the framework of the EC project 'Advanced methods for assessing the seismic vulnerability of existing motorway bridges' (VAB) to assess the importance of non-synchronous seismic excitation of long structures. The definition of the seismic input at the Warth bridge site, i.e. the determination of the seismic ground motion due to an earthquake with a given magnitude and epicentral distance from the site, has been done following a theoretical approach. In order to perform an accurate and realistic estimate of site effects and of differential motion it is necessary to make a parametric study that takes into account the complex combination of the source and propagation parameters, in realistic geological structures. The computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different sources and structural models, allows us the construction of damage scenarios that are out of the reach of stochastic models, at a very low cost/benefit ratio. (author)

  10. Integrative computational models of cardiac arrhythmias -- simulating the structurally realistic heart

    Science.gov (United States)

    Trayanova, Natalia A; Tice, Brock M

    2009-01-01

    Simulation of cardiac electrical function, and specifically, simulation aimed at understanding the mechanisms of cardiac rhythm disorders, represents an example of a successful integrative multiscale modeling approach, uncovering emergent behavior at the successive scales in the hierarchy of structural complexity. The goal of this article is to present a review of the integrative multiscale models of realistic ventricular structure used in the quest to understand and treat ventricular arrhythmias. It concludes with the new advances in image-based modeling of the heart and the promise it holds for the development of individualized models of ventricular function in health and disease. PMID:20628585

  11. IBM parameters derived from realistic shell-model Hamiltonian via Hn-cooling method

    International Nuclear Information System (INIS)

    Nakada, Hitoshi

    1997-01-01

    There is a certain influence of non-collective degrees-of-freedom even in lowest-lying states of medium-heavy nuclei. This influence seems to be significant for some of the IBM parameters. In order to take it into account, several renormalization approaches have been applied. It has been shown in the previous studies that the influence of the G-pairs is important, but does not fully account for the fitted values. The influence of the non-collective components may be more serious when we take a realistic effective nucleonic interaction. To incorporate this influence into the IBM parameters, we employ the recently developed H n -cooling method. This method is applied to renormalize the wave functions of the states consisting of the SD-pairs, for the Cr-Fe nuclei. On this ground, the IBM Hamiltonian and transition operators are derived from corresponding realistic shell-model operators, for the Cr-Fe nuclei. Together with some features of the realistic interaction, the effects of the non-SD degrees-of-freedom are presented. (author)

  12. Realistic Visualization of Virtual Views

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    that can be impractical and sometime impossible. In addition, the artificial nature of data often makes visualized virtual scenarios not realistic enough. Not realistic in the sense that a synthetic scene is easy to discriminate visually from a natural scene. A new field of research has consequently...... developed and received much attention in recent years: Realistic Virtual View Synthesis. The main goal is a high fidelity representation of virtual scenarios while easing modeling and physical phenomena simulation. In particular, realism is achieved by the transfer to the novel view of all the physical...... phenomena captured in the reference photographs, (i.e. the transfer of photographic-realism). An overview of most prominent approaches in realistic virtual view synthesis will be presented and briefly discussed. Applications of proposed methods to visual survey, virtual cinematography, as well as mobile...

  13. Correcting electrode modelling errors in EIT on realistic 3D head models.

    Science.gov (United States)

    Jehl, Markus; Avery, James; Malone, Emma; Holder, David; Betcke, Timo

    2015-12-01

    Electrical impedance tomography (EIT) is a promising medical imaging technique which could aid differentiation of haemorrhagic from ischaemic stroke in an ambulance. One challenge in EIT is the ill-posed nature of the image reconstruction, i.e., that small measurement or modelling errors can result in large image artefacts. It is therefore important that reconstruction algorithms are improved with regard to stability to modelling errors. We identify that wrongly modelled electrode positions constitute one of the biggest sources of image artefacts in head EIT. Therefore, the use of the Fréchet derivative on the electrode boundaries in a realistic three-dimensional head model is investigated, in order to reconstruct electrode movements simultaneously to conductivity changes. We show a fast implementation and analyse the performance of electrode position reconstructions in time-difference and absolute imaging for simulated and experimental voltages. Reconstructing the electrode positions and conductivities simultaneously increased the image quality significantly in the presence of electrode movement.

  14. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  15. Modeling human risk: Cell & molecular biology in context

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    It is anticipated that early in the next century manned missions into outer space will occur, with a mission to Mars scheduled between 2015 and 2020. However, before such missions can be undertaken, a realistic estimation of the potential risks to the flight crews is required. One of the uncertainties remaining in this risk estimation is that posed by the effects of exposure to the radiation environment of outer space. Although the composition of this environment is fairly well understood, the biological effects arising from exposure to it are not. The reasons for this are three-fold: (1) A small but highly significant component of the radiation spectrum in outer space consists of highly charged, high energy (HZE) particles which are not routinely experienced on earth, and for which there are insufficient data on biological effects; (2) Most studies on the biological effects of radiation to date have been high-dose, high dose-rate, whereas in space, with the exception of solar particle events, radiation exposures will be low-dose, low dose-rate; (3) Although it has been established that the virtual absence of gravity in space has a profound effect on human physiology, it is not clear whether these effects will act synergistically with those of radiation exposure. A select panel will evaluate the utilizing experiments and models to accurately predict the risks associated with exposure to HZE particles. Topics of research include cellular and tissue response, health effects associated with radiation damage, model animal systems, and critical markers of Radiation response.

  16. Radiation Damage to Nervous System: Designing Optimal Models for Realistic Neuron Morphology in Hippocampus

    Science.gov (United States)

    Batmunkh, Munkhbaatar; Bugay, Alexander; Bayarchimeg, Lkhagvaa; Lkhagva, Oidov

    2018-02-01

    The present study is focused on the development of optimal models of neuron morphology for Monte Carlo microdosimetry simulations of initial radiation-induced events of heavy charged particles in the specific types of cells of the hippocampus, which is the most radiation-sensitive structure of the central nervous system. The neuron geometry and particles track structures were simulated by the Geant4/Geant4-DNA Monte Carlo toolkits. The calculations were made for beams of protons and heavy ions with different energies and doses corresponding to real fluxes of galactic cosmic rays. A simple compartmental model and a complex model with realistic morphology extracted from experimental data were constructed and compared. We estimated the distribution of the energy deposition events and the production of reactive chemical species within the developed models of CA3/CA1 pyramidal neurons and DG granule cells of the rat hippocampus under exposure to different particles with the same dose. Similar distributions of the energy deposition events and concentration of some oxidative radical species were obtained in both the simplified and realistic neuron models.

  17. Any realistic theory must be computationally realistic: a response to N. Gisin's definition of a Realistic Physics Theory

    OpenAIRE

    Bolotin, Arkady

    2014-01-01

    It is argued that the recent definition of a realistic physics theory by N. Gisin cannot be considered comprehensive unless it is supplemented with requirement that any realistic theory must be computationally realistic as well.

  18. Simulation of size-dependent aerosol deposition in a realistic model of the upper human airways

    NARCIS (Netherlands)

    Frederix, E.M.A.; Kuczaj, Arkadiusz K.; Nordlund, Markus; Belka, M.; Lizal, F.; Elcner, J.; Jicha, M.; Geurts, Bernardus J.

    An Eulerian internally mixed aerosol model is used for predictions of deposition inside a realistic cast of the human upper airways. The model, formulated in the multi-species and compressible framework, is solved using the sectional discretization of the droplet size distribution function to

  19. Realistic neurons can compute the operations needed by quantum probability theory and other vector symbolic architectures.

    Science.gov (United States)

    Stewart, Terrence C; Eliasmith, Chris

    2013-06-01

    Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B).

  20. Bell Operator Method to Classify Local Realistic Theories

    International Nuclear Information System (INIS)

    Nagata, Koji

    2010-01-01

    We review the historical fact of multipartite Bell inequalities with an arbitrary number of settings. An explicit local realistic model for the values of a correlation function, given in a two-setting Bell experiment (two-setting model), works only for the specific set of settings in the given experiment, but cannot construct a local realistic model for the values of a correlation function, given in a continuous-infinite settings Bell experiment (infinite-setting model), even though there exist two-setting models for all directions in space. Hence, the two-setting model does not have the property that the infinite-setting model has. Here, we show that an explicit two-setting model cannot construct a local realistic model for the values of a correlation function, given in an M-setting Bell experiment (M-setting model), even though there exist two-setting models for the M measurement directions chosen in the given M-setting experiment. Hence, the two-setting model does not have the property that the M-setting model has. (general)

  1. Modelling Analysis of Echo Signature and Target Strength of a Realistically Modelled Ship Wake for a Generic Forward Looking Active Sonar

    NARCIS (Netherlands)

    Schippers, P.

    2009-01-01

    The acoustic modelling in TNO’s ALMOST (=Acoustic Loss Model for Operational Studies and Tasks) uses a bubble migration model as realistic input for wake modelling. The modelled bubble cloud represents the actual ship wake. Ship hull, propeller and bow wave are the main generators of bubbles in the

  2. Statistical multi-path exposure method for assessing the whole-body SAR in a heterogeneous human body model in a realistic environment.

    Science.gov (United States)

    Vermeeren, Günter; Joseph, Wout; Martens, Luc

    2013-04-01

    Assessing the whole-body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi-path exposure method for heterogeneous realistic human body models. The method is applied for the 6-year-old Virtual Family boy (VFB) exposed to the GSM downlink at 950 MHz. It is shown that the whole-body SAR does not differ significantly over the different environments at an operating frequency of 950 MHz. Furthermore, the whole-body SAR in the VFB for multi-path exposure exceeds the whole-body SAR for worst-case single-incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950 MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole-body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. Copyright © 2012 Wiley Periodicals, Inc.

  3. Fault-Tolerant Robot Programming through Simulation with Realistic Sensor Models

    Directory of Open Access Journals (Sweden)

    Axel Waggershauser

    2008-11-01

    Full Text Available We introduce a simulation system for mobile robots that allows a realistic interaction of multiple robots in a common environment. The simulated robots are closely modeled after robots from the EyeBot family and have an identical application programmer interface. The simulation supports driving commands at two levels of abstraction as well as numerous sensors such as shaft encoders, infrared distance sensors, and compass. Simulation of on-board digital cameras via synthetic images allows the use of image processing routines for robot control within the simulation. Specific error models for actuators, distance sensors, camera sensor, and wireless communication have been implemented. Progressively increasing error levels for an application program allows for testing and improving its robustness and fault-tolerance.

  4. Modeling human risk: Cell ampersand molecular biology in context

    International Nuclear Information System (INIS)

    1997-06-01

    It is anticipated that early in the next century manned missions into outer space will occur, with a mission to Mars scheduled between 2015 and 2020. However, before such missions can be undertaken, a realistic estimation of the potential risks to the flight crews is required. One of the uncertainties remaining in this risk estimation is that posed by the effects of exposure to the radiation environment of outer space. Although the composition of this environment is fairly well understood, the biological effects arising from exposure to it are not. The reasons for this are three-fold: (1) A small but highly significant component of the radiation spectrum in outer space consists of highly charged, high energy (HZE) particles which are not routinely experienced on earth, and for which there are insufficient data on biological effects; (2) Most studies on the biological effects of radiation to date have been high-dose, high dose-rate, whereas in space, with the exception of solar particle events, radiation exposures will be low-dose, low dose-rate; (3) Although it has been established that the virtual absence of gravity in space has a profound effect on human physiology, it is not clear whether these effects will act synergistically with those of radiation exposure. A select panel will evaluate the utilizing experiments and models to accurately predict the risks associated with exposure to HZE particles. Topics of research include cellular and tissue response, health effects associated with radiation damage, model animal systems, and critical markers of Radiation response

  5. The effect of a realistic thermal diffusivity on numerical model of a subducting slab

    Science.gov (United States)

    Maierova, P.; Steinle-Neumann, G.; Cadek, O.

    2010-12-01

    A number of numerical studies of subducting slab assume simplified (constant or only depth-dependent) models of thermal conductivity. The available mineral physics data indicate, however, that thermal diffusivity is strongly temperature- and pressure-dependent and may also vary among different mantle materials. In the present study, we examine the influence of realistic thermal properties of mantle materials on the thermal state of the upper mantle and the dynamics of subducting slabs. On the basis of the data published in mineral physics literature we compile analytical relationships that approximate the pressure and temperature dependence of thermal diffusivity for major mineral phases of the mantle (olivine, wadsleyite, ringwoodite, garnet, clinopyroxenes, stishovite and perovskite). We propose a simplified composition of mineral assemblages predominating in the subducting slab and the surrounding mantle (pyrolite, mid-ocean ridge basalt, harzburgite) and we estimate their thermal diffusivity using the Hashin-Shtrikman bounds. The resulting complex formula for the diffusivity of each aggregate is then approximated by a simpler analytical relationship that is used in our numerical model as an input parameter. For the numerical modeling we use the Elmer software (open source finite element software for multiphysical problems, see http://www.csc.fi/english/pages/elmer). We set up a 2D Cartesian thermo-mechanical steady-state model of a subducting slab. The model is partly kinematic as the flow is driven by a boundary condition on velocity that is prescribed on the top of the subducting lithospheric plate. Reology of the material is non-linear and is coupled with the thermal equation. Using the realistic relationship for thermal diffusivity of mantle materials, we compute the thermal and flow fields for different input velocity and age of the subducting plate and we compare the results against the models assuming a constant thermal diffusivity. The importance of the

  6. More Realistic Face Model Surface Improves Relevance of Pediatric In-Vitro Aerosol Studies.

    Science.gov (United States)

    Amirav, Israel; Halamish, Asaf; Gorenberg, Miguel; Omar, Hamza; Newhouse, Michael T

    2015-01-01

    Various hard face models are commonly used to evaluate the efficiency of aerosol face masks. Softer more realistic "face" surface materials, like skin, deform upon mask application and should provide more relevant in-vitro tests. Studies that simultaneously take into consideration many of the factors characteristic of the in vivo face are lacking. These include airways, various application forces, comparison of various devices, comparison with a hard-surface model and use of a more representative model face based on large numbers of actual faces. To compare mask to "face" seal and aerosol delivery of two pediatric masks using a soft vs. a hard, appropriately representative, pediatric face model under various applied forces. Two identical face models and upper airways replicas were constructed, the only difference being the suppleness and compressibility of the surface layer of the "face." Integrity of the seal and aerosol delivery of two different masks [AeroChamber (AC) and SootherMask (SM)] were compared using a breath simulator, filter collection and realistic applied forces. The soft "face" significantly increased the delivery efficiency and the sealing characteristics of both masks. Aerosol delivery with the soft "face" was significantly greater for the SM compared to the AC (pmasks was observed with the hard "face." The material and pliability of the model "face" surface has a significant influence on both the seal and delivery efficiency of face masks. This finding should be taken into account during in-vitro aerosol studies.

  7. Characterization of photomultiplier tubes with a realistic model through GPU-boosted simulation

    Science.gov (United States)

    Anthony, M.; Aprile, E.; Grandi, L.; Lin, Q.; Saldanha, R.

    2018-02-01

    The accurate characterization of a photomultiplier tube (PMT) is crucial in a wide-variety of applications. However, current methods do not give fully accurate representations of the response of a PMT, especially at very low light levels. In this work, we present a new and more realistic model of the response of a PMT, called the cascade model, and use it to characterize two different PMTs at various voltages and light levels. The cascade model is shown to outperform the more common Gaussian model in almost all circumstances and to agree well with a newly introduced model independent approach. The technical and computational challenges of this model are also presented along with the employed solution of developing a robust GPU-based analysis framework for this and other non-analytical models.

  8. Neural network models: from biology to many - body phenomenology

    International Nuclear Information System (INIS)

    Clark, J.W.

    1993-01-01

    Theoretical work in neural networks has a strange feel for most physicists. In some cases the aspect of design becomes paramount. More comfortable ground at least for many body theorists may be found in realistic biological simulation, although the complexity of most problems is so awesome that incisive results will be hard won. It has also shown the impressive capabilities of artificial networks in pattern recognition and classification may be exploited to solve management problems in experimental physics and for discovery of radically new theoretical description of physical systems. This advance represents an important step towards the ultimate goal of neuro biological paradigm. (A.B.)

  9. Convective aggregation in idealised models and realistic equatorial cases

    Science.gov (United States)

    Holloway, Chris

    2015-04-01

    Idealised explicit convection simulations of the Met Office Unified Model are shown to exhibit spontaneous self-aggregation in radiative-convective equilibrium, as seen previously in other models in several recent studies. This self-aggregation is linked to feedbacks between radiation, surface fluxes, and convection, and the organization is intimately related to the evolution of the column water vapour (CWV) field. To investigate the relevance of this behaviour to the real world, these idealized simulations are compared with five 15-day cases of real organized convection in the tropics, including multiple simulations of each case testing sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. Despite similar large-scale forcing via lateral boundary conditions, systematic differences in mean CWV, CWV distribution shape, and the length scale of CWV features are found between the different sensitivity runs, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations.

  10. The Realistic Versus the Spherical Head Model in EEG Dipole Source Analysis in the Presence of Noise

    National Research Council Canada - National Science Library

    Vanrumste, Bart

    2001-01-01

    .... For 27 electrodes, an EEG epoch of one time sample and spatially white Gaussian noise we found that the importance of the realistic head model over the spherical head model reduces by increasing the noise level.

  11. Problem Posing with Realistic Mathematics Education Approach in Geometry Learning

    Science.gov (United States)

    Mahendra, R.; Slamet, I.; Budiyono

    2017-09-01

    One of the difficulties of students in the learning of geometry is on the subject of plane that requires students to understand the abstract matter. The aim of this research is to determine the effect of Problem Posing learning model with Realistic Mathematics Education Approach in geometry learning. This quasi experimental research was conducted in one of the junior high schools in Karanganyar, Indonesia. The sample was taken using stratified cluster random sampling technique. The results of this research indicate that the model of Problem Posing learning with Realistic Mathematics Education Approach can improve students’ conceptual understanding significantly in geometry learning especially on plane topics. It is because students on the application of Problem Posing with Realistic Mathematics Education Approach are become to be active in constructing their knowledge, proposing, and problem solving in realistic, so it easier for students to understand concepts and solve the problems. Therefore, the model of Problem Posing learning with Realistic Mathematics Education Approach is appropriately applied in mathematics learning especially on geometry material. Furthermore, the impact can improve student achievement.

  12. A realistic extension of gauge-mediated SUSY-breaking model with superconformal hidden sector

    International Nuclear Information System (INIS)

    Asano, Masaki; Hisano, Junji; Okada, Takashi; Sugiyama, Shohei

    2009-01-01

    The sequestering of supersymmetry (SUSY) breaking parameters, which is induced by superconformal hidden sector, is one of the solutions for the μ/B μ problem in gauge-mediated SUSY-breaking scenario. However, it is found that the minimal messenger model does not derive the correct electroweak symmetry breaking. In this Letter we present a model which has the coupling of the messengers with the SO(10) GUT-symmetry breaking Higgs fields. The model is one of the realistic extensions of the gauge mediation model with superconformal hidden sector. It is shown that the extension is applicable for a broad range of conformality breaking scale

  13. BSim: an agent-based tool for modeling bacterial populations in systems and synthetic biology.

    Directory of Open Access Journals (Sweden)

    Thomas E Gorochowski

    Full Text Available Large-scale collective behaviors such as synchronization and coordination spontaneously arise in many bacterial populations. With systems biology attempting to understand these phenomena, and synthetic biology opening up the possibility of engineering them for our own benefit, there is growing interest in how bacterial populations are best modeled. Here we introduce BSim, a highly flexible agent-based computational tool for analyzing the relationships between single-cell dynamics and population level features. BSim includes reference implementations of many bacterial traits to enable the quick development of new models partially built from existing ones. Unlike existing modeling tools, BSim fully considers spatial aspects of a model allowing for the description of intricate micro-scale structures, enabling the modeling of bacterial behavior in more realistic three-dimensional, complex environments. The new opportunities that BSim opens are illustrated through several diverse examples covering: spatial multicellular computing, modeling complex environments, population dynamics of the lac operon, and the synchronization of genetic oscillators. BSim is open source software that is freely available from http://bsim-bccs.sf.net and distributed under the Open Source Initiative (OSI recognized MIT license. Developer documentation and a wide range of example simulations are also available from the website. BSim requires Java version 1.6 or higher.

  14. Comparative study of non-premixed and partially-premixed combustion simulations in a realistic Tay model combustor

    OpenAIRE

    Zhang, K.; Ghobadian, A.; Nouri, J. M.

    2017-01-01

    A comparative study of two combustion models based on non-premixed assumption and partially premixed assumptions using the overall models of Zimont Turbulent Flame Speed Closure Method (ZTFSC) and Extended Coherent Flamelet Method (ECFM) are conducted through Reynolds stress turbulence modelling of Tay model gas turbine combustor for the first time. The Tay model combustor retains all essential features of a realistic gas turbine combustor. It is seen that the non-premixed combustion model fa...

  15. Spike Neural Models Part II: Abstract Neural Models

    Directory of Open Access Journals (Sweden)

    Johnson, Melissa G.

    2018-02-01

    Full Text Available Neurons are complex cells that require a lot of time and resources to model completely. In spiking neural networks (SNN though, not all that complexity is required. Therefore simple, abstract models are often used. These models save time, use less computer resources, and are easier to understand. This tutorial presents two such models: Izhikevich's model, which is biologically realistic in the resulting spike trains but not in the parameters, and the Leaky Integrate and Fire (LIF model which is not biologically realistic but does quickly and easily integrate input to produce spikes. Izhikevich's model is based on Hodgkin-Huxley's model but simplified such that it uses only two differentiation equations and four parameters to produce various realistic spike patterns. LIF is based on a standard electrical circuit and contains one equation. Either of these two models, or any of the many other models in literature can be used in a SNN. Choosing a neural model is an important task that depends on the goal of the research and the resources available. Once a model is chosen, network decisions such as connectivity, delay, and sparseness, need to be made. Understanding neural models and how they are incorporated into the network is the first step in creating a SNN.

  16. CaliBayes and BASIS: integrated tools for the calibration, simulation and storage of biological simulation models.

    Science.gov (United States)

    Chen, Yuhui; Lawless, Conor; Gillespie, Colin S; Wu, Jake; Boys, Richard J; Wilkinson, Darren J

    2010-05-01

    Dynamic simulation modelling of complex biological processes forms the backbone of systems biology. Discrete stochastic models are particularly appropriate for describing sub-cellular molecular interactions, especially when critical molecular species are thought to be present at low copy-numbers. For example, these stochastic effects play an important role in models of human ageing, where ageing results from the long-term accumulation of random damage at various biological scales. Unfortunately, realistic stochastic simulation of discrete biological processes is highly computationally intensive, requiring specialist hardware, and can benefit greatly from parallel and distributed approaches to computation and analysis. For these reasons, we have developed the BASIS system for the simulation and storage of stochastic SBML models together with associated simulation results. This system is exposed as a set of web services to allow users to incorporate its simulation tools into their workflows. Parameter inference for stochastic models is also difficult and computationally expensive. The CaliBayes system provides a set of web services (together with an R package for consuming these and formatting data) which addresses this problem for SBML models. It uses a sequential Bayesian MCMC method, which is powerful and flexible, providing very rich information. However this approach is exceptionally computationally intensive and requires the use of a carefully designed architecture. Again, these tools are exposed as web services to allow users to take advantage of this system. In this article, we describe these two systems and demonstrate their integrated use with an example workflow to estimate the parameters of a simple model of Saccharomyces cerevisiae growth on agar plates.

  17. Evolutionary approaches for the reverse-engineering of gene regulatory networks: A study on a biologically realistic dataset

    Directory of Open Access Journals (Sweden)

    Gidrol Xavier

    2008-02-01

    Full Text Available Abstract Background Inferring gene regulatory networks from data requires the development of algorithms devoted to structure extraction. When only static data are available, gene interactions may be modelled by a Bayesian Network (BN that represents the presence of direct interactions from regulators to regulees by conditional probability distributions. We used enhanced evolutionary algorithms to stochastically evolve a set of candidate BN structures and found the model that best fits data without prior knowledge. Results We proposed various evolutionary strategies suitable for the task and tested our choices using simulated data drawn from a given bio-realistic network of 35 nodes, the so-called insulin network, which has been used in the literature for benchmarking. We assessed the inferred models against this reference to obtain statistical performance results. We then compared performances of evolutionary algorithms using two kinds of recombination operators that operate at different scales in the graphs. We introduced a niching strategy that reinforces diversity through the population and avoided trapping of the algorithm in one local minimum in the early steps of learning. We show the limited effect of the mutation operator when niching is applied. Finally, we compared our best evolutionary approach with various well known learning algorithms (MCMC, K2, greedy search, TPDA, MMHC devoted to BN structure learning. Conclusion We studied the behaviour of an evolutionary approach enhanced by niching for the learning of gene regulatory networks with BN. We show that this approach outperforms classical structure learning methods in elucidating the original model. These results were obtained for the learning of a bio-realistic network and, more importantly, on various small datasets. This is a suitable approach for learning transcriptional regulatory networks from real datasets without prior knowledge.

  18. Modeling the Earth's magnetospheric magnetic field confined within a realistic magnetopause

    Science.gov (United States)

    Tsyganenko, N. A.

    1995-01-01

    Empirical data-based models of the magnetosphereic magnetic field have been widely used during recent years. However, the existing models (Tsyganenko, 1987, 1989a) have three serious deficiencies: (1) an unstable de facto magnetopause, (2) a crude parametrization by the K(sub p) index, and (3) inaccuracies in the equatorial magnetotail B(sub z) values. This paper describes a new approach to the problem; the essential new features are (1) a realistic shape and size of the magnetopause, based on fits to a large number of observed crossing (allowing a parametrization by the solar wind pressure), (2) fully controlled shielding of the magnetic field produced by all magnetospheric current systems, (3) new flexible representations for the tail and ring currents, and (4) a new directional criterion for fitting the model field to spacecraft data, providing improved accuracy for field line mapping. Results are presented from initial efforts to create models assembled from these modules and calibrated against spacecraft data sets.

  19. Novel high-fidelity realistic explosion damage simulation for urban environments

    Science.gov (United States)

    Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya

    2010-04-01

    Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.

  20. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes to make use of the new data.3

  1. 3D realistic head model simulation based on transcranial magnetic stimulation.

    Science.gov (United States)

    Yang, Shuo; Xu, Guizhi; Wang, Lei; Chen, Yong; Wu, Huanli; Li, Ying; Yang, Qingxin

    2006-01-01

    Transcranial magnetic stimulation (TMS) is a powerful non-invasive tool for investigating functions in the brain. The target inside the head is stimulated with eddy currents induced in the tissue by the time-varying magnetic field. Precise spatial localization of stimulation sites is the key of efficient functional magnetic stimulations. Many researchers devote to magnetic field analysis in empty free space. In this paper, a realistic head model used in Finite Element Method has been developed. The magnetic field inducted in the head bt TMS has been analysed. This three-dimensional simulation is useful for spatial localization of stimulation.

  2. Evaluation of photovoltaic panel temperature in realistic scenarios

    International Nuclear Information System (INIS)

    Du, Yanping; Fell, Christopher J.; Duck, Benjamin; Chen, Dong; Liffman, Kurt; Zhang, Yinan; Gu, Min; Zhu, Yonggang

    2016-01-01

    Highlights: • The developed realistic model captures more reasonably the thermal response and hysteresis effects. • The predicted panel temperature is as high as 60 °C under a solar irradiance of 1000 W/m"2 in no-wind weather. • In realistic scenarios, the thermal response normally takes 50–250 s. • The actual heating effect may cause a photoelectric efficiency drop of 2.9–9.0%. - Abstract: Photovoltaic (PV) panel temperature was evaluated by developing theoretical models that are feasible to be used in realistic scenarios. Effects of solar irradiance, wind speed and ambient temperature on the PV panel temperature were studied. The parametric study shows significant influence of solar irradiance and wind speed on the PV panel temperature. With an increase of ambient temperature, the temperature rise of solar cells is reduced. The characteristics of panel temperature in realistic scenarios were analyzed. In steady weather conditions, the thermal response time of a solar cell with a Si thickness of 100–500 μm is around 50–250 s. While in realistic scenarios, the panel temperature variation in a day is different from that in steady weather conditions due to the effect of thermal hysteresis. The heating effect on the photovoltaic efficiency was assessed based on real-time temperature measurement of solar cells in realistic weather conditions. For solar cells with a temperature coefficient in the range of −0.21%∼−0.50%, the current field tests indicated an approximate efficiency loss between 2.9% and 9.0%.

  3. Simple and Realistic Data Generation

    DEFF Research Database (Denmark)

    Pedersen, Kenneth Houkjær; Torp, Kristian; Wind, Rico

    2006-01-01

    This paper presents a generic, DBMS independent, and highly extensible relational data generation tool. The tool can efficiently generate realistic test data for OLTP, OLAP, and data streaming applications. The tool uses a graph model to direct the data generation. This model makes it very simple...... to generate data even for large database schemas with complex inter- and intra table relationships. The model also makes it possible to generate data with very accurate characteristics....

  4. From biology to mathematical models and back: teaching modeling to biology students, and biology to math and engineering students.

    Science.gov (United States)

    Chiel, Hillel J; McManus, Jeffrey M; Shaw, Kendrick M

    2010-01-01

    We describe the development of a course to teach modeling and mathematical analysis skills to students of biology and to teach biology to students with strong backgrounds in mathematics, physics, or engineering. The two groups of students have different ways of learning material and often have strong negative feelings toward the area of knowledge that they find difficult. To give students a sense of mastery in each area, several complementary approaches are used in the course: 1) a "live" textbook that allows students to explore models and mathematical processes interactively; 2) benchmark problems providing key skills on which students make continuous progress; 3) assignment of students to teams of two throughout the semester; 4) regular one-on-one interactions with instructors throughout the semester; and 5) a term project in which students reconstruct, analyze, extend, and then write in detail about a recently published biological model. Based on student evaluations and comments, an attitude survey, and the quality of the students' term papers, the course has significantly increased the ability and willingness of biology students to use mathematical concepts and modeling tools to understand biological systems, and it has significantly enhanced engineering students' appreciation of biology.

  5. From Biology to Mathematical Models and Back: Teaching Modeling to Biology Students, and Biology to Math and Engineering Students

    Science.gov (United States)

    McManus, Jeffrey M.; Shaw, Kendrick M.

    2010-01-01

    We describe the development of a course to teach modeling and mathematical analysis skills to students of biology and to teach biology to students with strong backgrounds in mathematics, physics, or engineering. The two groups of students have different ways of learning material and often have strong negative feelings toward the area of knowledge that they find difficult. To give students a sense of mastery in each area, several complementary approaches are used in the course: 1) a “live” textbook that allows students to explore models and mathematical processes interactively; 2) benchmark problems providing key skills on which students make continuous progress; 3) assignment of students to teams of two throughout the semester; 4) regular one-on-one interactions with instructors throughout the semester; and 5) a term project in which students reconstruct, analyze, extend, and then write in detail about a recently published biological model. Based on student evaluations and comments, an attitude survey, and the quality of the students' term papers, the course has significantly increased the ability and willingness of biology students to use mathematical concepts and modeling tools to understand biological systems, and it has significantly enhanced engineering students' appreciation of biology. PMID:20810957

  6. Use of realistic anthropomorphic models for calculation of radiation dose in nuclear medicine

    International Nuclear Information System (INIS)

    Stabin, Michael G.; Emmons, Mary A.; Fernald, Michael J.; Brill, A.B.; Segars, W.Paul

    2008-01-01

    Anthropomorphic phantoms based on simple geometric structures have been used in radiation dose calculations for many years. We have now developed a series of anatomically realistic phantoms representing adults and children using body models based on non-uniform rational B-spline (NURBS), with organ and body masses based on the reference values given in ICRP Publication 89. Age-dependent models were scaled and shaped to represent the reference individuals described in ICRP 89 (male and female adults, newborns, 1-, 5-, 10- and 15-year-olds), using a software tool developed in Visual C++. Voxel-based versions of these models were used with GEANT4 radiation transport codes for calculation of specific absorbed fractions (SAFs) for internal sources of photons and electrons, using standard starting energy values. Organ masses in the models were within a few % of ICRP reference masses, and physicians reviewed the models for anatomical realism. Development of individual phantoms was much faster than manual segmentation of medical images, and resulted in a very uniform standardized phantom series. SAFs were calculated on the Vanderbilt multi node computing network (ACCRE). Photon and electron SAFs were calculated for all organs in all models, and were compared to values from similar phantoms developed by others. Agreement was very good in most cases; some differences were seen, due to differences in organ mass and geometry. This realistic phantom series represents a possible replacement for the Cristy/Eckerman series of the 1980's. Both phantom sets will be included in the next release of the OLINDA/EXM personal computer code, and the new phantoms will be made generally available to the research community for other uses. Calculated radiation doses for diagnostic and therapeutic radiopharmaceuticals will be compared with previous values. (author)

  7. Realistic Noise Assessment and Strain Analysis of Iranian Permanent GPS Stations

    Science.gov (United States)

    Razeghi, S. M.; Amiri Simkooei, A. A.; Sharifi, M. A.

    2012-04-01

    To assess noise characteristics of Iranian Permanent GPS Stations (IPGS), northwestern part of this network namely Azerbaijan Continuous GPS Station (ACGS), was selected. For a realistic noise assessment it is required to model all deterministic signals of the GPS time series by means of least squares harmonic estimation (LS-HE) and derive all periodic behavior of the series. After taking all deterministic signals into account, the least squares variance component estimation (LS-VCE) is used to obtain a realistic noise model (white noise plus flicker noise) of the ACGS. For this purpose, one needs simultaneous GPS time series for which a multivariate noise assessment is applied. Having determined realistic noise model, a realistic strain analysis of the network is obtained for which one relies on the finite element methods. Finite element is now considered to be a new functional model and the new stochastic model is given based on the multivariate noise assessment using LS-VCE. The deformation rates of the components along with their full covariance matries are input to the strain analysis. Further, the results are also provided using a pure white noise model. The normalized strains for these two models show that the strain parameters derived from a realistic noise model are less significant than those derived from the white model. This could be either due to the short time span of the time series used or due to the intrinsic behavior of the strain parameters in the ACGS. Longer time series are required to further elaborate this issue.

  8. Agent-based re-engineering of ErbB signaling: a modeling pipeline for integrative systems biology.

    Science.gov (United States)

    Das, Arya A; Ajayakumar Darsana, T; Jacob, Elizabeth

    2017-03-01

    Experiments in systems biology are generally supported by a computational model which quantitatively estimates the parameters of the system by finding the best fit to the experiment. Mathematical models have proved to be successful in reverse engineering the system. The data generated is interpreted to understand the dynamics of the underlying phenomena. The question we have sought to answer is that - is it possible to use an agent-based approach to re-engineer a biological process, making use of the available knowledge from experimental and modelling efforts? Can the bottom-up approach benefit from the top-down exercise so as to create an integrated modelling formalism for systems biology? We propose a modelling pipeline that learns from the data given by reverse engineering, and uses it for re-engineering the system, to carry out in-silico experiments. A mathematical model that quantitatively predicts co-expression of EGFR-HER2 receptors in activation and trafficking has been taken for this study. The pipeline architecture takes cues from the population model that gives the rates of biochemical reactions, to formulate knowledge-based rules for the particle model. Agent-based simulations using these rules, support the existing facts on EGFR-HER2 dynamics. We conclude that, re-engineering models, built using the results of reverse engineering, opens up the possibility of harnessing the power pack of data which now lies scattered in literature. Virtual experiments could then become more realistic when empowered with the findings of empirical cell biology and modelling studies. Implemented on the Agent Modelling Framework developed in-house. C ++ code templates available in Supplementary material . liz.csir@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  9. Modelling effective dielectric properties of materials containing diverse types of biological cells

    International Nuclear Information System (INIS)

    Huclova, Sonja; Froehlich, Juerg; Erni, Daniel

    2010-01-01

    An efficient and versatile numerical method for the generation of different realistically shaped biological cells is developed. This framework is used to calculate the dielectric spectra of materials containing specific types of biological cells. For the generation of the numerical models of the cells a flexible parametrization method based on the so-called superformula is applied including the option of obtaining non-axisymmetric shapes such as box-shaped cells and even shapes corresponding to echinocytes. The dielectric spectra of effective media containing various cell morphologies are calculated focusing on the dependence of the spectral features on the cell shape. The numerical method is validated by comparing a model of spherical inclusions at a low volume fraction with the analytical solution obtained by the Maxwell-Garnett mixing formula, resulting in good agreement. Our simulation data for different cell shapes suggest that around 1MHz the effective dielectric properties of different cell shapes at different volume fractions significantly deviate from the spherical case. The most pronounced change exhibits ε eff between 0.1 and 1 MHz with a deviation of up to 35% for a box-shaped cell and 15% for an echinocyte compared with the sphere at a volume fraction of 0.4. This hampers the unique interpretation of changes in cellular features measured by dielectric spectroscopy when simplified material models are used.

  10. Robust mode space approach for atomistic modeling of realistically large nanowire transistors

    Science.gov (United States)

    Huang, Jun Z.; Ilatikhameneh, Hesameddin; Povolotskyi, Michael; Klimeck, Gerhard

    2018-01-01

    Nanoelectronic transistors have reached 3D length scales in which the number of atoms is countable. Truly atomistic device representations are needed to capture the essential functionalities of the devices. Atomistic quantum transport simulations of realistically extended devices are, however, computationally very demanding. The widely used mode space (MS) approach can significantly reduce the numerical cost, but a good MS basis is usually very hard to obtain for atomistic full-band models. In this work, a robust and parallel algorithm is developed to optimize the MS basis for atomistic nanowires. This enables engineering-level, reliable tight binding non-equilibrium Green's function simulation of nanowire metal-oxide-semiconductor field-effect transistor (MOSFET) with a realistic cross section of 10 nm × 10 nm using a small computer cluster. This approach is applied to compare the performance of InGaAs and Si nanowire n-type MOSFETs (nMOSFETs) with various channel lengths and cross sections. Simulation results with full-band accuracy indicate that InGaAs nanowire nMOSFETs have no drive current advantage over their Si counterparts for cross sections up to about 10 nm × 10 nm.

  11. A Radiosity Approach to Realistic Image Synthesis

    Science.gov (United States)

    1992-12-01

    AD-A259 082 AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE SYNTHESIS THESIS Richard L. Remington Captain, USAF fl ECTE AFIT/GCE/ENG/92D...09 SJANl 1993U 93-00134 Approved for public release; distribution unlimited 93& 1! A -A- AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE...assistance in creating the input geometry file for the AWACS aircraft interior. Without his assistance, a good model for the diffuse radiosity implementation

  12. An online model composition tool for system biology models.

    Science.gov (United States)

    Coskun, Sarp A; Cicek, A Ercument; Lai, Nicola; Dash, Ranjan K; Ozsoyoglu, Z Meral; Ozsoyoglu, Gultekin

    2013-09-05

    There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user's input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well.

  13. Electron distribution in polar heterojunctions within a realistic model

    Energy Technology Data Exchange (ETDEWEB)

    Tien, Nguyen Thanh, E-mail: thanhtienctu@gmail.com [College of Natural Science, Can Tho University, 3-2 Road, Can Tho City (Viet Nam); Thao, Dinh Nhu [Center for Theoretical and Computational Physics, College of Education, Hue University, 34 Le Loi Street, Hue City (Viet Nam); Thao, Pham Thi Bich [College of Natural Science, Can Tho University, 3-2 Road, Can Tho City (Viet Nam); Quang, Doan Nhat [Institute of Physics, Vietnamese Academy of Science and Technology, 10 Dao Tan Street, Hanoi (Viet Nam)

    2015-12-15

    We present a theoretical study of the electron distribution, i.e., two-dimensional electron gas (2DEG) in polar heterojunctions (HJs) within a realistic model. The 2DEG is confined along the growth direction by a triangular quantum well with a finite potential barrier and a bent band figured by all confinement sources. Therein, interface polarization charges take a double role: they induce a confining potential and, furthermore, they can make some change in other confinements, e.g., in the Hartree potential from ionized impurities and 2DEG. Confinement by positive interface polarization charges is necessary for the ground state of 2DEG existing at a high sheet density. The 2DEG bulk density is found to be increased in the barrier, so that the scattering occurring in this layer (from interface polarization charges and alloy disorder) becomes paramount in a polar modulation-doped HJ.

  14. Intelligible design a realistic approach to the philosophy and history of science

    CERN Document Server

    Gonzalo, Julio A

    2014-01-01

    This book provides realistic answers to hotly debated scientific topics: Science is about quantitative aspects of natural realities (physical, chemical, biological) but it is the result of human intellectual inquiry and therefore not "per se" materialistic. This book, with contributions from experts in physics, cosmology, mathematics, engineering, biology and genetics, covers timely and relevant topics such as the origin of the universe, the origin of life on Earth, the origin of man (intelligent life) and the origin of science.

  15. Magnetic reconnection in the low solar chromosphere with a more realistic radiative cooling model

    Science.gov (United States)

    Ni, Lei; Lukin, Vyacheslav S.; Murphy, Nicholas A.; Lin, Jun

    2018-04-01

    Magnetic reconnection is the most likely mechanism responsible for the high temperature events that are observed in strongly magnetized locations around the temperature minimum in the low solar chromosphere. This work improves upon our previous work [Ni et al., Astrophys. J. 852, 95 (2018)] by using a more realistic radiative cooling model computed from the OPACITY project and the CHIANTI database. We find that the rate of ionization of the neutral component of the plasma is still faster than recombination within the current sheet region. For low β plasmas, the ionized and neutral fluid flows are well-coupled throughout the reconnection region resembling the single-fluid Sweet-Parker model dynamics. Decoupling of the ion and neutral inflows appears in the higher β case with β0=1.46 , which leads to a reconnection rate about three times faster than the rate predicted by the Sweet-Parker model. In all cases, the plasma temperature increases with time inside the current sheet, and the maximum value is above 2 ×104 K when the reconnection magnetic field strength is greater than 500 G. While the more realistic radiative cooling model does not result in qualitative changes of the characteristics of magnetic reconnection, it is necessary for studying the variations of the plasma temperature and ionization fraction inside current sheets in strongly magnetized regions of the low solar atmosphere. It is also important for studying energy conversion during the magnetic reconnection process when the hydrogen-dominated plasma approaches full ionization.

  16. Hemodynamic Changes Caused by Flow Diverters in Rabbit Aneurysm Models: Comparison of Virtual and Realistic FD Deployments Based on Micro-CT Reconstruction

    Science.gov (United States)

    Fang, Yibin; Yu, Ying; Cheng, Jiyong; Wang, Shengzhang; Wang, Kuizhong; Liu, Jian-Min; Huang, Qinghai

    2013-01-01

    Adjusting hemodynamics via flow diverter (FD) implantation is emerging as a novel method of treating cerebral aneurysms. However, most previous FD-related hemodynamic studies were based on virtual FD deployment, which may produce different hemodynamic outcomes than realistic (in vivo) FD deployment. We compared hemodynamics between virtual FD and realistic FD deployments in rabbit aneurysm models using computational fluid dynamics (CFD) simulations. FDs were implanted for aneurysms in 14 rabbits. Vascular models based on rabbit-specific angiograms were reconstructed for CFD studies. Real FD configurations were reconstructed based on micro-CT scans after sacrifice, while virtual FD configurations were constructed with SolidWorks software. Hemodynamic parameters before and after FD deployment were analyzed. According to the metal coverage (MC) of implanted FDs calculated based on micro-CT reconstruction, 14 rabbits were divided into two groups (A, MC >35%; B, MC 0.05). The normalized mean WSS in Group A after realistic FD implantation was significantly lower than that of Group B. All parameters in Group B exhibited no significant difference between realistic and virtual FDs. This study confirmed MC-correlated differences in hemodynamic parameters between realistic and virtual FD deployment. PMID:23823503

  17. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2008-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards in the interior of the object. In this abstract, we describe a simple algorithm for triangulating k-guardable polygons. Our algorithm, which is easily implementable, takes

  18. Integrating systems biology models and biomedical ontologies.

    Science.gov (United States)

    Hoehndorf, Robert; Dumontier, Michel; Gennari, John H; Wimalaratne, Sarala; de Bono, Bernard; Cook, Daniel L; Gkoutos, Georgios V

    2011-08-11

    Systems biology is an approach to biology that emphasizes the structure and dynamic behavior of biological systems and the interactions that occur within them. To succeed, systems biology crucially depends on the accessibility and integration of data across domains and levels of granularity. Biomedical ontologies were developed to facilitate such an integration of data and are often used to annotate biosimulation models in systems biology. We provide a framework to integrate representations of in silico systems biology with those of in vivo biology as described by biomedical ontologies and demonstrate this framework using the Systems Biology Markup Language. We developed the SBML Harvester software that automatically converts annotated SBML models into OWL and we apply our software to those biosimulation models that are contained in the BioModels Database. We utilize the resulting knowledge base for complex biological queries that can bridge levels of granularity, verify models based on the biological phenomenon they represent and provide a means to establish a basic qualitative layer on which to express the semantics of biosimulation models. We establish an information flow between biomedical ontologies and biosimulation models and we demonstrate that the integration of annotated biosimulation models and biomedical ontologies enables the verification of models as well as expressive queries. Establishing a bi-directional information flow between systems biology and biomedical ontologies has the potential to enable large-scale analyses of biological systems that span levels of granularity from molecules to organisms.

  19. Workshop Introduction: Systems Biology and Biological Models

    Science.gov (United States)

    As we consider the future of toxicity testing, the importance of applying biological models to this problem is clear. Modeling efforts exist along a continuum with respect to the level of organization (e.g. cell, tissue, organism) linked to the resolution of the model. Generally,...

  20. Realistic full wave modeling of focal plane array pixels.

    Energy Technology Data Exchange (ETDEWEB)

    Campione, Salvatore [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Warne, Larry K. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Jorgenson, Roy E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Davids, Paul [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.; Peters, David W. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.

    2017-11-01

    Here, we investigate full-wave simulations of realistic implementations of multifunctional nanoantenna enabled detectors (NEDs). We focus on a 2x2 pixelated array structure that supports two wavelengths of operation. We design each resonating structure independently using full-wave simulations with periodic boundary conditions mimicking the whole infinite array. We then construct a supercell made of a 2x2 pixelated array with periodic boundary conditions mimicking the full NED; in this case, however, each pixel comprises 10-20 antennas per side. In this way, the cross-talk between contiguous pixels is accounted for in our simulations. We observe that, even though there are finite extent effects, the pixels work as designed, each responding at the respective wavelength of operation. This allows us to stress that realistic simulations of multifunctional NEDs need to be performed to verify the design functionality by taking into account finite extent and cross-talk effects.

  1. Toward synthesizing executable models in biology.

    Science.gov (United States)

    Fisher, Jasmin; Piterman, Nir; Bodik, Rastislav

    2014-01-01

    Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell's behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions), even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modeling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.

  2. Improved transcranial magnetic stimulation coil design with realistic head modeling

    Science.gov (United States)

    Crowther, Lawrence; Hadimani, Ravi; Jiles, David

    2013-03-01

    We are investigating Transcranial magnetic stimulation (TMS) as a noninvasive technique based on electromagnetic induction which causes stimulation of the neurons in the brain. TMS can be used as a pain-free alternative to conventional electroconvulsive therapy (ECT) which is still widely implemented for treatment of major depression. Development of improved TMS coils capable of stimulating subcortical regions could also allow TMS to replace invasive deep brain stimulation (DBS) which requires surgical implantation of electrodes in the brain. Our new designs allow new applications of the technique to be established for a variety of diagnostic and therapeutic applications of psychiatric disorders and neurological diseases. Calculation of the fields generated inside the head is vital for the use of this method for treatment. In prior work we have implemented a realistic head model, incorporating inhomogeneous tissue structures and electrical conductivities, allowing the site of neuronal activation to be accurately calculated. We will show how we utilize this model in the development of novel TMS coil designs to improve the depth of penetration and localization of stimulation produced by stimulator coils.

  3. Automatic skull segmentation from MR images for realistic volume conductor models of the head: Assessment of the state-of-the-art

    DEFF Research Database (Denmark)

    Nielsen, Jesper Duemose; Madsen, Kristoffer Hougaard; Puonti, Oula

    2018-01-01

    Anatomically realistic volume conductor models of the human head are important for accurate forward modeling of the electric field during transcranial brain stimulation (TBS), electro- (EEG) and magnetoencephalography (MEG). In particular, the skull compartment exerts a strong influence on the fi......Anatomically realistic volume conductor models of the human head are important for accurate forward modeling of the electric field during transcranial brain stimulation (TBS), electro- (EEG) and magnetoencephalography (MEG). In particular, the skull compartment exerts a strong influence...... local defects. In contrast to FSL BET2, the SPM12-based segmentation with extended spatial tissue priors and the BrainSuite-based segmentation provide coarse reconstructions of the vertebrae, enabling the construction of volume conductor models that include the neck. We exemplarily demonstrate...

  4. RenderGAN: Generating Realistic Labeled Data

    Directory of Open Access Journals (Sweden)

    Leon Sixt

    2018-06-01

    Full Text Available Deep Convolutional Neuronal Networks (DCNNs are showing remarkable performance on many computer vision tasks. Due to their large parameter space, they require many labeled samples when trained in a supervised setting. The costs of annotating data manually can render the use of DCNNs infeasible. We present a novel framework called RenderGAN that can generate large amounts of realistic, labeled images by combining a 3D model and the Generative Adversarial Network framework. In our approach, image augmentations (e.g., lighting, background, and detail are learned from unlabeled data such that the generated images are strikingly realistic while preserving the labels known from the 3D model. We apply the RenderGAN framework to generate images of barcode-like markers that are attached to honeybees. Training a DCNN on data generated by the RenderGAN yields considerably better performance than training it on various baselines.

  5. A Local-Realistic Model of Quantum Mechanics Based on a Discrete Spacetime

    Science.gov (United States)

    Sciarretta, Antonio

    2018-01-01

    This paper presents a realistic, stochastic, and local model that reproduces nonrelativistic quantum mechanics (QM) results without using its mathematical formulation. The proposed model only uses integer-valued quantities and operations on probabilities, in particular assuming a discrete spacetime under the form of a Euclidean lattice. Individual (spinless) particle trajectories are described as random walks. Transition probabilities are simple functions of a few quantities that are either randomly associated to the particles during their preparation, or stored in the lattice nodes they visit during the walk. QM predictions are retrieved as probability distributions of similarly-prepared ensembles of particles. The scenarios considered to assess the model comprise of free particle, constant external force, harmonic oscillator, particle in a box, the Delta potential, particle on a ring, particle on a sphere and include quantization of energy levels and angular momentum, as well as momentum entanglement.

  6. EIT forward problem parallel simulation environment with anisotropic tissue and realistic electrode models.

    Science.gov (United States)

    De Marco, Tommaso; Ries, Florian; Guermandi, Marco; Guerrieri, Roberto

    2012-05-01

    Electrical impedance tomography (EIT) is an imaging technology based on impedance measurements. To retrieve meaningful insights from these measurements, EIT relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of current flows therein. The nonhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeoff between physical accuracy and technical feasibility, which at present severely limits the capabilities of EIT. This work presents a complete algorithmic flow for an accurate EIT modeling environment featuring high anatomical fidelity with a spatial resolution equal to that provided by an MRI and a novel realistic complete electrode model implementation. At the same time, we demonstrate that current graphics processing unit (GPU)-based platforms provide enough computational power that a domain discretized with five million voxels can be numerically modeled in about 30 s.

  7. Issues in Biological Shape Modelling

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen

    This talk reflects parts of the current research at informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations, modifications, and applications of the elements of constructing models of shape or appear......This talk reflects parts of the current research at informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations, modifications, and applications of the elements of constructing models of shape...

  8. Biology task group

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    The accomplishments of the task group studies over the past year are reviewed. The purposes of biological investigations, in the context of subseabed disposal, are: an evaluation of the dose to man; an estimation of effects on the ecosystem; and an estimation of the influence of organisms on and as barriers to radionuclide migration. To accomplish these ends, the task group adopted the following research goals: (1) acquire more data on biological accumulation of specific radionuclides, such as those of Tc, Np, Ra, and Sr; (2) acquire more data on transfer coefficients from sediment to organism; (3) Calculate mass transfer rates, construct simple models using them, and estimate collective dose commitment; (4) Identify specific pathways or transfer routes, determine the rates of transfer, and make dose limit calculations with simple models; (5) Calculate dose rates to and estimate irradiation effects on the biota as a result of waste emplacement, by reference to background irradiation calculations. (6) Examine the effect of the biota on altering sediment/water radionuclide exchange; (7) Consider the biological data required to address different accident scenarios; (8) Continue to provide the basic biological information for all of the above, and ensure that the system analysis model is based on the most realistic and up-to-date concepts of marine biologists; and (9) Ensure by way of free exchange of information that the data used in any model are the best currently available

  9. Computation of Surface Laplacian for tri-polar ring electrodes on high-density realistic geometry head model.

    Science.gov (United States)

    Junwei Ma; Han Yuan; Sunderam, Sridhar; Besio, Walter; Lei Ding

    2017-07-01

    Neural activity inside the human brain generate electrical signals that can be detected on the scalp. Electroencephalograph (EEG) is one of the most widely utilized techniques helping physicians and researchers to diagnose and understand various brain diseases. Due to its nature, EEG signals have very high temporal resolution but poor spatial resolution. To achieve higher spatial resolution, a novel tri-polar concentric ring electrode (TCRE) has been developed to directly measure Surface Laplacian (SL). The objective of the present study is to accurately calculate SL for TCRE based on a realistic geometry head model. A locally dense mesh was proposed to represent the head surface, where the local dense parts were to match the small structural components in TCRE. Other areas without dense mesh were used for the purpose of reducing computational load. We conducted computer simulations to evaluate the performance of the proposed mesh and evaluated possible numerical errors as compared with a low-density model. Finally, with achieved accuracy, we presented the computed forward lead field of SL for TCRE for the first time in a realistic geometry head model and demonstrated that it has better spatial resolution than computed SL from classic EEG recordings.

  10. Towards Synthesizing Executable Models in Biology

    Directory of Open Access Journals (Sweden)

    Jasmin eFisher

    2014-12-01

    Full Text Available Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell’s behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions, even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modelling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.

  11. Explicit all-atom modeling of realistically sized ligand-capped nanocrystals

    KAUST Repository

    Kaushik, Ananth P.

    2012-01-01

    We present a study of an explicit all-atom representation of nanocrystals of experimentally relevant sizes (up to 6 nm), capped with alkyl chain ligands, in vacuum. We employ all-atom molecular dynamics simulation methods in concert with a well-tested intermolecular potential model, MM3 (molecular mechanics 3), for the studies presented here. These studies include determining the preferred conformation of an isolated single nanocrystal (NC), pairs of isolated NCs, and (presaging studies of superlattice arrays) unit cells of NC superlattices. We observe that very small NCs (3 nm) behave differently in a superlattice as compared to larger NCs (6 nm and above) due to the conformations adopted by the capping ligands on the NC surface. Short ligands adopt a uniform distribution of orientational preferences, including some that lie against the face of the nanocrystal. In contrast, longer ligands prefer to interdigitate. We also study the effect of changing ligand length and ligand coverage on the NCs on the preferred ligand configurations. Since explicit all-atom modeling constrains the maximum system size that can be studied, we discuss issues related to coarse-graining the representation of the ligands, including a comparison of two commonly used coarse-grained models. We find that care has to be exercised in the choice of coarse-grained model. The data provided by these realistically sized ligand-capped NCs, determined using explicit all-atom models, should serve as a reference standard for future models of coarse-graining ligands using united atom models, especially for self-assembly processes. © 2012 American Institute of Physics.

  12. A Madden-Julian oscillation event realistically simulated by a global cloud-resolving model.

    Science.gov (United States)

    Miura, Hiroaki; Satoh, Masaki; Nasuno, Tomoe; Noda, Akira T; Oouchi, Kazuyoshi

    2007-12-14

    A Madden-Julian Oscillation (MJO) is a massive weather event consisting of deep convection coupled with atmospheric circulation, moving slowly eastward over the Indian and Pacific Oceans. Despite its enormous influence on many weather and climate systems worldwide, it has proven very difficult to simulate an MJO because of assumptions about cumulus clouds in global meteorological models. Using a model that allows direct coupling of the atmospheric circulation and clouds, we successfully simulated the slow eastward migration of an MJO event. Topography, the zonal sea surface temperature gradient, and interplay between eastward- and westward-propagating signals controlled the timing of the eastward transition of the convective center. Our results demonstrate the potential making of month-long MJO predictions when global cloud-resolving models with realistic initial conditions are used.

  13. Models for synthetic biology.

    Science.gov (United States)

    Kaznessis, Yiannis N

    2007-11-06

    Synthetic biological engineering is emerging from biology as a distinct discipline based on quantification. The technologies propelling synthetic biology are not new, nor is the concept of designing novel biological molecules. What is new is the emphasis on system behavior. The objective is the design and construction of new biological devices and systems to deliver useful applications. Numerous synthetic gene circuits have been created in the past decade, including bistable switches, oscillators, and logic gates, and possible applications abound, including biofuels, detectors for biochemical and chemical weapons, disease diagnosis, and gene therapies. More than fifty years after the discovery of the molecular structure of DNA, molecular biology is mature enough for real quantification that is useful for biological engineering applications, similar to the revolution in modeling in chemistry in the 1950s. With the excitement that synthetic biology is generating, the engineering and biological science communities appear remarkably willing to cross disciplinary boundaries toward a common goal.

  14. MORE: mixed optimization for reverse engineering--an application to modeling biological networks response via sparse systems of nonlinear differential equations.

    Science.gov (United States)

    Sambo, Francesco; de Oca, Marco A Montes; Di Camillo, Barbara; Toffolo, Gianna; Stützle, Thomas

    2012-01-01

    Reverse engineering is the problem of inferring the structure of a network of interactions between biological variables from a set of observations. In this paper, we propose an optimization algorithm, called MORE, for the reverse engineering of biological networks from time series data. The model inferred by MORE is a sparse system of nonlinear differential equations, complex enough to realistically describe the dynamics of a biological system. MORE tackles separately the discrete component of the problem, the determination of the biological network topology, and the continuous component of the problem, the strength of the interactions. This approach allows us both to enforce system sparsity, by globally constraining the number of edges, and to integrate a priori information about the structure of the underlying interaction network. Experimental results on simulated and real-world networks show that the mixed discrete/continuous optimization approach of MORE significantly outperforms standard continuous optimization and that MORE is competitive with the state of the art in terms of accuracy of the inferred networks.

  15. Regional 3-D Modeling of Ground Geoelectric Field for the Northeast United States due to Realistic Geomagnetic Disturbances

    Science.gov (United States)

    Ivannikova, E.; Kruglyakov, M.; Kuvshinov, A. V.; Rastaetter, L.; Pulkkinen, A. A.; Ngwira, C. M.

    2017-12-01

    During extreme space weather events electric currents in the Earth's magnetosphere and ionosphere experience large variations, which leads to dramatic intensification of the fluctuating magnetic field at the surface of the Earth. According to Faraday's law of induction, the fluctuating geomagnetic field in turn induces electric field that generates harmful currents (so-called "geomagnetically induced currents"; GICs) in grounded technological systems. Understanding (via modeling) of the spatio-temporal evolution of the geoelectric field during enhanced geomagnetic activity is a key consideration in estimating the hazard to technological systems from space weather. We present the results of ground geoelectric field modeling for the Northeast United States, which is performed with the use of our novel numerical tool based on integral equation approach. The tool exploits realistic regional three-dimensional (3-D) models of the Earth's electrical conductivity and realistic global models of the spatio-temporal evolution of the magnetospheric and ionospheric current systems responsible for geomagnetic disturbances. We also explore in detail the manifestation of the coastal effect (anomalous intensification of the geoelectric field near the coasts) in this region.

  16. A phenomenological biological dose model for proton therapy based on linear energy transfer spectra.

    Science.gov (United States)

    Rørvik, Eivind; Thörnqvist, Sara; Stokkevåg, Camilla H; Dahle, Tordis J; Fjaera, Lars Fredrik; Ytre-Hauge, Kristian S

    2017-06-01

    The relative biological effectiveness (RBE) of protons varies with the radiation quality, quantified by the linear energy transfer (LET). Most phenomenological models employ a linear dependency of the dose-averaged LET (LET d ) to calculate the biological dose. However, several experiments have indicated a possible non-linear trend. Our aim was to investigate if biological dose models including non-linear LET dependencies should be considered, by introducing a LET spectrum based dose model. The RBE-LET relationship was investigated by fitting of polynomials from 1st to 5th degree to a database of 85 data points from aerobic in vitro experiments. We included both unweighted and weighted regression, the latter taking into account experimental uncertainties. Statistical testing was performed to decide whether higher degree polynomials provided better fits to the data as compared to lower degrees. The newly developed models were compared to three published LET d based models for a simulated spread out Bragg peak (SOBP) scenario. The statistical analysis of the weighted regression analysis favored a non-linear RBE-LET relationship, with the quartic polynomial found to best represent the experimental data (P = 0.010). The results of the unweighted regression analysis were on the borderline of statistical significance for non-linear functions (P = 0.053), and with the current database a linear dependency could not be rejected. For the SOBP scenario, the weighted non-linear model estimated a similar mean RBE value (1.14) compared to the three established models (1.13-1.17). The unweighted model calculated a considerably higher RBE value (1.22). The analysis indicated that non-linear models could give a better representation of the RBE-LET relationship. However, this is not decisive, as inclusion of the experimental uncertainties in the regression analysis had a significant impact on the determination and ranking of the models. As differences between the models were

  17. A realistic approach to modeling an in-duct desulfurization process based on an experimental pilot plant study

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, F.J.G.; Ollero, P. [University of Seville, Seville (Spain)

    2008-07-15

    This paper has been written to provide a realistic approach to modeling an in-duct desulfurization process and because of the disagreement between the results predicted by published kinetic models of the reaction between hydrated lime and SO{sub 2} at low temperature and the experimental results obtained in pilot plants where this process takes place. Results were obtained from an experimental program carried out in a 3-MWe pilot plant. Additionally, five kinetic models, from the literature, of the reaction of sulfation of Ca(OH){sub 2} at low temperatures were assessed by simulation and indicate that the desulfurization efficiencies predicted by them are clearly lower than those experimentally obtained in our own pilot plant as well as others. Next, a general model was fitted by minimizing the difference between the calculated and the experimental results from the pilot plant, using Matlab{sup TM}. The parameters were reduced as much as possible, to only two. Finally, after implementing this model in a simulation tool of the in-duct sorbent injection process, it was validated and it was shown to yield a realistic approach useful for both analyzing results and aiding in the design of an in-duct desulfurization process.

  18. Semantic modeling for theory clarification: The realist vs liberal international relations perspective

    Energy Technology Data Exchange (ETDEWEB)

    Bray, O.H. [Sandia National Labs., Albuquerque, NM (United States)]|[Univ. of New Mexico, Albuquerque, NM (United States). Political Science Dept.

    1994-04-01

    This paper describes a natural language based, semantic information modeling methodology and explores its use and value in clarifying and comparing political science theories and frameworks. As an example, the paper uses this methodology to clarify and compare some of the basic concepts and relationships in the realist (e.g. Waltz) and the liberal (e.g. Rosenau) paradigms for international relations. The methodology can provide three types of benefits: (1) it can clarify and make explicit exactly what is meant by a concept; (2) it can often identify unanticipated implications and consequence of concepts and relationships; and (3) it can help in identifying and operationalizing testable hypotheses.

  19. Autumn Algorithm-Computation of Hybridization Networks for Realistic Phylogenetic Trees.

    Science.gov (United States)

    Huson, Daniel H; Linz, Simone

    2018-01-01

    A minimum hybridization network is a rooted phylogenetic network that displays two given rooted phylogenetic trees using a minimum number of reticulations. Previous mathematical work on their calculation has usually assumed the input trees to be bifurcating, correctly rooted, or that they both contain the same taxa. These assumptions do not hold in biological studies and "realistic" trees have multifurcations, are difficult to root, and rarely contain the same taxa. We present a new algorithm for computing minimum hybridization networks for a given pair of "realistic" rooted phylogenetic trees. We also describe how the algorithm might be used to improve the rooting of the input trees. We introduce the concept of "autumn trees", a nice framework for the formulation of algorithms based on the mathematics of "maximum acyclic agreement forests". While the main computational problem is hard, the run-time depends mainly on how different the given input trees are. In biological studies, where the trees are reasonably similar, our parallel implementation performs well in practice. The algorithm is available in our open source program Dendroscope 3, providing a platform for biologists to explore rooted phylogenetic networks. We demonstrate the utility of the algorithm using several previously studied data sets.

  20. Survey of Approaches to Generate Realistic Synthetic Graphs

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Seung-Hwan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Sangkeun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Sarah S [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shankar, Mallikarjun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Imam, Neena [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    A graph is a flexible data structure that can represent relationships between entities. As with other data analysis tasks, the use of realistic graphs is critical to obtaining valid research results. Unfortunately, using the actual ("real-world") graphs for research and new algorithm development is difficult due to the presence of sensitive information in the data or due to the scale of data. This results in practitioners developing algorithms and systems that employ synthetic graphs instead of real-world graphs. Generating realistic synthetic graphs that provide reliable statistical confidence to algorithmic analysis and system evaluation involves addressing technical hurdles in a broad set of areas. This report surveys the state of the art in approaches to generate realistic graphs that are derived from fitted graph models on real-world graphs.

  1. Generating Systems Biology Markup Language Models from the Synthetic Biology Open Language.

    Science.gov (United States)

    Roehner, Nicholas; Zhang, Zhen; Nguyen, Tramy; Myers, Chris J

    2015-08-21

    In the context of synthetic biology, model generation is the automated process of constructing biochemical models based on genetic designs. This paper discusses the use cases for model generation in genetic design automation (GDA) software tools and introduces the foundational concepts of standards and model annotation that make this process useful. Finally, this paper presents an implementation of model generation in the GDA software tool iBioSim and provides an example of generating a Systems Biology Markup Language (SBML) model from a design of a 4-input AND sensor written in the Synthetic Biology Open Language (SBOL).

  2. Realistic Approach for Phasor Measurement Unit Placement

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    This paper presents a realistic cost-effectivemodel for optimal placement of phasor measurement units (PMUs) for complete observability of a power system considering practical cost implications. The proposed model considers hidden or otherwise unaccounted practical costs involved in PMU...... installation. Consideration of these hidden but significant and integral part of total PMU installation costs was inspired from practical experience on a real-life project. The proposedmodel focuses on the minimization of total realistic costs instead of a widely used theoretical concept of a minimal number...... of PMUs. The proposed model has been applied to IEEE 14-bus, IEEE 24-bus, IEEE 30-bus, New England 39-bus, and large power system of 300 buses and real life Danish grid. A comparison of the presented results with those reported by traditionalmethods has also been shown to justify the effectiveness...

  3. Mesoscopic models of biological membranes

    DEFF Research Database (Denmark)

    Venturoli, M.; Sperotto, Maria Maddalena; Kranenburg, M.

    2006-01-01

    Phospholipids are the main components of biological membranes and dissolved in water these molecules self-assemble into closed structures, of which bilayers are the most relevant from a biological point of view. Lipid bilayers are often used, both in experimental and by theoretical investigations...... to coarse grain a biological membrane. The conclusion of this comparison is that there can be many valid different strategies, but that the results obtained by the various mesoscopic models are surprisingly consistent. A second objective of this review is to illustrate how mesoscopic models can be used...

  4. Mathematics Instructional Model Based on Realistic Mathematics Education to Promote Problem Solving Ability at Junior High School Padang

    OpenAIRE

    Edwin Musdi

    2016-01-01

    This research aims to develop a mathematics instructional model based realistic mathematics education (RME) to promote students' problem-solving abilities. The design research used Plomp models, which consists of preliminary phase, development or proto-typing phase and assessment phase.  At this study, only the first two phases conducted. The first phase, a preliminary investigation, carried out with a literature study to examine the theory-based instructional learning RME model, characterist...

  5. A coupled physical-biological model of the Northern Gulf of Mexico shelf: model description, validation and analysis of phytoplankton variability

    Directory of Open Access Journals (Sweden)

    K. Fennel

    2011-07-01

    Full Text Available The Texas-Louisiana shelf in the Northern Gulf of Mexico receives large inputs of nutrients and freshwater from the Mississippi/Atchafalaya River system. The nutrients stimulate high rates of primary production in the river plume, which contributes to the development of a large and recurring hypoxic area in summer, but the mechanistic links between hypoxia and river discharge of freshwater and nutrients are complex as the accumulation and vertical export of organic matter, the establishment and maintenance of vertical stratification, and the microbial degradation of organic matter are controlled by a non-linear interplay of factors. Unraveling these interactions will have to rely on a combination of observations and models. Here we present results from a realistic, 3-dimensional, physical-biological model with focus on a quantification of nutrient-stimulated phytoplankton growth, its variability and the fate of this organic matter. We demonstrate that the model realistically reproduces many features of observed nitrate and phytoplankton dynamics including observed property distributions and rates. We then contrast the environmental factors and phytoplankton source and sink terms characteristic of three model subregions that represent an ecological gradient from eutrophic to oligotrophic conditions. We analyze specifically the reasons behind the counterintuitive observation that primary production in the light-limited plume region near the Mississippi River delta is positively correlated with river nutrient input, and find that, while primary production and phytoplankton biomass are positively correlated with nutrient load, phytoplankton growth rate is not. This suggests that accumulation of biomass in this region is not primarily controlled bottom up by nutrient-stimulation, but top down by systematic differences in the loss processes.

  6. Cardiac autonomic functions and the emergence of violence in a highly realistic model of social conflict in humans.

    Directory of Open Access Journals (Sweden)

    Jozsef eHaller

    2014-10-01

    Full Text Available Among the multitude of factors that can transform human social interactions into violent conflicts, biological features received much attention in recent years as correlates of decision making and aggressiveness especially in critical situations. We present here a highly realistic new model of human aggression and violence, where genuine acts of aggression are readily performed and which at the same time allows the parallel recording of biological concomitants. Particularly, we studied police officers trained at the International Training Centre (Budapest, Hungary, who are prepared to perform operations under extreme conditions of stress. We found that aggressive arousal can transform a basically peaceful social encounter into a violent conflict. Autonomic recordings show that this change is accompanied by increased heart rates, which was associated earlier with reduced cognitive complexity of perceptions (attentional myopia and promotes a bias towards hostile attributions and aggression. We also observed reduced heart rate variability in violent subjects, which is believed to signal a poor functioning of prefrontal-subcortical inhibitory circuits and reduces self-control. Importantly, these autonomic particularities were observed already at the beginning of social encounters i.e. before aggressive acts were initiated, suggesting that individual characteristics of the stress-response define the way in which social pressure affects social behavior, particularly the way in which this develops into violence. Taken together, these findings suggest that cardiac autonomic functions are valuable external symptoms of internal motivational states and decision making processes, and raise the possibility that behavior under social pressure can be predicted by the individual characteristics of stress responsiveness.

  7. Toward University Modeling Instruction—Biology: Adapting Curricular Frameworks from Physics to Biology

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-01-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence. PMID:23737628

  8. Toward university modeling instruction--biology: adapting curricular frameworks from physics to biology.

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-06-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence.

  9. Biochemical transport modeling, estimation, and detection in realistic environments

    Science.gov (United States)

    Ortner, Mathias; Nehorai, Arye

    2006-05-01

    Early detection and estimation of the spread of a biochemical contaminant are major issues for homeland security applications. We present an integrated approach combining the measurements given by an array of biochemical sensors with a physical model of the dispersion and statistical analysis to solve these problems and provide system performance measures. We approximate the dispersion model of the contaminant in a realistic environment through numerical simulations of reflected stochastic diffusions describing the microscopic transport phenomena due to wind and chemical diffusion using the Feynman-Kac formula. We consider arbitrary complex geometries and account for wind turbulence. Localizing the dispersive sources is useful for decontamination purposes and estimation of the cloud evolution. To solve the associated inverse problem, we propose a Bayesian framework based on a random field that is particularly powerful for localizing multiple sources with small amounts of measurements. We also develop a sequential detector using the numerical transport model we propose. Sequential detection allows on-line analysis and detecting wether a change has occurred. We first focus on the formulation of a suitable sequential detector that overcomes the presence of unknown parameters (e.g. release time, intensity and location). We compute a bound on the expected delay before false detection in order to decide the threshold of the test. For a fixed false-alarm rate, we obtain the detection probability of a substance release as a function of its location and initial concentration. Numerical examples are presented for two real-world scenarios: an urban area and an indoor ventilation duct.

  10. Laboratory of Biological Modeling

    Data.gov (United States)

    Federal Laboratory Consortium — The Laboratory of Biological Modeling is defined by both its methodologies and its areas of application. We use mathematical modeling in many forms and apply it to a...

  11. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  12. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo; Artina, Marco; Foransier, Massimo; Markowich, Peter A.

    2015-01-01

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation

  13. Implications of introducing realistic fire response traits in a Dynamic Global Vegetation Model

    Science.gov (United States)

    Kelley, D.; Harrison, S. P.; Prentice, I. C.

    2013-12-01

    Bark thickness is a key trait protecting woody plants against fire damage, while the ability to resprout is a trait that confers competitive advantage over non-resprouting individuals in fire-prone landscapes. Neither trait is well represented in fire-enabled dynamic global vegetation models (DGVMs). Here we describe a version of the Land Processes and eXchanges (LPX-Mv1) DGVM that incorporates both of these traits in a realistic way. From a synthesis of a large number of field studies, we show there is considerable innate variability in bark thickness between species within a plant-functional type (PFT). Furthermore, bark thickness is an adaptive trait at ecosystem level, increasing with fire frequency. We use the data to specify the range of bark thicknesses characteristic of each model PFT. We allow this distribution to change dynamically: thinner-barked trees are killed preferentially by fire, shifting the distribution of bark thicknesses represented in a model grid cell. We use the PFT-specific bark-thickness probability range for saplings during re-establishment. Since it is rare to destroy all trees in a grid cell, this treatment results in average bark thickness increasing with fire frequency and intensity. Resprouting is a prominent adaptation of temperate and tropical trees in fire-prone areas. The ability to resprout from above-ground tissue (apical or epicormic resprouting) results in the fastest recovery of total biomass after disturbance; resprouting from basal or below-ground meristems results in slower recovery, while non-resprouting species must regenerate from seed and therefore take the longest time to recover. Our analyses show that resprouting species have thicker bark than non-resprouting species. Investment in resprouting is accompanied by reduced efficacy of regeneration from seed. We introduce resprouting PFTs in LPX-Mv1 by specifying an appropriate range of bark thickness, allowing resprouters to survive fire and regenerate vegetatively in

  14. An anatomically realistic whole-body pregnant-woman model and specific absorption rates for pregnant-woman exposure to electromagnetic plane waves from 10 MHz to 2 GHz

    International Nuclear Information System (INIS)

    Nagaoka, Tomoaki; Togashi, Toshihiro; Saito, Kazuyuki; Takahashi, Masaharu; Ito, Koichi; Watanabe, Soichi

    2007-01-01

    The numerical dosimetry of pregnant women is an important issue in electromagnetic-field safety. However, an anatomically realistic whole-body pregnant-woman model for electromagnetic dosimetry has not been developed. Therefore, we have developed a high-resolution whole-body model of pregnant women. A new fetus model including inherent tissues of pregnant women was constructed on the basis of abdominal magnetic resonance imaging data of a 26-week-pregnant woman. The whole-body pregnant-woman model was developed by combining the fetus model and a nonpregnant-woman model that was developed previously. The developed model consists of about 7 million cubical voxels of 2 mm size and is segmented into 56 tissues and organs. This pregnant-woman model is the first completely anatomically realistic voxel model that includes a realistic fetus model and enables a numerical simulation of electromagnetic dosimetry up to the gigahertz band. In this paper, we also present the basic specific absorption rate characteristics of the pregnant-woman model exposed to vertically and horizontally polarized electromagnetic waves from 10 MHz to 2 GHz

  15. Mathematical models in biological discovery

    CERN Document Server

    Walter, Charles

    1977-01-01

    When I was asked to help organize an American Association for the Advancement of Science symposium about how mathematical models have con­ tributed to biology, I agreed immediately. The subject is of immense importance and wide-spread interest. However, too often it is discussed in biologically sterile environments by "mutual admiration society" groups of "theoreticians", many of whom have never seen, and most of whom have never done, an original scientific experiment with the biolog­ ical materials they attempt to describe in abstract (and often prejudiced) terms. The opportunity to address the topic during an annual meeting of the AAAS was irresistable. In order to try to maintain the integrity ;,f the original intent of the symposium, it was entitled, "Contributions of Mathematical Models to Biological Discovery". This symposium was organized by Daniel Solomon and myself, held during the 141st annual meeting of the AAAS in New York during January, 1975, sponsored by sections G and N (Biological and Medic...

  16. Realistic camera noise modeling with application to improved HDR synthesis

    Science.gov (United States)

    Goossens, Bart; Luong, Hiêp; Aelterman, Jan; Pižurica, Aleksandra; Philips, Wilfried

    2012-12-01

    Due to the ongoing miniaturization of digital camera sensors and the steady increase of the "number of megapixels", individual sensor elements of the camera become more sensitive to noise, even deteriorating the final image quality. To go around this problem, sophisticated processing algorithms in the devices, can help to maximally exploit the knowledge on the sensor characteristics (e.g., in terms of noise), and offer a better image reconstruction. Although a lot of research focuses on rather simplistic noise models, such as stationary additive white Gaussian noise, only limited attention has gone to more realistic digital camera noise models. In this article, we first present a digital camera noise model that takes several processing steps in the camera into account, such as sensor signal amplification, clipping, post-processing,.. We then apply this noise model to the reconstruction problem of high dynamic range (HDR) images from a small set of low dynamic range (LDR) exposures of a static scene. In literature, HDR reconstruction is mostly performed by computing a weighted average, in which the weights are directly related to the observer pixel intensities of the LDR image. In this work, we derive a Bayesian probabilistic formulation of a weighting function that is near-optimal in the MSE sense (or SNR sense) of the reconstructed HDR image, by assuming exponentially distributed irradiance values. We define the weighting function as the probability that the observed pixel intensity is approximately unbiased. The weighting function can be directly computed based on the noise model parameters, which gives rise to different symmetric and asymmetric shapes when electronic noise or photon noise is dominant. We also explain how to deal with the case that some of the noise model parameters are unknown and explain how the camera response function can be estimated using the presented noise model. Finally, experimental results are provided to support our findings.

  17. Functional model of biological neural networks.

    Science.gov (United States)

    Lo, James Ting-Ho

    2010-12-01

    A functional model of biological neural networks, called temporal hierarchical probabilistic associative memory (THPAM), is proposed in this paper. THPAM comprises functional models of dendritic trees for encoding inputs to neurons, a first type of neuron for generating spike trains, a second type of neuron for generating graded signals to modulate neurons of the first type, supervised and unsupervised Hebbian learning mechanisms for easy learning and retrieving, an arrangement of dendritic trees for maximizing generalization, hardwiring for rotation-translation-scaling invariance, and feedback connections with different delay durations for neurons to make full use of present and past informations generated by neurons in the same and higher layers. These functional models and their processing operations have many functions of biological neural networks that have not been achieved by other models in the open literature and provide logically coherent answers to many long-standing neuroscientific questions. However, biological justifications of these functional models and their processing operations are required for THPAM to qualify as a macroscopic model (or low-order approximate) of biological neural networks.

  18. Satisfaction and sustainability: a realist review of decentralized models of perinatal surgery for rural women.

    Science.gov (United States)

    Kornelsen, Jude; McCartney, Kevin; Williams, Kim

    2016-01-01

    This article was developed as part of a larger realist review investigating the viability and efficacy of decentralized models of perinatal surgical services for rural women in the context of recent and ongoing service centralization witnessed in many developed nations. The larger realist review was commissioned by the British Columbia Ministry of Health and Perinatal Services of British Columbia, Canada. Findings from that review are addressed in this article specific to the sustainability of rural perinatal surgical sites and the satisfaction of providers that underpins their recruitment to and retention at such sites. A realist method was used in the selection and analysis of literature with the intention to iteratively develop a sophisticated understanding of how perinatal surgical services can best meet the needs of women who live in rural and remote environments. The goal of a realist review is to examine what works for whom under what circumstances and why. The high sensitivity search used language (English) and year (since 1990) limiters in keeping with both a realist and rapid review tradition of using reasoned contextual boundaries. No exclusions were made based on methodology or methodological approach in keeping with a realist review. Databases searched included MEDLINE, PubMed, EBSCO, CINAHL, EBM Reviews, NHS Economic Evaluation Database and PAIS International for literature in December 2013. Database searching produced 103 included academic articles. A further 59 resources were added through pearling and 13 grey literature reports were added on recommendation from the commissioner. A total of 42 of these 175 articles were included in this article as specific to provider satisfaction and service sustainability. Operative perinatal practice was found to be a lynchpin of sustainable primary and surgical services in rural communities. Rural shortages of providers, including challenges with recruitment and retention, were found to be a complex issue, with

  19. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  20. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  1. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  2. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  3. Improvement of Modeling Scheme of the Safety Injection Tank with Fluidic Device for Realistic LBLOCA Calculation

    International Nuclear Information System (INIS)

    Bang, Young Seok; Cheong, Aeju; Woo, Sweng Woong

    2014-01-01

    Confirmation of the performance of the SIT with FD should be based on thermal-hydraulic analysis of LBLOCA and an adequate and physical model simulating the SIT/FD should be used in the LBLOCA calculation. To develop such a physical model on SIT/FD, simulation of the major phenomena including flow distribution of by standpipe and FD should be justified by full scale experiment and/or plant preoperational testing. Author's previous study indicated that an approximation of SIT/FD phenomena could be obtained by a typical system transient code, MARS-KS, and using 'accumulator' component model, however, that additional improvement on modeling scheme of the FD and standpipe flow paths was needed for a reasonable prediction. One problem was a depressurizing behavior after switchover to low flow injection phase. Also a potential to release of nitrogen gas from the SIT to the downstream pipe and then reactor core through flow paths of FD and standpipe has been concerned. The intrusion of noncondensible gas may have an effect on LBLOCA thermal response. Therefore, a more reliable model on SIT/FD has been requested to get a more accurate prediction and a confidence of the evaluation of LBLOCA. The present paper is to discuss an improvement of modeling scheme from the previous study. Compared to the existing modeling, effect of the present modeling scheme on LBLOCA cladding thermal response is discussed. The present study discussed the modeling scheme of SIT with FD for a realistic simulation of LBLOCA of APR1400. Currently, the SIT blowdown test can be best simulated by the modeling scheme using 'pipe' component with dynamic area reduction. The LBLOCA analysis adopting the modeling scheme showed the PCT increase of 23K when compared to the case of 'accumulator' component model, which was due to the flow rate decrease at transition phase low flow injection and intrusion of nitrogen gas to the core. Accordingly, the effect of SIT/FD modeling

  4. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  5. Measurable realistic image-based 3D mapping

    Science.gov (United States)

    Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.

    2011-12-01

    Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable

  6. Realistic and efficient 2D crack simulation

    Science.gov (United States)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  7. Using remotely sensed data and stochastic models to simulate realistic flood hazard footprints across the continental US

    Science.gov (United States)

    Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.

    2017-12-01

    Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in

  8. From Biology to Mathematical Models and Back: Teaching Modeling to Biology Students, and Biology to Math and Engineering Students

    Science.gov (United States)

    Chiel, Hillel J.; McManus, Jeffrey M.; Shaw, Kendrick M.

    2010-01-01

    We describe the development of a course to teach modeling and mathematical analysis skills to students of biology and to teach biology to students with strong backgrounds in mathematics, physics, or engineering. The two groups of students have different ways of learning material and often have strong negative feelings toward the area of knowledge…

  9. Realistic modeling of seismic input for megacities and large urban areas

    Science.gov (United States)

    Panza, G. F.; Unesco/Iugs/Igcp Project 414 Team

    2003-04-01

    The project addressed the problem of pre-disaster orientation: hazard prediction, risk assessment, and hazard mapping, in connection with seismic activity and man-induced vibrations. The definition of realistic seismic input has been obtained from the computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different source and structural models. The innovative modeling technique, that constitutes the common tool to the entire project, takes into account source, propagation and local site effects. This is done using first principles of physics about wave generation and propagation in complex media, and does not require to resort to convolutive approaches, that have been proven to be quite unreliable, mainly when dealing with complex geological structures, the most interesting from the practical point of view. In fact, several techniques that have been proposed to empirically estimate the site effects using observations convolved with theoretically computed signals corresponding to simplified models, supply reliable information about the site response to non-interfering seismic phases. They are not adequate in most of the real cases, when the seismic sequel is formed by several interfering waves. The availability of realistic numerical simulations enables us to reliably estimate the amplification effects even in complex geological structures, exploiting the available geotechnical, lithological, geophysical parameters, topography of the medium, tectonic, historical, palaeoseismological data, and seismotectonic models. The realistic modeling of the ground motion is a very important base of knowledge for the preparation of groundshaking scenarios that represent a valid and economic tool for the seismic microzonation. This knowledge can be very fruitfully used by civil engineers in the design of new seismo-resistant constructions and in the reinforcement of the existing built environment, and, therefore

  10. Effective electric fields along realistic DTI-based neural trajectories for modelling the stimulation mechanisms of TMS

    International Nuclear Information System (INIS)

    De Geeter, N; Crevecoeur, G; Dupré, L; Leemans, A

    2015-01-01

    In transcranial magnetic stimulation (TMS), an applied alternating magnetic field induces an electric field in the brain that can interact with the neural system. It is generally assumed that this induced electric field is the crucial effect exciting a certain region of the brain. More specifically, it is the component of this field parallel to the neuron’s local orientation, the so-called effective electric field, that can initiate neuronal stimulation. Deeper insights on the stimulation mechanisms can be acquired through extensive TMS modelling. Most models study simple representations of neurons with assumed geometries, whereas we embed realistic neural trajectories computed using tractography based on diffusion tensor images. This way of modelling ensures a more accurate spatial distribution of the effective electric field that is in addition patient and case specific. The case study of this paper focuses on the single pulse stimulation of the left primary motor cortex with a standard figure-of-eight coil. Including realistic neural geometry in the model demonstrates the strong and localized variations of the effective electric field between the tracts themselves and along them due to the interplay of factors such as the tract’s position and orientation in relation to the TMS coil, the neural trajectory and its course along the white and grey matter interface. Furthermore, the influence of changes in the coil orientation is studied. Investigating the impact of tissue anisotropy confirms that its contribution is not negligible. Moreover, assuming isotropic tissues lead to errors of the same size as rotating or tilting the coil with 10 degrees. In contrast, the model proves to be less sensitive towards the not well-known tissue conductivity values. (paper)

  11. Smart-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hale, Elaine T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgindy, Tarek [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bugbee, Bruce [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Rossol, Michael N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lopez, Anthony J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Vergara, Claudio [MIT; Domingo, Carlos Mateo [IIT Comillas; Postigo, Fernando [IIT Comillas; de Cuadra, Fernando [IIT Comillas; Gomez, Tomas [IIT Comillas; Duenas, Pablo [MIT; Luke, Max [MIT; Li, Vivian [MIT; Vinoth, Mohan [GE Grid Solutions; Kadankodu, Sree [GE Grid Solutions

    2017-08-09

    The National Renewable Energy Laboratory (NREL) in collaboration with Massachusetts Institute of Technology (MIT), Universidad Pontificia Comillas (Comillas-IIT, Spain) and GE Grid Solutions, is working on an ARPA-E GRID DATA project, titled Smart-DS, to create: 1) High-quality, realistic, synthetic distribution network models, and 2) Advanced tools for automated scenario generation based on high-resolution weather data and generation growth projections. Through these advancements, the Smart-DS project is envisioned to accelerate the development, testing, and adoption of advanced algorithms, approaches, and technologies for sustainable and resilient electric power systems, especially in the realm of U.S. distribution systems. This talk will present the goals and overall approach of the Smart-DS project, including the process of creating the synthetic distribution datasets using reference network model (RNM) and the comprehensive validation process to ensure network realism, feasibility, and applicability to advanced use cases. The talk will provide demonstrations of early versions of synthetic models, along with the lessons learnt from expert engagements to enhance future iterations. Finally, the scenario generation framework, its development plans, and co-ordination with GRID DATA repository teams to house these datasets for public access will also be discussed.

  12. Spatial Modeling Tools for Cell Biology

    National Research Council Canada - National Science Library

    Przekwas, Andrzej; Friend, Tom; Teixeira, Rodrigo; Chen, Z. J; Wilkerson, Patrick

    2006-01-01

    .... Scientific potentials and military relevance of computational biology and bioinformatics have inspired DARPA/IPTO's visionary BioSPICE project to develop computational framework and modeling tools for cell biology...

  13. Genome-scale biological models for industrial microbial systems.

    Science.gov (United States)

    Xu, Nan; Ye, Chao; Liu, Liming

    2018-04-01

    The primary aims and challenges associated with microbial fermentation include achieving faster cell growth, higher productivity, and more robust production processes. Genome-scale biological models, predicting the formation of an interaction among genetic materials, enzymes, and metabolites, constitute a systematic and comprehensive platform to analyze and optimize the microbial growth and production of biological products. Genome-scale biological models can help optimize microbial growth-associated traits by simulating biomass formation, predicting growth rates, and identifying the requirements for cell growth. With regard to microbial product biosynthesis, genome-scale biological models can be used to design product biosynthetic pathways, accelerate production efficiency, and reduce metabolic side effects, leading to improved production performance. The present review discusses the development of microbial genome-scale biological models since their emergence and emphasizes their pertinent application in improving industrial microbial fermentation of biological products.

  14. On the impacts of coarse-scale models of realistic roughness on a forward-facing step turbulent flow

    International Nuclear Information System (INIS)

    Wu, Yanhua; Ren, Huiying

    2013-01-01

    Highlights: ► Discrete wavelet transform was used to produce coarse-scale models of roughness. ► PIV were performed in a forward-facing step flow with roughness of different scales. ► Impacts of roughness scales on various turbulence statistics were studied. -- Abstract: The present work explores the impacts of the coarse-scale models of realistic roughness on the turbulent boundary layers over forward-facing steps. The surface topographies of different scale resolutions were obtained from a novel multi-resolution analysis using discrete wavelet transform. PIV measurements are performed in the streamwise–wall-normal (x–y) planes at two different spanwise positions in turbulent boundary layers at Re h = 3450 and δ/h = 8, where h is the mean step height and δ is the incoming boundary layer thickness. It was observed that large-scale but low-amplitude roughness scales had small effects on the forward-facing step turbulent flow. For the higher-resolution model of the roughness, the turbulence characteristics within 2h downstream of the steps are observed to be distinct from those over the original realistic rough step at a measurement position where the roughness profile possesses a positive slope immediately after the step’s front. On the other hand, much smaller differences exist in the flow characteristics at the other measurement position whose roughness profile possesses a negative slope following the step’s front

  15. Unified Deep Learning Architecture for Modeling Biology Sequence.

    Science.gov (United States)

    Wu, Hongjie; Cao, Chengyuan; Xia, Xiaoyan; Lu, Qiang

    2017-10-09

    Prediction of the spatial structure or function of biological macromolecules based on their sequence remains an important challenge in bioinformatics. When modeling biological sequences using traditional sequencing models, characteristics, such as long-range interactions between basic units, the complicated and variable output of labeled structures, and the variable length of biological sequences, usually lead to different solutions on a case-by-case basis. This study proposed the use of bidirectional recurrent neural networks based on long short-term memory or a gated recurrent unit to capture long-range interactions by designing the optional reshape operator to adapt to the diversity of the output labels and implementing a training algorithm to support the training of sequence models capable of processing variable-length sequences. Additionally, the merge and pooling operators enhanced the ability to capture short-range interactions between basic units of biological sequences. The proposed deep-learning model and its training algorithm might be capable of solving currently known biological sequence-modeling problems through the use of a unified framework. We validated our model on one of the most difficult biological sequence-modeling problems currently known, with our results indicating the ability of the model to obtain predictions of protein residue interactions that exceeded the accuracy of current popular approaches by 10% based on multiple benchmarks.

  16. Agent-based modelling in synthetic biology.

    Science.gov (United States)

    Gorochowski, Thomas E

    2016-11-30

    Biological systems exhibit complex behaviours that emerge at many different levels of organization. These span the regulation of gene expression within single cells to the use of quorum sensing to co-ordinate the action of entire bacterial colonies. Synthetic biology aims to make the engineering of biology easier, offering an opportunity to control natural systems and develop new synthetic systems with useful prescribed behaviours. However, in many cases, it is not understood how individual cells should be programmed to ensure the emergence of a required collective behaviour. Agent-based modelling aims to tackle this problem, offering a framework in which to simulate such systems and explore cellular design rules. In this article, I review the use of agent-based models in synthetic biology, outline the available computational tools, and provide details on recently engineered biological systems that are amenable to this approach. I further highlight the challenges facing this methodology and some of the potential future directions. © 2016 The Author(s).

  17. Model checking biological systems described using ambient calculus

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Priami, Corrado; Qualia, Paola

    2005-01-01

    Model checking biological systems described using ambient calculus. In Proc. of the second International Workshop on Computational Methods in Systems Biology (CMSB04), Lecture Notes in Bioinformatics 3082:85-103, Springer, 2005.......Model checking biological systems described using ambient calculus. In Proc. of the second International Workshop on Computational Methods in Systems Biology (CMSB04), Lecture Notes in Bioinformatics 3082:85-103, Springer, 2005....

  18. The Effects of 3D Computer Simulation on Biology Students' Achievement and Memory Retention

    Science.gov (United States)

    Elangovan, Tavasuria; Ismail, Zurida

    2014-01-01

    A quasi experimental study was conducted for six weeks to determine the effectiveness of two different 3D computer simulation based teaching methods, that is, realistic simulation and non-realistic simulation on Form Four Biology students' achievement and memory retention in Perak, Malaysia. A sample of 136 Form Four Biology students in Perak,…

  19. Simulating the value of electric-vehicle-grid integration using a behaviourally realistic model

    Science.gov (United States)

    Wolinetz, Michael; Axsen, Jonn; Peters, Jotham; Crawford, Curran

    2018-02-01

    Vehicle-grid integration (VGI) uses the interaction between electric vehicles and the electrical grid to provide benefits that may include reducing the cost of using intermittent renwable electricity or providing a financial incentive for electric vehicle ownerhip. However, studies that estimate the value of VGI benefits have largely ignored how consumer behaviour will affect the magnitude of the impact. Here, we simulate the long-term impact of VGI using behaviourally realistic and empirically derived models of vehicle adoption and charging combined with an electricity system model. We focus on the case where a central entity manages the charging rate and timing for participating electric vehicles. VGI is found not to increase the adoption of electric vehicles, but does have a a small beneficial impact on electricity prices. By 2050, VGI reduces wholesale electricity prices by 0.6-0.7% (0.7 MWh-1, 2010 CAD) relative to an equivalent scenario without VGI. Excluding consumer behaviour from the analysis inflates the value of VGI.

  20. The ultimate intrinsic signal-to-noise ratio of loop- and dipole-like current patterns in a realistic human head model.

    Science.gov (United States)

    Pfrommer, Andreas; Henning, Anke

    2018-03-13

    The ultimate intrinsic signal-to-noise ratio (UISNR) represents an upper bound for the achievable SNR of any receive coil. To reach this threshold a complete basis set of equivalent surface currents is required. This study systematically investigated to what extent either loop- or dipole-like current patterns are able to reach the UISNR threshold in a realistic human head model between 1.5 T and 11.7 T. Based on this analysis, we derived guidelines for coil designers to choose the best array element at a given field strength. Moreover, we present ideal current patterns yielding the UISNR in a realistic body model. We distributed generic current patterns on a cylindrical and helmet-shaped surface around a realistic human head model. We excited electromagnetic fields in the human head by using eigenfunctions of the spherical and cylindrical Helmholtz operator. The electromagnetic field problem was solved by a fast volume integral equation solver. At 7 T and above, adding curl-free current patterns to divergence-free current patterns substantially increased the SNR in the human head (locally >20%). This was true for the helmet-shaped and the cylindrical surface. On the cylindrical surface, dipole-like current patterns had high SNR performance in central regions at ultra-high field strength. The UISNR increased superlinearly with B0 in most parts of the cerebrum but only sublinearly in the periphery of the human head. The combination of loop and dipole elements could enhance the SNR performance in the human head at ultra-high field strength. © 2018 International Society for Magnetic Resonance in Medicine.

  1. Details of regional particle deposition and airflow structures in a realistic model of human tracheobronchial airways: two-phase flow simulation.

    Science.gov (United States)

    Rahimi-Gorji, Mohammad; Gorji, Tahereh B; Gorji-Bandpy, Mofid

    2016-07-01

    In the present investigation, detailed two-phase flow modeling of airflow, transport and deposition of micro-particles (1-10µm) in a realistic tracheobronchial airway geometry based on CT scan images under various breathing conditions (i.e. 10-60l/min) was considered. Lagrangian particle tracking has been used to investigate the particle deposition patterns in a model comprising mouth up to generation G6 of tracheobronchial airways. The results demonstrated that during all breathing patterns, the maximum velocity change occurred in the narrow throat region (Larynx). Due to implementing a realistic geometry for simulations, many irregularities and bending deflections exist in the airways model. Thereby, at higher inhalation rates, these areas are prone to vortical effects which tend to entrap the inhaled particles. According to the results, deposition fraction has a direct relationship with particle aerodynamic diameter (for dp=1-10µm). Enhancing inhalation flow rate and particle size will largely increase the inertial force and consequently, more particle deposition is evident suggesting that inertial impaction is the dominant deposition mechanism in tracheobronchial airways. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Construction of hexahedral elements mesh capturing realistic geometries of Bayou Choctaw SPR site

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung Yoon [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roberts, Barry L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The three-dimensional finite element mesh capturing realistic geometries of Bayou Choctaw site has been constructed using the sonar and seismic survey data obtained from the field. The mesh is consisting of hexahedral elements because the salt constitutive model is coded using hexahedral elements. Various ideas and techniques to construct finite element mesh capturing artificially and naturally formed geometries are provided. The techniques to reduce the number of elements as much as possible to save on computer run time with maintaining the computational accuracy is also introduced. The steps and methodologies could be applied to construct the meshes of Big Hill, Bryan Mound, and West Hackberry strategic petroleum reserve sites. The methodology could be applied to the complicated shape masses for not only various civil and geological structures but also biological applications such as artificial limbs.

  3. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  4. Spectroscopy of light nuclei with realistic NN interaction JISP

    International Nuclear Information System (INIS)

    Shirokov, A. M.; Vary, J. P.; Mazur, A. I.; Weber, T. A.

    2008-01-01

    Recent results of our systematic ab initio studies of the spectroscopy of s- and p-shell nuclei in fully microscopic large-scale (up to a few hundred million basis functions) no-core shell-model calculations are presented. A new high-quality realistic nonlocal NN interaction JISP is used. This interaction is obtained in the J-matrix inverse-scattering approach (JISP stands for the J-matrix inverse-scattering potential) and is of the form of a small-rank matrix in the oscillator basis in each of the NN partial waves, providing a very fast convergence in shell-model studies. The current purely two-body JISP model of the nucleon-nucleon interaction JISP16 provides not only an excellent description of two-nucleon data (deuteron properties and np scattering) with χ 2 /datum = 1.05 but also a better description of a wide range of observables (binding energies, spectra, rms radii, quadrupole moments, electromagnetic-transition probabilities, etc.) in all s-and p-shell nuclei than the best modern interaction models combining realistic nucleon-nucleon and three-nucleon interactions.

  5. A biological compression model and its applications.

    Science.gov (United States)

    Cao, Minh Duc; Dix, Trevor I; Allison, Lloyd

    2011-01-01

    A biological compression model, expert model, is presented which is superior to existing compression algorithms in both compression performance and speed. The model is able to compress whole eukaryotic genomes. Most importantly, the model provides a framework for knowledge discovery from biological data. It can be used for repeat element discovery, sequence alignment and phylogenetic analysis. We demonstrate that the model can handle statistically biased sequences and distantly related sequences where conventional knowledge discovery tools often fail.

  6. Mathematical manipulative models: in defense of "beanbag biology".

    Science.gov (United States)

    Jungck, John R; Gaff, Holly; Weisstein, Anton E

    2010-01-01

    Mathematical manipulative models have had a long history of influence in biological research and in secondary school education, but they are frequently neglected in undergraduate biology education. By linking mathematical manipulative models in a four-step process-1) use of physical manipulatives, 2) interactive exploration of computer simulations, 3) derivation of mathematical relationships from core principles, and 4) analysis of real data sets-we demonstrate a process that we have shared in biological faculty development workshops led by staff from the BioQUEST Curriculum Consortium over the past 24 yr. We built this approach based upon a broad survey of literature in mathematical educational research that has convincingly demonstrated the utility of multiple models that involve physical, kinesthetic learning to actual data and interactive simulations. Two projects that use this approach are introduced: The Biological Excel Simulations and Tools in Exploratory, Experiential Mathematics (ESTEEM) Project (http://bioquest.org/esteem) and Numerical Undergraduate Mathematical Biology Education (NUMB3R5 COUNT; http://bioquest.org/numberscount). Examples here emphasize genetics, ecology, population biology, photosynthesis, cancer, and epidemiology. Mathematical manipulative models help learners break through prior fears to develop an appreciation for how mathematical reasoning informs problem solving, inference, and precise communication in biology and enhance the diversity of quantitative biology education.

  7. Unified data model for biological data

    International Nuclear Information System (INIS)

    Idrees, M.

    2014-01-01

    A data model empowers us to store, retrieve and manipulate data in a unified way. We consider the biological data consists of DNA (De-Oxyribonucleic Acid), RNA (Ribonucleic Acid) and protein structures. In our Bioinformatics Lab (Bioinformatics Lab, Alkhawarizmi Institute of Computer Science, University of Engineering and Technology, Lahore, Pakistan), we have already proposed two data models for DNA and protein structures individually. In this paper, we propose a unified data model by using the data models of TOS (Temporal Object Oriented System) after making some necessary modifications to this data model and our already proposed the two data models. This proposed unified data model can be used for the modeling and maintaining the biological data (i.e. DNA, RNA and protein structures), in a single unified way. (author)

  8. Automated parameter estimation for biological models using Bayesian statistical model checking.

    Science.gov (United States)

    Hussain, Faraz; Langmead, Christopher J; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram; Jha, Sumit K

    2015-01-01

    Probabilistic models have gained widespread acceptance in the systems biology community as a useful way to represent complex biological systems. Such models are developed using existing knowledge of the structure and dynamics of the system, experimental observations, and inferences drawn from statistical analysis of empirical data. A key bottleneck in building such models is that some system variables cannot be measured experimentally. These variables are incorporated into the model as numerical parameters. Determining values of these parameters that justify existing experiments and provide reliable predictions when model simulations are performed is a key research problem. Using an agent-based model of the dynamics of acute inflammation, we demonstrate a novel parameter estimation algorithm by discovering the amount and schedule of doses of bacterial lipopolysaccharide that guarantee a set of observed clinical outcomes with high probability. We synthesized values of twenty-eight unknown parameters such that the parameterized model instantiated with these parameter values satisfies four specifications describing the dynamic behavior of the model. We have developed a new algorithmic technique for discovering parameters in complex stochastic models of biological systems given behavioral specifications written in a formal mathematical logic. Our algorithm uses Bayesian model checking, sequential hypothesis testing, and stochastic optimization to automatically synthesize parameters of probabilistic biological models.

  9. Structured population models in biology and epidemiology

    CERN Document Server

    Ruan, Shigui

    2008-01-01

    This book consists of six chapters written by leading researchers in mathematical biology. These chapters present recent and important developments in the study of structured population models in biology and epidemiology. Topics include population models structured by age, size, and spatial position; size-structured models for metapopulations, macroparasitc diseases, and prion proliferation; models for transmission of microparasites between host populations living on non-coincident spatial domains; spatiotemporal patterns of disease spread; method of aggregation of variables in population dynamics; and biofilm models. It is suitable as a textbook for a mathematical biology course or a summer school at the advanced undergraduate and graduate level. It can also serve as a reference book for researchers looking for either interesting and specific problems to work on or useful techniques and discussions of some particular problems.

  10. Realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2013-11-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. According to this definition, some roofs may have faces isolated from the boundary of P or even local minima, which are undesirable for several practical reasons. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs and show that the straight skeleton induces a realistic roof with maximum height and volume. We also show that the maximum possible number of distinct realistic roofs over P is ((n-4)(n-4)/4 /2⌋) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n4) preprocessing time. We also present an O(n5)-time algorithm for computing a realistic roof with minimum height or volume. © 2013 Elsevier B.V.

  11. PIV-measured versus CFD-predicted flow dynamics in anatomically realistic cerebral aneurysm models.

    Science.gov (United States)

    Ford, Matthew D; Nikolov, Hristo N; Milner, Jaques S; Lownie, Stephen P; Demont, Edwin M; Kalata, Wojciech; Loth, Francis; Holdsworth, David W; Steinman, David A

    2008-04-01

    Computational fluid dynamics (CFD) modeling of nominally patient-specific cerebral aneurysms is increasingly being used as a research tool to further understand the development, prognosis, and treatment of brain aneurysms. We have previously developed virtual angiography to indirectly validate CFD-predicted gross flow dynamics against the routinely acquired digital subtraction angiograms. Toward a more direct validation, here we compare detailed, CFD-predicted velocity fields against those measured using particle imaging velocimetry (PIV). Two anatomically realistic flow-through phantoms, one a giant internal carotid artery (ICA) aneurysm and the other a basilar artery (BA) tip aneurysm, were constructed of a clear silicone elastomer. The phantoms were placed within a computer-controlled flow loop, programed with representative flow rate waveforms. PIV images were collected on several anterior-posterior (AP) and lateral (LAT) planes. CFD simulations were then carried out using a well-validated, in-house solver, based on micro-CT reconstructions of the geometries of the flow-through phantoms and inlet/outlet boundary conditions derived from flow rates measured during the PIV experiments. PIV and CFD results from the central AP plane of the ICA aneurysm showed a large stable vortex throughout the cardiac cycle. Complex vortex dynamics, captured by PIV and CFD, persisted throughout the cardiac cycle on the central LAT plane. Velocity vector fields showed good overall agreement. For the BA, aneurysm agreement was more compelling, with both PIV and CFD similarly resolving the dynamics of counter-rotating vortices on both AP and LAT planes. Despite the imposition of periodic flow boundary conditions for the CFD simulations, cycle-to-cycle fluctuations were evident in the BA aneurysm simulations, which agreed well, in terms of both amplitudes and spatial distributions, with cycle-to-cycle fluctuations measured by PIV in the same geometry. The overall good agreement

  12. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  13. Oscillation and stability of delay models in biology

    CERN Document Server

    Agarwal, Ravi P; Saker, Samir H

    2014-01-01

    Environmental variation plays an important role in many biological and ecological dynamical systems. This monograph focuses on the study of oscillation and the stability of delay models occurring in biology. The book presents recent research results on the qualitative behavior of mathematical models under different physical and environmental conditions, covering dynamics including the distribution and consumption of food. Researchers in the fields of mathematical modeling, mathematical biology, and population dynamics will be particularly interested in this material.

  14. Logic-statistic modeling and analysis of biological sequence data

    DEFF Research Database (Denmark)

    Christiansen, Henning

    2007-01-01

    We describe here the intentions and plans of a newly started, funded research project in order to further the dialogue with the international research in the field. The purpose is to obtain experiences for realistic applications of flexible and powerful modeling tools that integrate logic and sta...

  15. Rapidly re-computable EEG (electroencephalography) forward models for realistic head shapes

    International Nuclear Information System (INIS)

    Ermer, J.J.; Mosher, J.C.; Baillet, S.; Leahy, R.M.

    2001-01-01

    Solution of the EEG source localization (inverse) problem utilizing model-based methods typically requires a significant number of forward model evaluations. For subspace based inverse methods like MUSIC (6), the total number of forward model evaluations can often approach an order of 10 3 or 10 4 . Techniques based on least-squares minimization may require significantly more evaluations. The observed set of measurements over an M-sensor array is often expressed as a linear forward spatio-temporal model of the form: F = GQ + N (1) where the observed forward field F (M-sensors x N-time samples) can be expressed in terms of the forward model G, a set of dipole moment(s) Q (3xP-dipoles x N-time samples) and additive noise N. Because of their simplicity, ease of computation, and relatively good accuracy, multi-layer spherical models (7) (or fast approximations described in (1), (7)) have traditionally been the 'forward model of choice' for approximating the human head. However, approximation of the human head via a spherical model does have several key drawbacks. By its very shape, the use of a spherical model distorts the true distribution of passive currents in the skull cavity. Spherical models also require that the sensor positions be projected onto the fitted sphere (Fig. 1), resulting in a distortion of the true sensor-dipole spatial geometry (and ultimately the computed surface potential). The use of a single 'best-fitted' sphere has the added drawback of incomplete coverage of the inner skull region, often ignoring areas such as the frontal cortex. In practice, this problem is typically countered by fitting additional sphere(s) to those region(s) not covered by the primary sphere. The use of these additional spheres results in added complication to the forward model. Using high-resolution spatial information obtained via X-ray CT or MR imaging, a realistic head model can be formed by tessellating the head into a set of contiguous regions (typically the scalp

  16. Recurrent Convolutional Neural Networks: A Better Model of Biological Object Recognition.

    Science.gov (United States)

    Spoerer, Courtney J; McClure, Patrick; Kriegeskorte, Nikolaus

    2017-01-01

    Feedforward neural networks provide the dominant model of how the brain performs visual object recognition. However, these networks lack the lateral and feedback connections, and the resulting recurrent neuronal dynamics, of the ventral visual pathway in the human and non-human primate brain. Here we investigate recurrent convolutional neural networks with bottom-up (B), lateral (L), and top-down (T) connections. Combining these types of connections yields four architectures (B, BT, BL, and BLT), which we systematically test and compare. We hypothesized that recurrent dynamics might improve recognition performance in the challenging scenario of partial occlusion. We introduce two novel occluded object recognition tasks to test the efficacy of the models, digit clutter (where multiple target digits occlude one another) and digit debris (where target digits are occluded by digit fragments). We find that recurrent neural networks outperform feedforward control models (approximately matched in parametric complexity) at recognizing objects, both in the absence of occlusion and in all occlusion conditions. Recurrent networks were also found to be more robust to the inclusion of additive Gaussian noise. Recurrent neural networks are better in two respects: (1) they are more neurobiologically realistic than their feedforward counterparts; (2) they are better in terms of their ability to recognize objects, especially under challenging conditions. This work shows that computer vision can benefit from using recurrent convolutional architectures and suggests that the ubiquitous recurrent connections in biological brains are essential for task performance.

  17. Calculation of electrical potentials on the surface of a realistic head model by finite differences

    International Nuclear Information System (INIS)

    Lemieux, L.; McBride, A.; Hand, J.W.

    1996-01-01

    We present a method for the calculation of electrical potentials at the surface of realistic head models from a point dipole generator based on a 3D finite-difference algorithm. The model was validated by comparing calculated values with those obtained algebraically for a three-shell spherical model. For a 1.25 mm cubic grid size, the mean error was 4.9% for a superficial dipole (3.75 mm from the inner surface of the skull) pointing in the radial direction. The effect of generator discretization and node spacing on the accuracy of the model was studied. Three values of the node spacing were considered: 1, 1.25 and 1.5 mm. The mean relative errors were 4.2, 6.3 and 9.3%, respectively. The quality of the approximation of a point dipole by an array of nodes in a spherical neighbourhood did not depend significantly on the number of nodes used. The application of the method to a conduction model derived from MRI data is demonstrated. (author)

  18. Realistic modelling of observed seismic motion in complex sedimentary basins

    International Nuclear Information System (INIS)

    Faeh, D.; Panza, G.F.

    1994-03-01

    Three applications of a numerical technique are illustrated to model realistically the seismic ground motion for complex two-dimensional structures. First we consider a sedimentary basin in the Friuli region, and we model strong motion records from an aftershock of the 1976 earthquake. Then we simulate the ground motion caused in Rome by the 1915, Fucino (Italy) earthquake, and we compare our modelling with the damage distribution observed in the town. Finally we deal with the interpretation of ground motion recorded in Mexico City, as a consequence of earthquakes in the Mexican subduction zone. The synthetic signals explain the major characteristics (relative amplitudes, spectral amplification, frequency content) of the considered seismograms, and the space distribution of the available macroseismic data. For the sedimentary basin in the Friuli area, parametric studies demonstrate the relevant sensitivity of the computed ground motion to small changes in the subsurface topography of the sedimentary basin, and in the velocity and quality factor of the sediments. The total energy of ground motion, determined from our numerical simulation in Rome, is in very good agreement with the distribution of damage observed during the Fucino earthquake. For epicentral distances in the range 50km-100km, the source location and not only the local soil conditions control the local effects. For Mexico City, the observed ground motion can be explained as resonance effects and as excitation of local surface waves, and the theoretical and the observed maximum spectral amplifications are very similar. In general, our numerical simulations permit the estimate of the maximum and average spectral amplification for specific sites, i.e. are a very powerful tool for accurate micro-zonation. (author). 38 refs, 19 figs, 1 tab

  19. A Realistic Human Exposure Assessment of Indoor Radon released from Groundwater

    International Nuclear Information System (INIS)

    Yu, Dong Han; Han, Moon Hee

    2002-01-01

    The work presents a realistic human exposure assessment of indoor radon released from groundwater in a house. At first, a two-compartment model is developed to describe the generation and transfer of radon in indoor air from groundwater. The model is used to estimate radon concentrations profile of indoor air in a house using by showering, washing clothes, and flushing toilets. Then, the study performs an uncertainty analysis of model input parameters to quantify the uncertainty in radon concentration profile. In order to estimate a daily internal dose of a specific tissue group in an adult through the inhalation of such indoor radon, a PBPK(Physiologically-Based Pharmaco-Kinetic) model is developed. Combining indoor radon profile and PBPK model is used to a realistic human assessment for such exposure. The results obtained from this study would be used to the evaluation of human risk by inhalation associated with the indoor radon released from groundwater

  20. Predictive modelling of complex agronomic and biological systems.

    Science.gov (United States)

    Keurentjes, Joost J B; Molenaar, Jaap; Zwaan, Bas J

    2013-09-01

    Biological systems are tremendously complex in their functioning and regulation. Studying the multifaceted behaviour and describing the performance of such complexity has challenged the scientific community for years. The reduction of real-world intricacy into simple descriptive models has therefore convinced many researchers of the usefulness of introducing mathematics into biological sciences. Predictive modelling takes such an approach another step further in that it takes advantage of existing knowledge to project the performance of a system in alternating scenarios. The ever growing amounts of available data generated by assessing biological systems at increasingly higher detail provide unique opportunities for future modelling and experiment design. Here we aim to provide an overview of the progress made in modelling over time and the currently prevalent approaches for iterative modelling cycles in modern biology. We will further argue for the importance of versatility in modelling approaches, including parameter estimation, model reduction and network reconstruction. Finally, we will discuss the difficulties in overcoming the mathematical interpretation of in vivo complexity and address some of the future challenges lying ahead. © 2013 John Wiley & Sons Ltd.

  1. Time lags in biological models

    CERN Document Server

    MacDonald, Norman

    1978-01-01

    In many biological models it is necessary to allow the rates of change of the variables to depend on the past history, rather than only the current values, of the variables. The models may require discrete lags, with the use of delay-differential equations, or distributed lags, with the use of integro-differential equations. In these lecture notes I discuss the reasons for including lags, especially distributed lags, in biological models. These reasons may be inherent in the system studied, or may be the result of simplifying assumptions made in the model used. I examine some of the techniques available for studying the solution of the equations. A large proportion of the material presented relates to a special method that can be applied to a particular class of distributed lags. This method uses an extended set of ordinary differential equations. I examine the local stability of equilibrium points, and the existence and frequency of periodic solutions. I discuss the qualitative effects of lags, and how these...

  2. Experimental Section: On the magnetic field distribution generated by a dipolar current source situated in a realistically shaped compartment model of the head

    NARCIS (Netherlands)

    Meijs, J.W.H.; Bosch, F.G.C.; Peters, M.J.; Lopes da silva, F.H.

    1987-01-01

    The magnetic field distribution around the head is simulated using a realistically shaped compartment model of the head. The model is based on magnetic resonance images. The 3 compartments describe the brain, the skull and the scalp. The source is represented by a current dipole situated in the

  3. Design and validation of realistic breast models for use in multiple alternative forced choice virtual clinical trials.

    Science.gov (United States)

    Elangovan, Premkumar; Mackenzie, Alistair; Dance, David R; Young, Kenneth C; Cooke, Victoria; Wilkinson, Louise; Given-Wilson, Rosalind M; Wallis, Matthew G; Wells, Kevin

    2017-04-07

    A novel method has been developed for generating quasi-realistic voxel phantoms which simulate the compressed breast in mammography and digital breast tomosynthesis (DBT). The models are suitable for use in virtual clinical trials requiring realistic anatomy which use the multiple alternative forced choice (AFC) paradigm and patches from the complete breast image. The breast models are produced by extracting features of breast tissue components from DBT clinical images including skin, adipose and fibro-glandular tissue, blood vessels and Cooper's ligaments. A range of different breast models can then be generated by combining these components. Visual realism was validated using a receiver operating characteristic (ROC) study of patches from simulated images calculated using the breast models and from real patient images. Quantitative analysis was undertaken using fractal dimension and power spectrum analysis. The average areas under the ROC curves for 2D and DBT images were 0.51  ±  0.06 and 0.54  ±  0.09 demonstrating that simulated and real images were statistically indistinguishable by expert breast readers (7 observers); errors represented as one standard error of the mean. The average fractal dimensions (2D, DBT) for real and simulated images were (2.72  ±  0.01, 2.75  ±  0.01) and (2.77  ±  0.03, 2.82  ±  0.04) respectively; errors represented as one standard error of the mean. Excellent agreement was found between power spectrum curves of real and simulated images, with average β values (2D, DBT) of (3.10  ±  0.17, 3.21  ±  0.11) and (3.01  ±  0.32, 3.19  ±  0.07) respectively; errors represented as one standard error of the mean. These results demonstrate that radiological images of these breast models realistically represent the complexity of real breast structures and can be used to simulate patches from mammograms and DBT images that are indistinguishable from

  4. Electromagnetic forward modelling for realistic Earth models using unstructured tetrahedral meshes and a meshfree approach

    Science.gov (United States)

    Farquharson, C.; Long, J.; Lu, X.; Lelievre, P. G.

    2017-12-01

    makes the process of building a geophysical Earth model from a geological model much simpler. In this presentation we will explore the issues that arise when working with realistic Earth models and when synthesizing geophysical electromagnetic data for them. We briefly consider meshfree methods as a possible means of alleviating some of these issues.

  5. Creating a Structurally Realistic Finite Element Geometric Model of a Cardiomyocyte to Study the Role of Cellular Architecture in Cardiomyocyte Systems Biology.

    Science.gov (United States)

    Rajagopal, Vijay; Bass, Gregory; Ghosh, Shouryadipta; Hunt, Hilary; Walker, Cameron; Hanssen, Eric; Crampin, Edmund; Soeller, Christian

    2018-04-18

    With the advent of three-dimensional (3D) imaging technologies such as electron tomography, serial-block-face scanning electron microscopy and confocal microscopy, the scientific community has unprecedented access to large datasets at sub-micrometer resolution that characterize the architectural remodeling that accompanies changes in cardiomyocyte function in health and disease. However, these datasets have been under-utilized for investigating the role of cellular architecture remodeling in cardiomyocyte function. The purpose of this protocol is to outline how to create an accurate finite element model of a cardiomyocyte using high resolution electron microscopy and confocal microscopy images. A detailed and accurate model of cellular architecture has significant potential to provide new insights into cardiomyocyte biology, more than experiments alone can garner. The power of this method lies in its ability to computationally fuse information from two disparate imaging modalities of cardiomyocyte ultrastructure to develop one unified and detailed model of the cardiomyocyte. This protocol outlines steps to integrate electron tomography and confocal microscopy images of adult male Wistar (name for a specific breed of albino rat) rat cardiomyocytes to develop a half-sarcomere finite element model of the cardiomyocyte. The procedure generates a 3D finite element model that contains an accurate, high-resolution depiction (on the order of ~35 nm) of the distribution of mitochondria, myofibrils and ryanodine receptor clusters that release the necessary calcium for cardiomyocyte contraction from the sarcoplasmic reticular network (SR) into the myofibril and cytosolic compartment. The model generated here as an illustration does not incorporate details of the transverse-tubule architecture or the sarcoplasmic reticular network and is therefore a minimal model of the cardiomyocyte. Nevertheless, the model can already be applied in simulation-based investigations into the

  6. Laser interaction with biological material mathematical modeling

    CERN Document Server

    Kulikov, Kirill

    2014-01-01

    This book covers the principles of laser interaction with biological cells and tissues of varying degrees of organization. The problems of biomedical diagnostics are considered. Scattering of laser irradiation of blood cells is modeled for biological structures (dermis, epidermis, vascular plexus). An analytic theory is provided which is based on solving the wave equation for the electromagnetic field. It allows the accurate analysis of interference effects arising from the partial superposition of scattered waves. Treated topics of mathematical modeling are: optical characterization of biological tissue with large-scale and small-scale inhomogeneities in the layers, heating blood vessel under laser irradiation incident on the outer surface of the skin and thermo-chemical denaturation of biological structures at the example of human skin.

  7. Blend Shape Interpolation and FACS for Realistic Avatar

    Science.gov (United States)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Basori, Ahmad Hoirul; Saba, Tanzila

    2015-03-01

    The quest of developing realistic facial animation is ever-growing. The emergence of sophisticated algorithms, new graphical user interfaces, laser scans and advanced 3D tools imparted further impetus towards the rapid advancement of complex virtual human facial model. Face-to-face communication being the most natural way of human interaction, the facial animation systems became more attractive in the information technology era for sundry applications. The production of computer-animated movies using synthetic actors are still challenging issues. Proposed facial expression carries the signature of happiness, sadness, angry or cheerful, etc. The mood of a particular person in the midst of a large group can immediately be identified via very subtle changes in facial expressions. Facial expressions being very complex as well as important nonverbal communication channel are tricky to synthesize realistically using computer graphics. Computer synthesis of practical facial expressions must deal with the geometric representation of the human face and the control of the facial animation. We developed a new approach by integrating blend shape interpolation (BSI) and facial action coding system (FACS) to create a realistic and expressive computer facial animation design. The BSI is used to generate the natural face while the FACS is employed to reflect the exact facial muscle movements for four basic natural emotional expressions such as angry, happy, sad and fear with high fidelity. The results in perceiving the realistic facial expression for virtual human emotions based on facial skin color and texture may contribute towards the development of virtual reality and game environment of computer aided graphics animation systems.

  8. Development of Realistic Head Models for Electromagnetic Source Imaging of the Human Brain

    National Research Council Canada - National Science Library

    Akalin, Z

    2001-01-01

    ... images is performed Then triangular, quadratic meshes are formed for the interfaces of the tissues, Thus, realistic meshes, representing scalp, skull, CSF, brain and eye tissues, are formed, At least...

  9. International Management: Creating a More Realistic Global Planning Environment.

    Science.gov (United States)

    Waldron, Darryl G.

    2000-01-01

    Discusses the need for realistic global planning environments in international business education, introducing a strategic planning model that has teams interacting with teams to strategically analyze a selected multinational company. This dynamic process must result in a single integrated written analysis that specifies an optimal strategy for…

  10. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  11. Successful N2 leptogenesis with flavour coupling effects in realistic unified models

    International Nuclear Information System (INIS)

    Bari, Pasquale Di; King, Stephen F.

    2015-01-01

    In realistic unified models involving so-called SO(10)-inspired patterns of Dirac and heavy right-handed (RH) neutrino masses, the lightest right-handed neutrino N 1 is too light to yield successful thermal leptogenesis, barring highly fine tuned solutions, while the second heaviest right-handed neutrino N 2 is typically in the correct mass range. We show that flavour coupling effects in the Boltzmann equations may be crucial to the success of such N 2 dominated leptogenesis, by helping to ensure that the flavour asymmetries produced at the N 2 scale survive N 1 washout. To illustrate these effects we focus on N 2 dominated leptogenesis in an existing model, the A to Z of flavour with Pati-Salam, where the neutrino Dirac mass matrix may be equal to an up-type quark mass matrix and has a particular constrained structure. The numerical results, supported by analytical insight, show that in order to achieve successful N 2 leptogenesis, consistent with neutrino phenomenology, requires a ''flavour swap scenario'' together with a less hierarchical pattern of RH neutrino masses than naively expected, at the expense of some mild fine-tuning. In the considered A to Z model neutrino masses are predicted to be normal ordered, with an atmospheric neutrino mixing angle well into the second octant and the Dirac phase δ≅ 20 o , a set of predictions that will be tested in the next years in neutrino oscillation experiments. Flavour coupling effects may be relevant for other SO(10)-inspired unified models where N 2 leptogenesis is necessary

  12. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB

    KAUST Repository

    Klingbeil, G.; Erban, R.; Giles, M.; Maini, P. K.

    2011-01-01

    Motivation: The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new

  13. SEEK: a systems biology data and model management platform.

    Science.gov (United States)

    Wolstencroft, Katherine; Owen, Stuart; Krebs, Olga; Nguyen, Quyen; Stanford, Natalie J; Golebiewski, Martin; Weidemann, Andreas; Bittkowski, Meik; An, Lihua; Shockley, David; Snoep, Jacky L; Mueller, Wolfgang; Goble, Carole

    2015-07-11

    Systems biology research typically involves the integration and analysis of heterogeneous data types in order to model and predict biological processes. Researchers therefore require tools and resources to facilitate the sharing and integration of data, and for linking of data to systems biology models. There are a large number of public repositories for storing biological data of a particular type, for example transcriptomics or proteomics, and there are several model repositories. However, this silo-type storage of data and models is not conducive to systems biology investigations. Interdependencies between multiple omics datasets and between datasets and models are essential. Researchers require an environment that will allow the management and sharing of heterogeneous data and models in the context of the experiments which created them. The SEEK is a suite of tools to support the management, sharing and exploration of data and models in systems biology. The SEEK platform provides an access-controlled, web-based environment for scientists to share and exchange data and models for day-to-day collaboration and for public dissemination. A plug-in architecture allows the linking of experiments, their protocols, data, models and results in a configurable system that is available 'off the shelf'. Tools to run model simulations, plot experimental data and assist with data annotation and standardisation combine to produce a collection of resources that support analysis as well as sharing. Underlying semantic web resources additionally extract and serve SEEK metadata in RDF (Resource Description Format). SEEK RDF enables rich semantic queries, both within SEEK and between related resources in the web of Linked Open Data. The SEEK platform has been adopted by many systems biology consortia across Europe. It is a data management environment that has a low barrier of uptake and provides rich resources for collaboration. This paper provides an update on the functions and

  14. Continuum Modeling of Biological Network Formation

    KAUST Repository

    Albi, Giacomo; Burger, Martin; Haskovec, Jan; Markowich, Peter A.; Schlottbom, Matthias

    2017-01-01

    We present an overview of recent analytical and numerical results for the elliptic–parabolic system of partial differential equations proposed by Hu and Cai, which models the formation of biological transportation networks. The model describes

  15. Theoretical Biology and Medical Modelling: ensuring continued growth and future leadership.

    Science.gov (United States)

    Nishiura, Hiroshi; Rietman, Edward A; Wu, Rongling

    2013-07-11

    Theoretical biology encompasses a broad range of biological disciplines ranging from mathematical biology and biomathematics to philosophy of biology. Adopting a broad definition of "biology", Theoretical Biology and Medical Modelling, an open access journal, considers original research studies that focus on theoretical ideas and models associated with developments in biology and medicine.

  16. Toward realistic pursuit-evasion using a roadmap-based approach

    KAUST Repository

    Rodriguez, Samuel

    2011-05-01

    In this work, we describe an approach for modeling and simulating group behaviors for pursuit-evasion that uses a graph-based representation of the environment and integrates multi-agent simulation with roadmap-based path planning. Our approach can be applied to more realistic scenarios than are typically studied in most previous work, including agents moving in 3D environments such as terrains, multi-story buildings, and dynamic environments. We also support more realistic three-dimensional visibility computations that allow evading agents to hide in crowds or behind hills. We demonstrate the utility of this approach on mobile robots and in simulation for a variety of scenarios including pursuit-evasion and tag on terrains, in multi-level buildings, and in crowds. © 2011 IEEE.

  17. On the limitations of standard statistical modeling in biological systems: a full Bayesian approach for biology.

    Science.gov (United States)

    Gomez-Ramirez, Jaime; Sanz, Ricardo

    2013-09-01

    One of the most important scientific challenges today is the quantitative and predictive understanding of biological function. Classical mathematical and computational approaches have been enormously successful in modeling inert matter, but they may be inadequate to address inherent features of biological systems. We address the conceptual and methodological obstacles that lie in the inverse problem in biological systems modeling. We introduce a full Bayesian approach (FBA), a theoretical framework to study biological function, in which probability distributions are conditional on biophysical information that physically resides in the biological system that is studied by the scientist. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Quantitative relationship between SAR and temperature rise inside eyeball in a realistic human heat model for 1.5 GHz-microwave exposure; 1.5GHz maikuroha wo abita tobu real model ni okeru gankyunai no hikyushuritsu to josho ondo tono teiryo kankei

    Energy Technology Data Exchange (ETDEWEB)

    Takai, K.; Fujiwara, O. [Nagoya Institute of Technology, Nagoya (Japan)

    1997-12-20

    For investigating biological effects of a localized SAR (specific absorption rate) deposited in a human body for electromagnetic wave exposure, it is indispensable to graps a temperature-rise inside a human brain including the control center for the body temperature. This paper numerically analyzes a temperature-rise inside an eyeball of our developed realistic head model for 1.5 GHz microwave exposure, using the FD-TD (finite-difference time-domain) method. The computed results are validated in comparison with the data obtained by Taflove and his colleague. In order to examine a quantitative relationship between the localized SAR and temperature-rise, we also obtained a tissue amount over which the localized SAR should be averaged so as to well reflect the temperature-rise distribution inside the eyeball. 15 refs., 9 figs., 3 tabs.

  19. Stochastic biomathematical models with applications to neuronal modeling

    CERN Document Server

    Batzel, Jerry; Ditlevsen, Susanne

    2013-01-01

    Stochastic biomathematical models are becoming increasingly important as new light is shed on the role of noise in living systems. In certain biological systems, stochastic effects may even enhance a signal, thus providing a biological motivation for the noise observed in living systems. Recent advances in stochastic analysis and increasing computing power facilitate the analysis of more biophysically realistic models, and this book provides researchers in computational neuroscience and stochastic systems with an overview of recent developments. Key concepts are developed in chapters written by experts in their respective fields. Topics include: one-dimensional homogeneous diffusions and their boundary behavior, large deviation theory and its application in stochastic neurobiological models, a review of mathematical methods for stochastic neuronal integrate-and-fire models, stochastic partial differential equation models in neurobiology, and stochastic modeling of spreading cortical depression.

  20. Development of realistic thermal hydraulic system analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, B. D; Kim, K. D. [and others

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others.

  1. Development of realistic thermal hydraulic system analysis code

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, B. D; Kim, K. D.

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others

  2. Biology meets Physics: Reductionism and Multi-scale Modeling of Morphogenesis

    DEFF Research Database (Denmark)

    Green, Sara; Batterman, Robert

    2017-01-01

    A common reductionist assumption is that macro-scale behaviors can be described "bottom-up" if only sufficient details about lower-scale processes are available. The view that an "ideal" or "fundamental" physics would be sufficient to explain all macro-scale phenomena has been met with criticism ...... modeling in developmental biology. In such contexts, the relation between models at different scales and from different disciplines is neither reductive nor completely autonomous, but interdependent....... from philosophers of biology. Specifically, scholars have pointed to the impossibility of deducing biological explanations from physical ones, and to the irreducible nature of distinctively biological processes such as gene regulation and evolution. This paper takes a step back in asking whether bottom......-up modeling is feasible even when modeling simple physical systems across scales. By comparing examples of multi-scale modeling in physics and biology, we argue that the “tyranny of scales” problem present a challenge to reductive explanations in both physics and biology. The problem refers to the scale...

  3. Respectful Modeling: Addressing Uncertainty in Dynamic System Models for Molecular Biology.

    Science.gov (United States)

    Tsigkinopoulou, Areti; Baker, Syed Murtuza; Breitling, Rainer

    2017-06-01

    Although there is still some skepticism in the biological community regarding the value and significance of quantitative computational modeling, important steps are continually being taken to enhance its accessibility and predictive power. We view these developments as essential components of an emerging 'respectful modeling' framework which has two key aims: (i) respecting the models themselves and facilitating the reproduction and update of modeling results by other scientists, and (ii) respecting the predictions of the models and rigorously quantifying the confidence associated with the modeling results. This respectful attitude will guide the design of higher-quality models and facilitate the use of models in modern applications such as engineering and manipulating microbial metabolism by synthetic biology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Photo Realistic 3d Modeling with Uav: GEDİK Ahmet Pasha Mosque in AFYONKARAHİSAR

    Science.gov (United States)

    Uysal, M.; Toprak, A. S.; Polat, N.

    2013-07-01

    Many of the cultural heritages in the world have been totally or partly destroyed by natural events and human activities such as earthquake, flood and fire until the present day. Cultural heritages are legacy for us as well; it is also a fiduciary for next generation. To deliver this fiduciary to the future generations, cultural heritages have to be protected and registered. There are different methods for applying this registry but Photogrammetry is the most accurate and rapid method. Photogrammetry enables us to registry cultural heritages and generating 3D photo-realistic models. Nowadays, 3D models are being used in various fields such as education and tourism. In registration of complex and high construction by Photogrammetry, there are some problems in data acquisition and processing. Especially for high construction's photographs, some additional equipment is required such as balloon and lifter. In recent years The Unmanned Aerial Vehicles (UAV) are commonly started to be used in different fields for different goals. In Photogrammetry, The UAVs are being used for particularly data acquisition. It is not always easy to capture data due to the situation of historical places and their neighbourhood. The use of UAVs for documentation of cultural heritage will make an important contribution. The main goals of this study are to survey cultural heritages by Photogrammetry and to investigate the potential of UAVs in 3D modelling. In this purpose we surveyed Gedik Ahmet Pasha Mosque photogrammetricly by UAV and will produce photorealistic 3D model. Gedik Ahmet Pasha, The Grand Vizier of Fatih Sultan Mehmet, has been in Afyonkarahisar during the campaign to Karaman between the years of 1472-1473. He wanted Architect Ayaz Agha to build a complex of Bathhouse, Mosque and a Madrasah here, Afyon, due to admiration of this city. Gedik Ahmet Pasha Mosque is in the centre of this complex. Gedik Ahmet Pasha Mosque is popularly known as Imaret Mosque among the people of Afyon

  5. PHOTO REALISTIC 3D MODELING WITH UAV: GEDİK AHMET PASHA MOSQUE IN AFYONKARAHİSAR

    Directory of Open Access Journals (Sweden)

    M. Uysal

    2013-07-01

    Full Text Available Many of the cultural heritages in the world have been totally or partly destroyed by natural events and human activities such as earthquake, flood and fire until the present day. Cultural heritages are legacy for us as well; it is also a fiduciary for next generation. To deliver this fiduciary to the future generations, cultural heritages have to be protected and registered. There are different methods for applying this registry but Photogrammetry is the most accurate and rapid method. Photogrammetry enables us to registry cultural heritages and generating 3D photo-realistic models. Nowadays, 3D models are being used in various fields such as education and tourism. In registration of complex and high construction by Photogrammetry, there are some problems in data acquisition and processing. Especially for high construction's photographs, some additional equipment is required such as balloon and lifter. In recent years The Unmanned Aerial Vehicles (UAV are commonly started to be used in different fields for different goals. In Photogrammetry, The UAVs are being used for particularly data acquisition. It is not always easy to capture data due to the situation of historical places and their neighbourhood. The use of UAVs for documentation of cultural heritage will make an important contribution. The main goals of this study are to survey cultural heritages by Photogrammetry and to investigate the potential of UAVs in 3D modelling. In this purpose we surveyed Gedik Ahmet Pasha Mosque photogrammetricly by UAV and will produce photorealistic 3D model. Gedik Ahmet Pasha, The Grand Vizier of Fatih Sultan Mehmet, has been in Afyonkarahisar during the campaign to Karaman between the years of 1472–1473. He wanted Architect Ayaz Agha to build a complex of Bathhouse, Mosque and a Madrasah here, Afyon, due to admiration of this city. Gedik Ahmet Pasha Mosque is in the centre of this complex. Gedik Ahmet Pasha Mosque is popularly known as Imaret Mosque among

  6. Continuous time Boolean modeling for biological signaling: application of Gillespie algorithm.

    OpenAIRE

    Stoll, Gautier; Viara, Eric; Barillot, Emmanuel; Calzone, Laurence

    2012-01-01

    Abstract Mathematical modeling is used as a Systems Biology tool to answer biological questions, and more precisely, to validate a network that describes biological observations and predict the effect of perturbations. This article presents an algorithm for modeling biological networks in a discrete framework with continuous time. Background There exist two major types of mathematical modeling approaches: (1) quantitative modeling, representing various chemical species concentrations by real...

  7. Tuukka Kaidesoja on Critical Realist Transcendental Realism

    Directory of Open Access Journals (Sweden)

    Groff Ruth

    2015-09-01

    Full Text Available I argue that critical realists think pretty much what Tukka Kaidesoja says that he himself thinks, but also that Kaidesoja’s objections to the views that he attributes to critical realists are not persuasive.

  8. Dose related risk and effect assessment model (DREAM) -- A more realistic approach to risk assessment of offshore discharges

    International Nuclear Information System (INIS)

    Johnsen, S.; Furuholt, E.

    1995-01-01

    Risk assessment of discharges from offshore oil and gas production to the marine environment features determination of potential environmental concentration (PEC) levels and no observed effect concentration (NOEC) levels. The PEC values are normally based on dilution of chemical components in the actual discharge source in the recipient, while the NOEC values are determined by applying a safety factor to acute toxic effects from laboratory tests. The DREAM concept focuses on realistic exposure doses as function of contact time and dilution, rather than fixed exposure concentrations of chemicals in long time exposure regimes. In its present state, the DREAM model is based on a number of assumptions with respect to the link between real life exposure doses and effects observed in laboratory tests. A research project has recently been initiated to develop the concept further, with special focus on chronic effects of different chemical compounds on the marine ecosystem. One of the questions that will be addressed is the link between exposure time, dose, concentration and effect. Validation of the safety factors applied for transforming acute toxic data into NOEC values will also be included. The DREAM model has been used by Statoil for risk assessment of discharges from new and existing offshore oil and gas production fields, and has been found to give a much more realistic results than conventional risk assessment tools. The presentation outlines the background for the DREAM approach, describes the model in its present state, discusses further developments and applications, and shows a number of examples on the performance of DREAM

  9. Genome Scale Modeling in Systems Biology: Algorithms and Resources

    Science.gov (United States)

    Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali

    2014-01-01

    In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031

  10. Investigations of sensitivity and resolution of ECG and MCG in a realistically shaped thorax model

    International Nuclear Information System (INIS)

    Mäntynen, Ville; Konttila, Teijo; Stenroos, Matti

    2014-01-01

    Solving the inverse problem of electrocardiography (ECG) and magnetocardiography (MCG) is often referred to as cardiac source imaging. Spatial properties of ECG and MCG as imaging systems are, however, not well known. In this modelling study, we investigate the sensitivity and point-spread function (PSF) of ECG, MCG, and combined ECG+MCG as a function of source position and orientation, globally around the ventricles: signal topographies are modelled using a realistically-shaped volume conductor model, and the inverse problem is solved using a distributed source model and linear source estimation with minimal use of prior information. The results show that the sensitivity depends not only on the modality but also on the location and orientation of the source and that the sensitivity distribution is clearly reflected in the PSF. MCG can better characterize tangential anterior sources (with respect to the heart surface), while ECG excels with normally-oriented and posterior sources. Compared to either modality used alone, the sensitivity of combined ECG+MCG is less dependent on source orientation per source location, leading to better source estimates. Thus, for maximal sensitivity and optimal source estimation, the electric and magnetic measurements should be combined. (paper)

  11. Introduction to stochastic models in biology

    DEFF Research Database (Denmark)

    Ditlevsen, Susanne; Samson, Adeline

    2013-01-01

    This chapter is concerned with continuous time processes, which are often modeled as a system of ordinary differential equations (ODEs). These models assume that the observed dynamics are driven exclusively by internal, deterministic mechanisms. However, real biological systems will always be exp...

  12. Realistic Paleobathymetry of the Cenomanian–Turonian (94 Ma Boundary Global Ocean

    Directory of Open Access Journals (Sweden)

    Arghya Goswami

    2018-01-01

    Full Text Available At present, global paleoclimate simulations are prepared with bathtub-like, flat, featureless and steep walled ocean bathymetry, which is neither realistic nor suitable. In this article, we present the first enhanced version of a reconstructed paleobathymetry for Cenomanian–Turonian (94 Ma time in a 0.1° × 0.1° resolution, that is both realistic and suitable for use in paleo-climate studies. This reconstruction is an extrapolation of a parameterized modern ocean bathymetry that combines simple geophysical models (standard plate cooling model for the oceanic lithosphere based on ocean crustal age, global modern oceanic sediment thicknesses, and generalized shelf-slope-rise structures calibrated from a published global relief model of the modern world (ETOPO1 at active and passive continental margins. The base version of this Cenomanian–Turonian paleobathymetry reconstruction is then updated with known submarine large igneous provinces, plateaus, and seamounts to minimize the difference between the reconstructed paleobathymetry and the real bathymetry that once existed.

  13. Realistic terrain visualization based on 3D virtual world technology

    Science.gov (United States)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2010-11-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  14. A literature survey of the biological effects and mechanics of electromagnetic radiation

    International Nuclear Information System (INIS)

    Zeh, K.A.

    1985-01-01

    The following report discusses the very controversial subject of electromagnetic interaction with the human body. The project was undertaken in the form of a literature survey to investigate the biological mechanisms responsible for the interaction, the theoretical models and associated mathematical techniques required to model the human body, the resulting energy deposition in the human and the factors which effect this. It was established that at present the most realistic model of man can be obtained using a block model and moment method technique with improved methods such as conjugate gradients or band approximation for the necessary matrix inversion. The impedance method of modelling could be very promising for future research. From the literature studied on biological effects no scientific evidence was found which definitely proves or disproves hazardous effects exist at low field intensities ( -2 ). The testes and the lens of the eye can be harmed, however, if the intensity is sufficient to cause a temperature rise of 1 degree Celsius in these organs

  15. Realistic Simulations of Coronagraphic Observations with WFIRST

    Science.gov (United States)

    Rizzo, Maxime; Zimmerman, Neil; Roberge, Aki; Lincowski, Andrew; Arney, Giada; Stark, Chris; Jansen, Tiffany; Turnbull, Margaret; WFIRST Science Investigation Team (Turnbull)

    2018-01-01

    We present a framework to simulate observing scenarios with the WFIRST Coronagraphic Instrument (CGI). The Coronagraph and Rapid Imaging Spectrograph in Python (crispy) is an open-source package that can be used to create CGI data products for analysis and development of post-processing routines. The software convolves time-varying coronagraphic PSFs with realistic astrophysical scenes which contain a planetary architecture, a consistent dust structure, and a background field composed of stars and galaxies. The focal plane can be read out by a WFIRST electron-multiplying CCD model directly, or passed through a WFIRST integral field spectrograph model first. Several elementary post-processing routines are provided as part of the package.

  16. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  17. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar; Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knuepfer, Christian; Liebermeister, Wolfram

    2016-01-01

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users' intuition about model similarity, and to support complex model searches in databases.

  18. Morphogenesis and pattern formation in biological systems experiments and models

    CERN Document Server

    Noji, Sumihare; Ueno, Naoto; Maini, Philip

    2003-01-01

    A central goal of current biology is to decode the mechanisms that underlie the processes of morphogenesis and pattern formation. Concerned with the analysis of those phenomena, this book covers a broad range of research fields, including developmental biology, molecular biology, plant morphogenesis, ecology, epidemiology, medicine, paleontology, evolutionary biology, mathematical biology, and computational biology. In Morphogenesis and Pattern Formation in Biological Systems: Experiments and Models, experimental and theoretical aspects of biology are integrated for the construction and investigation of models of complex processes. This collection of articles on the latest advances by leading researchers not only brings together work from a wide spectrum of disciplines, but also provides a stepping-stone to the creation of new areas of discovery.

  19. The use of biologically based cancer risk models in radiation epidemiology

    International Nuclear Information System (INIS)

    Krewski, D.; Zielinski, J.M.; Hazelton, W.D.; Garner, M.J.; Moolgavkar, S.H.

    2003-01-01

    Biologically based risk projection models for radiation carcinogenesis seek to describe the fundamental biological processes involved in neoplastic transformation of somatic cells into malignant cancer cells. A validated biologically based model, whose parameters have a direct biological interpretation, can also be used to extrapolate cancer risks to different exposure conditions with some confidence. In this article, biologically based models for radiation carcinogenesis, including the two-stage clonal expansion (TSCE) model and its extensions, are reviewed. The biological and mathematical bases for such models are described, and the implications of key model parameters for cancer risk assessment examined. Specific applications of versions of the TSCE model to important epidemiologic datasets are discussed, including the Colorado uranium miners' cohort; a cohort of Chinese tin miners; the lifespan cohort of atomic bomb survivors in Hiroshima and Nagasaki; and a cohort of over 200,000 workers included in the National Dose Registry (NDR) of Canada. (author)

  20. Mathematical models in biology bringing mathematics to life

    CERN Document Server

    Ferraro, Maria; Guarracino, Mario

    2015-01-01

    This book presents an exciting collection of contributions based on the workshop “Bringing Maths to Life” held October 27-29, 2014 in Naples, Italy.  The state-of-the art research in biology and the statistical and analytical challenges facing huge masses of data collection are treated in this Work. Specific topics explored in depth surround the sessions and special invited sessions of the workshop and include genetic variability via differential expression, molecular dynamics and modeling, complex biological systems viewed from quantitative models, and microscopy images processing, to name several. In depth discussions of the mathematical analysis required to extract insights from complex bodies of biological datasets, to aid development in the field novel algorithms, methods and software tools for genetic variability, molecular dynamics, and complex biological systems are presented in this book. Researchers and graduate students in biology, life science, and mathematics/statistics will find the content...

  1. Investigation of realistic PET simulations incorporating tumor patient's specificity using anthropomorphic models: Creation of an oncology database

    International Nuclear Information System (INIS)

    Papadimitroulas, Panagiotis; Efthimiou, Nikos; Nikiforidis, George C.; Kagadis, George C.; Loudos, George; Le Maitre, Amandine; Hatt, Mathieu; Tixier, Florent; Visvikis, Dimitris

    2013-01-01

    Purpose: The GATE Monte Carlo simulation toolkit is used for the implementation of realistic PET simulations incorporating tumor heterogeneous activity distributions. The reconstructed patient images include noise from the acquisition process, imaging system's performance restrictions and have limited spatial resolution. For those reasons, the measured intensity cannot be simply introduced in GATE simulations, to reproduce clinical data. Investigation of the heterogeneity distribution within tumors applying partial volume correction (PVC) algorithms was assessed. The purpose of the present study was to create a simulated oncology database based on clinical data with realistic intratumor uptake heterogeneity properties.Methods: PET/CT data of seven oncology patients were used in order to create a realistic tumor database investigating the heterogeneity activity distribution of the simulated tumors. The anthropomorphic models (NURBS based cardiac torso and Zubal phantoms) were adapted to the CT data of each patient, and the activity distribution was extracted from the respective PET data. The patient-specific models were simulated with the Monte Carlo Geant4 application for tomography emission (GATE) in three different levels for each case: (a) using homogeneous activity within the tumor, (b) using heterogeneous activity distribution in every voxel within the tumor as it was extracted from the PET image, and (c) using heterogeneous activity distribution corresponding to the clinical image following PVC. The three different types of simulated data in each case were reconstructed with two iterations and filtered with a 3D Gaussian postfilter, in order to simulate the intratumor heterogeneous uptake. Heterogeneity in all generated images was quantified using textural feature derived parameters in 3D according to the ground truth of the simulation, and compared to clinical measurements. Finally, profiles were plotted in central slices of the tumors, across lines with

  2. Generating realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2011-01-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs, and show a connection with the straight skeleton of P. We show that the maximum possible number of distinct realistic roofs over P is ( ⌊(n-4)/4⌋ (n-4)/2) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n 4) preprocessing time. We also present an O(n 5)-time algorithm for computing a realistic roof with minimum height or volume. © 2011 Springer-Verlag.

  3. Learning (from) the errors of a systems biology model.

    Science.gov (United States)

    Engelhardt, Benjamin; Frőhlich, Holger; Kschischo, Maik

    2016-02-11

    Mathematical modelling is a labour intensive process involving several iterations of testing on real data and manual model modifications. In biology, the domain knowledge guiding model development is in many cases itself incomplete and uncertain. A major problem in this context is that biological systems are open. Missed or unknown external influences as well as erroneous interactions in the model could thus lead to severely misleading results. Here we introduce the dynamic elastic-net, a data driven mathematical method which automatically detects such model errors in ordinary differential equation (ODE) models. We demonstrate for real and simulated data, how the dynamic elastic-net approach can be used to automatically (i) reconstruct the error signal, (ii) identify the target variables of model error, and (iii) reconstruct the true system state even for incomplete or preliminary models. Our work provides a systematic computational method facilitating modelling of open biological systems under uncertain knowledge.

  4. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Computational investigation of nonlinear microwave tomography on anatomically realistic breast phantoms

    DEFF Research Database (Denmark)

    Jensen, P. D.; Rubæk, Tonny; Mohr, J. J.

    2013-01-01

    The performance of a nonlinear microwave tomography algorithm is tested using simulated data from anatomically realistic breast phantoms. These tests include several different anatomically correct breast models from the University of Wisconsin-Madison repository with and without tumors inserted....

  6. In Silico Nanodosimetry: New Insights into Nontargeted Biological Responses to Radiation

    Directory of Open Access Journals (Sweden)

    Zdenka Kuncic

    2012-01-01

    nontargeted responses cannot be understood in the framework of DNA-centric radiobiological models; what is needed are new physically motivated models that address the damage-sensing signalling pathways triggered by the production of reactive free radicals. To this end, we have conducted a series of in silico experiments aimed at elucidating the underlying physical processes responsible for nontargeted biological responses to radiation. Our simulation studies implement new results on very low-energy electromagnetic interactions in liquid water (applicable down to nanoscales and we also consider a realistic simulation of extranuclear microbeam irradiation of a cell. Our results support the idea that organelles with important functional roles, such as mitochondria and lysosomes, as well as membranes, are viable targets for ionizations and excitations, and their chemical composition and density are critical to determining the free radical yield and ensuing biological responses.

  7. Detailed performance analysis of realistic solar photovoltaic systems at extensive climatic conditions

    International Nuclear Information System (INIS)

    Gupta, Ankit; Chauhan, Yogesh K.

    2016-01-01

    In recent years, solar energy has been considered as one of the principle renewable energy source for electric power generation. In this paper, single diode photovoltaic (PV) system and double/bypass diode based PV system are designed in MATLAB/Simulink environment based on their mathematical modeling and are validated with a commercially available solar panel. The novelty of the paper is to include the effect of climatic conditions i.e. variable irradiation level, wind speed, temperature, humidity level and dust accumulation in the modeling of both the PV systems to represent a realistic PV system. The comprehensive investigations are made on both the modeled PV systems. The obtained results show the satisfactory performance for realistic models of the PV system. Furthermore, an in depth comparative analysis is carried out for both PV systems. - Highlights: • Modeling of Single diode and Double diode PV systems in MATLAB/Simulink software. • Validation of designed PV systems with a commercially available PV panel. • Acquisition and employment of key climatic factors in modeling of the PV systems. • Evaluation of main model parameters of both the PV systems. • Detailed comparative assessment of both the modeled PV system parameters.

  8. Bayesian inversion using a geologically realistic and discrete model space

    Science.gov (United States)

    Jaeggli, C.; Julien, S.; Renard, P.

    2017-12-01

    Since the early days of groundwater modeling, inverse methods play a crucial role. Many research and engineering groups aim to infer extensive knowledge of aquifer parameters from a sparse set of observations. Despite decades of dedicated research on this topic, there are still several major issues to be solved. In the hydrogeological framework, one is often confronted with underground structures that present very sharp contrasts of geophysical properties. In particular, subsoil structures such as karst conduits, channels, faults, or lenses, strongly influence groundwater flow and transport behavior of the underground. For this reason it can be essential to identify their location and shape very precisely. Unfortunately, when inverse methods are specially trained to consider such complex features, their computation effort often becomes unaffordably high. The following work is an attempt to solve this dilemma. We present a new method that is, in some sense, a compromise between the ergodicity of Markov chain Monte Carlo (McMC) methods and the efficient handling of data by the ensemble based Kalmann filters. The realistic and complex random fields are generated by a Multiple-Point Statistics (MPS) tool. Nonetheless, it is applicable with any conditional geostatistical simulation tool. Furthermore, the algorithm is independent of any parametrization what becomes most important when two parametric systems are equivalent (permeability and resistivity, speed and slowness, etc.). When compared to two existing McMC schemes, the computational effort was divided by a factor of 12.

  9. Realistic electricity market simulator for energy and economic studies

    International Nuclear Information System (INIS)

    Bernal-Agustin, Jose L.; Contreras, Javier; Conejo, Antonio J.; Martin-Flores, Raul

    2007-01-01

    Electricity market simulators have become a useful tool to train engineers in the power industry. With the maturing of electricity markets throughout the world, there is a need for sophisticated software tools that can replicate the actual behavior of power markets. In most of these markets, power producers/consumers submit production/demand bids and the Market Operator clears the market producing a single price per hour. What makes markets different from each other are the bidding rules and the clearing algorithms to balance the market. This paper presents a realistic simulator of the day-ahead electricity market of mainland Spain. All the rules that govern this market are modeled. This simulator can be used either to train employees by power companies or to teach electricity markets courses in universities. To illustrate the tool, several realistic case studies are presented and discussed. (author)

  10. Kuhn: Realist or Antirealist?

    Directory of Open Access Journals (Sweden)

    Michel Ghins

    1998-06-01

    Full Text Available Although Kuhn is much more an antirealist than a realist, the earlier and later articulations of realist and antirealist ingredients in his views merit close scrutiny. What are the constituents of the real invariant World posited by Kuhn and its relation to the mutable paradigm-related worlds? Various proposed solutions to this problem (dubbed the "new-world problem" by Ian Hacking are examined and shown to be unsatisfactory. In The Structure of Scientific Revolutions, the stable World can reasonably be taken to be made up of ordinary perceived objects, whereas in Kuhn's later works the transparadigmatic World is identified with something akin to the Kantian world-in-itself. It is argued that both proposals are beset with insuperable difficulties which render Kuhn's earlier and later versions of antirealism implausible.

  11. Empirical Evidence for Niss' "Implemented Anticipation" in Mathematising Realistic Situations

    Science.gov (United States)

    Stillman, Gloria; Brown, Jill P.

    2012-01-01

    Mathematisation of realistic situations is an on-going focus of research. Classroom data from a Year 9 class participating in a program of structured modelling of real situations was analysed for evidence of Niss's theoretical construct, implemented anticipation, during mathematisation. Evidence was found for two of three proposed aspects. In…

  12. Function of dynamic models in systems biology: linking structure to behaviour.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens

    2013-10-08

    Dynamic models in Systems Biology are used in computational simulation experiments for addressing biological questions. The complexity of the modelled biological systems and the growing number and size of the models calls for computer support for modelling and simulation in Systems Biology. This computer support has to be based on formal representations of relevant knowledge fragments. In this paper we describe different functional aspects of dynamic models. This description is conceptually embedded in our "meaning facets" framework which systematises the interpretation of dynamic models in structural, functional and behavioural facets. Here we focus on how function links the structure and the behaviour of a model. Models play a specific role (teleological function) in the scientific process of finding explanations for dynamic phenomena. In order to fulfil this role a model has to be used in simulation experiments (pragmatical function). A simulation experiment always refers to a specific situation and a state of the model and the modelled system (conditional function). We claim that the function of dynamic models refers to both the simulation experiment executed by software (intrinsic function) and the biological experiment which produces the phenomena under investigation (extrinsic function). We use the presented conceptual framework for the function of dynamic models to review formal accounts for functional aspects of models in Systems Biology, such as checklists, ontologies, and formal languages. Furthermore, we identify missing formal accounts for some of the functional aspects. In order to fill one of these gaps we propose an ontology for the teleological function of models. We have thoroughly analysed the role and use of models in Systems Biology. The resulting conceptual framework for the function of models is an important first step towards a comprehensive formal representation of the functional knowledge involved in the modelling and simulation process

  13. Computational Biomechanics Theoretical Background and BiologicalBiomedical Problems

    CERN Document Server

    Tanaka, Masao; Nakamura, Masanori

    2012-01-01

    Rapid developments have taken place in biological/biomedical measurement and imaging technologies as well as in computer analysis and information technologies. The increase in data obtained with such technologies invites the reader into a virtual world that represents realistic biological tissue or organ structures in digital form and allows for simulation and what is called “in silico medicine.” This volume is the third in a textbook series and covers both the basics of continuum mechanics of biosolids and biofluids and the theoretical core of computational methods for continuum mechanics analyses. Several biomechanics problems are provided for better understanding of computational modeling and analysis. Topics include the mechanics of solid and fluid bodies, fundamental characteristics of biosolids and biofluids, computational methods in biomechanics analysis/simulation, practical problems in orthopedic biomechanics, dental biomechanics, ophthalmic biomechanics, cardiovascular biomechanics, hemodynamics...

  14. Realistic microscopic level densities for spherical nuclei

    International Nuclear Information System (INIS)

    Cerf, N.

    1994-01-01

    Nuclear level densities play an important role in nuclear reactions such as the formation of the compound nucleus. We develop a microscopic calculation of the level density based on a combinatorial evaluation from a realistic single-particle level scheme. This calculation makes use of a fast Monte Carlo algorithm allowing us to consider large shell model spaces which could not be treated previously in combinatorial approaches. Since our model relies on a microscopic basis, it can be applied to exotic nuclei with more confidence than the commonly used semiphenomenological formuals. An exhaustive comparison of our predicted neutron s-wave resonance spacings with experimental data for a wide range of nuclei is presented

  15. A Local Realistic Reconciliation of the EPR Paradox

    Science.gov (United States)

    Sanctuary, Bryan

    2014-03-01

    The exact violation of Bell's Inequalities is obtained with a local realistic model for spin. The model treats one particle that comprises a quantum ensemble and simulates the EPR data one coincidence at a time as a product state. Such a spin is represented by operators σx , iσy ,σz in its body frame rather than the usual set of σX ,σY ,σZ in the laboratory frame. This model, assumed valid in the absence of a measuring probe, contains both quantum polarizations and coherences. Each carries half the EPR correlation, but only half can be measured using coincidence techniques. The model further predicts the filter angles that maximize the spin correlation in EPR experiments.

  16. Quantitative computational models of molecular self-assembly in systems biology.

    Science.gov (United States)

    Thomas, Marcus; Schwartz, Russell

    2017-05-23

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.

  17. Setting Parameters for Biological Models With ANIMO

    NARCIS (Netherlands)

    Schivo, Stefano; Scholma, Jetse; Karperien, Hermanus Bernardus Johannes; Post, Janine Nicole; van de Pol, Jan Cornelis; Langerak, Romanus; André, Étienne; Frehse, Goran

    2014-01-01

    ANIMO (Analysis of Networks with Interactive MOdeling) is a software for modeling biological networks, such as e.g. signaling, metabolic or gene networks. An ANIMO model is essentially the sum of a network topology and a number of interaction parameters. The topology describes the interactions

  18. Stroke type differentiation using spectrally constrained multifrequency EIT: evaluation of feasibility in a realistic head model

    International Nuclear Information System (INIS)

    Malone, Emma; Jehl, Markus; Arridge, Simon; Betcke, Timo; Holder, David

    2014-01-01

    We investigate the application of multifrequency electrical impedance tomography (MFEIT) to imaging the brain in stroke patients. The use of MFEIT could enable early diagnosis and thrombolysis of ischaemic stroke, and therefore improve the outcome of treatment. Recent advances in the imaging methodology suggest that the use of spectral constraints could allow for the reconstruction of a one-shot image. We performed a simulation study to investigate the feasibility of imaging stroke in a head model with realistic conductivities. We introduced increasing levels of modelling errors to test the robustness of the method to the most common sources of artefact. We considered the case of errors in the electrode placement, spectral constraints, and contact impedance. The results indicate that errors in the position and shape of the electrodes can affect image quality, although our imaging method was successful in identifying tissues with sufficiently distinct spectra. (paper)

  19. Toward computational cumulative biology by combining models of biological datasets.

    Science.gov (United States)

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations-for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database.

  20. Biologically based neural circuit modelling for the study of fear learning and extinction

    Science.gov (United States)

    Nair, Satish S.; Paré, Denis; Vicentic, Aleksandra

    2016-11-01

    The neuronal systems that promote protective defensive behaviours have been studied extensively using Pavlovian conditioning. In this paradigm, an initially neutral-conditioned stimulus is paired with an aversive unconditioned stimulus leading the subjects to display behavioural signs of fear. Decades of research into the neural bases of this simple behavioural paradigm uncovered that the amygdala, a complex structure comprised of several interconnected nuclei, is an essential part of the neural circuits required for the acquisition, consolidation and expression of fear memory. However, emerging evidence from the confluence of electrophysiological, tract tracing, imaging, molecular, optogenetic and chemogenetic methodologies, reveals that fear learning is mediated by multiple connections between several amygdala nuclei and their distributed targets, dynamical changes in plasticity in local circuit elements as well as neuromodulatory mechanisms that promote synaptic plasticity. To uncover these complex relations and analyse multi-modal data sets acquired from these studies, we argue that biologically realistic computational modelling, in conjunction with experiments, offers an opportunity to advance our understanding of the neural circuit mechanisms of fear learning and to address how their dysfunction may lead to maladaptive fear responses in mental disorders.

  1. Rational versus Emotional Reasoning in a Realistic Multi-Objective Environment

    OpenAIRE

    Mayboudi, Seyed Mohammad Hossein

    2011-01-01

    ABSTRACT: Emotional intelligence and its associated with models have recently become one of new active studies in the field of artificial intelligence. Several works have been performed on modelling of emotional behaviours such as love, hate, happiness and sadness. This study presents a comparative evaluation of rational and emotional behaviours and the effects of emotions on the decision making process of agents in a realistic multi-objective environment. NetLogo simulation environment is u...

  2. Thermodynamic modeling of transcription: sensitivity analysis differentiates biological mechanism from mathematical model-induced effects.

    Science.gov (United States)

    Dresch, Jacqueline M; Liu, Xiaozhou; Arnosti, David N; Ay, Ahmet

    2010-10-24

    Quantitative models of gene expression generate parameter values that can shed light on biological features such as transcription factor activity, cooperativity, and local effects of repressors. An important element in such investigations is sensitivity analysis, which determines how strongly a model's output reacts to variations in parameter values. Parameters of low sensitivity may not be accurately estimated, leading to unwarranted conclusions. Low sensitivity may reflect the nature of the biological data, or it may be a result of the model structure. Here, we focus on the analysis of thermodynamic models, which have been used extensively to analyze gene transcription. Extracted parameter values have been interpreted biologically, but until now little attention has been given to parameter sensitivity in this context. We apply local and global sensitivity analyses to two recent transcriptional models to determine the sensitivity of individual parameters. We show that in one case, values for repressor efficiencies are very sensitive, while values for protein cooperativities are not, and provide insights on why these differential sensitivities stem from both biological effects and the structure of the applied models. In a second case, we demonstrate that parameters that were thought to prove the system's dependence on activator-activator cooperativity are relatively insensitive. We show that there are numerous parameter sets that do not satisfy the relationships proferred as the optimal solutions, indicating that structural differences between the two types of transcriptional enhancers analyzed may not be as simple as altered activator cooperativity. Our results emphasize the need for sensitivity analysis to examine model construction and forms of biological data used for modeling transcriptional processes, in order to determine the significance of estimated parameter values for thermodynamic models. Knowledge of parameter sensitivities can provide the necessary

  3. Computerised modelling for developmental biology : an exploration with case studies

    NARCIS (Netherlands)

    Bertens, Laura M.F.

    2012-01-01

    Many studies in developmental biology rely on the construction and analysis of models. This research presents a broad view of modelling approaches for developmental biology, with a focus on computational methods. An overview of modelling techniques is given, followed by several case studies. Using

  4. Notions of similarity for systems biology models.

    Science.gov (United States)

    Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knüpfer, Christian; Liebermeister, Wolfram; Waltemath, Dagmar

    2018-01-01

    Systems biology models are rapidly increasing in complexity, size and numbers. When building large models, researchers rely on software tools for the retrieval, comparison, combination and merging of models, as well as for version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of 'similarity' may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here we survey existing methods for the comparison of models, introduce quantitative measures for model similarity, and discuss potential applications of combined similarity measures. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on a combination of different model aspects. The six aspects that we define as potentially relevant for similarity are underlying encoding, references to biological entities, quantitative behaviour, qualitative behaviour, mathematical equations and parameters and network structure. We argue that future similarity measures will benefit from combining these model aspects in flexible, problem-specific ways to mimic users' intuition about model similarity, and to support complex model searches in databases. © The Author 2016. Published by Oxford University Press.

  5. METABOLIC MODELLING IN THE DEVELOPMENT OF CELL FACTORIES BY SYNTHETIC BIOLOGY

    Directory of Open Access Journals (Sweden)

    Paula Jouhten

    2012-10-01

    Full Text Available Cell factories are commonly microbial organisms utilized for bioconversion of renewable resources to bulk or high value chemicals. Introduction of novel production pathways in chassis strains is the core of the development of cell factories by synthetic biology. Synthetic biology aims to create novel biological functions and systems not found in nature by combining biology with engineering. The workflow of the development of novel cell factories with synthetic biology is ideally linear which will be attainable with the quantitative engineering approach, high-quality predictive models, and libraries of well-characterized parts. Different types of metabolic models, mathematical representations of metabolism and its components, enzymes and metabolites, are useful in particular phases of the synthetic biology workflow. In this minireview, the role of metabolic modelling in synthetic biology will be discussed with a review of current status of compatible methods and models for the in silico design and quantitative evaluation of a cell factory.

  6. Radical production in biological systems

    International Nuclear Information System (INIS)

    Johnson, J.R.; Akabani, G.

    1994-10-01

    This paper describes our effort to develop a metric for radiation exposure that is more fundamental than adsorbed dose and upon which a metric for exposure to chemicals could be based. This metric is based on the production of radicals by the two agents. Radicals produced by radiation in biological systems commonly assumed to be the same as those produced in water despite the presence of a variety of complex molecules. This may explain why the extensive efforts to describe the relationship between energy deposition (track structure) and molecular damage to DNA, based on the spectrum of radicals produced, have not been successful in explaining simple biological effects such as cell killing. Current models assume that DNA and its basic elements are immersed in water-like media and only model the production and diffusion of water-based radicals and their interaction with DNA structures; these models lack the cross sections associated with each macro-component of DNA and only treat water-based radicals. It has been found that such models are not realistic because DNA is not immersed in pure water. A computer code capable of simulating electron tracks, low-energy electrons, energy deposition in small molecules, and radical production and diffusion in water like media has been developed. This code is still in at a primitive stage and development is continuing. It is being used to study radical production by radiation, and radical diffusion and interactions in simple molecular systems following their production. We are extending the code to radical production by chemicals to complement our PBPK modeling efforts. It therefore has been developed primarily for use with radionuclides that are in biological materials, and not for radiation fields

  7. Modeling of nonlinear biological phenomena modeled by S-systems.

    Science.gov (United States)

    Mansouri, Majdi M; Nounou, Hazem N; Nounou, Mohamed N; Datta, Aniruddha A

    2014-03-01

    A central challenge in computational modeling of biological systems is the determination of the model parameters. In such cases, estimating these variables or parameters from other easily obtained measurements can be extremely useful. For example, time-series dynamic genomic data can be used to develop models representing dynamic genetic regulatory networks, which can be used to design intervention strategies to cure major diseases and to better understand the behavior of biological systems. Unfortunately, biological measurements are usually highly infected by errors that hide the important characteristics in the data. Therefore, these noisy measurements need to be filtered to enhance their usefulness in practice. This paper addresses the problem of state and parameter estimation of biological phenomena modeled by S-systems using Bayesian approaches, where the nonlinear observed system is assumed to progress according to a probabilistic state space model. The performances of various conventional and state-of-the-art state estimation techniques are compared. These techniques include the extended Kalman filter (EKF), unscented Kalman filter (UKF), particle filter (PF), and the developed variational Bayesian filter (VBF). Specifically, two comparative studies are performed. In the first comparative study, the state variables (the enzyme CadA, the model cadBA, the cadaverine Cadav and the lysine Lys for a model of the Cad System in Escherichia coli (CSEC)) are estimated from noisy measurements of these variables, and the various estimation techniques are compared by computing the estimation root mean square error (RMSE) with respect to the noise-free data. In the second comparative study, the state variables as well as the model parameters are simultaneously estimated. In this case, in addition to comparing the performances of the various state estimation techniques, the effect of the number of estimated model parameters on the accuracy and convergence of these

  8. Markov Chain-Like Quantum Biological Modeling of Mutations, Aging, and Evolution

    Directory of Open Access Journals (Sweden)

    Ivan B. Djordjevic

    2015-08-01

    Full Text Available Recent evidence suggests that quantum mechanics is relevant in photosynthesis, magnetoreception, enzymatic catalytic reactions, olfactory reception, photoreception, genetics, electron-transfer in proteins, and evolution; to mention few. In our recent paper published in Life, we have derived the operator-sum representation of a biological channel based on codon basekets, and determined the quantum channel model suitable for study of the quantum biological channel capacity. However, this model is essentially memoryless and it is not able to properly model the propagation of mutation errors in time, the process of aging, and evolution of genetic information through generations. To solve for these problems, we propose novel quantum mechanical models to accurately describe the process of creation spontaneous, induced, and adaptive mutations and their propagation in time. Different biological channel models with memory, proposed in this paper, include: (i Markovian classical model, (ii Markovian-like quantum model, and (iii hybrid quantum-classical model. We then apply these models in a study of aging and evolution of quantum biological channel capacity through generations. We also discuss key differences of these models with respect to a multilevel symmetric channel-based Markovian model and a Kimura model-based Markovian process. These models are quite general and applicable to many open problems in biology, not only biological channel capacity, which is the main focus of the paper. We will show that the famous quantum Master equation approach, commonly used to describe different biological processes, is just the first-order approximation of the proposed quantum Markov chain-like model, when the observation interval tends to zero. One of the important implications of this model is that the aging phenotype becomes determined by different underlying transition probabilities in both programmed and random (damage Markov chain-like models of aging, which

  9. Generative models versus underlying symmetries to explain biological pattern.

    Science.gov (United States)

    Frank, S A

    2014-06-01

    Mathematical models play an increasingly important role in the interpretation of biological experiments. Studies often present a model that generates the observations, connecting hypothesized process to an observed pattern. Such generative models confirm the plausibility of an explanation and make testable hypotheses for further experiments. However, studies rarely consider the broad family of alternative models that match the same observed pattern. The symmetries that define the broad class of matching models are in fact the only aspects of information truly revealed by observed pattern. Commonly observed patterns derive from simple underlying symmetries. This article illustrates the problem by showing the symmetry associated with the observed rate of increase in fitness in a constant environment. That underlying symmetry reveals how each particular generative model defines a single example within the broad class of matching models. Further progress on the relation between pattern and process requires deeper consideration of the underlying symmetries. © 2014 The Author. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.

  10. Complete methodology on generating realistic wind speed profiles based on measurements

    DEFF Research Database (Denmark)

    Gavriluta, Catalin; Spataru, Sergiu; Mosincat, Ioan

    2012-01-01

    , wind modelling for medium and large time scales is poorly treated in the present literature. This paper presents methods for generating realistic wind speed profiles based on real measurements. The wind speed profile is divided in a low- frequency component (describing long term variations...

  11. Biologic Constraints on Modelling Virus Assembly

    Directory of Open Access Journals (Sweden)

    Robert L. Garcea

    2008-01-01

    Full Text Available The mathematic modelling of icosahedral virus assembly has drawn increasing interest because of the symmetric geometry of the outer shell structures. Many models involve equilibrium expressions of subunit binding, with reversible subunit additions forming various intermediate structures. The underlying assumption is that a final lowest energy state drives the equilibrium toward assembly. In their simplest forms, these models have explained why high subunit protein concentrations and strong subunit association constants can result in kinetic traps forming off pathway partial and aberrant structures. However, the cell biology of virus assembly is exceedingly complex. The biochemistry and biology of polyoma and papillomavirus assembly described here illustrates many of these specific issues. Variables include the use of cellular ‘chaperone’ proteins as mediators of assembly fidelity, the coupling of assembly to encapsidation of a specific nucleic acid genome, the use of cellular structures as ‘workbenches’ upon which assembly occurs, and the underlying problem of making a capsid structure that is metastable and capable of rapid disassembly upon infection. Although formidable to model, incorporating these considerations could advance the relevance of mathematical models of virus assembly to the real world.

  12. Downscaling Ocean Conditions: Initial Results using a Quasigeostrophic and Realistic Ocean Model

    Science.gov (United States)

    Katavouta, Anna; Thompson, Keith

    2014-05-01

    Previous theoretical work (Henshaw et al, 2003) has shown that the small-scale modes of variability of solutions of the unforced, incompressible Navier-Stokes equation, and Burgers' equation, can be reconstructed with surprisingly high accuracy from the time history of a few of the large-scale modes. Motivated by this theoretical work we first describe a straightforward method for assimilating information on the large scales in order to recover the small scale oceanic variability. The method is based on nudging in specific wavebands and frequencies and is similar to the so-called spectral nudging method that has been used successfully for atmospheric downscaling with limited area models (e.g. von Storch et al., 2000). The validity of the method is tested using a quasigestrophic model configured to simulate a double ocean gyre separated by an unstable mid-ocean jet. It is shown that important features of the ocean circulation including the position of the meandering mid-ocean jet and associated pinch-off eddies can indeed be recovered from the time history of a small number of large-scales modes. The benefit of assimilating additional time series of observations from a limited number of locations, that alone are too sparse to significantly improve the recovery of the small scales using traditional assimilation techniques, is also demonstrated using several twin experiments. The final part of the study outlines the application of the approach using a realistic high resolution (1/36 degree) model, based on the NEMO (Nucleus for European Modelling of the Ocean) modeling framework, configured for the Scotian Shelf of the east coast of Canada. The large scale conditions used in this application are obtained from the HYCOM (HYbrid Coordinate Ocean Model) + NCODA (Navy Coupled Ocean Data Assimilation) global 1/12 degree analysis product. Henshaw, W., Kreiss, H.-O., Ystrom, J., 2003. Numerical experiments on the interaction between the larger- and the small-scale motion of

  13. Exophobic Quasi-Realistic Heterotic String Vacua

    CERN Document Server

    Assel, Benjamin; Faraggi, Alon E; Kounnas, Costas; Rizos, John

    2009-01-01

    We demonstrate the existence of heterotic-string vacua that are free of massless exotic fields. The need to break the non-Abelian GUT symmetries in k=1 heterotic-string models by Wilson lines, while preserving the GUT embedding of the weak-hypercharge and the GUT prediction sin^2\\theta_w(M(GUT))=3/8, necessarily implies that the models contain states with fractional electric charge. Such states are severely restricted by observations, and must be confined or sufficiently massive and diluted. We construct the first quasi-realistic heterotic-string models in which the exotic states do not appear in the massless spectrum, and only exist, as they must, in the massive spectrum. The SO(10) GUT symmetry is broken to the Pati-Salam subgroup. Our PS heterotic-string models contain adequate Higgs representations to break the GUT and electroweak symmetry, as well as colour Higgs triplets that can be used for the missing partner mechanism. By statistically sampling the space of Pati-Salam vacua we demonstrate the abundan...

  14. Creating photo-realistic works in a 3D scene using layers styles to create an animation

    Science.gov (United States)

    Avramescu, A. M.

    2015-11-01

    Creating realist objects in a 3D scene is not an easy work. We have to be very careful to make the creation very detailed. If we don't know how to make these photo-realistic works, by using the techniques and a good reference photo we can create an amazing amount of detail and realism. For example, in this article there are some of these detailed methods from which we can learn the techniques necessary to make beautiful and realistic objects in a scene. More precisely, in this paper, we present how to create a 3D animated scene, mainly using the Pen Tool and Blending Options. Indeed, this work is based on teaching some simple ways of using the Layer Styles to create some great shadows, lights, textures and a realistic sense of 3 Dimension. The present work involves also showing how some interesting ways of using the illuminating and rendering options can create a realistic effect in a scene. Moreover, this article shows how to create photo realistic 3D models from a digital image. The present work proposes to present how to use Illustrator paths, texturing, basic lighting and rendering, how to apply textures and how to parent the building and objects components. We also propose to use this proposition to recreate smaller details or 3D objects from a 2D image. After a critic art stage, we are able now to present in this paper the architecture of a design method that proposes to create an animation. The aim is to create a conceptual and methodological tutorial to address this issue both scientifically and in practice. This objective also includes proposing, on strong scientific basis, a model that gives the possibility of a better understanding of the techniques necessary to create a realistic animation.

  15. cellPACK: a virtual mesoscope to model and visualize structural systems biology.

    Science.gov (United States)

    Johnson, Graham T; Autin, Ludovic; Al-Alusi, Mostafa; Goodsell, David S; Sanner, Michel F; Olson, Arthur J

    2015-01-01

    cellPACK assembles computational models of the biological mesoscale, an intermediate scale (10-100 nm) between molecular and cellular biology scales. cellPACK's modular architecture unites existing and novel packing algorithms to generate, visualize and analyze comprehensive three-dimensional models of complex biological environments that integrate data from multiple experimental systems biology and structural biology sources. cellPACK is available as open-source code, with tools for validation of models and with 'recipes' and models for five biological systems: blood plasma, cytoplasm, synaptic vesicles, HIV and a mycoplasma cell. We have applied cellPACK to model distributions of HIV envelope protein to test several hypotheses for consistency with experimental observations. Biologists, educators and outreach specialists can interact with cellPACK models, develop new recipes and perform packing experiments through scripting and graphical user interfaces at http://cellPACK.org/.

  16. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    OpenAIRE

    Matthew P. Adams; Catherine J. Collier; Sven Uthicke; Yan X. Ow; Lucas Langlois; Katherine R. O’Brien

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluat...

  17. Modeling of biological intelligence for SCM system optimization.

    Science.gov (United States)

    Chen, Shengyong; Zheng, Yujun; Cattani, Carlo; Wang, Wanliang

    2012-01-01

    This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM) systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms.

  18. Modeling of Biological Intelligence for SCM System Optimization

    Directory of Open Access Journals (Sweden)

    Shengyong Chen

    2012-01-01

    Full Text Available This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms.

  19. Modeling of Biological Intelligence for SCM System Optimization

    Science.gov (United States)

    Chen, Shengyong; Zheng, Yujun; Cattani, Carlo; Wang, Wanliang

    2012-01-01

    This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM) systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms. PMID:22162724

  20. Development of a Shipboard Remote Control and Telemetry Experimental System for Large-Scale Model's Motions and Loads Measurement in Realistic Sea Waves.

    Science.gov (United States)

    Jiao, Jialong; Ren, Huilong; Adenya, Christiaan Adika; Chen, Chaohe

    2017-10-29

    Wave-induced motion and load responses are important criteria for ship performance evaluation. Physical experiments have long been an indispensable tool in the predictions of ship's navigation state, speed, motions, accelerations, sectional loads and wave impact pressure. Currently, majority of the experiments are conducted in laboratory tank environment, where the wave environments are different from the realistic sea waves. In this paper, a laboratory tank testing system for ship motions and loads measurement is reviewed and reported first. Then, a novel large-scale model measurement technique is developed based on the laboratory testing foundations to obtain accurate motion and load responses of ships in realistic sea conditions. For this purpose, a suite of advanced remote control and telemetry experimental system was developed in-house to allow for the implementation of large-scale model seakeeping measurement at sea. The experimental system includes a series of technique sensors, e.g., the Global Position System/Inertial Navigation System (GPS/INS) module, course top, optical fiber sensors, strain gauges, pressure sensors and accelerometers. The developed measurement system was tested by field experiments in coastal seas, which indicates that the proposed large-scale model testing scheme is capable and feasible. Meaningful data including ocean environment parameters, ship navigation state, motions and loads were obtained through the sea trial campaign.

  1. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species.

    Science.gov (United States)

    Adams, Matthew P; Collier, Catherine J; Uthicke, Sven; Ow, Yan X; Langlois, Lucas; O'Brien, Katherine R

    2017-01-04

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (T opt ) for maximum photosynthetic rate (P max ). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  2. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    Science.gov (United States)

    Adams, Matthew P.; Collier, Catherine J.; Uthicke, Sven; Ow, Yan X.; Langlois, Lucas; O'Brien, Katherine R.

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (Topt) for maximum photosynthetic rate (Pmax). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  3. Multi-level and hybrid modelling approaches for systems biology.

    Science.gov (United States)

    Bardini, R; Politano, G; Benso, A; Di Carlo, S

    2017-01-01

    During the last decades, high-throughput techniques allowed for the extraction of a huge amount of data from biological systems, unveiling more of their underling complexity. Biological systems encompass a wide range of space and time scales, functioning according to flexible hierarchies of mechanisms making an intertwined and dynamic interplay of regulations. This becomes particularly evident in processes such as ontogenesis, where regulative assets change according to process context and timing, making structural phenotype and architectural complexities emerge from a single cell, through local interactions. The information collected from biological systems are naturally organized according to the functional levels composing the system itself. In systems biology, biological information often comes from overlapping but different scientific domains, each one having its own way of representing phenomena under study. That is, the different parts of the system to be modelled may be described with different formalisms. For a model to have improved accuracy and capability for making a good knowledge base, it is good to comprise different system levels, suitably handling the relative formalisms. Models which are both multi-level and hybrid satisfy both these requirements, making a very useful tool in computational systems biology. This paper reviews some of the main contributions in this field.

  4. Institute for Multiscale Modeling of Biological Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Paulaitis, Michael E; Garcia-Moreno, Bertrand; Lenhoff, Abraham

    2009-12-26

    The Institute for Multiscale Modeling of Biological Interactions (IMMBI) has two primary goals: Foster interdisciplinary collaborations among faculty and their research laboratories that will lead to novel applications of multiscale simulation and modeling methods in the biological sciences and engineering; and Building on the unique biophysical/biology-based engineering foundations of the participating faculty, train scientists and engineers to apply computational methods that collectively span multiple time and length scales of biological organization. The success of IMMBI will be defined by the following: Size and quality of the applicant pool for pre-doctoral and post-doctoral fellows; Academic performance; Quality of the pre-doctoral and post-doctoral research; Impact of the research broadly and to the DOE (ASCR program) mission; Distinction of the next career step for pre-doctoral and post-doctoral fellows; and Faculty collaborations that result from IMMBI activities. Specific details about accomplishments during the three years of DOE support for IMMBI have been documented in Annual Progress Reports (April 2005, June 2006, and March 2007) and a Report for a National Academy of Sciences Review (October 2005) that were submitted to DOE on the dates indicated. An overview of these accomplishments is provided.

  5. Realistic Simulation of Rice Plant

    Directory of Open Access Journals (Sweden)

    Wei-long DING

    2011-09-01

    Full Text Available The existing research results of virtual modeling of rice plant, however, is far from perfect compared to that of other crops due to its complex structure and growth process. Techniques to visually simulate the architecture of rice plant and its growth process are presented based on the analysis of the morphological characteristics at different stages. Firstly, the simulations of geometrical shape, the bending status and the structural distortion of rice leaves are conducted. Then, by using an improved model for bending deformation, the curved patterns of panicle axis and various types of panicle branches are generated, and the spatial shape of rice panicle is therefore created. Parametric L-system is employed to generate its topological structures, and finite-state automaton is adopted to describe the development of geometrical structures. Finally, the computer visualization of three-dimensional morphologies of rice plant at both organ and individual levels is achieved. The experimental results showed that the proposed methods of modeling the three-dimensional shapes of organs and simulating the growth of rice plant are feasible and effective, and the generated three-dimensional images are realistic.

  6. Multiobjective Bak-Sneppen model on a small-world network

    International Nuclear Information System (INIS)

    Elettreby, M.F.

    2005-01-01

    Small-world networks (SWN) are relevant to biological systems. We study the dynamics of the Bak-Sneppen (BS) model on small-world network, including the concepts of extremal dynamics, multiobjective optimization and coherent noise. We find that the small-world structure stabilizes the system. Also, it is more realistic to augment the Bak-Sneppen model by these concepts

  7. Multiobjective Bak-Sneppen model on a small-world network

    International Nuclear Information System (INIS)

    Elettreby, M.

    2004-09-01

    Small-world networks (SWN) are relevant to biological systems. We study the dynamics of the Bak-Sneppen (BS) model on small-world network, including the concepts of extremal dynamics, multiobjective optimization and coherent noise. We find that the small-world structure stabilizes the system. Also, it is more realistic to augment the Bak-Sneppen model by these concepts. (author)

  8. Systematic integration of experimental data and models in systems biology.

    Science.gov (United States)

    Li, Peter; Dada, Joseph O; Jameson, Daniel; Spasic, Irena; Swainston, Neil; Carroll, Kathleen; Dunn, Warwick; Khan, Farid; Malys, Naglis; Messiha, Hanan L; Simeonidis, Evangelos; Weichart, Dieter; Winder, Catherine; Wishart, Jill; Broomhead, David S; Goble, Carole A; Gaskell, Simon J; Kell, Douglas B; Westerhoff, Hans V; Mendes, Pedro; Paton, Norman W

    2010-11-29

    The behaviour of biological systems can be deduced from their mathematical models. However, multiple sources of data in diverse forms are required in the construction of a model in order to define its components and their biochemical reactions, and corresponding parameters. Automating the assembly and use of systems biology models is dependent upon data integration processes involving the interoperation of data and analytical resources. Taverna workflows have been developed for the automated assembly of quantitative parameterised metabolic networks in the Systems Biology Markup Language (SBML). A SBML model is built in a systematic fashion by the workflows which starts with the construction of a qualitative network using data from a MIRIAM-compliant genome-scale model of yeast metabolism. This is followed by parameterisation of the SBML model with experimental data from two repositories, the SABIO-RK enzyme kinetics database and a database of quantitative experimental results. The models are then calibrated and simulated in workflows that call out to COPASIWS, the web service interface to the COPASI software application for analysing biochemical networks. These systems biology workflows were evaluated for their ability to construct a parameterised model of yeast glycolysis. Distributed information about metabolic reactions that have been described to MIRIAM standards enables the automated assembly of quantitative systems biology models of metabolic networks based on user-defined criteria. Such data integration processes can be implemented as Taverna workflows to provide a rapid overview of the components and their relationships within a biochemical system.

  9. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB

    KAUST Repository

    Klingbeil, G.

    2011-02-25

    Motivation: The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new software tool STOCHSIMGPU that exploits graphics processing units (GPUs) for parallel stochastic simulations of biological/chemical reaction systems and show that significant gains in efficiency can be made. It is integrated into MATLAB and works with the Systems Biology Toolbox 2 (SBTOOLBOX2) for MATLAB. Results: The GPU-based parallel implementation of the Gillespie stochastic simulation algorithm (SSA), the logarithmic direct method (LDM) and the next reaction method (NRM) is approximately 85 times faster than the sequential implementation of the NRM on a central processing unit (CPU). Using our software does not require any changes to the user\\'s models, since it acts as a direct replacement of the stochastic simulation software of the SBTOOLBOX2. © The Author 2011. Published by Oxford University Press. All rights reserved.

  10. River, delta and coastal morphological response accounting for biological dynamics

    Science.gov (United States)

    Goldsmith, W.; Bernardi, D.; Schippa, L.

    2015-03-01

    Management and construction can increase resilience in the face of climate change, and benefits can be enhanced through integration of biogenic materials including shells and vegetation. Rivers and coastal landforms are dynamic systems that respond to intentional and unintended manipulation of critical factors, often with unforeseen and/or undesirable resulting effects. River management strategies have impacts that include deltas and coastal areas which are increasingly vulnerable to climate change with reference to sea level rise and storm intensity. Whereas conventional assessment and analysis of rivers and coasts has relied on modelling of hydrology, hydraulics and sediment transport, incorporating additional biological factors can offer more comprehensive, beneficial and realistic alternatives. Suitable modelling tools can provide improved decision support. The question has been whether current models can effectively address biological responses with suitable reliability and efficiency. Since morphodynamic evolution exhibits its effects on a large timescale, the choice of mathematical model is not trivial and depends upon the availability of data, as well as the spatial extent, timelines and computation effort desired. The ultimate goal of the work is to set up a conveniently simplified river morphodynamic model, coupled with a biological dynamics plant population model able to predict the long-term evolution of large alluvial river systems managed through bioengineering. This paper presents the first step of the work related to the application of the model accounting for stationary vegetation condition. Sensitivity analysis has been performed on the main hydraulic, sedimentology, and biological parameters. The model has been applied to significant river training in Europe, Asia and North America, and comparative analysis has been used to validate analytical solutions. Data gaps and further areas for investigation are identified.

  11. A realistic pattern of fermion masses from a five-dimensional SO(10) model

    International Nuclear Information System (INIS)

    Feruglio, Ferruccio; Patel, Ketan M.; Vicino, Denise

    2015-01-01

    We provide a unified description of fermion masses and mixing angles in the framework of a supersymmetric grand unified SO(10) model with anarchic Yukawa couplings of order unity. The space-time is five dimensional and the extra flat spatial dimension is compactified on the orbifold S 1 /(Z 2 ×Z 2 ′ ), leading to Pati-Salam gauge symmetry on the boundary where Yukawa interactions are localised. The gauge symmetry breaking is completed by means of a rather economic scalar sector, avoiding the doublet-triplet splitting problem. The matter fields live in the bulk and their massless modes get exponential profiles, which naturally explain the mass hierarchy of the different fermion generations. Quarks and leptons properties are naturally reproduced by a mechanism, first proposed by Kitano and Li, that lifts the SO(10) degeneracy of bulk masses in terms of a single parameter. The model provides a realistic pattern of fermion masses and mixing angles for large values of tan β. It favours normally ordered neutrino mass spectrum with the lightest neutrino mass below 0.01 eV and no preference for leptonic CP violating phases. The right handed neutrino mass spectrum is very hierarchical and does not allow for thermal leptogenesis. We analyse several variants of the basic framework and find that the results concerning the fermion spectrum are remarkably stable.

  12. Atomic level insights into realistic molecular models of dendrimer-drug complexes through MD simulations

    Science.gov (United States)

    Jain, Vaibhav; Maiti, Prabal K.; Bharatam, Prasad V.

    2016-09-01

    Computational studies performed on dendrimer-drug complexes usually consider 1:1 stoichiometry, which is far from reality, since in experiments more number of drug molecules get encapsulated inside a dendrimer. In the present study, molecular dynamic (MD) simulations were implemented to characterize the more realistic molecular models of dendrimer-drug complexes (1:n stoichiometry) in order to understand the effect of high drug loading on the structural properties and also to unveil the atomistic level details. For this purpose, possible inclusion complexes of model drug Nateglinide (Ntg) (antidiabetic, belongs to Biopharmaceutics Classification System class II) with amine- and acetyl-terminated G4 poly(amidoamine) (G4 PAMAM(NH2) and G4 PAMAM(Ac)) dendrimers at neutral and low pH conditions are explored in this work. MD simulation analysis on dendrimer-drug complexes revealed that the drug encapsulation efficiency of G4 PAMAM(NH2) and G4 PAMAM(Ac) dendrimers at neutral pH was 6 and 5, respectively, while at low pH it was 12 and 13, respectively. Center-of-mass distance analysis showed that most of the drug molecules are located in the interior hydrophobic pockets of G4 PAMAM(NH2) at both the pH; while in the case of G4 PAMAM(Ac), most of them are distributed near to the surface at neutral pH and in the interior hydrophobic pockets at low pH. Structural properties such as radius of gyration, shape, radial density distribution, and solvent accessible surface area of dendrimer-drug complexes were also assessed and compared with that of the drug unloaded dendrimers. Further, binding energy calculations using molecular mechanics Poisson-Boltzmann surface area approach revealed that the location of drug molecules in the dendrimer is not the decisive factor for the higher and lower binding affinity of the complex, but the charged state of dendrimer and drug, intermolecular interactions, pH-induced conformational changes, and surface groups of dendrimer do play an

  13. Universally sloppy parameter sensitivities in systems biology models.

    Directory of Open Access Journals (Sweden)

    Ryan N Gutenkunst

    2007-10-01

    Full Text Available Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring in vivo biochemical parameters is difficult, and collectively fitting them to other experimental data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a "sloppy" spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our growth-factor-signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.

  14. Universally sloppy parameter sensitivities in systems biology models.

    Science.gov (United States)

    Gutenkunst, Ryan N; Waterfall, Joshua J; Casey, Fergal P; Brown, Kevin S; Myers, Christopher R; Sethna, James P

    2007-10-01

    Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring in vivo biochemical parameters is difficult, and collectively fitting them to other experimental data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a "sloppy" spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our growth-factor-signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.

  15. Boolean modeling in systems biology: an overview of methodology and applications

    International Nuclear Information System (INIS)

    Wang, Rui-Sheng; Albert, Réka; Saadatpour, Assieh

    2012-01-01

    Mathematical modeling of biological processes provides deep insights into complex cellular systems. While quantitative and continuous models such as differential equations have been widely used, their use is obstructed in systems wherein the knowledge of mechanistic details and kinetic parameters is scarce. On the other hand, a wealth of molecular level qualitative data on individual components and interactions can be obtained from the experimental literature and high-throughput technologies, making qualitative approaches such as Boolean network modeling extremely useful. In this paper, we build on our research to provide a methodology overview of Boolean modeling in systems biology, including Boolean dynamic modeling of cellular networks, attractor analysis of Boolean dynamic models, as well as inferring biological regulatory mechanisms from high-throughput data using Boolean models. We finally demonstrate how Boolean models can be applied to perform the structural analysis of cellular networks. This overview aims to acquaint life science researchers with the basic steps of Boolean modeling and its applications in several areas of systems biology. (paper)

  16. Biochemical Space: A Framework for Systemic Annotation of Biological Models

    Czech Academy of Sciences Publication Activity Database

    Klement, M.; Děd, T.; Šafránek, D.; Červený, Jan; Müller, Stefan; Steuer, Ralf

    2014-01-01

    Roč. 306, JUL (2014), s. 31-44 ISSN 1571-0661 R&D Projects: GA MŠk(CZ) EE2.3.20.0256 Institutional support: RVO:67179843 Keywords : biological models * model annotation * systems biology * cyanobacteria Subject RIV: EH - Ecology, Behaviour

  17. Review of "Stochastic Modelling for Systems Biology" by Darren Wilkinson

    Directory of Open Access Journals (Sweden)

    Bullinger Eric

    2006-12-01

    Full Text Available Abstract "Stochastic Modelling for Systems Biology" by Darren Wilkinson introduces the peculiarities of stochastic modelling in biology. This book is particularly suited to as a textbook or for self-study, and for readers with a theoretical background.

  18. Echinococcus as a model system: biology and epidemiology.

    Science.gov (United States)

    Thompson, R C A; Jenkins, D J

    2014-10-15

    The introduction of Echinococcus to Australia over 200 years ago and its establishment in sheep rearing areas of the country inflicted a serious medical and economic burden on the country. This resulted in an investment in both basic and applied research aimed at learning more about the biology and life cycle of Echinococcus. This research served to illustrate the uniqueness of the parasite in terms of developmental biology and ecology, and the value of Echinococcus as a model system in a broad range of research, from fundamental biology to theoretical control systems. These studies formed the foundation for an international, diverse and ongoing research effort on the hydatid organisms encompassing stem cell biology, gene regulation, strain variation, wildlife diseases and models of transmission dynamics. We describe the development, nature and diversity of this research, and how it was initiated in Australia but subsequently has stimulated much international and collaborative research on Echinococcus. Copyright © 2014 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.

  19. Adapting realist synthesis methodology: The case of workplace harassment interventions.

    Science.gov (United States)

    Carr, Tracey; Quinlan, Elizabeth; Robertson, Susan; Gerrard, Angie

    2017-12-01

    Realist synthesis techniques can be used to assess complex interventions by extracting and synthesizing configurations of contexts, mechanisms, and outcomes found in the literature. Our novel and multi-pronged approach to the realist synthesis of workplace harassment interventions describes our pursuit of theory to link macro and program level theories. After discovering the limitations of a dogmatic approach to realist synthesis, we adapted our search strategy and focused our analysis on a subset of data. We tailored our realist synthesis to understand how, why, and under what circumstances workplace harassment interventions are effective. The result was a conceptual framework to test our theory-based interventions and provide the basis for subsequent realist evaluation. Our experience documented in this article contributes to an understanding of how, under what circumstances, and with what consequences realist synthesis principles can be customized. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Evaluation of radiobiological effects in 3 distinct biological models

    International Nuclear Information System (INIS)

    Lemos, J.; Costa, P.; Cunha, L.; Metello, L.F.; Carvalho, A.P.; Vasconcelos, V.; Genesio, P.; Ponte, F.; Costa, P.S.; Crespo, P.

    2015-01-01

    Full text of publication follows. The present work aims at sharing the process of development of advanced biological models to study radiobiological effects. Recognizing several known limitations and difficulties of the current monolayer cellular models, as well as the increasing difficulties to use advanced biological models, our group has been developing advanced biological alternative models, namely three-dimensional cell cultures and a less explored animal model (the Zebra fish - Danio rerio - which allows the access to inter-generational data, while characterized by a great genetic homology towards the humans). These 3 models (monolayer cellular model, three-dimensional cell cultures and zebra fish) were externally irradiated with 100 mGy, 500 mGy or 1 Gy. The consequences of that irradiation were studied using cellular and molecular tests. Our previous experimental studies with 100 mGy external gamma irradiation of HepG2 monolayer cells showed a slight increase in the proliferation rate 24 h, 48 h and 72 h post irradiation. These results also pointed into the presence of certain bystander effects 72 h post irradiation, constituting the starting point for the need of a more accurate analysis realized with this work. At this stage, we continue focused on the acute biological effects. Obtained results, namely MTT and clonogenic assays for evaluating cellular metabolic activity and proliferation in the in vitro models, as well as proteomics for the evaluation of in vivo effects will be presented, discussed and explained. Several hypotheses will be presented and defended based on the facts previously demonstrated. This work aims at sharing the actual state and the results already available from this medium-term project, building the proof of the added value on applying these advanced models, while demonstrating the strongest and weakest points from all of them (so allowing the comparison between them and to base the subsequent choice for research groups starting

  1. Polynomial algebra of discrete models in systems biology.

    Science.gov (United States)

    Veliz-Cuba, Alan; Jarrah, Abdul Salam; Laubenbacher, Reinhard

    2010-07-01

    An increasing number of discrete mathematical models are being published in Systems Biology, ranging from Boolean network models to logical models and Petri nets. They are used to model a variety of biochemical networks, such as metabolic networks, gene regulatory networks and signal transduction networks. There is increasing evidence that such models can capture key dynamic features of biological networks and can be used successfully for hypothesis generation. This article provides a unified framework that can aid the mathematical analysis of Boolean network models, logical models and Petri nets. They can be represented as polynomial dynamical systems, which allows the use of a variety of mathematical tools from computer algebra for their analysis. Algorithms are presented for the translation into polynomial dynamical systems. Examples are given of how polynomial algebra can be used for the model analysis. alanavc@vt.edu Supplementary data are available at Bioinformatics online.

  2. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  3. Turbulent transport measurements in a cold model of GT-burner at realistic flow rates

    Directory of Open Access Journals (Sweden)

    Gobyzov Oleg

    2016-01-01

    Full Text Available In the present work simultaneous velocity field and passive admixture concentration field measurements at realistic flow-rates conditions in a non-reacting flow in a model of combustion chamber with an industrial mixing device are reported. In the experiments for safety reasons the real fuel (natural gas was replaced with neon gas to simulate stratification in a strongly swirling flow. Measurements were performed by means of planar laser-induced fluorescence (PLIF and particle image velocimetry technique (PIV at Reynolds number, based on the mean flow rate and nozzle diameter, ≈300 000. Details on experimental technique, features of the experimental setup, images and data preprocessing procedures and results of performed measurements are given in the paper. In addition to the raw velocity and admixture concentration data in-depth evaluation approaches aimed for estimation of turbulent kinetic energy (TKE components, assessment of turbulent Schmidt number and analysis of the gradient closure hypothesis from experimental data are presented in the paper.

  4. Mathematical modeling in biology: A critical assessment

    Energy Technology Data Exchange (ETDEWEB)

    Buiatti, M. [Florence, Univ. (Italy). Dipt. di Biologia Animale e Genetica

    1998-01-01

    The molecular revolution and the development of biology-derived industry have led in the last fifty years to an unprecedented `lead forward` of life sciences in terms of experimental data. Less success has been achieved in the organisation of such data and in the consequent development of adequate explanatory and predictive theories and models. After a brief historical excursus inborn difficulties of mathematisation of biological objects and processes derived from the complex dynamics of life are discussed along with the logical tools (simplifications, choice of observation points etc.) used to overcome them. `Autistic`, monodisciplinary attitudes towards biological modeling of mathematicians, physicists, biologists aimed in each case at the use of the tools of other disciplines to solve `selfish` problems are also taken into account and a warning against derived dangers (reification of mono disciplinary metaphors, lack of falsification etc.) is given. Finally `top.down` (deductive) and `bottom up` (inductive) heuristic interactive approaches to mathematisation are critically discussed with the help of serie of examples.

  5. Mathematical modeling in biology: A critical assessment

    International Nuclear Information System (INIS)

    Buiatti, M.

    1998-01-01

    The molecular revolution and the development of biology-derived industry have led in the last fifty years to an unprecedented 'lead forward' of life sciences in terms of experimental data. Less success has been achieved in the organisation of such data and in the consequent development of adequate explanatory and predictive theories and models. After a brief historical excursus inborn difficulties of mathematisation of biological objects and processes derived from the complex dynamics of life are discussed along with the logical tools (simplifications, choice of observation points etc.) used to overcome them. 'Autistic', monodisciplinary attitudes towards biological modeling of mathematicians, physicists, biologists aimed in each case at the use of the tools of other disciplines to solve 'selfish' problems are also taken into account and a warning against derived dangers (reification of mono disciplinary metaphors, lack of falsification etc.) is given. Finally 'top.down' (deductive) and 'bottom up' (inductive) heuristic interactive approaches to mathematisation are critically discussed with the help of serie of examples

  6. Multiscale modeling of emergent materials: biological and soft matter

    DEFF Research Database (Denmark)

    Murtola, Teemu; Bunker, Alex; Vattulainen, Ilpo

    2009-01-01

    In this review, we focus on four current related issues in multiscale modeling of soft and biological matter. First, we discuss how to use structural information from detailed models (or experiments) to construct coarse-grained ones in a hierarchical and systematic way. This is discussed in the c......In this review, we focus on four current related issues in multiscale modeling of soft and biological matter. First, we discuss how to use structural information from detailed models (or experiments) to construct coarse-grained ones in a hierarchical and systematic way. This is discussed...

  7. Prospective Tests on Biological Models of Acupuncture

    Directory of Open Access Journals (Sweden)

    Charles Shang

    2009-01-01

    Full Text Available The biological effects of acupuncture include the regulation of a variety of neurohumoral factors and growth control factors. In science, models or hypotheses with confirmed predictions are considered more convincing than models solely based on retrospective explanations. Literature review showed that two biological models of acupuncture have been prospectively tested with independently confirmed predictions: The neurophysiology model on the long-term effects of acupuncture emphasizes the trophic and anti-inflammatory effects of acupuncture. Its prediction on the peripheral effect of endorphin in acupuncture has been confirmed. The growth control model encompasses the neurophysiology model and suggests that a macroscopic growth control system originates from a network of organizers in embryogenesis. The activity of the growth control system is important in the formation, maintenance and regulation of all the physiological systems. Several phenomena of acupuncture such as the distribution of auricular acupuncture points, the long-term effects of acupuncture and the effect of multimodal non-specific stimulation at acupuncture points are consistent with the growth control model. The following predictions of the growth control model have been independently confirmed by research results in both acupuncture and conventional biomedical sciences: (i Acupuncture has extensive growth control effects. (ii Singular point and separatrix exist in morphogenesis. (iii Organizers have high electric conductance, high current density and high density of gap junctions. (iv A high density of gap junctions is distributed as separatrices or boundaries at body surface after early embryogenesis. (v Many acupuncture points are located at transition points or boundaries between different body domains or muscles, coinciding with the connective tissue planes. (vi Some morphogens and organizers continue to function after embryogenesis. Current acupuncture research suggests a

  8. A Comprehensive Web-based Platform For Domain-Specific Biological Models

    Czech Academy of Sciences Publication Activity Database

    Klement, M.; Šafránek, D.; Děd, J.; Pejznoch, A.; Nedbal, Ladislav; Steuer, Ralf; Červený, Jan; Müller, Stefan

    2013-01-01

    Roč. 299, 25 Dec (2013), s. 61-67 ISSN 1571-0661 R&D Projects: GA MŠk(CZ) EE2.3.20.0256 Institutional support: RVO:67179843 Keywords : biological models * model annotation * systems biology * simulation * database Subject RIV: EH - Ecology, Behaviour

  9. Multiway modeling and analysis in stem cell systems biology

    Directory of Open Access Journals (Sweden)

    Vandenberg Scott L

    2008-07-01

    Full Text Available Abstract Background Systems biology refers to multidisciplinary approaches designed to uncover emergent properties of biological systems. Stem cells are an attractive target for this analysis, due to their broad therapeutic potential. A central theme of systems biology is the use of computational modeling to reconstruct complex systems from a wealth of reductionist, molecular data (e.g., gene/protein expression, signal transduction activity, metabolic activity, etc.. A number of deterministic, probabilistic, and statistical learning models are used to understand sophisticated cellular behaviors such as protein expression during cellular differentiation and the activity of signaling networks. However, many of these models are bimodal i.e., they only consider row-column relationships. In contrast, multiway modeling techniques (also known as tensor models can analyze multimodal data, which capture much more information about complex behaviors such as cell differentiation. In particular, tensors can be very powerful tools for modeling the dynamic activity of biological networks over time. Here, we review the application of systems biology to stem cells and illustrate application of tensor analysis to model collagen-induced osteogenic differentiation of human mesenchymal stem cells. Results We applied Tucker1, Tucker3, and Parallel Factor Analysis (PARAFAC models to identify protein/gene expression patterns during extracellular matrix-induced osteogenic differentiation of human mesenchymal stem cells. In one case, we organized our data into a tensor of type protein/gene locus link × gene ontology category × osteogenic stimulant, and found that our cells expressed two distinct, stimulus-dependent sets of functionally related genes as they underwent osteogenic differentiation. In a second case, we organized DNA microarray data in a three-way tensor of gene IDs × osteogenic stimulus × replicates, and found that application of tensile strain to a

  10. Some Issues of Biological Shape Modelling with Applications

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Hilger, Klaus Baggesen; Skoglund, Karl

    2003-01-01

    This paper illustrates current research at Informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations to, modifications to, and applications of the elements of constructing models of shape or appearance...

  11. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  12. 3D Realistic Radiative Hydrodynamic Modeling of a Moderate-Mass Star: Effects of Rotation

    Science.gov (United States)

    Kitiashvili, Irina; Kosovichev, Alexander G.; Mansour, Nagi N.; Wray, Alan A.

    2018-01-01

    Recent progress in stellar observations opens new perspectives in understanding stellar evolution and structure. However, complex interactions in the turbulent radiating plasma together with effects of magnetic fields and rotation make inferences of stellar properties uncertain. The standard 1D mixing-length-based evolutionary models are not able to capture many physical processes of stellar interior dynamics, but they provide an initial approximation of the stellar structure that can be used to initialize 3D time-dependent radiative hydrodynamics simulations, based on first physical principles, that take into account the effects of turbulence, radiation, and others. In this presentation we will show simulation results from a 3D realistic modeling of an F-type main-sequence star with mass 1.47 Msun, in which the computational domain includes the upper layers of the radiation zone, the entire convection zone, and the photosphere. The simulation results provide new insight into the formation and properties of the convective overshoot region, the dynamics of the near-surface, highly turbulent layer, the structure and dynamics of granulation, and the excitation of acoustic and gravity oscillations. We will discuss the thermodynamic structure, oscillations, and effects of rotation on the dynamics of the star across these layers.

  13. Realist synthesis: illustrating the method for implementation research

    Directory of Open Access Journals (Sweden)

    Rycroft-Malone Jo

    2012-04-01

    Full Text Available Abstract Background Realist synthesis is an increasingly popular approach to the review and synthesis of evidence, which focuses on understanding the mechanisms by which an intervention works (or not. There are few published examples of realist synthesis. This paper therefore fills a gap by describing, in detail, the process used for a realist review and synthesis to answer the question ‘what interventions and strategies are effective in enabling evidence-informed healthcare?’ The strengths and challenges of conducting realist review are also considered. Methods The realist approach involves identifying underlying causal mechanisms and exploring how they work under what conditions. The stages of this review included: defining the scope of the review (concept mining and framework formulation; searching for and scrutinising the evidence; extracting and synthesising the evidence; and developing the narrative, including hypotheses. Results Based on key terms and concepts related to various interventions to promote evidence-informed healthcare, we developed an outcome-focused theoretical framework. Questions were tailored for each of four theory/intervention areas within the theoretical framework and were used to guide development of a review and data extraction process. The search for literature within our first theory area, change agency, was executed and the screening procedure resulted in inclusion of 52 papers. Using the questions relevant to this theory area, data were extracted by one reviewer and validated by a second reviewer. Synthesis involved organisation of extracted data into evidence tables, theming and formulation of chains of inference, linking between the chains of inference, and hypothesis formulation. The narrative was developed around the hypotheses generated within the change agency theory area. Conclusions Realist synthesis lends itself to the review of complex interventions because it accounts for context as well as

  14. Biology learning evaluation model in Senior High Schools

    Directory of Open Access Journals (Sweden)

    Sri Utari

    2017-06-01

    Full Text Available The study was to develop a Biology learning evaluation model in senior high schools that referred to the research and development model by Borg & Gall and the logic model. The evaluation model included the components of input, activities, output and outcomes. The developing procedures involved a preliminary study in the form of observation and theoretical review regarding the Biology learning evaluation in senior high schools. The product development was carried out by designing an evaluation model, designing an instrument, performing instrument experiment and performing implementation. The instrument experiment involved teachers and Students from Grade XII in senior high schools located in the City of Yogyakarta. For the data gathering technique and instrument, the researchers implemented observation sheet, questionnaire and test. The questionnaire was applied in order to attain information regarding teacher performance, learning performance, classroom atmosphere and scientific attitude; on the other hand, test was applied in order to attain information regarding Biology concept mastery. Then, for the analysis of instrument construct, the researchers performed confirmatory factor analysis by means of Lisrel 0.80 software and the results of this analysis showed that the evaluation instrument valid and reliable. The construct validity was between 0.43-0.79 while the reliability of measurement model was between 0.88-0.94. Last but not the least, the model feasibility test showed that the theoretical model had been supported by the empirical data.

  15. Sotsialistlik realist Keskküla

    Index Scriptorium Estoniae

    1998-01-01

    Londonis 1998. a. ilmunud inglise kunstikriitiku Matthew Cullerne Bowni monograafias "Socialist Realist Painting" on eesti kunstnikest Enn Põldroos, Nikolai Kormashov, Ando Keskküla, Kormashovi ja Keskküla maalide reproduktsioonid

  16. Modeling and simulation of high dimensional stochastic multiscale PDE systems at the exascale

    Energy Technology Data Exchange (ETDEWEB)

    Zabaras, Nicolas J. [Cornell Univ., Ithaca, NY (United States)

    2016-11-08

    Predictive Modeling of multiscale and Multiphysics systems requires accurate data driven characterization of the input uncertainties, and understanding of how they propagate across scales and alter the final solution. This project develops a rigorous mathematical framework and scalable uncertainty quantification algorithms to efficiently construct realistic low dimensional input models, and surrogate low complexity systems for the analysis, design, and control of physical systems represented by multiscale stochastic PDEs. The work can be applied to many areas including physical and biological processes, from climate modeling to systems biology.

  17. Continuous time Boolean modeling for biological signaling: application of Gillespie algorithm.

    Science.gov (United States)

    Stoll, Gautier; Viara, Eric; Barillot, Emmanuel; Calzone, Laurence

    2012-08-29

    Mathematical modeling is used as a Systems Biology tool to answer biological questions, and more precisely, to validate a network that describes biological observations and predict the effect of perturbations. This article presents an algorithm for modeling biological networks in a discrete framework with continuous time. There exist two major types of mathematical modeling approaches: (1) quantitative modeling, representing various chemical species concentrations by real numbers, mainly based on differential equations and chemical kinetics formalism; (2) and qualitative modeling, representing chemical species concentrations or activities by a finite set of discrete values. Both approaches answer particular (and often different) biological questions. Qualitative modeling approach permits a simple and less detailed description of the biological systems, efficiently describes stable state identification but remains inconvenient in describing the transient kinetics leading to these states. In this context, time is represented by discrete steps. Quantitative modeling, on the other hand, can describe more accurately the dynamical behavior of biological processes as it follows the evolution of concentration or activities of chemical species as a function of time, but requires an important amount of information on the parameters difficult to find in the literature. Here, we propose a modeling framework based on a qualitative approach that is intrinsically continuous in time. The algorithm presented in this article fills the gap between qualitative and quantitative modeling. It is based on continuous time Markov process applied on a Boolean state space. In order to describe the temporal evolution of the biological process we wish to model, we explicitly specify the transition rates for each node. For that purpose, we built a language that can be seen as a generalization of Boolean equations. Mathematically, this approach can be translated in a set of ordinary differential

  18. Are there realistically interpretable local theories?

    International Nuclear Information System (INIS)

    d'Espagnat, B.

    1989-01-01

    Although it rests on strongly established proofs, the statement that no realistically interpretable local theory is compatible with some experimentally testable predictions of quantum mechanics seems at first sight to be incompatible with a few general ideas and clear-cut statements occurring in recent theoretical work by Griffiths, Omnes, and Ballentine and Jarrett. It is shown here that in fact none of the developments due to these authors can be considered as a realistically interpretable local theory, so that there is no valid reason for suspecting that the existing proofs of the statement in question are all flawed

  19. Realistic Real-Time Outdoor Rendering in Augmented Reality

    Science.gov (United States)

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  20. Realistic real-time outdoor rendering in augmented reality.

    Directory of Open Access Journals (Sweden)

    Hoshang Kolivand

    Full Text Available Realistic rendering techniques of outdoor Augmented Reality (AR has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps. Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  1. An overview of bioinformatics methods for modeling biological pathways in yeast.

    Science.gov (United States)

    Hou, Jie; Acharya, Lipi; Zhu, Dongxiao; Cheng, Jianlin

    2016-03-01

    The advent of high-throughput genomics techniques, along with the completion of genome sequencing projects, identification of protein-protein interactions and reconstruction of genome-scale pathways, has accelerated the development of systems biology research in the yeast organism Saccharomyces cerevisiae In particular, discovery of biological pathways in yeast has become an important forefront in systems biology, which aims to understand the interactions among molecules within a cell leading to certain cellular processes in response to a specific environment. While the existing theoretical and experimental approaches enable the investigation of well-known pathways involved in metabolism, gene regulation and signal transduction, bioinformatics methods offer new insights into computational modeling of biological pathways. A wide range of computational approaches has been proposed in the past for reconstructing biological pathways from high-throughput datasets. Here we review selected bioinformatics approaches for modeling biological pathways inS. cerevisiae, including metabolic pathways, gene-regulatory pathways and signaling pathways. We start with reviewing the research on biological pathways followed by discussing key biological databases. In addition, several representative computational approaches for modeling biological pathways in yeast are discussed. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  2. Generating realistic images using Kray

    Science.gov (United States)

    Tanski, Grzegorz

    2004-07-01

    Kray is an application for creating realistic images. It is written in C++ programming language, has a text-based interface, solves global illumination problem using techniques such as radiosity, path tracing and photon mapping.

  3. Investigation of realistic PET simulations incorporating tumor patient's specificity using anthropomorphic models: Creation of an oncology database

    Energy Technology Data Exchange (ETDEWEB)

    Papadimitroulas, Panagiotis; Efthimiou, Nikos; Nikiforidis, George C.; Kagadis, George C. [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 265 04 (Greece); Loudos, George [Department of Biomedical Engineering, Technological Educational Institute of Athens, Ag. Spyridonos Street, Egaleo GR 122 10, Athens (Greece); Le Maitre, Amandine; Hatt, Mathieu; Tixier, Florent; Visvikis, Dimitris [Medical Information Processing Laboratory (LaTIM), National Institute of Health and Medical Research (INSERM), 29609 Brest (France)

    2013-11-15

    Purpose: The GATE Monte Carlo simulation toolkit is used for the implementation of realistic PET simulations incorporating tumor heterogeneous activity distributions. The reconstructed patient images include noise from the acquisition process, imaging system's performance restrictions and have limited spatial resolution. For those reasons, the measured intensity cannot be simply introduced in GATE simulations, to reproduce clinical data. Investigation of the heterogeneity distribution within tumors applying partial volume correction (PVC) algorithms was assessed. The purpose of the present study was to create a simulated oncology database based on clinical data with realistic intratumor uptake heterogeneity properties.Methods: PET/CT data of seven oncology patients were used in order to create a realistic tumor database investigating the heterogeneity activity distribution of the simulated tumors. The anthropomorphic models (NURBS based cardiac torso and Zubal phantoms) were adapted to the CT data of each patient, and the activity distribution was extracted from the respective PET data. The patient-specific models were simulated with the Monte Carlo Geant4 application for tomography emission (GATE) in three different levels for each case: (a) using homogeneous activity within the tumor, (b) using heterogeneous activity distribution in every voxel within the tumor as it was extracted from the PET image, and (c) using heterogeneous activity distribution corresponding to the clinical image following PVC. The three different types of simulated data in each case were reconstructed with two iterations and filtered with a 3D Gaussian postfilter, in order to simulate the intratumor heterogeneous uptake. Heterogeneity in all generated images was quantified using textural feature derived parameters in 3D according to the ground truth of the simulation, and compared to clinical measurements. Finally, profiles were plotted in central slices of the tumors, across lines

  4. River, delta and coastal morphological response accounting for biological dynamics

    Directory of Open Access Journals (Sweden)

    W. Goldsmith

    2015-03-01

    Full Text Available Management and construction can increase resilience in the face of climate change, and benefits can be enhanced through integration of biogenic materials including shells and vegetation. Rivers and coastal landforms are dynamic systems that respond to intentional and unintended manipulation of critical factors, often with unforeseen and/or undesirable resulting effects. River management strategies have impacts that include deltas and coastal areas which are increasingly vulnerable to climate change with reference to sea level rise and storm intensity. Whereas conventional assessment and analysis of rivers and coasts has relied on modelling of hydrology, hydraulics and sediment transport, incorporating additional biological factors can offer more comprehensive, beneficial and realistic alternatives. Suitable modelling tools can provide improved decision support. The question has been whether current models can effectively address biological responses with suitable reliability and efficiency. Since morphodynamic evolution exhibits its effects on a large timescale, the choice of mathematical model is not trivial and depends upon the availability of data, as well as the spatial extent, timelines and computation effort desired. The ultimate goal of the work is to set up a conveniently simplified river morphodynamic model, coupled with a biological dynamics plant population model able to predict the long-term evolution of large alluvial river systems managed through bioengineering. This paper presents the first step of the work related to the application of the model accounting for stationary vegetation condition. Sensitivity analysis has been performed on the main hydraulic, sedimentology, and biological parameters. The model has been applied to significant river training in Europe, Asia and North America, and comparative analysis has been used to validate analytical solutions. Data gaps and further areas for investigation are identified.

  5. The Strategies of Modeling in Biology Education

    Science.gov (United States)

    Svoboda, Julia; Passmore, Cynthia

    2013-01-01

    Modeling, like inquiry more generally, is not a single method, but rather a complex suite of strategies. Philosophers of biology, citing the diverse aims, interests, and disciplinary cultures of biologists, argue that modeling is best understood in the context of its epistemic aims and cognitive payoffs. In the science education literature,…

  6. Ultrafast spectroscopy of model biological membranes

    NARCIS (Netherlands)

    Ghosh, Avishek

    2009-01-01

    In this PhD thesis, I have described the novel time-resolved sum-frequency generation (TR-SFG) spectroscopic technique that I developed during the course of my PhD research and used it study the ultrafast vibrational, structural and orientational dynamics of water molecules at model biological

  7. Interactive wood combustion for botanical tree models

    KAUST Repository

    Pirk, Sören

    2017-11-22

    We present a novel method for the combustion of botanical tree models. Tree models are represented as connected particles for the branching structure and a polygonal surface mesh for the combustion. Each particle stores biological and physical attributes that drive the kinetic behavior of a plant and the exothermic reaction of the combustion. Coupled with realistic physics for rods, the particles enable dynamic branch motions. We model material properties, such as moisture and charring behavior, and associate them with individual particles. The combustion is efficiently processed in the surface domain of the tree model on a polygonal mesh. A user can dynamically interact with the model by initiating fires and by inducing stress on branches. The flames realistically propagate through the tree model by consuming the available resources. Our method runs at interactive rates and supports multiple tree instances in parallel. We demonstrate the effectiveness of our approach through numerous examples and evaluate its plausibility against the combustion of real wood samples.

  8. SEEK: a systems biology data and model management platform.

    NARCIS (Netherlands)

    Wolstencroft, K.J.; Owen, S.; Krebs, O.; Nguyen, Q.; Stanford, N.J.; Golebiewski, M.; Weidemann, A.; Bittkowski, M.; An, L.; Shockley, D.; Snoep, J.L.; Mueller, W.; Goble, C.

    2015-01-01

    Background: Systems biology research typically involves the integration and analysis of heterogeneous data types in order to model and predict biological processes. Researchers therefore require tools and resources to facilitate the sharing and integration of data, and for linking of data to systems

  9. BayesMD: flexible biological modeling for motif discovery

    DEFF Research Database (Denmark)

    Tang, Man-Hung Eric; Krogh, Anders; Winther, Ole

    2008-01-01

    We present BayesMD, a Bayesian Motif Discovery model with several new features. Three different types of biological a priori knowledge are built into the framework in a modular fashion. A mixture of Dirichlets is used as prior over nucleotide probabilities in binding sites. It is trained on trans......We present BayesMD, a Bayesian Motif Discovery model with several new features. Three different types of biological a priori knowledge are built into the framework in a modular fashion. A mixture of Dirichlets is used as prior over nucleotide probabilities in binding sites. It is trained...

  10. Effects of classical and neo-classical cross-field transport of tungsten impurity in realistic tokamak geometry

    Energy Technology Data Exchange (ETDEWEB)

    Yamoto, S.; Inoue, H.; Sawada, Y.; Hatayama, A. [Faculty of Science and Technology, Keio University, Yokohama (Japan); Homma, Y.; Hoshino, K. [Japan Atomic Energy Agency, Rokkasho, Aomori (Japan); Bonnin, X. [ITER Organization, St. Paul Lez Durance (France); Coster, D. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Schneider, R. [Ernst-Moritz-Arndt University Greifswald (Germany)

    2016-08-15

    The initial simulation study of the neoclassical perpendicular self-diffusion transport in the SOL/Divertor regions for a realistic tokamak geometry with the IMPGYRO code has been performed in this paper. One of the most unique features of the IMPGYRO code is calculating exact Larmor orbit of the test particle instead of assuming guiding center approximation. Therefore, effects of the magnetic drifts in realistic tokamaks are naturally taken into account in the IMPGYRO code. This feature makes it possible to calculate neoclassical transport processes, which possibly become large in the SOL/divertor plasma. Indeed, neoclassical self-diffusion process, the resultant effect of the combination of magnetic drift and Coulomb collisions with background ions, has already been included in the IMPGYRO model. In the present paper, prior to implementing the detailed model of neoclassical transport process into IMPGYRO, we have investigated the effect of neoclassical selfdiffusion in a realistic tokamak geometry with lower single null X-point. We also use a model with guiding center approximation in order to compare with the IMPGYRO full orbit model. The preliminary calculation results of each model have shown differences in the perpendicular average velocity of impurity ions at the top region of the SOL. The mechanism which leads to the difference has been discussed. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  11. Preservice Biology Teachers' Conceptions about the Tentative Nature of Theories and Models in Biology

    Science.gov (United States)

    Reinisch, Bianca; Krüger, Dirk

    2018-01-01

    In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers (N = 10) were asked about their understanding of theories…

  12. Toward realistic pursuit-evasion using a roadmap-based approach

    KAUST Repository

    Rodriguez, Samuel; Denny, Jory; Burgos, Juan; Mahadevan, Aditya; Manavi, Kasra; Murray, Luke; Kodochygov, Anton; Zourntos, Takis; Amato, Nancy M.

    2011-01-01

    be applied to more realistic scenarios than are typically studied in most previous work, including agents moving in 3D environments such as terrains, multi-story buildings, and dynamic environments. We also support more realistic three-dimensional visibility

  13. On Realistically Attacking Tor with Website Fingerprinting

    Directory of Open Access Journals (Sweden)

    Wang Tao

    2016-10-01

    Full Text Available Website fingerprinting allows a local, passive observer monitoring a web-browsing client’s encrypted channel to determine her web activity. Previous attacks have shown that website fingerprinting could be a threat to anonymity networks such as Tor under laboratory conditions. However, there are significant differences between laboratory conditions and realistic conditions. First, in laboratory tests we collect the training data set together with the testing data set, so the training data set is fresh, but an attacker may not be able to maintain a fresh data set. Second, laboratory packet sequences correspond to a single page each, but for realistic packet sequences the split between pages is not obvious. Third, packet sequences may include background noise from other types of web traffic. These differences adversely affect website fingerprinting under realistic conditions. In this paper, we tackle these three problems to bridge the gap between laboratory and realistic conditions for website fingerprinting. We show that we can maintain a fresh training set with minimal resources. We demonstrate several classification-based techniques that allow us to split full packet sequences effectively into sequences corresponding to a single page each. We describe several new algorithms for tackling background noise. With our techniques, we are able to build the first website fingerprinting system that can operate directly on packet sequences collected in the wild.

  14. Combining NMR ensembles and molecular dynamics simulations provides more realistic models of protein structures in solution and leads to better chemical shift prediction

    International Nuclear Information System (INIS)

    Lehtivarjo, Juuso; Tuppurainen, Kari; Hassinen, Tommi; Laatikainen, Reino; Peräkylä, Mikael

    2012-01-01

    While chemical shifts are invaluable for obtaining structural information from proteins, they also offer one of the rare ways to obtain information about protein dynamics. A necessary tool in transforming chemical shifts into structural and dynamic information is chemical shift prediction. In our previous work we developed a method for 4D prediction of protein 1 H chemical shifts in which molecular motions, the 4th dimension, were modeled using molecular dynamics (MD) simulations. Although the approach clearly improved the prediction, the X-ray structures and single NMR conformers used in the model cannot be considered fully realistic models of protein in solution. In this work, NMR ensembles (NMRE) were used to expand the conformational space of proteins (e.g. side chains, flexible loops, termini), followed by MD simulations for each conformer to map the local fluctuations. Compared with the non-dynamic model, the NMRE+MD model gave 6–17% lower root-mean-square (RMS) errors for different backbone nuclei. The improved prediction indicates that NMR ensembles with MD simulations can be used to obtain a more realistic picture of protein structures in solutions and moreover underlines the importance of short and long time-scale dynamics for the prediction. The RMS errors of the NMRE+MD model were 0.24, 0.43, 0.98, 1.03, 1.16 and 2.39 ppm for 1 Hα, 1 HN, 13 Cα, 13 Cβ, 13 CO and backbone 15 N chemical shifts, respectively. The model is implemented in the prediction program 4DSPOT, available at http://www.uef.fi/4dspothttp://www.uef.fi/4dspot.

  15. Combining NMR ensembles and molecular dynamics simulations provides more realistic models of protein structures in solution and leads to better chemical shift prediction

    Energy Technology Data Exchange (ETDEWEB)

    Lehtivarjo, Juuso, E-mail: juuso.lehtivarjo@uef.fi; Tuppurainen, Kari; Hassinen, Tommi; Laatikainen, Reino [University of Eastern Finland, School of Pharmacy (Finland); Peraekylae, Mikael [University of Eastern Finland, Institute of Biomedicine (Finland)

    2012-03-15

    While chemical shifts are invaluable for obtaining structural information from proteins, they also offer one of the rare ways to obtain information about protein dynamics. A necessary tool in transforming chemical shifts into structural and dynamic information is chemical shift prediction. In our previous work we developed a method for 4D prediction of protein {sup 1}H chemical shifts in which molecular motions, the 4th dimension, were modeled using molecular dynamics (MD) simulations. Although the approach clearly improved the prediction, the X-ray structures and single NMR conformers used in the model cannot be considered fully realistic models of protein in solution. In this work, NMR ensembles (NMRE) were used to expand the conformational space of proteins (e.g. side chains, flexible loops, termini), followed by MD simulations for each conformer to map the local fluctuations. Compared with the non-dynamic model, the NMRE+MD model gave 6-17% lower root-mean-square (RMS) errors for different backbone nuclei. The improved prediction indicates that NMR ensembles with MD simulations can be used to obtain a more realistic picture of protein structures in solutions and moreover underlines the importance of short and long time-scale dynamics for the prediction. The RMS errors of the NMRE+MD model were 0.24, 0.43, 0.98, 1.03, 1.16 and 2.39 ppm for {sup 1}H{alpha}, {sup 1}HN, {sup 13}C{alpha}, {sup 13}C{beta}, {sup 13}CO and backbone {sup 15}N chemical shifts, respectively. The model is implemented in the prediction program 4DSPOT, available at http://www.uef.fi/4dspothttp://www.uef.fi/4dspot.

  16. Normal and Pathological NCAT Image and PhantomData Based onPhysiologically Realistic Left Ventricle Finite-Element Models

    Energy Technology Data Exchange (ETDEWEB)

    Veress, Alexander I.; Segars, W. Paul; Weiss, Jeffrey A.; Tsui,Benjamin M.W.; Gullberg, Grant T.

    2006-08-02

    The 4D NURBS-based Cardiac-Torso (NCAT) phantom, whichprovides a realistic model of the normal human anatomy and cardiac andrespiratory motions, is used in medical imaging research to evaluate andimprove imaging devices and techniques, especially dynamic cardiacapplications. One limitation of the phantom is that it lacks the abilityto accurately simulate altered functions of the heart that result fromcardiac pathologies such as coronary artery disease (CAD). The goal ofthis work was to enhance the 4D NCAT phantom by incorporating aphysiologically based, finite-element (FE) mechanical model of the leftventricle (LV) to simulate both normal and abnormal cardiac motions. Thegeometry of the FE mechanical model was based on gated high-resolutionx-ray multi-slice computed tomography (MSCT) data of a healthy malesubject. The myocardial wall was represented as transversely isotropichyperelastic material, with the fiber angle varying from -90 degrees atthe epicardial surface, through 0 degreesat the mid-wall, to 90 degreesat the endocardial surface. A time varying elastance model was used tosimulate fiber contraction, and physiological intraventricular systolicpressure-time curves were applied to simulate the cardiac motion over theentire cardiac cycle. To demonstrate the ability of the FE mechanicalmodel to accurately simulate the normal cardiac motion as well abnormalmotions indicative of CAD, a normal case and two pathologic cases weresimulated and analyzed. In the first pathologic model, a subendocardialanterior ischemic region was defined. A second model was created with atransmural ischemic region defined in the same location. The FE baseddeformations were incorporated into the 4D NCAT cardiac model through thecontrol points that define the cardiac structures in the phantom whichwere set to move according to the predictions of the mechanical model. Asimulation study was performed using the FE-NCAT combination toinvestigate how the differences in contractile function

  17. Iterated interactions method. Realistic NN potential

    International Nuclear Information System (INIS)

    Gorbatov, A.M.; Skopich, V.L.; Kolganova, E.A.

    1991-01-01

    The method of iterated potential is tested in the case of realistic fermionic systems. As a base for comparison calculations of the 16 O system (using various versions of realistic NN potentials) by means of the angular potential-function method as well as operators of pairing correlation were used. The convergence of genealogical series is studied for the central Malfliet-Tjon potential. In addition the mathematical technique of microscopical calculations is improved: new equations for correlators in odd states are suggested and the technique of leading terms was applied for the first time to calculations of heavy p-shell nuclei in the basis of angular potential functions

  18. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  19. A novel model to assess the efficacy of steam surface pasteurization of cooked surimi gels inoculated with realistic levels of Listeria innocua.

    Science.gov (United States)

    Skåra, Torstein; Valdramidis, Vasilis P; Rosnes, Jan Thomas; Noriega, Estefanía; Van Impe, Jan F M

    2014-12-01

    Steam surface pasteurization is a promising decontamination technology for reducing pathogenic bacteria in different stages of food production. The effect of the artificial inoculation type and initial microbial load, however, has not been thoroughly assessed in the context of inactivation studies. In order to optimize the efficacy of the technology, the aim of this study was to design and validate a model system for steam surface pasteurization, assessing different inoculation methods and realistic microbial levels. More specifically, the response of Listeria innocua, a surrogate organism of Listeria monocytogenes, on a model fish product, and the effect of different inoculation levels following treatments with a steam surface pasteurization system was investigated. The variation in the resulting inoculation level on the samples was too large (77%) for the contact inoculation procedure to be further considered. In contrast, the variation of a drop inoculation procedure was 17%. Inoculation with high levels showed a rapid 1-2 log decrease after 3-5 s, and then no further inactivation beyond 20 s. A low level inoculation study was performed by analysing the treated samples using a novel contact plating approach, which can be performed without sample homogenization and dilution. Using logistic regression, results from this method were used to model the binary responses of Listeria on surfaces with realistic inoculation levels. According to this model, a treatment time of 23 s will result in a 1 log reduction (for P = 0.1). Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Invited review liquid crystal models of biological materials and silk spinning.

    Science.gov (United States)

    Rey, Alejandro D; Herrera-Valencia, Edtson E

    2012-06-01

    A review of thermodynamic, materials science, and rheological liquid crystal models is presented and applied to a wide range of biological liquid crystals, including helicoidal plywoods, biopolymer solutions, and in vivo liquid crystals. The distinguishing characteristics of liquid crystals (self-assembly, packing, defects, functionalities, processability) are discussed in relation to biological materials and the strong correspondence between different synthetic and biological materials is established. Biological polymer processing based on liquid crystalline precursors includes viscoelastic flow to form and shape fibers. Viscoelastic models for nematic and chiral nematics are reviewed and discussed in terms of key parameters that facilitate understanding and quantitative information from optical textures and rheometers. It is shown that viscoelastic modeling the silk spinning process using liquid crystal theories sheds light on textural transitions in the duct of spiders and silk worms as well as on tactoidal drops and interfacial structures. The range and consistency of the predictions demonstrates that the use of mesoscopic liquid crystal models is another tool to develop the science and biomimetic applications of mesogenic biological soft matter. Copyright © 2011 Wiley Periodicals, Inc.

  1. Dynamics of leaf gas exchange, xylem and phloem transport, water potential and carbohydrate concentration in a realistic 3-D model tree crown.

    Science.gov (United States)

    Nikinmaa, Eero; Sievänen, Risto; Hölttä, Teemu

    2014-09-01

    Tree models simulate productivity using general gas exchange responses and structural relationships, but they rarely check whether leaf gas exchange and resulting water and assimilate transport and driving pressure gradients remain within acceptable physical boundaries. This study presents an implementation of the cohesion-tension theory of xylem transport and the Münch hypothesis of phloem transport in a realistic 3-D tree structure and assesses the gas exchange and transport dynamics. A mechanistic model of xylem and phloem transport was used, together with a tested leaf assimilation and transpiration model in a realistic tree architecture to simulate leaf gas exchange and water and carbohydrate transport within an 8-year-old Scots pine tree. The model solved the dynamics of the amounts of water and sucrose solute in the xylem, cambium and phloem using a fine-grained mesh with a system of coupled ordinary differential equations. The simulations predicted the observed patterns of pressure gradients and sugar concentration. Diurnal variation of environmental conditions influenced tree-level gradients in turgor pressure and sugar concentration, which are important drivers of carbon allocation. The results and between-shoot variation were sensitive to structural and functional parameters such as tree-level scaling of conduit size and phloem unloading. Linking whole-tree-level water and assimilate transport, gas exchange and sink activity opens a new avenue for plant studies, as features that are difficult to measure can be studied dynamically with the model. Tree-level responses to local and external conditions can be tested, thus making the approach described here a good test-bench for studies of whole-tree physiology.

  2. A realistic multimodal modeling approach for the evaluation of distributed source analysis: application to sLORETA

    Science.gov (United States)

    Cosandier-Rimélé, D.; Ramantani, G.; Zentner, J.; Schulze-Bonhage, A.; Dümpelmann, M.

    2017-10-01

    Objective. Electrical source localization (ESL) deriving from scalp EEG and, in recent years, from intracranial EEG (iEEG), is an established method in epilepsy surgery workup. We aimed to validate the distributed ESL derived from scalp EEG and iEEG, particularly regarding the spatial extent of the source, using a realistic epileptic spike activity simulator. Approach. ESL was applied to the averaged scalp EEG and iEEG spikes of two patients with drug-resistant structural epilepsy. The ESL results for both patients were used to outline the location and extent of epileptic cortical patches, which served as the basis for designing a spatiotemporal source model. EEG signals for both modalities were then generated for different anatomic locations and spatial extents. ESL was subsequently performed on simulated signals with sLORETA, a commonly used distributed algorithm. ESL accuracy was quantitatively assessed for iEEG and scalp EEG. Main results. The source volume was overestimated by sLORETA at both EEG scales, with the error increasing with source size, particularly for iEEG. For larger sources, ESL accuracy drastically decreased, and reconstruction volumes shifted to the center of the head for iEEG, while remaining stable for scalp EEG. Overall, the mislocalization of the reconstructed source was more pronounced for iEEG. Significance. We present a novel multiscale framework for the evaluation of distributed ESL, based on realistic multiscale EEG simulations. Our findings support that reconstruction results for scalp EEG are often more accurate than for iEEG, owing to the superior 3D coverage of the head. Particularly the iEEG-derived reconstruction results for larger, widespread generators should be treated with caution.

  3. A realistic multimodal modeling approach for the evaluation of distributed source analysis: application to sLORETA.

    Science.gov (United States)

    Cosandier-Rimélé, D; Ramantani, G; Zentner, J; Schulze-Bonhage, A; Dümpelmann, M

    2017-10-01

    Electrical source localization (ESL) deriving from scalp EEG and, in recent years, from intracranial EEG (iEEG), is an established method in epilepsy surgery workup. We aimed to validate the distributed ESL derived from scalp EEG and iEEG, particularly regarding the spatial extent of the source, using a realistic epileptic spike activity simulator. ESL was applied to the averaged scalp EEG and iEEG spikes of two patients with drug-resistant structural epilepsy. The ESL results for both patients were used to outline the location and extent of epileptic cortical patches, which served as the basis for designing a spatiotemporal source model. EEG signals for both modalities were then generated for different anatomic locations and spatial extents. ESL was subsequently performed on simulated signals with sLORETA, a commonly used distributed algorithm. ESL accuracy was quantitatively assessed for iEEG and scalp EEG. The source volume was overestimated by sLORETA at both EEG scales, with the error increasing with source size, particularly for iEEG. For larger sources, ESL accuracy drastically decreased, and reconstruction volumes shifted to the center of the head for iEEG, while remaining stable for scalp EEG. Overall, the mislocalization of the reconstructed source was more pronounced for iEEG. We present a novel multiscale framework for the evaluation of distributed ESL, based on realistic multiscale EEG simulations. Our findings support that reconstruction results for scalp EEG are often more accurate than for iEEG, owing to the superior 3D coverage of the head. Particularly the iEEG-derived reconstruction results for larger, widespread generators should be treated with caution.

  4. Development of a Value Inquiry Model in Biology Education.

    Science.gov (United States)

    Jeong, Eun-Young; Kim, Young-Soo

    2000-01-01

    Points out the rapid advances in biology, increasing bioethical issues, and how students need to make rational decisions. Introduces a value inquiry model development that includes identifying and clarifying value problems; understanding biological knowledge related to conflict situations; considering, selecting, and evaluating each alternative;…

  5. Evolving cell models for systems and synthetic biology.

    Science.gov (United States)

    Cao, Hongqing; Romero-Campero, Francisco J; Heeb, Stephan; Cámara, Miguel; Krasnogor, Natalio

    2010-03-01

    This paper proposes a new methodology for the automated design of cell models for systems and synthetic biology. Our modelling framework is based on P systems, a discrete, stochastic and modular formal modelling language. The automated design of biological models comprising the optimization of the model structure and its stochastic kinetic constants is performed using an evolutionary algorithm. The evolutionary algorithm evolves model structures by combining different modules taken from a predefined module library and then it fine-tunes the associated stochastic kinetic constants. We investigate four alternative objective functions for the fitness calculation within the evolutionary algorithm: (1) equally weighted sum method, (2) normalization method, (3) randomly weighted sum method, and (4) equally weighted product method. The effectiveness of the methodology is tested on four case studies of increasing complexity including negative and positive autoregulation as well as two gene networks implementing a pulse generator and a bandwidth detector. We provide a systematic analysis of the evolutionary algorithm's results as well as of the resulting evolved cell models.

  6. Realistic modelling of external flooding scenarios - A multi-disciplinary approach

    International Nuclear Information System (INIS)

    Brinkman, J.L.

    2014-01-01

    against flooding and timing of the events into account as basis for the development and screening of flooding scenarios. Realistic modelling of external flooding scenarios in a PSA requires a multi-disciplinary approach. Next to being thoroughly familiar with the design features of the plant against flooding, like its critical elevations for safety (related) equipment and the strength of buildings, additional knowledge is necessary on design of flood protection measures as dikes and dunes, their failure behaviour and modelling. The approach does not change the basic flooding scenarios - the event tree structure - itself, but impacts the initiating event of the specific flooding scenarios. (authors)

  7. Programming biological models in Python using PySB.

    Science.gov (United States)

    Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K

    2013-01-01

    Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.

  8. The effect of problem posing and problem solving with realistic mathematics education approach to the conceptual understanding and adaptive reasoning

    Science.gov (United States)

    Mahendra, Rengga; Slamet, Isnandar; Budiyono

    2017-12-01

    One of the difficulties of students in learning mathematics is on the subject of geometry that requires students to understand abstract things. The aim of this research is to determine the effect of learning model Problem Posing and Problem Solving with Realistic Mathematics Education Approach to conceptual understanding and students' adaptive reasoning in learning mathematics. This research uses a kind of quasi experimental research. The population of this research is all seventh grade students of Junior High School 1 Jaten, Indonesia. The sample was taken using stratified cluster random sampling technique. The test of the research hypothesis was analyzed by using t-test. The results of this study indicate that the model of Problem Posing learning with Realistic Mathematics Education Approach can improve students' conceptual understanding significantly in mathematics learning. In addition tu, the results also showed that the model of Problem Solving learning with Realistic Mathematics Education Approach can improve students' adaptive reasoning significantly in learning mathematics. Therefore, the model of Problem Posing and Problem Solving learning with Realistic Mathematics Education Approach is appropriately applied in mathematics learning especially on the subject of geometry so as to improve conceptual understanding and students' adaptive reasoning. Furthermore, the impact can improve student achievement.

  9. Stochastic processes, multiscale modeling, and numerical methods for computational cellular biology

    CERN Document Server

    2017-01-01

    This book focuses on the modeling and mathematical analysis of stochastic dynamical systems along with their simulations. The collected chapters will review fundamental and current topics and approaches to dynamical systems in cellular biology. This text aims to develop improved mathematical and computational methods with which to study biological processes. At the scale of a single cell, stochasticity becomes important due to low copy numbers of biological molecules, such as mRNA and proteins that take part in biochemical reactions driving cellular processes. When trying to describe such biological processes, the traditional deterministic models are often inadequate, precisely because of these low copy numbers. This book presents stochastic models, which are necessary to account for small particle numbers and extrinsic noise sources. The complexity of these models depend upon whether the biochemical reactions are diffusion-limited or reaction-limited. In the former case, one needs to adopt the framework of s...

  10. A distributed approach for parameters estimation in System Biology models

    International Nuclear Information System (INIS)

    Mosca, E.; Merelli, I.; Alfieri, R.; Milanesi, L.

    2009-01-01

    Due to the lack of experimental measurements, biological variability and experimental errors, the value of many parameters of the systems biology mathematical models is yet unknown or uncertain. A possible computational solution is the parameter estimation, that is the identification of the parameter values that determine the best model fitting respect to experimental data. We have developed an environment to distribute each run of the parameter estimation algorithm on a different computational resource. The key feature of the implementation is a relational database that allows the user to swap the candidate solutions among the working nodes during the computations. The comparison of the distributed implementation with the parallel one showed that the presented approach enables a faster and better parameter estimation of systems biology models.

  11. Modeling biological gradient formation: combining partial differential equations and Petri nets.

    Science.gov (United States)

    Bertens, Laura M F; Kleijn, Jetty; Hille, Sander C; Heiner, Monika; Koutny, Maciej; Verbeek, Fons J

    2016-01-01

    Both Petri nets and differential equations are important modeling tools for biological processes. In this paper we demonstrate how these two modeling techniques can be combined to describe biological gradient formation. Parameters derived from partial differential equation describing the process of gradient formation are incorporated in an abstract Petri net model. The quantitative aspects of the resulting model are validated through a case study of gradient formation in the fruit fly.

  12. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C

    2004-01-01

    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  13. Structural Identifiability of Dynamic Systems Biology Models.

    Science.gov (United States)

    Villaverde, Alejandro F; Barreiro, Antonio; Papachristodoulou, Antonis

    2016-10-01

    A powerful way of gaining insight into biological systems is by creating a nonlinear differential equation model, which usually contains many unknown parameters. Such a model is called structurally identifiable if it is possible to determine the values of its parameters from measurements of the model outputs. Structural identifiability is a prerequisite for parameter estimation, and should be assessed before exploiting a model. However, this analysis is seldom performed due to the high computational cost involved in the necessary symbolic calculations, which quickly becomes prohibitive as the problem size increases. In this paper we show how to analyse the structural identifiability of a very general class of nonlinear models by extending methods originally developed for studying observability. We present results about models whose identifiability had not been previously determined, report unidentifiabilities that had not been found before, and show how to modify those unidentifiable models to make them identifiable. This method helps prevent problems caused by lack of identifiability analysis, which can compromise the success of tasks such as experiment design, parameter estimation, and model-based optimization. The procedure is called STRIKE-GOLDD (STRuctural Identifiability taKen as Extended-Generalized Observability with Lie Derivatives and Decomposition), and it is implemented in a MATLAB toolbox which is available as open source software. The broad applicability of this approach facilitates the analysis of the increasingly complex models used in systems biology and other areas.

  14. A methodology to annotate systems biology markup language models with the synthetic biology open language.

    Science.gov (United States)

    Roehner, Nicholas; Myers, Chris J

    2014-02-21

    Recently, we have begun to witness the potential of synthetic biology, noted here in the form of bacteria and yeast that have been genetically engineered to produce biofuels, manufacture drug precursors, and even invade tumor cells. The success of these projects, however, has often failed in translation and application to new projects, a problem exacerbated by a lack of engineering standards that combine descriptions of the structure and function of DNA. To address this need, this paper describes a methodology to connect the systems biology markup language (SBML) to the synthetic biology open language (SBOL), existing standards that describe biochemical models and DNA components, respectively. Our methodology involves first annotating SBML model elements such as species and reactions with SBOL DNA components. A graph is then constructed from the model, with vertices corresponding to elements within the model and edges corresponding to the cause-and-effect relationships between these elements. Lastly, the graph is traversed to assemble the annotating DNA components into a composite DNA component, which is used to annotate the model itself and can be referenced by other composite models and DNA components. In this way, our methodology can be used to build up a hierarchical library of models annotated with DNA components. Such a library is a useful input to any future genetic technology mapping algorithm that would automate the process of composing DNA components to satisfy a behavioral specification. Our methodology for SBML-to-SBOL annotation is implemented in the latest version of our genetic design automation (GDA) software tool, iBioSim.

  15. Basic science through engineering? Synthetic modeling and the idea of biology-inspired engineering.

    Science.gov (United States)

    Knuuttila, Tarja; Loettgers, Andrea

    2013-06-01

    Synthetic biology is often understood in terms of the pursuit for well-characterized biological parts to create synthetic wholes. Accordingly, it has typically been conceived of as an engineering dominated and application oriented field. We argue that the relationship of synthetic biology to engineering is far more nuanced than that and involves a sophisticated epistemic dimension, as shown by the recent practice of synthetic modeling. Synthetic models are engineered genetic networks that are implanted in a natural cell environment. Their construction is typically combined with experiments on model organisms as well as mathematical modeling and simulation. What is especially interesting about this combinational modeling practice is that, apart from greater integration between these different epistemic activities, it has also led to the questioning of some central assumptions and notions on which synthetic biology is based. As a result synthetic biology is in the process of becoming more "biology inspired." Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Modelling effects of diquat under realistic exposure patterns in genetically differentiated populations of the gastropod Lymnaea stagnalis.

    Science.gov (United States)

    Ducrot, Virginie; Péry, Alexandre R R; Lagadic, Laurent

    2010-11-12

    Pesticide use leads to complex exposure and response patterns in non-target aquatic species, so that the analysis of data from standard toxicity tests may result in unrealistic risk forecasts. Developing models that are able to capture such complexity from toxicity test data is thus a crucial issue for pesticide risk assessment. In this study, freshwater snails from two genetically differentiated populations of Lymnaea stagnalis were exposed to repeated acute applications of environmentally realistic concentrations of the herbicide diquat, from the embryo to the adult stage. Hatching rate, embryonic development duration, juvenile mortality, feeding rate and age at first spawning were investigated during both exposure and recovery periods. Effects of diquat on mortality were analysed using a threshold hazard model accounting for time-varying herbicide concentrations. All endpoints were significantly impaired at diquat environmental concentrations in both populations. Snail evolutionary history had no significant impact on their sensitivity and responsiveness to diquat, whereas food acted as a modulating factor of toxicant-induced mortality. The time course of effects was adequately described by the model, which thus appears suitable to analyse long-term effects of complex exposure patterns based upon full life cycle experiment data. Obtained model outputs (e.g. no-effect concentrations) could be directly used for chemical risk assessment.

  17. Normal and Pathological NCAT Image and Phantom Data Based on Physiologically Realistic Left Ventricle Finite-Element Models

    International Nuclear Information System (INIS)

    Veress, Alexander I.; Segars, W. Paul; Weiss, Jeffrey A.; Tsui, Benjamin M.W.; Gullberg, Grant T.

    2006-01-01

    The 4D NURBS-based Cardiac-Torso (NCAT) phantom, which provides a realistic model of the normal human anatomy and cardiac and respiratory motions, is used in medical imaging research to evaluate and improve imaging devices and techniques, especially dynamic cardiac applications. One limitation of the phantom is that it lacks the ability to accurately simulate altered functions of the heart that result from cardiac pathologies such as coronary artery disease (CAD). The goal of this work was to enhance the 4D NCAT phantom by incorporating a physiologically based, finite-element (FE) mechanical model of the left ventricle (LV) to simulate both normal and abnormal cardiac motions. The geometry of the FE mechanical model was based on gated high-resolution x-ray multi-slice computed tomography (MSCT) data of a healthy male subject. The myocardial wall was represented as transversely isotropichyperelastic material, with the fiber angle varying from -90 degrees at the epicardial surface, through 0 degrees at the mid-wall, to 90 degrees at the endocardial surface. A time varying elastance model was used to simulate fiber contraction, and physiological intraventricular systolic pressure-time curves were applied to simulate the cardiac motion over the entire cardiac cycle. To demonstrate the ability of the FE mechanical model to accurately simulate the normal cardiac motion as well abnormal motions indicative of CAD, a normal case and two pathologic cases were simulated and analyzed. In the first pathologic model, a subendocardial anterior ischemic region was defined. A second model was created with a transmural ischemic region defined in the same location. The FE based deformations were incorporated into the 4D NCAT cardiac model through the control points that define the cardiac structures in the phantom which were set to move according to the predictions of the mechanical model. A simulation study was performed using the FE-NCAT combination to investigate how the

  18. Dynamic apeerture in damping rings with realistic wigglers

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Yunhai; /SLAC

    2005-05-04

    The International Linear Collider based on superconducting RF cavities requires the damping rings to have extremely small equilibrium emittance, huge circumference, fast damping time, and large acceptance. To achieve all of these requirements is a very challenging task. In this paper, we will present a systematic approach to designing the damping rings using simple cells and non-interlaced sextupoles. The designs of the damping rings with various circumferences and shapes, including dogbone, are presented. To model realistic wigglers, we have developed a new hybrid symplectic integrator for faster and accurate evaluation of dynamic aperture of the lattices.

  19. ADAM: analysis of discrete models of biological systems using computer algebra.

    Science.gov (United States)

    Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard

    2011-07-20

    Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web

  20. Modeling dynamics of biological and chemical components of aquatic ecosystems

    International Nuclear Information System (INIS)

    Lassiter, R.R.

    1975-05-01

    To provide capability to model aquatic ecosystems or their subsystems as needed for particular research goals, a modeling strategy was developed. Submodels of several processes common to aquatic ecosystems were developed or adapted from previously existing ones. Included are submodels for photosynthesis as a function of light and depth, biological growth rates as a function of temperature, dynamic chemical equilibrium, feeding and growth, and various types of losses to biological populations. These submodels may be used as modules in the construction of models of subsystems or ecosystems. A preliminary model for the nitrogen cycle subsystem was developed using the modeling strategy and applicable submodels. (U.S.)

  1. A Low-cost System for Generating Near-realistic Virtual Actors

    Science.gov (United States)

    Afifi, Mahmoud; Hussain, Khaled F.; Ibrahim, Hosny M.; Omar, Nagwa M.

    2015-06-01

    Generating virtual actors is one of the most challenging fields in computer graphics. The reconstruction of a realistic virtual actor has been paid attention by the academic research and the film industry to generate human-like virtual actors. Many movies were acted by human-like virtual actors, where the audience cannot distinguish between real and virtual actors. The synthesis of realistic virtual actors is considered a complex process. Many techniques are used to generate a realistic virtual actor; however they usually require expensive hardware equipment. In this paper, a low-cost system that generates near-realistic virtual actors is presented. The facial features of the real actor are blended with a virtual head that is attached to the actor's body. Comparing with other techniques that generate virtual actors, the proposed system is considered a low-cost system that requires only one camera that records the scene without using any expensive hardware equipment. The results of our system show that the system generates good near-realistic virtual actors that can be used on many applications.

  2. A framework to establish credibility of computational models in biology.

    Science.gov (United States)

    Patterson, Eann A; Whelan, Maurice P

    2017-10-01

    Computational models in biology and biomedical science are often constructed to aid people's understanding of phenomena or to inform decisions with socioeconomic consequences. Model credibility is the willingness of people to trust a model's predictions and is often difficult to establish for computational biology models. A 3 × 3 matrix has been proposed to allow such models to be categorised with respect to their testability and epistemic foundation in order to guide the selection of an appropriate process of validation to supply evidence to establish credibility. Three approaches to validation are identified that can be deployed depending on whether a model is deemed untestable, testable or lies somewhere in between. In the latter two cases, the validation process involves the quantification of uncertainty which is a key output. The issues arising due to the complexity and inherent variability of biological systems are discussed and the creation of 'digital twins' proposed as a means to alleviate the issues and provide a more robust, transparent and traceable route to model credibility and acceptance. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Mathematics Instructional Model Based on Realistic Mathematics Education to Promote Problem Solving Ability at Junior High School Padang

    Directory of Open Access Journals (Sweden)

    Edwin Musdi

    2016-02-01

    Full Text Available This research aims to develop a mathematics instructional model based realistic mathematics education (RME to promote students' problem-solving abilities. The design research used Plomp models, which consists of preliminary phase, development or proto-typing phase and assessment phase.  At this study, only the first two phases conducted. The first phase, a preliminary investigation, carried out with a literature study to examine the theory-based instructional learning RME model, characteristics of learners, learning management descriptions by junior high school mathematics teacher and relevant research. The development phase is done by developing a draft model (an early prototype model that consists of the syntax, the social system, the principle of reaction, support systems, and the impact and effects of instructional support. Early prototype model contain a draft model, lesson plans, worksheets, and assessments. Tesssmer formative evaluation model used to revise the model. In this study only phase of one to one evaluation conducted. In the ppreliminary phase has produced a theory-based learning RME model, a description of the characteristics of learners in grade VIII Junior High School Padang and the description of teacher teaching in the classroom. The result showed that most students were still not be able to solve the non-routine problem. Teachers did not optimally facilitate students to develop problem-solving skills of students. It was recommended that the model can be applied in the classroom.

  4. Trait-Dependent Biogeography: (Re)Integrating Biology into Probabilistic Historical Biogeographical Models.

    Science.gov (United States)

    Sukumaran, Jeet; Knowles, L Lacey

    2018-04-20

    The development of process-based probabilistic models for historical biogeography has transformed the field by grounding it in modern statistical hypothesis testing. However, most of these models abstract away biological differences, reducing species to interchangeable lineages. We present here the case for reintegration of biology into probabilistic historical biogeographical models, allowing a broader range of questions about biogeographical processes beyond ancestral range estimation or simple correlation between a trait and a distribution pattern, as well as allowing us to assess how inferences about ancestral ranges themselves might be impacted by differential biological traits. We show how new approaches to inference might cope with the computational challenges resulting from the increased complexity of these trait-based historical biogeographical models. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Convective aggregation in realistic convective-scale simulations

    Science.gov (United States)

    Holloway, Christopher E.

    2017-06-01

    To investigate the real-world relevance of idealized-model convective self-aggregation, five 15 day cases of real organized convection in the tropics are simulated. These include multiple simulations of each case to test sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. These simulations are compared to self-aggregation seen in the same model configured to run in idealized radiative-convective equilibrium. Analysis of the budget of the spatial variance of column-integrated frozen moist static energy shows that control runs have significant positive contributions to organization from radiation and negative contributions from surface fluxes and transport, similar to idealized runs once they become aggregated. Despite identical lateral boundary conditions for all experiments in each case, systematic differences in mean column water vapor (CWV), CWV distribution shape, and CWV autocorrelation length scale are found between the different sensitivity runs, particularly for those without interactive radiation, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations (although the organization of precipitation shows less sensitivity to interactive radiation). The magnitudes and signs of these systematic differences are consistent with a rough equilibrium between (1) equalization due to advection from the lateral boundaries and (2) disaggregation due to the absence of interactive radiation, implying disaggregation rates comparable to those in idealized runs with aggregated initial conditions and noninteractive radiation. This points to a plausible similarity in the way that radiation feedbacks maintain aggregated convection in both idealized simulations and the real world.Plain Language SummaryUnderstanding the processes that lead to the organization of tropical rainstorms is an important challenge for weather

  6. Enterococcus infection biology: lessons from invertebrate host models.

    Science.gov (United States)

    Yuen, Grace J; Ausubel, Frederick M

    2014-03-01

    The enterococci are commensals of the gastrointestinal tract of many metazoans, from insects to humans. While they normally do not cause disease in the intestine, they can become pathogenic when they infect sites outside of the gut. Recently, the enterococci have become important nosocomial pathogens, with the majority of human enterococcal infections caused by two species, Enterococcus faecalis and Enterococcus faecium. Studies using invertebrate infection models have revealed insights into the biology of enterococcal infections, as well as general principles underlying host innate immune defense. This review highlights recent findings on Enterococcus infection biology from two invertebrate infection models, the greater wax moth Galleria mellonella and the free-living bacteriovorous nematode Caenorhabditis elegans.

  7. Progress in realistic LOCA analysis

    Energy Technology Data Exchange (ETDEWEB)

    Young, M Y; Bajorek, S M; Ohkawa, K [Westinghouse Electric Corporation, Pittsburgh, PA (United States)

    1994-12-31

    While LOCA is a complex transient to simulate, the state of art in thermal hydraulics has advanced sufficiently to allow its realistic prediction and application of advanced methods to actual reactor design as demonstrated by methodology described in this paper 6 refs, 5 figs, 3 tabs

  8. Should scientific realists be platonists?

    DEFF Research Database (Denmark)

    Busch, Jacob; Morrison, Joe

    2015-01-01

    an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...

  9. Topic modeling for cluster analysis of large biological and medical datasets.

    Science.gov (United States)

    Zhao, Weizhong; Zou, Wen; Chen, James J

    2014-01-01

    The big data moniker is nowhere better deserved than to describe the ever-increasing prodigiousness and complexity of biological and medical datasets. New methods are needed to generate and test hypotheses, foster biological interpretation, and build validated predictors. Although multivariate techniques such as cluster analysis may allow researchers to identify groups, or clusters, of related variables, the accuracies and effectiveness of traditional clustering methods diminish for large and hyper dimensional datasets. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. Its ability to reduce high dimensionality to a small number of latent variables makes it suitable as a means for clustering or overcoming clustering difficulties in large biological and medical datasets. In this study, three topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, are proposed and tested on the cluster analysis of three large datasets: Salmonella pulsed-field gel electrophoresis (PFGE) dataset, lung cancer dataset, and breast cancer dataset, which represent various types of large biological or medical datasets. All three various methods are shown to improve the efficacy/effectiveness of clustering results on the three datasets in comparison to traditional methods. A preferable cluster analysis method emerged for each of the three datasets on the basis of replicating known biological truths. Topic modeling could be advantageously applied to the large datasets of biological or medical research. The three proposed topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, yield clustering improvements for the three different data types. Clusters more efficaciously represent truthful groupings and subgroupings in the data than traditional methods, suggesting

  10. A graphical method for reducing and relating models in systems biology.

    Science.gov (United States)

    Gay, Steven; Soliman, Sylvain; Fages, François

    2010-09-15

    In Systems Biology, an increasing collection of models of various biological processes is currently developed and made available in publicly accessible repositories, such as biomodels.net for instance, through common exchange formats such as SBML. To date, however, there is no general method to relate different models to each other by abstraction or reduction relationships, and this task is left to the modeler for re-using and coupling models. In mathematical biology, model reduction techniques have been studied for a long time, mainly in the case where a model exhibits different time scales, or different spatial phases, which can be analyzed separately. These techniques are however far too restrictive to be applied on a large scale in systems biology, and do not take into account abstractions other than time or phase decompositions. Our purpose here is to propose a general computational method for relating models together, by considering primarily the structure of the interactions and abstracting from their dynamics in a first step. We present a graph-theoretic formalism with node merge and delete operations, in which model reductions can be studied as graph matching problems. From this setting, we derive an algorithm for deciding whether there exists a reduction from one model to another, and evaluate it on the computation of the reduction relations between all SBML models of the biomodels.net repository. In particular, in the case of the numerous models of MAPK signalling, and of the circadian clock, biologically meaningful mappings between models of each class are automatically inferred from the structure of the interactions. We conclude on the generality of our graphical method, on its limits with respect to the representation of the structure of the interactions in SBML, and on some perspectives for dealing with the dynamics. The algorithms described in this article are implemented in the open-source software modeling platform BIOCHAM available at http

  11. Present Day Biology seen in the Looking Glass of Physics of Complexity

    Science.gov (United States)

    Schuster, P.

    Darwin's theory of variation and selection in its simplest form is directly applicable to RNA evolution in vitro as well as to virus evolution, and it allows for quantitative predictions. Understanding evolution at the molecular level is ultimately related to the central paradigm of structural biology: sequence⇒ structure ⇒ function. We elaborate on the state of the art in modeling and understanding evolution of RNA driven by reproduction and mutation. The focus will be laid on the landscape concept—originally introduced by Sewall Wright—and its application to problems in biology. The relation between genotypes and phenotypes is the result of two consecutive mappings from a space of genotypes called sequence space onto a space of phenotypes or structures, and fitness is the result of a mapping from phenotype space into non-negative real numbers. Realistic landscapes as derived from folding of RNA sequences into structures are characterized by two properties: (i) they are rugged in the sense that sequences lying nearby in sequence space may have very different fitness values and (ii) they are characterized by an appreciable degree of neutrality implying that a certain fraction of genotypes and/or phenotypes cannot be distinguished in the selection process. Evolutionary dynamics on realistic landscapes will be studied as a function of the mutation rate, and the role of neutrality in the selection process will be discussed.

  12. The quest for a new modelling framework in mathematical biology. Comment on "On the interplay between mathematics and biology: Hallmarks towards a new systems biology" by N. Bellomo et al.

    Science.gov (United States)

    Eftimie, Raluca

    2015-03-01

    One of the main unsolved problems of modern physics is finding a "theory of everything" - a theory that can explain, with the help of mathematics, all physical aspects of the universe. While the laws of physics could explain some aspects of the biology of living systems (e.g., the phenomenological interpretation of movement of cells and animals), there are other aspects specific to biology that cannot be captured by physics models. For example, it is generally accepted that the evolution of a cell-based system is influenced by the activation state of cells (e.g., only activated and functional immune cells can fight diseases); on the other hand, the evolution of an animal-based system can be influenced by the psychological state (e.g., distress) of animals. Therefore, the last 10-20 years have seen also a quest for a "theory of everything"-approach extended to biology, with researchers trying to propose mathematical modelling frameworks that can explain various biological phenomena ranging from ecology to developmental biology and medicine [1,2,6]. The basic idea behind this approach can be found in a few reviews on ecology and cell biology [6,7,9-11], where researchers suggested that due to the parallel between the micro-scale dynamics and the emerging macro-scale phenomena in both cell biology and in ecology, many mathematical methods used for ecological processes could be adapted to cancer modelling [7,9] or to modelling in immunology [11]. However, this approach generally involved the use of different models to describe different biological aspects (e.g., models for cell and animal movement, models for competition between cells or animals, etc.).

  13. A Data-Driven Approach to Realistic Shape Morphing

    KAUST Repository

    Gao, Lin; Lai, Yu-Kun; Huang, Qi-Xing; Hu, Shi-Min

    2013-01-01

    Morphing between 3D objects is a fundamental technique in computer graphics. Traditional methods of shape morphing focus on establishing meaningful correspondences and finding smooth interpolation between shapes. Such methods however only take geometric information as input and thus cannot in general avoid producing unnatural interpolation, in particular for large-scale deformations. This paper proposes a novel data-driven approach for shape morphing. Given a database with various models belonging to the same category, we treat them as data samples in the plausible deformation space. These models are then clustered to form local shape spaces of plausible deformations. We use a simple metric to reasonably represent the closeness between pairs of models. Given source and target models, the morphing problem is casted as a global optimization problem of finding a minimal distance path within the local shape spaces connecting these models. Under the guidance of intermediate models in the path, an extended as-rigid-as-possible interpolation is used to produce the final morphing. By exploiting the knowledge of plausible models, our approach produces realistic morphing for challenging cases as demonstrated by various examples in the paper. © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  14. A Data-Driven Approach to Realistic Shape Morphing

    KAUST Repository

    Gao, Lin

    2013-05-01

    Morphing between 3D objects is a fundamental technique in computer graphics. Traditional methods of shape morphing focus on establishing meaningful correspondences and finding smooth interpolation between shapes. Such methods however only take geometric information as input and thus cannot in general avoid producing unnatural interpolation, in particular for large-scale deformations. This paper proposes a novel data-driven approach for shape morphing. Given a database with various models belonging to the same category, we treat them as data samples in the plausible deformation space. These models are then clustered to form local shape spaces of plausible deformations. We use a simple metric to reasonably represent the closeness between pairs of models. Given source and target models, the morphing problem is casted as a global optimization problem of finding a minimal distance path within the local shape spaces connecting these models. Under the guidance of intermediate models in the path, an extended as-rigid-as-possible interpolation is used to produce the final morphing. By exploiting the knowledge of plausible models, our approach produces realistic morphing for challenging cases as demonstrated by various examples in the paper. © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  15. Scaling up complex interventions: insights from a realist synthesis.

    Science.gov (United States)

    Willis, Cameron D; Riley, Barbara L; Stockton, Lisa; Abramowicz, Aneta; Zummach, Dana; Wong, Geoff; Robinson, Kerry L; Best, Allan

    2016-12-19

    Preventing chronic diseases, such as cancer, cardiovascular disease and diabetes, requires complex interventions, involving multi-component and multi-level efforts that are tailored to the contexts in which they are delivered. Despite an increasing number of complex interventions in public health, many fail to be 'scaled up'. This study aimed to increase understanding of how and under what conditions complex public health interventions may be scaled up to benefit more people and populations.A realist synthesis was conducted and discussed at an in-person workshop involving practitioners responsible for scaling up activities. Realist approaches view causality through the linkages between changes in contexts (C) that activate mechanisms (M), leading to specific outcomes (O) (CMO configurations). To focus this review, three cases of complex interventions that had been successfully scaled up were included: Vibrant Communities, Youth Build USA and Pathways to Education. A search strategy of published and grey literature related to each case was developed, involving searches of relevant databases and nominations from experts. Data extracted from included documents were classified according to CMO configurations within strategic themes. Findings were compared and contrasted with guidance from diffusion theory, and interpreted with knowledge users to identify practical implications and potential directions for future research.Four core mechanisms were identified, namely awareness, commitment, confidence and trust. These mechanisms were activated within two broad scaling up strategies, those of renewing and regenerating, and documenting success. Within each strategy, specific actions to change contexts included building partnerships, conducting evaluations, engaging political support and adapting funding models. These modified contexts triggered the identified mechanisms, leading to a range of scaling up outcomes, such as commitment of new communities, changes in relevant

  16. Computer modeling in developmental biology: growing today, essential tomorrow.

    Science.gov (United States)

    Sharpe, James

    2017-12-01

    D'Arcy Thompson was a true pioneer, applying mathematical concepts and analyses to the question of morphogenesis over 100 years ago. The centenary of his famous book, On Growth and Form , is therefore a great occasion on which to review the types of computer modeling now being pursued to understand the development of organs and organisms. Here, I present some of the latest modeling projects in the field, covering a wide range of developmental biology concepts, from molecular patterning to tissue morphogenesis. Rather than classifying them according to scientific question, or scale of problem, I focus instead on the different ways that modeling contributes to the scientific process and discuss the likely future of modeling in developmental biology. © 2017. Published by The Company of Biologists Ltd.

  17. Modeling Dispersion of Chemical-Biological Agents in Three Dimensional Living Space

    International Nuclear Information System (INIS)

    William S. Winters

    2002-01-01

    This report documents a series of calculations designed to demonstrate Sandia's capability in modeling the dispersal of chemical and biological agents in complex three-dimensional spaces. The transport of particles representing biological agents is modeled in a single room and in several connected rooms. The influence of particle size, particle weight and injection method are studied

  18. Time management: a realistic approach.

    Science.gov (United States)

    Jackson, Valerie P

    2009-06-01

    Realistic time management and organization plans can improve productivity and the quality of life. However, these skills can be difficult to develop and maintain. The key elements of time management are goals, organization, delegation, and relaxation. The author addresses each of these components and provides suggestions for successful time management.

  19. What works for whom in pharmacist-led smoking cessation support: realist review.

    Science.gov (United States)

    Greenhalgh, Trisha; Macfarlane, Fraser; Steed, Liz; Walton, Robert

    2016-12-16

    New models of primary care are needed to address funding and staffing pressures. We addressed the research question "what works for whom in what circumstances in relation to the role of community pharmacies in providing lifestyle interventions to support smoking cessation?" This is a realist review conducted according to RAMESES standards. We began with a sample of 103 papers included in a quantitative review of community pharmacy intervention trials identified through systematic searching of seven databases. We supplemented this with additional papers: studies that had been excluded from the quantitative review but which provided rigorous and relevant additional data for realist theorising; citation chaining (pursuing reference lists and Google Scholar forward tracking of key papers); the 'search similar citations' function on PubMed. After mapping what research questions had been addressed by these studies and how, we undertook a realist analysis to identify and refine candidate theories about context-mechanism-outcome configurations. Our final sample consisted of 66 papers describing 74 studies (12 systematic reviews, 6 narrative reviews, 18 RCTs, 1 process detail of a RCT, 1 cost-effectiveness study, 12 evaluations of training, 10 surveys, 8 qualitative studies, 2 case studies, 2 business models, 1 development of complex intervention). Most studies had been undertaken in the field of pharmacy practice (pharmacists studying what pharmacists do) and demonstrated the success of pharmacist training in improving confidence, knowledge and (in many but not all studies) patient outcomes. Whilst a few empirical studies had applied psychological theories to account for behaviour change in pharmacists or people attempting to quit, we found no studies that had either developed or tested specific theoretical models to explore how pharmacists' behaviour may be affected by organisational context. Because of the nature of the empirical data, only a provisional realist analysis

  20. Preservice Biology Teachers' Conceptions About the Tentative Nature of Theories and Models in Biology

    Science.gov (United States)

    Reinisch, Bianca; Krüger, Dirk

    2018-02-01

    In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers ( N = 10) were asked about their understanding of theories and models. They were requested to give reasons why they see theories and models as either tentative or certain constructs. Their conceptions were then compared to philosophers' positions (e.g., Popper, Giere). A category system was developed from the qualitative content analysis of the interviews. These categories include 16 conceptions for theories ( n tentative = 11; n certai n = 5) and 18 conceptions for models ( n tentative = 10; n certain = 8). The analysis of the interviews showed that the preservice teachers gave reasons for the tentativeness or certainty of theories and models either due to their understanding of the terms or due to their understanding of the generation or evaluation of theories and models. Therefore, a variety of different terminology, from different sources, should be used in learning-teaching situations. Additionally, an understanding of which processes lead to the generation, evaluation, and refinement or rejection of theories and models should be discussed with preservice teachers. Within philosophy of science, there has been a shift from theories to models. This should be transferred to educational contexts by firstly highlighting the role of models and also their connections to theories.

  1. How realistic are air quality hindcasts driven by forcings from climate model simulations?

    Science.gov (United States)

    Lacressonnière, G.; Peuch, V.-H.; Arteta, J.; Josse, B.; Joly, M.; Marécal, V.; Saint Martin, D.; Déqué, M.; Watson, L.

    2012-12-01

    Predicting how European air quality could evolve over the next decades in the context of changing climate requires the use of climate models to produce results that can be averaged in a climatologically and statistically sound manner. This is a very different approach from the one that is generally used for air quality hindcasts for the present period; analysed meteorological fields are used to represent specifically each date and hour. Differences arise both from the fact that a climate model run results in a pure model output, with no influence from observations (which are useful to correct for a range of errors), and that in a "climate" set-up, simulations on a given day, month or even season cannot be related to any specific period of time (but can just be interpreted in a climatological sense). Hence, although an air quality model can be thoroughly validated in a "realistic" set-up using analysed meteorological fields, the question remains of how far its outputs can be interpreted in a "climate" set-up. For this purpose, we focus on Europe and on the current decade using three 5-yr simulations performed with the multiscale chemistry-transport model MOCAGE and use meteorological forcings either from operational meteorological analyses or from climate simulations. We investigate how statistical skill indicators compare in the different simulations, discriminating also the effects of meteorology on atmospheric fields (winds, temperature, humidity, pressure, etc.) and on the dependent emissions and deposition processes (volatile organic compound emissions, deposition velocities, etc.). Our results show in particular how differing boundary layer heights and deposition velocities affect horizontal and vertical distributions of species. When the model is driven by operational analyses, the simulation accurately reproduces the observed values of O3, NOx, SO2 and, with some bias that can be explained by the set-up, PM10. We study how the simulations driven by climate

  2. Evaluation of biological models using Spacelab

    Science.gov (United States)

    Tollinger, D.; Williams, B. A.

    1980-01-01

    Biological models of hypogravity effects are described, including the cardiovascular-fluid shift, musculoskeletal, embryological and space sickness models. These models predict such effects as loss of extracellular fluid and electrolytes, decrease in red blood cell mass, and the loss of muscle and bone mass in weight-bearing portions of the body. Experimentation in Spacelab by the use of implanted electromagnetic flow probes, by fertilizing frog eggs in hypogravity and fixing the eggs at various stages of early development and by assessing the role of the vestibulocular reflex arc in space sickness is suggested. It is concluded that the use of small animals eliminates the uncertainties caused by corrective or preventive measures employed with human subjects.

  3. A realistic intersecting D6-brane model after the first LHC run

    Science.gov (United States)

    Li, Tianjun; Nanopoulos, D. V.; Raza, Shabbar; Wang, Xiao-Chuan

    2014-08-01

    With the Higgs boson mass around 125 GeV and the LHC supersymmetry search constraints, we revisit a three-family Pati-Salam model from intersecting D6-branes in Type IIA string theory on the T 6/(ℤ2 × ℤ2) orientifold which has a realistic phenomenology. We systematically scan the parameter space for μ 0, and find that the gravitino mass is generically heavier than about 2 TeV for both cases due to the Higgs mass low bound 123 GeV. In particular, we identify a region of parameter space with the electroweak fine-tuning as small as Δ EW ~ 24-32 (3-4%). In the viable parameter space which is consistent with all the current constraints, the mass ranges for gluino, the first two-generation squarks and sleptons are respectively [3, 18] TeV, [3, 16] TeV, and [2, 7] TeV. For the third-generation sfermions, the light stop satisfying 5 σ WMAP bounds via neutralino-stop coannihilation has mass from 0.5 to 1.2 TeV, and the light stau can be as light as 800 GeV. We also show various coannihilation and resonance scenarios through which the observed dark matter relic density is achieved. Interestingly, the certain portions of parameter space has excellent t- b- τ and b- τ Yukawa coupling unification. Three regions of parameter space are highlighted as well where the dominant component of the lightest neutralino is a bino, wino or higgsino. We discuss various scenarios in which such solutions may avoid recent astrophysical bounds in case if they satisfy or above observed relic density bounds. Prospects of finding higgsino-like neutralino in direct and indirect searches are also studied. And we display six tables of benchmark points depicting various interesting features of our model. Note that the lightest neutralino can be heavy up to 2.8 TeV, and there exists a natural region of parameter space from low-energy fine-tuning definition with heavy gluino and first two-generation squarks/sleptons, we point out that the 33 TeV and 100 TeV proton-proton colliders are indeed

  4. Modelling the performance of interferometric gravitational-wave detectors with realistically imperfect optics

    Science.gov (United States)

    Bochner, Brett

    1998-12-01

    The LIGO project is part of a world-wide effort to detect the influx of Gravitational Waves upon the earth from astrophysical sources, via their interaction with laser beams in interferometric detectors that are designed for extraordinarily high sensitivity. Central to the successful performance of LIGO detectors is the quality of their optical components, and the efficient optimization of interferometer configuration parameters. To predict LIGO performance with optics possessing realistic imperfections, we have developed a numerical simulation program to compute the steady-state electric fields of a complete, coupled-cavity LIGO interferometer. The program can model a wide variety of deformations, including laser beam mismatch and/or misalignment, finite mirror size, mirror tilts, curvature distortions, mirror surface roughness, and substrate inhomogeneities. Important interferometer parameters are automatically optimized during program execution to achieve the best possible sensitivity for each new set of perturbed mirrors. This thesis includes investigations of two interferometer designs: the initial LIGO system, and an advanced LIGO configuration called Dual Recycling. For Initial-LIGO simulations, the program models carrier and sideband frequency beams to compute the explicit shot-noise-limited gravitational wave sensitivity of the interferometer. It is demonstrated that optics of exceptional quality (root-mean-square deformations of less than ~1 nm in the central mirror regions) are necessary to meet Initial-LIGO performance requirements, but that they can be feasibly met. It is also shown that improvements in mirror quality can substantially increase LIGO's sensitivity to selected astrophysical sources. For Dual Recycling, the program models gravitational- wave-induced sidebands over a range of frequencies to demonstrate that the tuned and narrow-banded signal responses predicted for this configuration can be achieved with imperfect optics. Dual Recycling

  5. Development of realistic high-resolution whole-body voxel models of Japanese adult males and females of average height and weight, and application of models to radio-frequency electromagnetic-field dosimetry

    International Nuclear Information System (INIS)

    Nagaoka, Tomoaki; Watanabe, Soichi; Sakurai, Kiyoko; Kunieda, Etsuo; Watanabe, Satoshi; Taki, Masao; Yamanaka, Yukio

    2004-01-01

    With advances in computer performance, the use of high-resolution voxel models of the entire human body has become more frequent in numerical dosimetries of electromagnetic waves. Using magnetic resonance imaging, we have developed realistic high-resolution whole-body voxel models for Japanese adult males and females of average height and weight. The developed models consist of cubic voxels of 2 mm on each side; the models are segmented into 51 anatomic regions. The adult female model is the first of its kind in the world and both are the first Asian voxel models (representing average Japanese) that enable numerical evaluation of electromagnetic dosimetry at high frequencies of up to 3 GHz. In this paper, we will also describe the basic SAR characteristics of the developed models for the VHF/UHF bands, calculated using the finite-difference time-domain method

  6. A linear evolution for non-linear dynamics and correlations in realistic nuclei

    International Nuclear Information System (INIS)

    Levin, E.; Lublinsky, M.

    2004-01-01

    A new approach to high energy evolution based on a linear equation for QCD generating functional is developed. This approach opens a possibility for systematic study of correlations inside targets, and, in particular, inside realistic nuclei. Our results are presented as three new equations. The first one is a linear equation for QCD generating functional (and for scattering amplitude) that sums the 'fan' diagrams. For the amplitude this equation is equivalent to the non-linear Balitsky-Kovchegov equation. The second equation is a generalization of the Balitsky-Kovchegov non-linear equation to interactions with realistic nuclei. It includes a new correlation parameter which incorporates, in a model-dependent way, correlations inside the nuclei. The third equation is a non-linear equation for QCD generating functional (and for scattering amplitude) that in addition to the 'fan' diagrams sums the Glauber-Mueller multiple rescatterings

  7. Population of 224 realistic human subject-based computational breast phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, David W. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Wells, Jered R., E-mail: jered.wells@duke.edu [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Sturgeon, Gregory M. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 (United States); Samei, Ehsan [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Dobbins, James T. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Segars, W. Paul [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Lo, Joseph Y. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Electrical and Computer Engineering and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2016-01-15

    Purpose: To create a database of highly realistic and anatomically variable 3D virtual breast phantoms based on dedicated breast computed tomography (bCT) data. Methods: A tissue classification and segmentation algorithm was used to create realistic and detailed 3D computational breast phantoms based on 230 + dedicated bCT datasets from normal human subjects. The breast volume was identified using a coarse three-class fuzzy C-means segmentation algorithm which accounted for and removed motion blur at the breast periphery. Noise in the bCT data was reduced through application of a postreconstruction 3D bilateral filter. A 3D adipose nonuniformity (bias field) correction was then applied followed by glandular segmentation using a 3D bias-corrected fuzzy C-means algorithm. Multiple tissue classes were defined including skin, adipose, and several fractional glandular densities. Following segmentation, a skin mask was produced which preserved the interdigitated skin, adipose, and glandular boundaries of the skin interior. Finally, surface modeling was used to produce digital phantoms with methods complementary to the XCAT suite of digital human phantoms. Results: After rejecting some datasets due to artifacts, 224 virtual breast phantoms were created which emulate the complex breast parenchyma of actual human subjects. The volume breast density (with skin) ranged from 5.5% to 66.3% with a mean value of 25.3% ± 13.2%. Breast volumes ranged from 25.0 to 2099.6 ml with a mean value of 716.3 ± 386.5 ml. Three breast phantoms were selected for imaging with digital compression (using finite element modeling) and simple ray-tracing, and the results show promise in their potential to produce realistic simulated mammograms. Conclusions: This work provides a new population of 224 breast phantoms based on in vivo bCT data for imaging research. Compared to previous studies based on only a few prototype cases, this dataset provides a rich source of new cases spanning a wide range

  8. Realistic nuclear shell theory and the doubly-magic 132Sn region

    International Nuclear Information System (INIS)

    Vary, J.P.

    1978-01-01

    After an introduction discussing the motivation and interest in results obtained with isotope separators, the fundamental problem in realistic nuclear shell theory is posed in the context of renormalization theory. Then some of the important developments that have occurred over the last fifteen years in the derivation of the effective Hamiltonian and application of realistic nuclear shell theory are briefly reviewed. Doubly magic regions of the periodic table and the unique advantages of the 132 Sn region are described. Then results are shown for the ground-state properties of 132 Sn as calculated from the density-dependent Hartree-Fock approach with the Skyrme Hamiltonian. A single theoretical Hamiltonian for all nuclei from doubly magic 132 Sn to doubly magic 208 Pb is presented; single-particle energies are graphed. Finally, predictions of shell-model level-density distributions obtained with spectral distribution methods are discussed; calculated level densities are shown for 136 Xe. 10 figures

  9. Modeling drug- and chemical- induced hepatotoxicity with systems biology approaches

    Directory of Open Access Journals (Sweden)

    Sudin eBhattacharya

    2012-12-01

    Full Text Available We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of ‘toxicity pathways’ is described in the context of the 2007 US National Academies of Science report, Toxicity testing in the 21st Century: A Vision and A Strategy. Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically-based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular virtual tissue model of the liver lobule that combines molecular circuits in individual hepatocytes with cell-cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the AhR toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsymTM to understand drug-induced liver injury (DILI, the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales.

  10. Multi-agent systems in epidemiology: a first step for computational biology in the study of vector-borne disease transmission

    Directory of Open Access Journals (Sweden)

    Guégan Jean-François

    2008-10-01

    Full Text Available Abstract Background Computational biology is often associated with genetic or genomic studies only. However, thanks to the increase of computational resources, computational models are appreciated as useful tools in many other scientific fields. Such modeling systems are particularly relevant for the study of complex systems, like the epidemiology of emerging infectious diseases. So far, mathematical models remain the main tool for the epidemiological and ecological analysis of infectious diseases, with SIR models could be seen as an implicit standard in epidemiology. Unfortunately, these models are based on differential equations and, therefore, can become very rapidly unmanageable due to the too many parameters which need to be taken into consideration. For instance, in the case of zoonotic and vector-borne diseases in wildlife many different potential host species could be involved in the life-cycle of disease transmission, and SIR models might not be the most suitable tool to truly capture the overall disease circulation within that environment. This limitation underlines the necessity to develop a standard spatial model that can cope with the transmission of disease in realistic ecosystems. Results Computational biology may prove to be flexible enough to take into account the natural complexity observed in both natural and man-made ecosystems. In this paper, we propose a new computational model to study the transmission of infectious diseases in a spatially explicit context. We developed a multi-agent system model for vector-borne disease transmission in a realistic spatial environment. Conclusion Here we describe in detail the general behavior of this model that we hope will become a standard reference for the study of vector-borne disease transmission in wildlife. To conclude, we show how this simple model could be easily adapted and modified to be used as a common framework for further research developments in this field.

  11. A Biologically Realistic Cortical Model of Eye Movement Control in Reading

    Science.gov (United States)

    Heinzle, Jakob; Hepp, Klaus; Martin, Kevan A. C.

    2010-01-01

    Reading is a highly complex task involving a precise integration of vision, attention, saccadic eye movements, and high-level language processing. Although there is a long history of psychological research in reading, it is only recently that imaging studies have identified some neural correlates of reading. Thus, the underlying neural mechanisms…

  12. When one model is not enough: Combining epistemic tools in systems biology

    DEFF Research Database (Denmark)

    Green, Sara

    2013-01-01

    . The conceptual repertoire of Rheinberger’s historical epistemology offers important insights for an analysis of the modelling practice. I illustrate this with a case study on network modeling in systems biology where engineering approaches are applied to the study of biological systems. I shall argue...

  13. Not just a theory--the utility of mathematical models in evolutionary biology.

    Directory of Open Access Journals (Sweden)

    Maria R Servedio

    2014-12-01

    Full Text Available Progress in science often begins with verbal hypotheses meant to explain why certain biological phenomena exist. An important purpose of mathematical models in evolutionary research, as in many other fields, is to act as “proof-of-concept” tests of the logic in verbal explanations, paralleling the way in which empirical data are used to test hypotheses. Because not all subfields of biology use mathematics for this purpose, misunderstandings of the function of proof-of-concept modeling are common. In the hope of facilitating communication, we discuss the role of proof-of-concept modeling in evolutionary biology.

  14. Consistent robustness analysis (CRA) identifies biologically relevant properties of regulatory network models.

    Science.gov (United States)

    Saithong, Treenut; Painter, Kevin J; Millar, Andrew J

    2010-12-16

    A number of studies have previously demonstrated that "goodness of fit" is insufficient in reliably classifying the credibility of a biological model. Robustness and/or sensitivity analysis is commonly employed as a secondary method for evaluating the suitability of a particular model. The results of such analyses invariably depend on the particular parameter set tested, yet many parameter values for biological models are uncertain. Here, we propose a novel robustness analysis that aims to determine the "common robustness" of the model with multiple, biologically plausible parameter sets, rather than the local robustness for a particular parameter set. Our method is applied to two published models of the Arabidopsis circadian clock (the one-loop [1] and two-loop [2] models). The results reinforce current findings suggesting the greater reliability of the two-loop model and pinpoint the crucial role of TOC1 in the circadian network. Consistent Robustness Analysis can indicate both the relative plausibility of different models and also the critical components and processes controlling each model.

  15. Generalized Beer-Lambert model for near-infrared light propagation in thick biological tissues

    Science.gov (United States)

    Bhatt, Manish; Ayyalasomayajula, Kalyan R.; Yalavarthy, Phaneendra K.

    2016-07-01

    The attenuation of near-infrared (NIR) light intensity as it propagates in a turbid medium like biological tissue is described by modified the Beer-Lambert law (MBLL). The MBLL is generally used to quantify the changes in tissue chromophore concentrations for NIR spectroscopic data analysis. Even though MBLL is effective in terms of providing qualitative comparison, it suffers from its applicability across tissue types and tissue dimensions. In this work, we introduce Lambert-W function-based modeling for light propagation in biological tissues, which is a generalized version of the Beer-Lambert model. The proposed modeling provides parametrization of tissue properties, which includes two attenuation coefficients μ0 and η. We validated our model against the Monte Carlo simulation, which is the gold standard for modeling NIR light propagation in biological tissue. We included numerous human and animal tissues to validate the proposed empirical model, including an inhomogeneous adult human head model. The proposed model, which has a closed form (analytical), is first of its kind in providing accurate modeling of NIR light propagation in biological tissues.

  16. Biclustering with Flexible Plaid Models to Unravel Interactions between Biological Processes.

    Science.gov (United States)

    Henriques, Rui; Madeira, Sara C

    2015-01-01

    Genes can participate in multiple biological processes at a time and thus their expression can be seen as a composition of the contributions from the active processes. Biclustering under a plaid assumption allows the modeling of interactions between transcriptional modules or biclusters (subsets of genes with coherence across subsets of conditions) by assuming an additive composition of contributions in their overlapping areas. Despite the biological interest of plaid models, few biclustering algorithms consider plaid effects and, when they do, they place restrictions on the allowed types and structures of biclusters, and suffer from robustness problems by seizing exact additive matchings. We propose BiP (Biclustering using Plaid models), a biclustering algorithm with relaxations to allow expression levels to change in overlapping areas according to biologically meaningful assumptions (weighted and noise-tolerant composition of contributions). BiP can be used over existing biclustering solutions (seizing their benefits) as it is able to recover excluded areas due to unaccounted plaid effects and detect noisy areas non-explained by a plaid assumption, thus producing an explanatory model of overlapping transcriptional activity. Experiments on synthetic data support BiP's efficiency and effectiveness. The learned models from expression data unravel meaningful and non-trivial functional interactions between biological processes associated with putative regulatory modules.

  17. Realistic methods for calculating the releases and consequences of a large LOCA

    International Nuclear Information System (INIS)

    Stephenson, W.; Dutton, L.M.C.; Handy, B.J.; Smedley, C.

    1992-01-01

    This report describes a calculational route to predict realistic radiological consequences for a successfully terminated large-loss-of-coolant accident (LOCA) at a pressurized-water reactor (PWR). All steps in the calculational route are considered. For each one, a brief comment is made on the significant differences between the methods of calculation that were identified in the benchmark studies and recommendations are made for the methods and data for carrying out realistic calculations. These are based on the best supportable methods and data and the technical basis for each recommendation is given. Where the lack of well-validated methods or data means that the most realistic method that can be justified is considered to be very conservative, the need for further research is identified. The behaviour of inorganic iodine and the removal of aerosols from the atmosphere of the reactor building are identified as areas of particular importance. Where the retention of radioactivity is sensitive to design features, these are identified and, for the most importance features, the impact of different designs on the release of activity is indicated. The predictions of the proposed model are calculated for each stage and compared with the releases of activity predicted by the licensing methods that were used in the earlier benchmark studies. The conservative nature of the latter is confirmed. Methods and data are also presented for calculating the resulting doses to members of the public of the National Radiological Protection Boards as a result of work carried out by several national bodies in the UK. Other, equally acceptable, models are used in other countries of the Community and some examples are given

  18. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB.

    Science.gov (United States)

    Klingbeil, Guido; Erban, Radek; Giles, Mike; Maini, Philip K

    2011-04-15

    The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new software tool STOCHSIMGPU that exploits graphics processing units (GPUs) for parallel stochastic simulations of biological/chemical reaction systems and show that significant gains in efficiency can be made. It is integrated into MATLAB and works with the Systems Biology Toolbox 2 (SBTOOLBOX2) for MATLAB. The GPU-based parallel implementation of the Gillespie stochastic simulation algorithm (SSA), the logarithmic direct method (LDM) and the next reaction method (NRM) is approximately 85 times faster than the sequential implementation of the NRM on a central processing unit (CPU). Using our software does not require any changes to the user's models, since it acts as a direct replacement of the stochastic simulation software of the SBTOOLBOX2. The software is open source under the GPL v3 and available at http://www.maths.ox.ac.uk/cmb/STOCHSIMGPU. The web site also contains supplementary information. klingbeil@maths.ox.ac.uk Supplementary data are available at Bioinformatics online.

  19. Biologically based modelling and simulation of carcinogenesis at low doses

    International Nuclear Information System (INIS)

    Ouchi, Noriyuki B.

    2003-01-01

    The process of the carcinogenesis is studied by computer simulation. In general, we need a large number of experimental samples to detect mutations at low doses, but in practice it is difficult to get such a large number of data. To satisfy the requirements of the situation at low doses, it is good to study the process of carcinogenesis using biologically based mathematical model. We have mainly studied it by using as known as 'multi-stage model'; the model seems to get complicated, as we adopt the recent new findings of molecular biological experiments. Moreover, the basic idea of the multi-stage model is based on the epidemiologic data of log-log variation of cancer incidence with age, it seems to be difficult to compare with experimental data of irradiated cell culture system, which has been increasing in recent years. Taking above into consideration, we concluded that we had better make new model with following features: 1) a unit of the target system is a cell, 2) the new information of the molecular biology can be easily introduced, 3) having spatial coordinates for checking a colony formation or tumorigenesis. In this presentation, we will show the detail of the model and some simulation results about the carcinogenesis. (author)

  20. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    Science.gov (United States)

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour

  1. Realistic Scheduling Mechanism for Smart Homes

    Directory of Open Access Journals (Sweden)

    Danish Mahmood

    2016-03-01

    Full Text Available In this work, we propose a Realistic Scheduling Mechanism (RSM to reduce user frustration and enhance appliance utility by classifying appliances with respective constraints and their time of use effectively. Algorithms are proposed regarding functioning of home appliances. A 24 hour time slot is divided into four logical sub-time slots, each composed of 360 min or 6 h. In these sub-time slots, only desired appliances (with respect to appliance classification are scheduled to raise appliance utility, restricting power consumption by a dynamically modelled power usage limiter that does not only take the electricity consumer into account but also the electricity supplier. Once appliance, time and power usage limiter modelling is done, we use a nature-inspired heuristic algorithm, Binary Particle Swarm Optimization (BPSO, optimally to form schedules with given constraints representing each sub-time slot. These schedules tend to achieve an equilibrium amongst appliance utility and cost effectiveness. For validation of the proposed RSM, we provide a comparative analysis amongst unscheduled electrical load usage, scheduled directly by BPSO and RSM, reflecting user comfort, which is based upon cost effectiveness and appliance utility.

  2. Yeast as a Model System to Study Tau Biology

    Directory of Open Access Journals (Sweden)

    Ann De Vos

    2011-01-01

    Full Text Available Hyperphosphorylated and aggregated human protein tau constitutes a hallmark of a multitude of neurodegenerative diseases called tauopathies, exemplified by Alzheimer's disease. In spite of an enormous amount of research performed on tau biology, several crucial questions concerning the mechanisms of tau toxicity remain unanswered. In this paper we will highlight some of the processes involved in tau biology and pathology, focusing on tau phosphorylation and the interplay with oxidative stress. In addition, we will introduce the development of a human tau-expressing yeast model, and discuss some crucial results obtained in this model, highlighting its potential in the elucidation of cellular processes leading to tau toxicity.

  3. Biological profiling and dose-response modeling tools ...

    Science.gov (United States)

    Through its ToxCast project, the U.S. EPA has developed a battery of in vitro high throughput screening (HTS) assays designed to assess the potential toxicity of environmental chemicals. At present, over 1800 chemicals have been tested in up to 600 assays, yielding a large number of concentration-response data sets. Standard processing of these data sets involves finding a best fitting mathematical model and set of model parameters that specify this model. The model parameters include quantities such as the half-maximal activity concentration (or “AC50”) that have biological significance and can be used to inform the efficacy or potency of a given chemical with respect to a given assay. All of this data is processed and stored in an online-accessible database and website: http://actor.epa.gov/dashboard2. Results from these in vitro assays are used in a multitude of ways. New pathways and targets can be identified and incorporated into new or existing adverse outcome pathways (AOPs). Pharmacokinetic models such as those implemented EPA’s HTTK R package can be used to translate an in vitro concentration into an in vivo dose; i.e., one can predict the oral equivalent dose that might be expected to activate a specific biological pathway. Such predicted values can then be compared with estimated actual human exposures prioritize chemicals for further testing.Any quantitative examination should be accompanied by estimation of uncertainty. We are developing met

  4. A realistic closed-form radiobiological model of clinical tumor-control data incorporating intertumor heterogeneity

    International Nuclear Information System (INIS)

    Roberts, Stephen A.; Hendry, Jolyon H.

    1998-01-01

    Purpose: To investigate the role of intertumor heterogeneity in clinical tumor control datasets and the relationship to in vitro measurements of tumor biopsy samples. Specifically, to develop a modified linear-quadratic (LQ) model incorporating such heterogeneity that it is practical to fit to clinical tumor-control datasets. Methods and Materials: We developed a modified version of the linear-quadratic (LQ) model for tumor control, incorporating a (lagged) time factor to allow for tumor cell repopulation. We explicitly took into account the interpatient heterogeneity in clonogen number, radiosensitivity, and repopulation rate. Using this model, we could generate realistic TCP curves using parameter estimates consistent with those reported from in vitro studies, subject to the inclusion of a radiosensitivity (or dose)-modifying factor. We then demonstrated that the model was dominated by the heterogeneity in α (tumor radiosensitivity) and derived an approximate simplified model incorporating this heterogeneity. This simplified model is expressible in a compact closed form, which it is practical to fit to clinical datasets. Using two previously analysed datasets, we fit the model using direct maximum-likelihood techniques and obtained parameter estimates that were, again, consistent with the experimental data on the radiosensitivity of primary human tumor cells. This heterogeneity model includes the same number of adjustable parameters as the standard LQ model. Results: The modified model provides parameter estimates that can easily be reconciled with the in vitro measurements. The simplified (approximate) form of the heterogeneity model is a compact, closed-form probit function that can readily be fitted to clinical series by conventional maximum-likelihood methodology. This heterogeneity model provides a slightly better fit to the datasets than the conventional LQ model, with the same numbers of fitted parameters. The parameter estimates of the clinically

  5. Dynamical behavior of a three species food chain model with Beddington-DeAngelis functional response

    International Nuclear Information System (INIS)

    Naji, Raid Kamel; Balasim, Alla Tariq

    2007-01-01

    A three species food chain model with Beddington-DeAngelis functional response is investigated. The local stability analysis is carried out and global behavior is simulated numerically for a biologically feasible choice of parameters. The persistence conditions of a food chain model are established. The bifurcation diagrams are obtained for different parameters of the model after intensive numerical simulations. The results of simulations show that the model could exhibit chaotic dynamics for realistic and biologically feasible parametric values. Finally, the effect of immigration within prey species is investigated. It is observed that adding small amount of constant immigration to prey species stabilize the system

  6. Production of a faithful realistic phantom to human head and thermal neutron flux measurement on the brain surface. Cooperative research

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Kazuyoshi; Kumada, Hiroaki; Kishi, Toshiaki; Torii, Yoshiya; Uchiyama, Junzo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Endo, Kiyoshi; Yamamoto, Tetsuya; Matsumura, Akira; Nose, Tadao [Tsukuba Univ., Tsukuba, Ibaraki (Japan)

    2002-12-01

    Thermal neutron flux is determined using the gold wires in current BNCT irradiation, so evaluation of arbitrary points after the irradiation is limited in the quantity of these detectors. In order to make up for the weakness, dose estimation of a patient is simulated by a computational dose calculation supporting system. In another way without computer simulation, a medical irradiation condition can be replicate experimentally using of realistic phantom which was produced from CT images by rapid prototyping technique. This phantom was irradiated at a same JRR-4 neutron beam as clinical irradiation condition of the patient and the thermal neutron distribution on the brain surface was measured in detail. This experimental evaluation technique using a realistic phantom is applicable to in vitro cell irradiation experiments for radiation biological effects as well as in-phantom experiments for dosimetry under the nearly medical irradiation condition of patient. (author)

  7. Production of a faithful realistic phantom to human head and thermal neutron flux measurement on the brain surface. Cooperative research

    CERN Document Server

    Yamamoto, K; Kishi, T; Kumada, H; Matsumura, A; Nose, T; Torii, Y; Uchiyama, J; Yamamoto, T

    2002-01-01

    Thermal neutron flux is determined using the gold wires in current BNCT irradiation, so evaluation of arbitrary points after the irradiation is limited in the quantity of these detectors. In order to make up for the weakness, dose estimation of a patient is simulated by a computational dose calculation supporting system. In another way without computer simulation, a medical irradiation condition can be replicate experimentally using of realistic phantom which was produced from CT images by rapid prototyping technique. This phantom was irradiated at a same JRR-4 neutron beam as clinical irradiation condition of the patient and the thermal neutron distribution on the brain surface was measured in detail. This experimental evaluation technique using a realistic phantom is applicable to in vitro cell irradiation experiments for radiation biological effects as well as in-phantom experiments for dosimetry under the nearly medical irradiation condition of patient.

  8. Separable expansion for realistic multichannel scattering problems

    International Nuclear Information System (INIS)

    Canton, L.; Cattapan, G.; Pisent, G.

    1987-01-01

    A new approach to the multichannel scattering problem with realistic local or nonlocal interactions is developed. By employing the negative-energy solutions of uncoupled Sturmian eigenvalue problems referring to simple auxiliary potentials, the coupling interactions appearing to the original multichannel problem are approximated by finite-rank potentials. By resorting to integral-equation tecniques the coupled-channel equations are then reduced to linear algebraic equations which can be straightforwardly solved. Compact algebraic expressions for the relevant scattering matrix elements are thus obtained. The convergence of the method is tasted in the single-channel case with realistic optical potentials. Excellent agreement is obtained with a few terms in the separable expansion for both real and absorptive interactions

  9. Realistic nurse-led policy implementation, optimization and evaluation: novel methodological exemplar.

    Science.gov (United States)

    Noyes, Jane; Lewis, Mary; Bennett, Virginia; Widdas, David; Brombley, Karen

    2014-01-01

    To report the first large-scale realistic nurse-led implementation, optimization and evaluation of a complex children's continuing-care policy. Health policies are increasingly complex, involve multiple Government departments and frequently fail to translate into better patient outcomes. Realist methods have not yet been adapted for policy implementation. Research methodology - Evaluation using theory-based realist methods for policy implementation. An expert group developed the policy and supporting tools. Implementation and evaluation design integrated diffusion of innovation theory with multiple case study and adapted realist principles. Practitioners in 12 English sites worked with Consultant Nurse implementers to manipulate the programme theory and logic of new decision-support tools and care pathway to optimize local implementation. Methods included key-stakeholder interviews, developing practical diffusion of innovation processes using key-opinion leaders and active facilitation strategies and a mini-community of practice. New and existing processes and outcomes were compared for 137 children during 2007-2008. Realist principles were successfully adapted to a shorter policy implementation and evaluation time frame. Important new implementation success factors included facilitated implementation that enabled 'real-time' manipulation of programme logic and local context to best-fit evolving theories of what worked; using local experiential opinion to change supporting tools to more realistically align with local context and what worked; and having sufficient existing local infrastructure to support implementation. Ten mechanisms explained implementation success and differences in outcomes between new and existing processes. Realistic policy implementation methods have advantages over top-down approaches, especially where clinical expertise is low and unlikely to diffuse innovations 'naturally' without facilitated implementation and local optimization. © 2013

  10. Does preliminary optimisation of an anatomically correct skull-brain model using simple simulants produce clinically realistic ballistic injury fracture patterns?

    Science.gov (United States)

    Mahoney, P F; Carr, D J; Delaney, R J; Hunt, N; Harrison, S; Breeze, J; Gibb, I

    2017-07-01

    Ballistic head injury remains a significant threat to military personnel. Studying such injuries requires a model that can be used with a military helmet. This paper describes further work on a skull-brain model using skulls made from three different polyurethane plastics and a series of skull 'fills' to simulate brain (3, 5, 7 and 10% gelatine by mass and PermaGel™). The models were subjected to ballistic impact from 7.62 × 39 mm mild steel core bullets. The first part of the work compares the different polyurethanes (mean bullet muzzle velocity of 708 m/s), and the second part compares the different fills (mean bullet muzzle velocity of 680 m/s). The impact events were filmed using high speed cameras. The resulting fracture patterns in the skulls were reviewed and scored by five clinicians experienced in assessing penetrating head injury. In over half of the models, one or more assessors felt aspects of the fracture pattern were close to real injury. Limitations of the model include the skull being manufactured in two parts and the lack of a realistic skin layer. Further work is ongoing to address these.

  11. From Minimal to Realistic Supersymmetric SU(5) Grand Unification

    CERN Document Server

    Altarelli, Guido; Masina, I; Altarelli, Guido; Feruglio, Ferruccio; Masina, Isabella

    2000-01-01

    We construct and discuss a "realistic" example of SUSY SU(5) GUT model, with an additional U(1) flavour symmetry, that is not plagued by the need of large fine tunings, like those associated with doublet-triplet splitting in the minimal model, and that leads to an acceptable phenomenology. This includes coupling unification with a value of alpha_s(m_Z) in much better agreement with the data than in the minimal version, an acceptable hierarchical pattern for fermion masses and mixing angles, also including neutrino masses and mixings, and a proton decay rate compatible with present limits (but the discovery of proton decay should be within reach of the next generation of experiments). In the neutrino sector the preferred solution is one with nearly maximal mixing both for atmospheric and solar neutrinos.

  12. Building executable biological pathway models automatically from BioPAX

    NARCIS (Netherlands)

    Willemsen, Timo; Feenstra, Anton; Groth, Paul

    2013-01-01

    The amount of biological data exposed in semantic formats is steadily increasing. In particular, pathway information (a model of how molecules interact within a cell) from databases such as KEGG and WikiPathways are available in a standard RDF-based format BioPAX. However, these models are

  13. The construction of 'realistic' four-dimensional strings through orbifolds

    International Nuclear Information System (INIS)

    Font, A.; Quevedo, F.; Sierra, A.

    1990-01-01

    We discuss the construction of 'realistic' lower rank 4-dimensional strings, through symmetric orbifolds with background fields. We present Z 3 three-generation SU(3)xSU(2)xU(1) models as well as models incorporating a left-right SU(2) L xSU(2) R xU(1) B-L symmetry in which proton stability is automatically guaranteed. Conformal field theory selection rules are used to find the flat directions to all orders which lead to these low-rank models and to study the relevant Yukawa couplings. A hierarchical structure of quark-lepton masses appears naturally in some models. We also present a detailed study of the structure of the Z 3 xZ 3 orbifold including the generalized GSO projection, the effect of discrete torsion and the conformal field theory Yukawa coupling selection rules. All these points are illustrated with a three-generation Z 3 xZ 3 model. We have made an effort to write a self-contained presentation in order to make this material available to non-string experts interested in the phenomenological aspects of this theory. (orig.)

  14. The construction of ``realistic'' four-dimensional strings through orbifolds

    Science.gov (United States)

    Font, A.; Ibáñez, L. E.; Quevedo, F.; Sierra, A.

    1990-02-01

    We discuss the construction of "realistic" lower rank 4-dimensional strings, through symmetric orbifolds with background fields. We present Z 3 three-generation SU(3) × SU(2) × U(1) models as well as models incorporating a left-right SU(2) L × SU(2) R × U(1) B-L symmetry in which proton stability is automatically guaranteed. Conformal field theory selection rules are used to find the flat directions to all orders which lead to these low-rank models and to study the relevant Yukawa couplings. A hierarchical structure of quark-lepton masses appears naturally in some models. We also present a detailed study of the structure of the Z 3 × Z 3 orbifold including the generalized GSO projection, the effect of discrete torsion and the conformal field theory Yukawa coupling selection rules. All these points are illustrated with a three-generation Z 3 × Z 3 model. We have made an effort to write a self-contained presentation in order to make this material available to non-string experts interested in the phenomenological aspects of this theory.

  15. Causal biological network database: a comprehensive platform of causal biological network models focused on the pulmonary and vascular systems.

    Science.gov (United States)

    Boué, Stéphanie; Talikka, Marja; Westra, Jurjen Willem; Hayes, William; Di Fabio, Anselmo; Park, Jennifer; Schlage, Walter K; Sewer, Alain; Fields, Brett; Ansari, Sam; Martin, Florian; Veljkovic, Emilija; Kenney, Renee; Peitsch, Manuel C; Hoeng, Julia

    2015-01-01

    With the wealth of publications and data available, powerful and transparent computational approaches are required to represent measured data and scientific knowledge in a computable and searchable format. We developed a set of biological network models, scripted in the Biological Expression Language, that reflect causal signaling pathways across a wide range of biological processes, including cell fate, cell stress, cell proliferation, inflammation, tissue repair and angiogenesis in the pulmonary and cardiovascular context. This comprehensive collection of networks is now freely available to the scientific community in a centralized web-based repository, the Causal Biological Network database, which is composed of over 120 manually curated and well annotated biological network models and can be accessed at http://causalbionet.com. The website accesses a MongoDB, which stores all versions of the networks as JSON objects and allows users to search for genes, proteins, biological processes, small molecules and keywords in the network descriptions to retrieve biological networks of interest. The content of the networks can be visualized and browsed. Nodes and edges can be filtered and all supporting evidence for the edges can be browsed and is linked to the original articles in PubMed. Moreover, networks may be downloaded for further visualization and evaluation. Database URL: http://causalbionet.com © The Author(s) 2015. Published by Oxford University Press.

  16. Achilles and the tortoise: Some caveats to mathematical modeling in biology.

    Science.gov (United States)

    Gilbert, Scott F

    2018-01-31

    Mathematical modeling has recently become a much-lauded enterprise, and many funding agencies seek to prioritize this endeavor. However, there are certain dangers associated with mathematical modeling, and knowledge of these pitfalls should also be part of a biologist's training in this set of techniques. (1) Mathematical models are limited by known science; (2) Mathematical models can tell what can happen, but not what did happen; (3) A model does not have to conform to reality, even if it is logically consistent; (4) Models abstract from reality, and sometimes what they eliminate is critically important; (5) Mathematics can present a Platonic ideal to which biologically organized matter strives, rather than a trial-and-error bumbling through evolutionary processes. This "Unity of Science" approach, which sees biology as the lowest physical science and mathematics as the highest science, is part of a Western belief system, often called the Great Chain of Being (or Scala Natura), that sees knowledge emerge as one passes from biology to chemistry to physics to mathematics, in an ascending progression of reason being purification from matter. This is also an informal model for the emergence of new life. There are now other informal models for integrating development and evolution, but each has its limitations. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Information-theoretic analysis of the dynamics of an executable biological model.

    Directory of Open Access Journals (Sweden)

    Avital Sadot

    Full Text Available To facilitate analysis and understanding of biological systems, large-scale data are often integrated into models using a variety of mathematical and computational approaches. Such models describe the dynamics of the biological system and can be used to study the changes in the state of the system over time. For many model classes, such as discrete or continuous dynamical systems, there exist appropriate frameworks and tools for analyzing system dynamics. However, the heterogeneous information that encodes and bridges molecular and cellular dynamics, inherent to fine-grained molecular simulation models, presents significant challenges to the study of system dynamics. In this paper, we present an algorithmic information theory based approach for the analysis and interpretation of the dynamics of such executable models of biological systems. We apply a normalized compression distance (NCD analysis to the state representations of a model that simulates the immune decision making and immune cell behavior. We show that this analysis successfully captures the essential information in the dynamics of the system, which results from a variety of events including proliferation, differentiation, or perturbations such as gene knock-outs. We demonstrate that this approach can be used for the analysis of executable models, regardless of the modeling framework, and for making experimentally quantifiable predictions.

  18. How to Build a Course in Mathematical-Biological Modeling: Content and Processes for Knowledge and Skill

    Science.gov (United States)

    Hoskinson, Anne-Marie

    2010-01-01

    Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical-biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity…

  19. Quantifying introgression risk with realistic population genetics.

    Science.gov (United States)

    Ghosh, Atiyo; Meirmans, Patrick G; Haccou, Patsy

    2012-12-07

    Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, repeated invasions and stochasticity. In addition, the use of linkage as a risk mitigation strategy has not been studied properly yet with genetic introgression models. Current genetic introgression studies fail to take repeated invasions and demographic stochasticity into account properly, and use incorrect measures of introgression risk that can be manipulated by arbitrary choices. In this study, we present proper methods for risk quantification that overcome these difficulties. We generalize a probabilistic risk measure, the so-called hazard rate of introgression, for application to introgression models with complex genetics and small natural population sizes. We illustrate the method by studying the effects of linkage and recombination on transgene introgression risk at different population sizes.

  20. A DTI-based model for TMS using the independent impedance method with frequency-dependent tissue parameters

    Science.gov (United States)

    De Geeter, N.; Crevecoeur, G.; Dupré, L.; Van Hecke, W.; Leemans, A.

    2012-04-01

    Accurate simulations on detailed realistic head models are necessary to gain a better understanding of the response to transcranial magnetic stimulation (TMS). Hitherto, head models with simplified geometries and constant isotropic material properties are often used, whereas some biological tissues have anisotropic characteristics which vary naturally with frequency. Moreover, most computational methods do not take the tissue permittivity into account. Therefore, we calculate the electromagnetic behaviour due to TMS in a head model with realistic geometry and where realistic dispersive anisotropic tissue properties are incorporated, based on T1-weighted and diffusion-weighted magnetic resonance images. This paper studies the impact of tissue anisotropy, permittivity and frequency dependence, using the anisotropic independent impedance method. The results show that anisotropy yields differences up to 32% and 19% of the maximum induced currents and electric field, respectively. Neglecting the permittivity values leads to a decrease of about 72% and 24% of the maximum currents and field, respectively. Implementing the dispersive effects of biological tissues results in a difference of 6% of the maximum currents. The cerebral voxels show limited sensitivity of the induced electric field to changes in conductivity and permittivity, whereas the field varies approximately linearly with frequency. These findings illustrate the importance of including each of the above parameters in the model and confirm the need for accuracy in the applied patient-specific method, which can be used in computer-assisted TMS.

  1. Continuum Modeling of Biological Network Formation

    KAUST Repository

    Albi, Giacomo

    2017-04-10

    We present an overview of recent analytical and numerical results for the elliptic–parabolic system of partial differential equations proposed by Hu and Cai, which models the formation of biological transportation networks. The model describes the pressure field using a Darcy type equation and the dynamics of the conductance network under pressure force effects. Randomness in the material structure is represented by a linear diffusion term and conductance relaxation by an algebraic decay term. We first introduce micro- and mesoscopic models and show how they are connected to the macroscopic PDE system. Then, we provide an overview of analytical results for the PDE model, focusing mainly on the existence of weak and mild solutions and analysis of the steady states. The analytical part is complemented by extensive numerical simulations. We propose a discretization based on finite elements and study the qualitative properties of network structures for various parameter values.

  2. A Conceptual Model to Identify Intent to Use Chemical-Biological Weapons

    Directory of Open Access Journals (Sweden)

    Mary Zalesny

    2017-10-01

    Full Text Available This paper describes a conceptual model to identify and interrelate indicators of intent of non-state actors to use chemical or biological weapons. The model expands on earlier efforts to understand intent to use weapons of mass destruction by building upon well-researched theories of intent and behavior and focusing on a sub-set of weapons of mass destruction (WMD to account for the distinct challenges of employing different types of WMD in violent acts. The conceptual model is presented as a first, critical step in developing a computational model for assessing the potential for groups to use chemical or biological weapons.

  3. Defining and detecting structural sensitivity in biological models: developing a new framework.

    Science.gov (United States)

    Adamson, M W; Morozov, A Yu

    2014-12-01

    When we construct mathematical models to represent biological systems, there is always uncertainty with regards to the model specification--whether with respect to the parameters or to the formulation of model functions. Sometimes choosing two different functions with close shapes in a model can result in substantially different model predictions: a phenomenon known in the literature as structural sensitivity, which is a significant obstacle to improving the predictive power of biological models. In this paper, we revisit the general definition of structural sensitivity, compare several more specific definitions and discuss their usefulness for the construction and analysis of biological models. Then we propose a general approach to reveal structural sensitivity with regards to certain system properties, which considers infinite-dimensional neighbourhoods of the model functions: a far more powerful technique than the conventional approach of varying parameters for a fixed functional form. In particular, we suggest a rigorous method to unearth sensitivity with respect to the local stability of systems' equilibrium points. We present a method for specifying the neighbourhood of a general unknown function with [Formula: see text] inflection points in terms of a finite number of local function properties, and provide a rigorous proof of its completeness. Using this powerful result, we implement our method to explore sensitivity in several well-known multicomponent ecological models and demonstrate the existence of structural sensitivity in these models. Finally, we argue that structural sensitivity is an important intrinsic property of biological models, and a direct consequence of the complexity of the underlying real systems.

  4. Numerical study of water diffusion in biological tissues using an improved finite difference method

    International Nuclear Information System (INIS)

    Xu Junzhong; Does, Mark D; Gore, John C

    2007-01-01

    An improved finite difference (FD) method has been developed in order to calculate the behaviour of the nuclear magnetic resonance signal variations caused by water diffusion in biological tissues more accurately and efficiently. The algorithm converts the conventional image-based finite difference method into a convenient matrix-based approach and includes a revised periodic boundary condition which eliminates the edge effects caused by artificial boundaries in conventional FD methods. Simulated results for some modelled tissues are consistent with analytical solutions for commonly used diffusion-weighted pulse sequences, whereas the improved FD method shows improved efficiency and accuracy. A tightly coupled parallel computing approach was also developed to implement the FD methods to enable large-scale simulations of realistic biological tissues. The potential applications of the improved FD method for understanding diffusion in tissues are also discussed. (note)

  5. Automatic skull segmentation from MR images for realistic volume conductor models of the head: Assessment of the state-of-the-art.

    Science.gov (United States)

    Nielsen, Jesper D; Madsen, Kristoffer H; Puonti, Oula; Siebner, Hartwig R; Bauer, Christian; Madsen, Camilla Gøbel; Saturnino, Guilherme B; Thielscher, Axel

    2018-03-12

    Anatomically realistic volume conductor models of the human head are important for accurate forward modeling of the electric field during transcranial brain stimulation (TBS), electro- (EEG) and magnetoencephalography (MEG). In particular, the skull compartment exerts a strong influence on the field distribution due to its low conductivity, suggesting the need to represent its geometry accurately. However, automatic skull reconstruction from structural magnetic resonance (MR) images is difficult, as compact bone has a very low signal in magnetic resonance imaging (MRI). Here, we evaluate three methods for skull segmentation, namely FSL BET2, the unified segmentation routine of SPM12 with extended spatial tissue priors, and the skullfinder tool of BrainSuite. To our knowledge, this study is the first to rigorously assess the accuracy of these state-of-the-art tools by comparison with CT-based skull segmentations on a group of ten subjects. We demonstrate several key factors that improve the segmentation quality, including the use of multi-contrast MRI data, the optimization of the MR sequences and the adaptation of the parameters of the segmentation methods. We conclude that FSL and SPM12 achieve better skull segmentations than BrainSuite. The former methods obtain reasonable results for the upper part of the skull when a combination of T1- and T2-weighted images is used as input. The SPM12-based results can be improved slightly further by means of simple morphological operations to fix local defects. In contrast to FSL BET2, the SPM12-based segmentation with extended spatial tissue priors and the BrainSuite-based segmentation provide coarse reconstructions of the vertebrae, enabling the construction of volume conductor models that include the neck. We exemplarily demonstrate that the extended models enable a more accurate estimation of the electric field distribution during transcranial direct current stimulation (tDCS) for montages that involve extraencephalic

  6. Track structure in biological models.

    Science.gov (United States)

    Curtis, S B

    1986-01-01

    High-energy heavy ions in the galactic cosmic radiation (HZE particles) may pose a special risk during long term manned space flights outside the sheltering confines of the earth's geomagnetic field. These particles are highly ionizing, and they and their nuclear secondaries can penetrate many centimeters of body tissue. The three dimensional patterns of ionizations they create as they lose energy are referred to as their track structure. Several models of biological action on mammalian cells attempt to treat track structure or related quantities in their formulation. The methods by which they do this are reviewed. The proximity function is introduced in connection with the theory of Dual Radiation Action (DRA). The ion-gamma kill (IGK) model introduces the radial energy-density distribution, which is a smooth function characterizing both the magnitude and extension of a charged particle track. The lethal, potentially lethal (LPL) model introduces lambda, the mean distance between relevant ion clusters or biochemical species along the track. Since very localized energy depositions (within approximately 10 nm) are emphasized, the proximity function as defined in the DRA model is not of utility in characterizing track structure in the LPL formulation.

  7. Lessons learned in using realist evaluation to assess maternal and newborn health programming in rural Bangladesh.

    Science.gov (United States)

    Adams, Alayne; Sedalia, Saroj; McNab, Shanon; Sarker, Malabika

    2016-03-01

    Realist evaluation furnishes valuable insight to public health practitioners and policy makers about how and why interventions work or don't work. Moving beyond binary measures of success or failure, it provides a systematic approach to understanding what goes on in the 'Black Box' and how implementation decisions in real life contexts can affect intervention effectiveness. This paper reflects on an experience in applying the tenets of realist evaluation to identify optimal implementation strategies for scale-up of Maternal and Newborn Health (MNH) programmes in rural Bangladesh. Supported by UNICEF, the three MNH programmes under consideration employed different implementation models to deliver similar services and meet similar MNH goals. Programme targets included adoption of recommended antenatal, post-natal and essential newborn care practices; health systems strengthening through improved referral, accountability and administrative systems, and increased community knowledge. Drawing on focused examples from this research, seven steps for operationalizing the realist evaluation approach are offered, while emphasizing the need to iterate and innovate in terms of methods and analysis strategies. The paper concludes by reflecting on lessons learned in applying realist evaluation, and the unique insights it yields regarding implementation strategies for successful MNH programming. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  8. Modelling of environmental impacts from biological treatment of organic municipal waste in EASEWASTE

    DEFF Research Database (Denmark)

    Boldrin, Alessio; Neidel, Trine Lund; Damgaard, Anders

    2011-01-01

    The waste-LCA model EASEWASTE quantifies potential environmental effects from biological treatment of organic waste, based on mass and energy flows, emissions to air, water, soil and groundwater as well as effects from upstream and downstream processes. Default technologies for composting......, anaerobic digestion and combinations hereof are available in the model, but the user can change all key parameters in the biological treatment module so that specific local plants and processes can be modelled. EASEWASTE is one of the newest waste LCA models and the biological treatment module was built...... partly on features of earlier waste-LCA models, but offers additional facilities, more flexibility, transparency and user-friendliness. The paper presents the main features of the module and provides some examples illustrating the capability of the model in environmentally assessing and discriminating...

  9. Fatigue - determination of a more realistic usage factor

    International Nuclear Information System (INIS)

    Lang, H.

    2001-01-01

    The ability to use a suitable counting method for determining the stress range spectrum in elastic and simplified elastic-plastic fatigue analyses is of crucial importance for enabling determination of a realistic usage factor. Determination of elastic-plastic strain range using the K e factor from fictitious elastically calculated loads is also important in the event of elastic behaviour being exceeded. This paper thus examines both points in detail. A fatigue module with additional options, which functions on this basis is presented. The much more realistic determination of usage factor presented here offers various economic benefits depending on the application

  10. SBRML: a markup language for associating systems biology data with models.

    Science.gov (United States)

    Dada, Joseph O; Spasić, Irena; Paton, Norman W; Mendes, Pedro

    2010-04-01

    Research in systems biology is carried out through a combination of experiments and models. Several data standards have been adopted for representing models (Systems Biology Markup Language) and various types of relevant experimental data (such as FuGE and those of the Proteomics Standards Initiative). However, until now, there has been no standard way to associate a model and its entities to the corresponding datasets, or vice versa. Such a standard would provide a means to represent computational simulation results as well as to frame experimental data in the context of a particular model. Target applications include model-driven data analysis, parameter estimation, and sharing and archiving model simulations. We propose the Systems Biology Results Markup Language (SBRML), an XML-based language that associates a model with several datasets. Each dataset is represented as a series of values associated with model variables, and their corresponding parameter values. SBRML provides a flexible way of indexing the results to model parameter values, which supports both spreadsheet-like data and multidimensional data cubes. We present and discuss several examples of SBRML usage in applications such as enzyme kinetics, microarray gene expression and various types of simulation results. The XML Schema file for SBRML is available at http://www.comp-sys-bio.org/SBRML under the Academic Free License (AFL) v3.0.

  11. The effects of presenting multidigit mathematics problems in a realistic context on sixth graders' problem solving

    NARCIS (Netherlands)

    Hickendorff, M.

    2013-01-01

    Mathematics education and assessments increasingly involve arithmetic problems presented in context: a realistic situation that requires mathematical modeling. This study assessed the effects of such typical school mathematics contexts on two aspects of problem solving: performance and strategy use.

  12. Realistic rhetoric and legal decision

    Directory of Open Access Journals (Sweden)

    João Maurício Adeodato

    2017-06-01

    Full Text Available The text aims to lay the foundations of a realistic rhetoric, from the descriptive perspective of how the legal decision actually takes place, without normative considerations. Aristotle's rhetorical idealism and its later prestige reduced rhetoric to the art of persuasion, eliminating important elements of sophistry, especially with regard to legal decision. It concludes with a rhetorical perspective of judicial activism in complex societies.

  13. Modeling of various contact theories for the manipulation of different biological micro/nanoparticles based on AFM

    Science.gov (United States)

    Korayem, M. H.; Taheri, M.

    2014-01-01

    In this article, the modeling of various contact theories to be applied in the biomanipulation of different micro/nanoparticles based on the atomic force microscope has been studied, and the effect of adhesion force in different contact models on indentation depth and contact angle between tip and substrate has been explored for the target biological micro/nanoparticle. The contact models used in this research include the Hertz, JKR, DMT, BCP, COS, PT, and the SUN models. Also, the target particles comprise the biological micro/nanoparticles of DNA, yeast, platelet, and nanobacterium. Previous research works have investigated the contact models for the manipulation of non-biological gold micro/nanoparticles in the air environment. Since in a real biomanipulation situation, the biological micro/nanoparticles are displaced in biological environments; in this article, various contact theories for the biomanipulation of biological micro/nanoparticles in different biological environments have been modeled and compared for the first time. The results of modeling indicate that the use of Hertz contact model in analyzing the biomanipulation of biological nanoparticles is not appropriate, because it does not take the adhesion force into consideration and thus produces a significant error. Also, all the six contact models developed in this article show larger deformations for studied bionanoparticles in comparison to the gold nanoparticles, which can be justified with regards to the mechanical properties of gold.

  14. Modeling the Biological Diversity of Pig Carcasses

    DEFF Research Database (Denmark)

    Erbou, Søren Gylling Hemmingsen

    This thesis applies methods from medical image analysis for modeling the biological diversity of pig carcasses. The Danish meat industry is very focused on improving product quality and productivity by optimizing the use of the carcasses and increasing productivity in the abattoirs. In order...... equipment is investigated, without the need for a calibration against a less accurate manual dissection. The rest of the contributions regard the construction and use of point distribution models (PDM). PDM’s are able to capture the shape variation of a population of shapes, in this case a 3D surface...

  15. A generic framework for individual-based modelling and physical-biological interaction

    DEFF Research Database (Denmark)

    Christensen, Asbjørn; Mariani, Patrizio; Payne, Mark R.

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian...... scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions...

  16. The SEEK: a platform for sharing data and models in systems biology.

    Science.gov (United States)

    Wolstencroft, Katy; Owen, Stuart; du Preez, Franco; Krebs, Olga; Mueller, Wolfgang; Goble, Carole; Snoep, Jacky L

    2011-01-01

    Systems biology research is typically performed by multidisciplinary groups of scientists, often in large consortia and in distributed locations. The data generated in these projects tend to be heterogeneous and often involves high-throughput "omics" analyses. Models are developed iteratively from data generated in the projects and from the literature. Consequently, there is a growing requirement for exchanging experimental data, mathematical models, and scientific protocols between consortium members and a necessity to record and share the outcomes of experiments and the links between data and models. The overall output of a research consortium is also a valuable commodity in its own right. The research and associated data and models should eventually be available to the whole community for reuse and future analysis. The SEEK is an open-source, Web-based platform designed for the management and exchange of systems biology data and models. The SEEK was originally developed for the SysMO (systems biology of microorganisms) consortia, but the principles and objectives are applicable to any systems biology project. The SEEK provides an index of consortium resources and acts as gateway to other tools and services commonly used in the community. For example, the model simulation tool, JWS Online, has been integrated into the SEEK, and a plug-in to PubMed allows publications to be linked to supporting data and author profiles in the SEEK. The SEEK is a pragmatic solution to data management which encourages, but does not force, researchers to share and disseminate their data to community standard formats. It provides tools to assist with management and annotation as well as incentives and added value for following these recommendations. Data exchange and reuse rely on sufficient annotation, consistent metadata descriptions, and the use of standard exchange formats for models, data, and the experiments they are derived from. In this chapter, we present the SEEK platform

  17. Biological nitrogen and phosphorus removal in membrane bioreactors: model development and parameter estimation.

    Science.gov (United States)

    Cosenza, Alida; Mannina, Giorgio; Neumann, Marc B; Viviani, Gaspare; Vanrolleghem, Peter A

    2013-04-01

    Membrane bioreactors (MBR) are being increasingly used for wastewater treatment. Mathematical modeling of MBR systems plays a key role in order to better explain their characteristics. Several MBR models have been presented in the literature focusing on different aspects: biological models, models which include soluble microbial products (SMP), physical models able to describe the membrane fouling and integrated models which couple the SMP models with the physical models. However, only a few integrated models have been developed which take into account the relationships between membrane fouling and biological processes. With respect to biological phosphorus removal in MBR systems, due to the complexity of the process, practical use of the models is still limited. There is a vast knowledge (and consequently vast amount of data) on nutrient removal for conventional-activated sludge systems but only limited information on phosphorus removal for MBRs. Calibration of these complex integrated models still remains the main bottleneck to their employment. The paper presents an integrated mathematical model able to simultaneously describe biological phosphorus removal, SMP formation/degradation and physical processes which also include the removal of organic matter. The model has been calibrated with data collected in a UCT-MBR pilot plant, located at the Palermo wastewater treatment plant, applying a modified version of a recently developed calibration protocol. The calibrated model provides acceptable correspondence with experimental data and can be considered a useful tool for MBR design and operation.

  18. Bending and Twisting the Embryonic Heart: A Computational Model for C-Looping Based on Realistic Geometry

    Directory of Open Access Journals (Sweden)

    Yunfei eShi

    2014-08-01

    Full Text Available The morphogenetic process of cardiac looping transforms the straight heart tube into a curved tube that resembles the shape of the future four-chambered heart. Although great progress has been made in identifying the molecular and genetic factors involved in looping, the physical mechanisms that drive this process have remained poorly understood. Recent work, however, has shed new light on this complicated problem. After briefly reviewing the current state of knowledge, we propose a relatively comprehensive hypothesis for the mechanics of the first phase of looping, termed c-looping, as the straight heart tube deforms into a c-shaped tube. According to this hypothesis, differential hypertrophic growth in the myocardium supplies the main forces that cause the heart tube to bend ventrally, while regional growth and contraction in the omphalomesenteric veins (primitive atria and compressive loads exerted by the splanchnopleuric membrane drive rightward torsion. A computational model based on realistic embryonic heart geometry is used to test this hypothesis. The behavior of the model is in reasonable agreement with available experimental data from control and perturbed embryos, offering support for our hypothesis. The results also suggest, however, that several other mechanisms contribute secondarily to normal looping, and we speculate that these mechanisms play backup roles when looping is perturbed. Finally, some outstanding questions are discussed for future study.

  19. Determination of Realistic Fire Scenarios in Spacecraft

    Science.gov (United States)

    Dietrich, Daniel L.; Ruff, Gary A.; Urban, David

    2013-01-01

    This paper expands on previous work that examined how large a fire a crew member could successfully survive and extinguish in the confines of a spacecraft. The hazards to the crew and equipment during an accidental fire include excessive pressure rise resulting in a catastrophic rupture of the vehicle skin, excessive temperatures that burn or incapacitate the crew (due to hyperthermia), carbon dioxide build-up or accumulation of other combustion products (e.g. carbon monoxide). The previous work introduced a simplified model that treated the fire primarily as a source of heat and combustion products and sink for oxygen prescribed (input to the model) based on terrestrial standards. The model further treated the spacecraft as a closed system with no capability to vent to the vacuum of space. The model in the present work extends this analysis to more realistically treat the pressure relief system(s) of the spacecraft, include more combustion products (e.g. HF) in the analysis and attempt to predict the fire spread and limiting fire size (based on knowledge of terrestrial fires and the known characteristics of microgravity fires) rather than prescribe them in the analysis. Including the characteristics of vehicle pressure relief systems has a dramatic mitigating effect by eliminating vehicle overpressure for all but very large fires and reducing average gas-phase temperatures.

  20. Beware the tail that wags the dog: informal and formal models in biology.

    Science.gov (United States)

    Gunawardena, Jeremy

    2014-11-05

    Informal models have always been used in biology to guide thinking and devise experiments. In recent years, formal mathematical models have also been widely introduced. It is sometimes suggested that formal models are inherently superior to informal ones and that biology should develop along the lines of physics or economics by replacing the latter with the former. Here I suggest to the contrary that progress in biology requires a better integration of the formal with the informal. © 2014 Gunawardena. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  1. Dynamical compensation and structural identifiability of biological models: Analysis, implications, and reconciliation.

    Science.gov (United States)

    Villaverde, Alejandro F; Banga, Julio R

    2017-11-01

    The concept of dynamical compensation has been recently introduced to describe the ability of a biological system to keep its output dynamics unchanged in the face of varying parameters. However, the original definition of dynamical compensation amounts to lack of structural identifiability. This is relevant if model parameters need to be estimated, as is often the case in biological modelling. Care should we taken when using an unidentifiable model to extract biological insight: the estimated values of structurally unidentifiable parameters are meaningless, and model predictions about unmeasured state variables can be wrong. Taking this into account, we explore alternative definitions of dynamical compensation that do not necessarily imply structural unidentifiability. Accordingly, we show different ways in which a model can be made identifiable while exhibiting dynamical compensation. Our analyses enable the use of the new concept of dynamical compensation in the context of parameter identification, and reconcile it with the desirable property of structural identifiability.

  2. The choice of a biological model in assessing internal dose equivalent

    International Nuclear Information System (INIS)

    Parodo, A.; Erre, N.

    1977-01-01

    Many are the biological models related to kinetic behavior of radioactive materials within the organism, or in an organ. This is true particularly for the metabolic kinetics of bone-seekers radionuclides described differently by various authors: as a consequence, different forms of the retention function have been used in calculating internal dose equivalent. In our opinion, the retention functions expressed as linear combinations of exponential terms with negative exponents are preferable. In fact, they can be obtained by coherent compartmental analysis and allow a mathematical formalism fairly well definite and easily adaptable to computers. Moreover, it is possible to make use of graphs and monograms already published. The role of the biological model in internal dosimetry, referred to the reliability of the quantitative informations on the kinetic behavior of the radionuclides in the organism and, therefrom, to the accuracy of the doses calculated, is discussed. By comparing the results obtained with different biological models, one finds that the choice of a model is less important than the choice of the value of the appropriate parameters

  3. A realistic large-scale model of the cerebellum granular layer predicts circuit spatio-temporal filtering properties

    Directory of Open Access Journals (Sweden)

    Sergio Solinas

    2010-05-01

    Full Text Available The way the cerebellar granular layer transforms incoming mossy fiber signals into new spike patterns to be related to Purkinje cells is not yet clear. Here, a realistic computational model of the granular layer was developed and used to address four main functional hypotheses: center-surround organization, time-windowing, high-pass filtering in responses to spike bursts and coherent oscillations in response to diffuse random activity. The model network was activated using patterns inspired by those recorded in vivo. Burst stimulation of a small mossy fiber bundle resulted in granule cell bursts delimited in time (time windowing and space (center-surround by network inhibition. This burst-burst transmission showed marked frequency-dependence configuring a high-pass filter with cut-off frequency around 100 Hz. The contrast between center and surround properties was regulated by the excitatory-inhibitory balance. The stronger excitation made the center more responsive to 10-50 Hz input frequencies and enhanced the granule cell output (with spike occurring earlier and with higher frequency and number compared to the surround. Finally, over a certain level of mossy fiber background activity, the circuit generated coherent oscillations in the theta-frequency band. All these processes were fine-tuned by NMDA and GABA-A receptor activation and neurotransmitter vesicle cycling in the cerebellar glomeruli. This model shows that available knowledge on cellular mechanisms is sufficient to unify the main functional hypotheses on the cerebellum granular layer and suggests that this network can behave as an adaptable spatio-temporal filter coordinated by theta-frequency oscillations.

  4. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Lazy Updating of hubs can enable more realistic models by speeding up stochastic simulations

    International Nuclear Information System (INIS)

    Ehlert, Kurt; Loewe, Laurence

    2014-01-01

    To respect the nature of discrete parts in a system, stochastic simulation algorithms (SSAs) must update for each action (i) all part counts and (ii) each action's probability of occurring next and its timing. This makes it expensive to simulate biological networks with well-connected “hubs” such as ATP that affect many actions. Temperature and volume also affect many actions and may be changed significantly in small steps by the network itself during fever and cell growth, respectively. Such trends matter for evolutionary questions, as cell volume determines doubling times and fever may affect survival, both key traits for biological evolution. Yet simulations often ignore such trends and assume constant environments to avoid many costly probability updates. Such computational convenience precludes analyses of important aspects of evolution. Here we present “Lazy Updating,” an add-on for SSAs designed to reduce the cost of simulating hubs. When a hub changes, Lazy Updating postpones all probability updates for reactions depending on this hub, until a threshold is crossed. Speedup is substantial if most computing time is spent on such updates. We implemented Lazy Updating for the Sorting Direct Method and it is easily integrated into other SSAs such as Gillespie's Direct Method or the Next Reaction Method. Testing on several toy models and a cellular metabolism model showed >10× faster simulations for its use-cases—with a small loss of accuracy. Thus we see Lazy Updating as a valuable tool for some special but important simulation problems that are difficult to address efficiently otherwise

  6. Protocol for an HTA report: Does therapeutic writing help people with long-term conditions? Systematic review, realist synthesis and economic modelling.

    Science.gov (United States)

    Meads, C; Nyssen, O P; Wong, G; Steed, L; Bourke, L; Ross, C A; Hayman, S; Field, V; Lord, J; Greenhalgh, T; Taylor, S J C

    2014-02-18

    Long-term medical conditions (LTCs) cause reduced health-related quality of life and considerable health service expenditure. Writing therapy has potential to improve physical and mental health in people with LTCs, but its effectiveness is not established. This project aims to establish the clinical and cost-effectiveness of therapeutic writing in LTCs by systematic review and economic evaluation, and to evaluate context and mechanisms by which it might work, through realist synthesis. Included are any comparative study of therapeutic writing compared with no writing, waiting list, attention control or placebo writing in patients with any diagnosed LTCs that report at least one of the following: relevant clinical outcomes; quality of life; health service use; psychological, behavioural or social functioning; adherence or adverse events. Searches will be conducted in the main medical databases including MEDLINE, EMBASE, PsycINFO, The Cochrane Library and Science Citation Index. For the realist review, further purposive and iterative searches through snowballing techniques will be undertaken. Inclusions, data extraction and quality assessment will be in duplicate with disagreements resolved through discussion. Quality assessment will include using Grading of Recommendations Assessment, Development and Evaluation (GRADE) criteria. Data synthesis will be narrative and tabular with meta-analysis where appropriate. De novo economic modelling will be attempted in one clinical area if sufficient evidence is available and performed according to the National Institute for Health and Care Excellence (NICE) reference case.

  7. Isobio software: biological dose distribution and biological dose volume histogram from physical dose conversion using linear-quadratic-linear model.

    Science.gov (United States)

    Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit

    2017-02-01

    To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD 2 ) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD 2 verification with pair t -test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D 90% , 0.56% in the bladder, 1.74% in the rectum when determined by D 2cc , and less than 1% in Pinnacle. The difference in the EQD 2 between the software calculation and the manual calculation was not significantly different with 0.00% at p -values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.

  8. Photo-Realistic Image Synthesis and Virtual Cinematography

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Synthesis is a new field of research that has received increasing attention in recent years. It is strictly related to the grown popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer generated...... characters, "virtual actors", in the motion picture production increases every day. While the most known computer graphics techniques have largely been adopted successfully in nowadays fictions, it still remains very challenging to implement virtual actors which would resemble, visually, human beings....... Interestingly, film directors have been looking at the recent progress achieved by the research community in the field of realistic visualization of virtual views, and they have successfully implemented state of the art research approaches in their productions. An innovative concept is then gaining consensus...

  9. Generating Geospatially Realistic Driving Patterns Derived From Clustering Analysis Of Real EV Driving Data

    DEFF Research Database (Denmark)

    Pedersen, Anders Bro; Aabrandt, Andreas; Østergaard, Jacob

    2014-01-01

    In order to provide a vehicle fleet that realistically represents the predicted Electric Vehicle (EV) penetration for the future, a model is required that mimics people driving behaviour rather than simply playing back collected data. When the focus is broadened from on a traditional user...... scales, which calls for a statistically correct, yet flexible model. This paper describes a method for modelling EV, based on non-categorized data, which takes into account the plug in locations of the vehicles. By using clustering analysis to extrapolate and classify the primary locations where...

  10. Boolean Models of Biological Processes Explain Cascade-Like Behavior.

    Science.gov (United States)

    Chen, Hao; Wang, Guanyu; Simha, Rahul; Du, Chenghang; Zeng, Chen

    2016-01-29

    Biological networks play a key role in determining biological function and therefore, an understanding of their structure and dynamics is of central interest in systems biology. In Boolean models of such networks, the status of each molecule is either "on" or "off" and along with the molecules interact with each other, their individual status changes from "on" to "off" or vice-versa and the system of molecules in the network collectively go through a sequence of changes in state. This sequence of changes is termed a biological process. In this paper, we examine the common perception that events in biomolecular networks occur sequentially, in a cascade-like manner, and ask whether this is likely to be an inherent property. In further investigations of the budding and fission yeast cell-cycle, we identify two generic dynamical rules. A Boolean system that complies with these rules will automatically have a certain robustness. By considering the biological requirements in robustness and designability, we show that those Boolean dynamical systems, compared to an arbitrary dynamical system, statistically present the characteristics of cascadeness and sequentiality, as observed in the budding and fission yeast cell- cycle. These results suggest that cascade-like behavior might be an intrinsic property of biological processes.

  11. Modern Perspectives on Numerical Modeling of Cardiac Pacemaker Cell

    Science.gov (United States)

    Maltsev, Victor A.; Yaniv, Yael; Maltsev, Anna V.; Stern, Michael D.; Lakatta, Edward G.

    2015-01-01

    Cardiac pacemaking is a complex phenomenon that is still not completely understood. Together with experimental studies, numerical modeling has been traditionally used to acquire mechanistic insights in this research area. This review summarizes the present state of numerical modeling of the cardiac pacemaker, including approaches to resolve present paradoxes and controversies. Specifically we discuss the requirement for realistic modeling to consider symmetrical importance of both intracellular and cell membrane processes (within a recent “coupled-clock” theory). Promising future developments of the complex pacemaker system models include the introduction of local calcium control, mitochondria function, and biochemical regulation of protein phosphorylation and cAMP production. Modern numerical and theoretical methods such as multi-parameter sensitivity analyses within extended populations of models and bifurcation analyses are also important for the definition of the most realistic parameters that describe a robust, yet simultaneously flexible operation of the coupled-clock pacemaker cell system. The systems approach to exploring cardiac pacemaker function will guide development of new therapies, such as biological pacemakers for treating insufficient cardiac pacemaker function that becomes especially prevalent with advancing age. PMID:24748434

  12. Spiral-wave dynamics in ionically realistic mathematical models for human ventricular tissue: the effects of periodic deformation.

    Science.gov (United States)

    Nayak, Alok R; Pandit, Rahul

    2014-01-01

    We carry out an extensive numerical study of the dynamics of spiral waves of electrical activation, in the presence of periodic deformation (PD) in two-dimensional simulation domains, in the biophysically realistic mathematical models of human ventricular tissue due to (a) ten-Tusscher and Panfilov (the TP06 model) and (b) ten-Tusscher, Noble, Noble, and Panfilov (the TNNP04 model). We first consider simulations in cable-type domains, in which we calculate the conduction velocity θ and the wavelength λ of a plane wave; we show that PD leads to a periodic, spatial modulation of θ and a temporally periodic modulation of λ; both these modulations depend on the amplitude and frequency of the PD. We then examine three types of initial conditions for both TP06 and TNNP04 models and show that the imposition of PD leads to a rich variety of spatiotemporal patterns in the transmembrane potential including states with a single rotating spiral (RS) wave, a spiral-turbulence (ST) state with a single meandering spiral, an ST state with multiple broken spirals, and a state SA in which all spirals are absorbed at the boundaries of our simulation domain. We find, for both TP06 and TNNP04 models, that spiral-wave dynamics depends sensitively on the amplitude and frequency of PD and the initial condition. We examine how these different types of spiral-wave states can be eliminated in the presence of PD by the application of low-amplitude pulses by square- and rectangular-mesh suppression techniques. We suggest specific experiments that can test the results of our simulations.

  13. Spiral-Wave Dynamics in Ionically Realistic MathematicalModels for Human Ventricular Tissue: The Effects of PeriodicDeformation

    Directory of Open Access Journals (Sweden)

    Alok Ranjan Nayak

    2014-06-01

    Full Text Available We carry out an extensive numerical study of the dynamics of spiral waves of electrical activation, in the presence of periodic deformation (PD in two-dimensional simulation domains, in the biophysically realistic mathematical models of human ventricular tissue due to (a ten-Tusscher and Panfilov (the TP06 model and (b ten-Tusscher, Noble, Noble, and Panfilov (theTNNP04 model. We first consider simulations in cable-type domains, in which we calculate the conduction velocity $CV$ andthe wavelength $lambda$ of a plane wave; we show that PD leads to a periodic, spatial modulation of $CV$ and a temporallyperiodic modulation of $lambda$; both these modulations depend on the amplitude and frequency of the PD. We then examine three types of initial conditions for both TP06 and TNNP04 models and show that the imposition of PD leads to a rich variety ofspatiotemporal patterns in the transmembrane potential including states with a single rotating spiral (RS wave, a spiral-turbulence (ST state with a single meandering spiral, an ST state with multiple broken spirals, and a state SA in which all spirals are absorbed at the boundaries of our simulation domain. We find, for both TP06 and TNNP04 models, that spiral-wave dynamics depends sensitively on the amplitude and frequency of PD and the initial condition. We examine how these different types of spiral-wave states can be eliminated in the presence of PD by the application of low-amplitude pulses on square and rectangular control meshes. We suggest specific experiments that can test the results of our simulations.

  14. Metastable cosmic strings in realistic models

    International Nuclear Information System (INIS)

    Holman, R.

    1992-01-01

    The stability of the electroweak Z-string is investigated at high temperatures. The results show that, while finite temperature corrections can improve the stability of the Z-string, their effect is not strong enough to stabilize the Z-string in the standard electroweak model. Consequently, the Z-string will be unstable even under the conditions present during the electroweak phase transition. Phenomenologically viable models based on the gauge group SU(2) L x SU(2) R x U(1) B-L are then considered, and it is shown that metastable strings exist and are stable to small perturbations for a large region of the parameter space for these models. It is also shown that these strings are superconducting with bosonic charge carriers. The string superconductivity may be able to stabilize segments and loops against dynamical contraction. Possible implications of these strings for cosmology are discussed

  15. I-Love relations for incompressible stars and realistic stars

    Science.gov (United States)

    Chan, T. K.; Chan, AtMa P. O.; Leung, P. T.

    2015-02-01

    In spite of the diversity in the equations of state of nuclear matter, the recently discovered I-Love-Q relations [Yagi and Yunes, Science 341, 365 (2013), 10.1126/science.1236462], which relate the moment of inertia, tidal Love number (deformability), and the spin-induced quadrupole moment of compact stars, hold for various kinds of realistic neutron stars and quark stars. While the physical origin of such universality is still a current issue, the observation that the I-Love-Q relations of incompressible stars can well approximate those of realistic compact stars hints at a new direction to approach the problem. In this paper, by establishing recursive post-Minkowskian expansion for the moment of inertia and the tidal deformability of incompressible stars, we analytically derive the I-Love relation for incompressible stars and show that the so-obtained formula can be used to accurately predict the behavior of realistic compact stars from the Newtonian limit to the maximum mass limit.

  16. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  17. WORKSHOP ON APPLICATION OF STATISTICAL METHODS TO BIOLOGICALLY-BASED PHARMACOKINETIC MODELING FOR RISK ASSESSMENT

    Science.gov (United States)

    Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...

  18. Modelling intelligent behavior

    Science.gov (United States)

    Green, H. S.; Triffet, T.

    1993-01-01

    An introductory discussion of the related concepts of intelligence and consciousness suggests criteria to be met in the modeling of intelligence and the development of intelligent materials. Methods for the modeling of actual structure and activity of the animal cortex have been found, based on present knowledge of the ionic and cellular constitution of the nervous system. These have led to the development of a realistic neural network model, which has been used to study the formation of memory and the process of learning. An account is given of experiments with simple materials which exhibit almost all properties of biological synapses and suggest the possibility of a new type of computer architecture to implement an advanced type of artificial intelligence.

  19. A model of heavy ion detection in physical and biological systems

    International Nuclear Information System (INIS)

    Waligorski, M.P.R.

    1988-01-01

    Track structure theory (the Katz model) and its application to the detection of heavy ions in physical and biological systems are reviewed. Following the use of a new corrected formula describing the radial distribution of average dose around the path of a heavy ion, based on results of Monte Carlo calculations and on results of experimental measurements, better agreement is achieved between model calculations and experimentally measured relative effectiveness, for enzymatic and viral systems, for the Fricke dosemeter and for alanine and thermoluminescent (TDL-700) dosemeters irradiated with beams of heavy charged particles. From experimentally measured RBE dependences for survival and frequency of neoplastic transformations in a mammalian cell culture irradiated with beams of energetic heavy ions, values of model parameters for these biological endpoints have been extracted, and a model extrapolation to the low-dose region performed. Results of model calculations are then compared with evaluations of the lung cancer hazard in populations exposed to radon and its progeny. The model can be applied to practical phenomenological analysis of radiation damage in solid-state systems and to dosimetry of charged particle and fast neutron beams using a variety of detectors. The model can also serve as a guide in building more basic models of the action of ionizing radiation with physical and biological systems and guide of development of models of radiation risk more relevant than that used presently. 185 refs., 31 figs., 3 tabs. (author)

  20. Hyper-realistic face masks: a new challenge in person identification.

    Science.gov (United States)

    Sanders, Jet Gabrielle; Ueda, Yoshiyuki; Minemoto, Kazusa; Noyes, Eilidh; Yoshikawa, Sakiko; Jenkins, Rob

    2017-01-01

    We often identify people using face images. This is true in occupational settings such as passport control as well as in everyday social environments. Mapping between images and identities assumes that facial appearance is stable within certain bounds. For example, a person's apparent age, gender and ethnicity change slowly, if at all. It also assumes that deliberate changes beyond these bounds (i.e., disguises) would be easy to spot. Hyper-realistic face masks overturn these assumptions by allowing the wearer to look like an entirely different person. If unnoticed, these masks break the link between facial appearance and personal identity, with clear implications for applied face recognition. However, to date, no one has assessed the realism of these masks, or specified conditions under which they may be accepted as real faces. Herein, we examined incidental detection of unexpected but attended hyper-realistic masks in both photographic and live presentations. Experiment 1 (UK; n = 60) revealed no evidence for overt detection of hyper-realistic masks among real face photos, and little evidence of covert detection. Experiment 2 (Japan; n = 60) extended these findings to different masks, mask-wearers and participant pools. In Experiment 3 (UK and Japan; n = 407), passers-by failed to notice that a live confederate was wearing a hyper-realistic mask and showed limited evidence of covert detection, even at close viewing distance (5 vs. 20 m). Across all of these studies, viewers accepted hyper-realistic masks as real faces. Specific countermeasures will be required if detection rates are to be improved.

  1. Entropy stable modeling of non-isothermal multi-component diffuse-interface two-phase flows with realistic equations of state

    KAUST Repository

    Kou, Jisheng

    2018-02-25

    In this paper, we consider mathematical modeling and numerical simulation of non-isothermal compressible multi-component diffuse-interface two-phase flows with realistic equations of state. A general model with general reference velocity is derived rigorously through thermodynamical laws and Onsager\\'s reciprocal principle, and it is capable of characterizing compressibility and partial miscibility between multiple fluids. We prove a novel relation among the pressure, temperature and chemical potentials, which results in a new formulation of the momentum conservation equation indicating that the gradients of chemical potentials and temperature become the primary driving force of the fluid motion except for the external forces. A key challenge in numerical simulation is to develop entropy stable numerical schemes preserving the laws of thermodynamics. Based on the convex-concave splitting of Helmholtz free energy density with respect to molar densities and temperature, we propose an entropy stable numerical method, which solves the total energy balance equation directly, and thus, naturally satisfies the first law of thermodynamics. Unconditional entropy stability (the second law of thermodynamics) of the proposed method is proved by estimating the variations of Helmholtz free energy and kinetic energy with time steps. Numerical results validate the proposed method.

  2. Entropy stable modeling of non-isothermal multi-component diffuse-interface two-phase flows with realistic equations of state

    KAUST Repository

    Kou, Jisheng; Sun, Shuyu

    2018-01-01

    In this paper, we consider mathematical modeling and numerical simulation of non-isothermal compressible multi-component diffuse-interface two-phase flows with realistic equations of state. A general model with general reference velocity is derived rigorously through thermodynamical laws and Onsager's reciprocal principle, and it is capable of characterizing compressibility and partial miscibility between multiple fluids. We prove a novel relation among the pressure, temperature and chemical potentials, which results in a new formulation of the momentum conservation equation indicating that the gradients of chemical potentials and temperature become the primary driving force of the fluid motion except for the external forces. A key challenge in numerical simulation is to develop entropy stable numerical schemes preserving the laws of thermodynamics. Based on the convex-concave splitting of Helmholtz free energy density with respect to molar densities and temperature, we propose an entropy stable numerical method, which solves the total energy balance equation directly, and thus, naturally satisfies the first law of thermodynamics. Unconditional entropy stability (the second law of thermodynamics) of the proposed method is proved by estimating the variations of Helmholtz free energy and kinetic energy with time steps. Numerical results validate the proposed method.

  3. Average intensity and spreading of partially coherent model beams propagating in a turbulent biological tissue

    International Nuclear Information System (INIS)

    Wu, Yuqian; Zhang, Yixin; Wang, Qiu; Hu, Zhengda

    2016-01-01

    For Gaussian beams with three different partially coherent models, including Gaussian-Schell model (GSM), Laguerre-Gaussian Schell-model (LGSM) and Bessel-Gaussian Schell-model (BGSM) beams propagating through a biological turbulent tissue, the expression of the spatial coherence radius of a spherical wave propagating in a turbulent biological tissue, and the average intensity and beam spreading for GSM, LGSM and BGSM beams are derived based on the fractal model of power spectrum of refractive-index variations in biological tissue. Effects of partially coherent model and parameters of biological turbulence on such beams are studied in numerical simulations. Our results reveal that the spreading of GSM beams is smaller than LGSM and BGSM beams on the same conditions, and the beam with larger source coherence width has smaller beam spreading than that with smaller coherence width. The results are useful for any applications involved light beam propagation through tissues, especially the cases where the average intensity and spreading properties of the light should be taken into account to evaluate the system performance and investigations in the structures of biological tissue. - Highlights: • Spatial coherence radius of a spherical wave propagating in a turbulent biological tissue is developed. • Expressions of average intensity and beam spreading for GSM, LGSM and BGSM beams in a turbulent biological tissue are derived. • The contrast for the three partially coherent model beams is shown in numerical simulations. • The results are useful for any applications involved light beam propagation through tissues.

  4. PENERAPAN MODEL SIKLUS BELAJAR 5E TERHADAP PEMAHAMAN KONSEP BIOLOGI UMUM DAN KEMAMPUAN APLIKASI SAINS MAHASISWA PENDIDIKAN BIOLOGI

    Directory of Open Access Journals (Sweden)

    Tuti Kurniati

    2015-02-01

    ABSTRAK Pergeseran paradigma pendidikan dan perkuliahan menyebabkan perkuliahan mulai mengarah pada konstruktivistik, yaitu perkuliahan yang berpusat pada mahasiswa (student-centered. Siklus Belajar 5E (learning cycle merupakan salah satu model perkuliahan yang berbasis konstuktivistik yaitu perkuliahan yang menekankan bahwa mahasiswa sendirilah yang akan membangun pengetahuannya. Penelitian ini bertujuan untuk mengetahui perbedaan pemahaman konsep dan kemampuan aplikasi sains mahasiswa antara kelas yang menggunakan Model Siklus Belajar 5E dibanding kelas kontrol pada perkuliahan Biologi Umum. Metode penelitian yang digunakan dalam penelitian ini adalah kuasi eksperimen dengan pendekatan kuantitatif melalui uji-t sampel berpasangan (paired sample t-test. Hasil penelitian ini menunjukkan bahwa terdapat perbedaan pemahaman konsep dan kemampuan aplikasi sains yang signifikan antara kelompok yang menggunakan Model Siklus Belajar 5E dibanding kelas kontrol. Rata-rata skor postes pemahaman konsep dan kemampuan aplikasi sains mahasiswa di kelas yang menggunakan Model Siklus Belajar 5E lebih tinggi dibandingkan kelas kontrol. Kata kunci: biologi umum, kemampuan aplikasi sains, Model Siklus Belajar 5E, pemahaman konsep

  5. Putting a Realistic Theory of Mind into Agency Theory

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Stea, Diego

    2014-01-01

    Agency theory is one of the most important foundational theories in management research, but it rests on contestable cognitive assumptions. Specifically, the principal is assumed to hold a perfect (correct) theory regarding some of the content of the agent's mind, while he is entirely ignorant...... concerning other such content. More realistically, individuals have some limited access to the minds of others. We explore the implications for classical agency theory of realistic assumptions regarding the human potential for interpersonal sensemaking. We discuss implications for the design and management...

  6. Realistic searches on stretched exponential networks

    Indian Academy of Sciences (India)

    We consider navigation or search schemes on networks which have a degree distribution of the form () ∝ exp(−). In addition, the linking probability is taken to be dependent on social distances and is governed by a parameter . The searches are realistic in the sense that not all search chains can be completed.

  7. Biological parameters for lung cancer in mathematical models of carcinogenesis

    International Nuclear Information System (INIS)

    Jacob, P.; Jacob, V.

    2003-01-01

    Applications of the two-step model of carcinogenesis with clonal expansion (TSCE) to lung cancer data are reviewed, including those on atomic bomb survivors from Hiroshima and Nagasaki, British doctors, Colorado Plateau miners, and Chinese tin miners. Different sets of identifiable model parameters are used in the literature. The parameter set which could be determined with the lowest uncertainty consists of the net proliferation rate gamma of intermediate cells, the hazard h 55 at an intermediate age, and the hazard H? at an asymptotically large age. Also, the values of these three parameters obtained in the various studies are more consistent than other identifiable combinations of the biological parameters. Based on representative results for these three parameters, implications for the biological parameters in the TSCE model are derived. (author)

  8. A model for the biological precipitation of Precambrian iron-formation

    Science.gov (United States)

    Laberge, G. L.

    1986-01-01

    A biological model for the precipitation of Precambrian iron formations is presented. Assuming an oxygen deficient atmosphere and water column to allow sufficient Fe solubility, it is proposed that local oxidizing environments, produced biologically, led to precipitation of iron formations. It is further suggested that spheroidal structures about 30 mm in diameter, which are widespread in low grade cherty rion formations, are relict forms of the organic walled microfossil Eosphaera tylerii. The presence of these structures suggests that the organism may have had a siliceous test, which allowed sufficient rigidity for accumulation and preservation. The model involves precipitation of ferric hydrates by oxidation of iron in the photic zone by a variety of photosynthetic organisms. Silica may have formed in the frustules of silica secreting organisms, including Eosphaera tylerii. Iron formates formed, therefore, by a sediment rain of biologically produced ferric hydrates and silica and other organic material. Siderite and hematite formed diagenetically on basin floors, and subsequent metamorphism produced magnetite and iron silicates.

  9. Results of recent calculations using realistic potentials

    International Nuclear Information System (INIS)

    Friar, J.L.

    1987-01-01

    Results of recent calculations for the triton using realistic potentials with strong tensor forces are reviewed, with an emphasis on progress made using the many different calculational schemes. Several test problems are suggested. 49 refs., 5 figs

  10. The University – a Rational-Biologic Model

    Directory of Open Access Journals (Sweden)

    Ion Gh. Rosca

    2008-05-01

    Full Text Available The article advances the extension of the biologic rational model for the organizations, which are reprocessing and living in a turbulent environment. The current “tree” type organizations are not able to satisfy the requirements of the socio-economical environment and are not able to provide the organizational perpetuation and development. Thus, an innovative performing model for both the top and down management areas is presented, with the following recommendations: dividing the organization into departments using neuronal connections, focusing on the formatting processes and not on the activities, rethinking the system of a new organizational culture.

  11. On the Modelling of Biological Patterns with Mechanochemical Models: Insights from Analysis and Computation

    KAUST Repository

    Moreo, P.

    2009-11-14

    The diversity of biological form is generated by a relatively small number of underlying mechanisms. Consequently, mathematical and computational modelling can, and does, provide insight into how cellular level interactions ultimately give rise to higher level structure. Given cells respond to mechanical stimuli, it is therefore important to consider the effects of these responses within biological self-organisation models. Here, we consider the self-organisation properties of a mechanochemical model previously developed by three of the authors in Acta Biomater. 4, 613-621 (2008), which is capable of reproducing the behaviour of a population of cells cultured on an elastic substrate in response to a variety of stimuli. In particular, we examine the conditions under which stable spatial patterns can emerge with this model, focusing on the influence of mechanical stimuli and the interplay of non-local phenomena. To this end, we have performed a linear stability analysis and numerical simulations based on a mixed finite element formulation, which have allowed us to study the dynamical behaviour of the system in terms of the qualitative shape of the dispersion relation. We show that the consideration of mechanotaxis, namely changes in migration speeds and directions in response to mechanical stimuli alters the conditions for pattern formation in a singular manner. Furthermore without non-local effects, responses to mechanical stimuli are observed to result in dispersion relations with positive growth rates at arbitrarily large wavenumbers, in turn yielding heterogeneity at the cellular level in model predictions. This highlights the sensitivity and necessity of non-local effects in mechanically influenced biological pattern formation models and the ultimate failure of the continuum approximation in their absence. © 2009 Society for Mathematical Biology.

  12. Development of a kinetic model for biological sulphate reduction ...

    African Journals Online (AJOL)

    A two-phase (aqueous/gas) physical, biological and chemical processes ... Additionally, the background weak acid/base chemistry for water, carbonate, ... in the UCTADM1 model, and hence the physical gas exchange for sulphide is included.

  13. Virtual Reconstruction and Three-Dimensional Printing of Blood Cells as a Tool in Cell Biology Education.

    Science.gov (United States)

    Augusto, Ingrid; Monteiro, Douglas; Girard-Dias, Wendell; Dos Santos, Thaisa Oliveira; Rosa Belmonte, Simone Letícia; Pinto de Oliveira, Jairo; Mauad, Helder; da Silva Pacheco, Marcos; Lenz, Dominik; Stefanon Bittencourt, Athelson; Valentim Nogueira, Breno; Lopes Dos Santos, Jorge Roberto; Miranda, Kildare; Guimarães, Marco Cesar Cunegundes

    2016-01-01

    The cell biology discipline constitutes a highly dynamic field whose concepts take a long time to be incorporated into the educational system, especially in developing countries. Amongst the main obstacles to the introduction of new cell biology concepts to students is their general lack of identification with most teaching methods. The introduction of elaborated figures, movies and animations to textbooks has given a tremendous contribution to the learning process and the search for novel teaching methods has been a central goal in cell biology education. Some specialized tools, however, are usually only available in advanced research centers or in institutions that are traditionally involved with the development of novel teaching/learning processes, and are far from becoming reality in the majority of life sciences schools. When combined with the known declining interest in science among young people, a critical scenario may result. This is especially important in the field of electron microscopy and associated techniques, methods that have greatly contributed to the current knowledge on the structure and function of different cell biology models but are rarely made accessible to most students. In this work, we propose a strategy to increase the engagement of students into the world of cell and structural biology by combining 3D electron microscopy techniques and 3D prototyping technology (3D printing) to generate 3D physical models that accurately and realistically reproduce a close-to-the native structure of the cell and serve as a tool for students and teachers outside the main centers. We introduce three strategies for 3D imaging, modeling and prototyping of cells and propose the establishment of a virtual platform where different digital models can be deposited by EM groups and subsequently downloaded and printed in different schools, universities, research centers and museums, thereby modernizing teaching of cell biology and increasing the accessibility to

  14. Caenorhabditis elegans, a Biological Model for Research in Toxicology.

    Science.gov (United States)

    Tejeda-Benitez, Lesly; Olivero-Verbel, Jesus

    2016-01-01

    Caenorhabditis elegans is a nematode of microscopic size which, due to its biological characteristics, has been used since the 1970s as a model for research in molecular biology, medicine, pharmacology, and toxicology. It was the first animal whose genome was completely sequenced and has played a key role in the understanding of apoptosis and RNA interference. The transparency of its body, short lifespan, ability to self-fertilize and ease of culture are advantages that make it ideal as a model in toxicology. Due to the fact that some of its biochemical pathways are similar to those of humans, it has been employed in research in several fields. C. elegans' use as a biological model in environmental toxicological assessments allows the determination of multiple endpoints. Some of these utilize the effects on the biological functions of the nematode and others use molecular markers. Endpoints such as lethality, growth, reproduction, and locomotion are the most studied, and usually employ the wild type Bristol N2 strain. Other endpoints use reporter genes, such as green fluorescence protein, driven by regulatory sequences from other genes related to different mechanisms of toxicity, such as heat shock, oxidative stress, CYP system, and metallothioneins among others, allowing the study of gene expression in a manner both rapid and easy. These transgenic strains of C. elegans represent a powerful tool to assess toxicity pathways for mixtures and environmental samples, and their numbers are growing in diversity and selectivity. However, other molecular biology techniques, including DNA microarrays and MicroRNAs have been explored to assess the effects of different toxicants and samples. C. elegans has allowed the assessment of neurotoxic effects for heavy metals and pesticides, among those more frequently studied, as the nematode has a very well defined nervous system. More recently, nanoparticles are emergent pollutants whose toxicity can be explored using this nematode

  15. A PROBLEM-BASED LEARNING MODEL IN BIOLOGY EDUCATION COURSES TO DEVELOP INQUIRY TEACHING COMPETENCY OF PRESERVICE TEACHERS

    Directory of Open Access Journals (Sweden)

    Diah Aryulina

    2016-02-01

    MODEL PEMBELAJARAN BERBASIS MASALAH PADA MATAKULIAH PENDIDIKAN BIOLOGI UNTUK MENGEMBANGKAN KOMPETENSI PEMBELAJARAN INKUIRI Abstrak: Tujuan tahap awal penelitian pengembangan ini adalah: 1 mengembangkan model pembelajaran berbasis masalah (PBM pada matakuliah pendidikan biologi, dan 2 memeroleh penilaian ahli terhadap ketepatan model PBM. Model PBM dikembangkan menggunakan pendekatan sistem desain instruksional berdasarkan analisis kebutuhan kompetensi guru biologi, serta kajian literatur mengenai ciri dan proses pembelajaran berbasis masalah. Evaluasi model PBM dilakukan oleh dua pakar pendidikan biologi. Selanjutnya data evaluasi dari pakar dianalisis secara deskriptif. Struktur model PBM yang dikembangkan pada matakuliah Strategi Pembelajaran Biologi, PPL I, dan PPL II terdiri atas tahap identifikasi masalah, perencanaan pemecahan masalah, pelaksanaan pemecahan masalah, penyajian hasil pemecahan masalah, dan refleksi pemecahan masalah. Kelima tahap tersebut dilaksanakan berulang dalam beberapa siklus selama semester. Hasil penilaian pakar menunjukkan bahwa model PBM sesuai dengan ciri pembelajaran berbasis masalah dan tepat digunakan untuk mengembangkan kompetensi pembelajaran inkuiri calon guru. Kata kunci: Model PBM, matakuliah pendidikan biologi, calon guru, kompetensi pembelajaran inkuiri

  16. Profiling the biological activity of oxide nanomaterials with mechanistic models

    NARCIS (Netherlands)

    Burello, E.

    2013-01-01

    In this study we present three mechanistic models for profiling the potential biological and toxicological effects of oxide nanomaterials. The models attempt to describe the reactivity, protein adsorption and membrane adhesion processes of a large range of oxide materials and are based on properties

  17. Uterus models for use in virtual reality hysteroscopy simulators.

    Science.gov (United States)

    Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias

    2009-05-01

    Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.

  18. Getting realistic; Endstation Demut

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.P.

    2004-01-28

    The fuel cell hype of the turn of the millenium has reached its end. The industry is getting realistic. If at all, fuel cell systems for private single-family and multiple dwellings will not be available until the next decade. With a Europe-wide field test, Vaillant intends to advance the PEM technology. [German] Der Brennstoffzellen-Hype der Jahrtausendwende ist verfolgen. Die Branche uebt sich in Bescheidenheit. Die Marktreife der Systeme fuer Ein- und Mehrfamilienhaeuser wird - wenn ueberhaupt - wohl erst im naechsten Jahrzehnt erreicht sein. Vaillant will durch einen europaweiten Feldtest die Entwicklung der PEM-Technologie vorantreiben. (orig.)

  19. Synthetic biology meets tissue engineering.

    Science.gov (United States)

    Davies, Jamie A; Cachat, Elise

    2016-06-15

    Classical tissue engineering is aimed mainly at producing anatomically and physiologically realistic replacements for normal human tissues. It is done either by encouraging cellular colonization of manufactured matrices or cellular recolonization of decellularized natural extracellular matrices from donor organs, or by allowing cells to self-organize into organs as they do during fetal life. For repair of normal bodies, this will be adequate but there are reasons for making unusual, non-evolved tissues (repair of unusual bodies, interface to electromechanical prostheses, incorporating living cells into life-support machines). Synthetic biology is aimed mainly at engineering cells so that they can perform custom functions: applying synthetic biological approaches to tissue engineering may be one way of engineering custom structures. In this article, we outline the 'embryological cycle' of patterning, differentiation and morphogenesis and review progress that has been made in constructing synthetic biological systems to reproduce these processes in new ways. The state-of-the-art remains a long way from making truly synthetic tissues, but there are now at least foundations for future work. © 2016 Authors; published by Portland Press Limited.

  20. HELIOSEISMOLOGY OF A REALISTIC MAGNETOCONVECTIVE SUNSPOT SIMULATION

    International Nuclear Information System (INIS)

    Braun, D. C.; Birch, A. C.; Rempel, M.; Duvall, T. L. Jr.

    2012-01-01

    We compare helioseismic travel-time shifts measured from a realistic magnetoconvective sunspot simulation using both helioseismic holography and time-distance helioseismology, and measured from real sunspots observed with the Helioseismic and Magnetic Imager instrument on board the Solar Dynamics Observatory and the Michelson Doppler Imager instrument on board the Solar and Heliospheric Observatory. We find remarkable similarities in the travel-time shifts measured between the methodologies applied and between the simulated and real sunspots. Forward modeling of the travel-time shifts using either Born or ray approximation kernels and the sound-speed perturbations present in the simulation indicates major disagreements with the measured travel-time shifts. These findings do not substantially change with the application of a correction for the reduction of wave amplitudes in the simulated and real sunspots. Overall, our findings demonstrate the need for new methods for inferring the subsurface structure of sunspots through helioseismic inversions.

  1. Mathematical biology

    CERN Document Server

    Murray, James D

    1993-01-01

    The book is a textbook (with many exercises) giving an in-depth account of the practical use of mathematical modelling in the biomedical sciences. The mathematical level required is generally not high and the emphasis is on what is required to solve the real biological problem. The subject matter is drawn, e.g. from population biology, reaction kinetics, biological oscillators and switches, Belousov-Zhabotinskii reaction, reaction-diffusion theory, biological wave phenomena, central pattern generators, neural models, spread of epidemics, mechanochemical theory of biological pattern formation and importance in evolution. Most of the models are based on real biological problems and the predictions and explanations offered as a direct result of mathematical analysis of the models are important aspects of the book. The aim is to provide a thorough training in practical mathematical biology and to show how exciting and novel mathematical challenges arise from a genuine interdisciplinary involvement with the biosci...

  2. Learning through Creating Robotic Models of Biological Systems

    Science.gov (United States)

    Cuperman, Dan; Verner, Igor M.

    2013-01-01

    This paper considers an approach to studying issues in technology and science, which integrates design and inquiry activities towards creating and exploring technological models of scientific phenomena. We implemented this approach in a context where the learner inquires into a biological phenomenon and develops its representation in the form of a…

  3. Biophysically realistic minimal model of dopamine neuron

    Science.gov (United States)

    Oprisan, Sorinel

    2008-03-01

    We proposed and studied a new biophysically relevant computational model of dopaminergic neurons. Midbrain dopamine neurons are involved in motivation and the control of movement, and have been implicated in various pathologies such as Parkinson's disease, schizophrenia, and drug abuse. The model we developed is a single-compartment Hodgkin-Huxley (HH)-type parallel conductance membrane model. The model captures the essential mechanisms underlying the slow oscillatory potentials and plateau potential oscillations. The main currents involved are: 1) a voltage-dependent fast calcium current, 2) a small conductance potassium current that is modulated by the cytosolic concentration of calcium, and 3) a slow voltage-activated potassium current. We developed multidimensional bifurcation diagrams and extracted the effective domains of sustained oscillations. The model includes a calcium balance due to the fundamental importance of calcium influx as proved by simultaneous electrophysiological and calcium imaging procedure. Although there are significant evidences to suggest a partially electrogenic calcium pump, all previous models considered only elecrtogenic pumps. We investigated the effect of the electrogenic calcium pump on the bifurcation diagram of the model and compared our findings against the experimental results.

  4. Identifying biological concepts from a protein-related corpus with a probabilistic topic model

    Directory of Open Access Journals (Sweden)

    Lu Xinghua

    2006-02-01

    Full Text Available Abstract Background Biomedical literature, e.g., MEDLINE, contains a wealth of knowledge regarding functions of proteins. Major recurring biological concepts within such text corpora represent the domains of this body of knowledge. The goal of this research is to identify the major biological topics/concepts from a corpus of protein-related MEDLINE© titles and abstracts by applying a probabilistic topic model. Results The latent Dirichlet allocation (LDA model was applied to the corpus. Based on the Bayesian model selection, 300 major topics were extracted from the corpus. The majority of identified topics/concepts was found to be semantically coherent and most represented biological objects or concepts. The identified topics/concepts were further mapped to the controlled vocabulary of the Gene Ontology (GO terms based on mutual information. Conclusion The major and recurring biological concepts within a collection of MEDLINE documents can be extracted by the LDA model. The identified topics/concepts provide parsimonious and semantically-enriched representation of the texts in a semantic space with reduced dimensionality and can be used to index text.

  5. Novel approaches to develop community-built biological network models for potential drug discovery.

    Science.gov (United States)

    Talikka, Marja; Bukharov, Natalia; Hayes, William S; Hofmann-Apitius, Martin; Alexopoulos, Leonidas; Peitsch, Manuel C; Hoeng, Julia

    2017-08-01

    Hundreds of thousands of data points are now routinely generated in clinical trials by molecular profiling and NGS technologies. A true translation of this data into knowledge is not possible without analysis and interpretation in a well-defined biology context. Currently, there are many public and commercial pathway tools and network models that can facilitate such analysis. At the same time, insights and knowledge that can be gained is highly dependent on the underlying biological content of these resources. Crowdsourcing can be employed to guarantee the accuracy and transparency of the biological content underlining the tools used to interpret rich molecular data. Areas covered: In this review, the authors describe crowdsourcing in drug discovery. The focal point is the efforts that have successfully used the crowdsourcing approach to verify and augment pathway tools and biological network models. Technologies that enable the building of biological networks with the community are also described. Expert opinion: A crowd of experts can be leveraged for the entire development process of biological network models, from ontologies to the evaluation of their mechanistic completeness. The ultimate goal is to facilitate biomarker discovery and personalized medicine by mechanistically explaining patients' differences with respect to disease prevention, diagnosis, and therapy outcome.

  6. Agent-Based Modeling in Molecular Systems Biology.

    Science.gov (United States)

    Soheilypour, Mohammad; Mofrad, Mohammad R K

    2018-06-08

    Molecular systems orchestrating the biology of the cell typically involve a complex web of interactions among various components and span a vast range of spatial and temporal scales. Computational methods have advanced our understanding of the behavior of molecular systems by enabling us to test assumptions and hypotheses, explore the effect of different parameters on the outcome, and eventually guide experiments. While several different mathematical and computational methods are developed to study molecular systems at different spatiotemporal scales, there is still a need for methods that bridge the gap between spatially-detailed and computationally-efficient approaches. In this review, we summarize the capabilities of agent-based modeling (ABM) as an emerging molecular systems biology technique that provides researchers with a new tool in exploring the dynamics of molecular systems/pathways in health and disease. © 2018 WILEY Periodicals, Inc.

  7. Biological variability in biomechanical engineering research: Significance and meta-analysis of current modeling practices.

    Science.gov (United States)

    Cook, Douglas; Julias, Margaret; Nauman, Eric

    2014-04-11

    Biological systems are characterized by high levels of variability, which can affect the results of biomechanical analyses. As a review of this topic, we first surveyed levels of variation in materials relevant to biomechanics, and compared these values to standard engineered materials. As expected, we found significantly higher levels of variation in biological materials. A meta-analysis was then performed based on thorough reviews of 60 research studies from the field of biomechanics to assess the methods and manner in which biological variation is currently handled in our field. The results of our meta-analysis revealed interesting trends in modeling practices, and suggest a need for more biomechanical studies that fully incorporate biological variation in biomechanical models and analyses. Finally, we provide some case study example of how biological variability may provide valuable insights or lead to surprising results. The purpose of this study is to promote the advancement of biomechanics research by encouraging broader treatment of biological variability in biomechanical modeling. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. A realistic validation study of a new nitrogen multiple-breath washout system.

    Directory of Open Access Journals (Sweden)

    Florian Singer

    Full Text Available BACKGROUND: For reliable assessment of ventilation inhomogeneity, multiple-breath washout (MBW systems should be realistically validated. We describe a new lung model for in vitro validation under physiological conditions and the assessment of a new nitrogen (N(2MBW system. METHODS: The N(2MBW setup indirectly measures the N(2 fraction (F(N2 from main-stream carbon dioxide (CO(2 and side-stream oxygen (O(2 signals: F(N2 = 1-F(O2-F(CO2-F(Argon. For in vitro N(2MBW, a double chamber plastic lung model was filled with water, heated to 37°C, and ventilated at various lung volumes, respiratory rates, and F(CO2. In vivo N(2MBW was undertaken in triplets on two occasions in 30 healthy adults. Primary N(2MBW outcome was functional residual capacity (FRC. We assessed in vitro error (√[difference](2 between measured and model FRC (100-4174 mL, and error between tests of in vivo FRC, lung clearance index (LCI, and normalized phase III slope indices (S(acin and S(cond. RESULTS: The model generated 145 FRCs under BTPS conditions and various breathing patterns. Mean (SD error was 2.3 (1.7%. In 500 to 4174 mL FRCs, 121 (98% of FRCs were within 5%. In 100 to 400 mL FRCs, the error was better than 7%. In vivo FRC error between tests was 10.1 (8.2%. LCI was the most reproducible ventilation inhomogeneity index. CONCLUSION: The lung model generates lung volumes under the conditions encountered during clinical MBW testing and enables realistic validation of MBW systems. The new N(2MBW system reliably measures lung volumes and delivers reproducible LCI values.

  9. The Development and Validation of an In Vitro Airway Model to Assess Realistic Airway Deposition and Drug Permeation Behavior of Orally Inhaled Products Across Synthetic Membranes.

    Science.gov (United States)

    Huynh, Bao K; Traini, Daniela; Farkas, Dale R; Longest, P Worth; Hindle, Michael; Young, Paul M

    2018-04-01

    Current in vitro approaches to assess lung deposition, dissolution, and cellular transport behavior of orally inhaled products (OIPs) have relied on compendial impactors to collect drug particles that are likely to deposit in the airway; however, the main drawback with this approach is that these impactors do not reflect the airway and may not necessarily represent drug deposition behavior in vivo. The aim of this article is to describe the development and method validation of a novel hybrid in vitro approach to assess drug deposition and permeation behavior in a more representative airway model. The medium-sized Virginia Commonwealth University (VCU) mouth-throat (MT) and tracheal-bronchial (TB) realistic upper airway models were used in this study as representative models of the upper airway. The TB model was modified to accommodate two Snapwell ® inserts above the first TB airway bifurcation region to collect deposited nebulized ciprofloxacin-hydrochloride (CIP-HCL) droplets as a model drug aerosol system. Permeation characteristics of deposited nebulized CIP-HCL droplets were assessed across different synthetic membranes using the Snapwell test system. The Snapwell test system demonstrated reproducible and discriminatory drug permeation profiles for already dissolved and nebulized CIP-HCL droplets through a range of synthetic permeable membranes under different test conditions. The rate and extent of drug permeation depended on the permeable membrane material used, presence of a stirrer in the receptor compartment, and, most importantly, the drug collection method. This novel hybrid in vitro approach, which incorporates a modified version of a realistic upper airway model, coupled with the Snapwell test system holds great potential to evaluate postairway deposition characteristics, such as drug permeation and particle dissolution behavior of OIPs. Future studies will expand this approach using a cell culture-based setup instead of synthetic membranes, within a

  10. Digital learning material for experimental design and model building in molecular biology

    NARCIS (Netherlands)

    Aegerter-Wilmsen, T.

    2005-01-01

    Designing experimental approaches is a major cognitive skill in molecular biology research, and building models, including quantitative ones, is a cognitive skill which is rapidly gaining importance. Since molecular biology education at university level is aimed at educating future researchers, we

  11. Genome-scale metabolic models as platforms for strain design and biological discovery.

    Science.gov (United States)

    Mienda, Bashir Sajo

    2017-07-01

    Genome-scale metabolic models (GEMs) have been developed and used in guiding systems' metabolic engineering strategies for strain design and development. This strategy has been used in fermentative production of bio-based industrial chemicals and fuels from alternative carbon sources. However, computer-aided hypotheses building using established algorithms and software platforms for biological discovery can be integrated into the pipeline for strain design strategy to create superior strains of microorganisms for targeted biosynthetic goals. Here, I described an integrated workflow strategy using GEMs for strain design and biological discovery. Specific case studies of strain design and biological discovery using Escherichia coli genome-scale model are presented and discussed. The integrated workflow presented herein, when applied carefully would help guide future design strategies for high-performance microbial strains that have existing and forthcoming genome-scale metabolic models.

  12. Introductory biology students' conceptual models and explanations of the origin of variation.

    Science.gov (United States)

    Speth, Elena Bray; Shaw, Neil; Momsen, Jennifer; Reinagel, Adam; Le, Paul; Taqieddin, Ranya; Long, Tammy

    2014-01-01

    Mutation is the key molecular mechanism generating phenotypic variation, which is the basis for evolution. In an introductory biology course, we used a model-based pedagogy that enabled students to integrate their understanding of genetics and evolution within multiple case studies. We used student-generated conceptual models to assess understanding of the origin of variation. By midterm, only a small percentage of students articulated complete and accurate representations of the origin of variation in their models. Targeted feedback was offered through activities requiring students to critically evaluate peers' models. At semester's end, a substantial proportion of students significantly improved their representation of how variation arises (though one-third still did not include mutation in their models). Students' written explanations of the origin of variation were mostly consistent with their models, although less effective than models in conveying mechanistic reasoning. This study contributes evidence that articulating the genetic origin of variation is particularly challenging for learners and may require multiple cycles of instruction, assessment, and feedback. To support meaningful learning of the origin of variation, we advocate instruction that explicitly integrates multiple scales of biological organization, assessment that promotes and reveals mechanistic and causal reasoning, and practice with explanatory models with formative feedback. © 2014 E. Bray Speth et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  13. Two-Capacitor Problem: A More Realistic View.

    Science.gov (United States)

    Powell, R. A.

    1979-01-01

    Discusses the two-capacitor problem by considering the self-inductance of the circuit used and by determining how well the usual series RC circuit approximates the two-capacitor problem when realistic values of L, C, and R are chosen. (GA)

  14. Systems Modelling and the Development of Coherent Understanding of Cell Biology

    Science.gov (United States)

    Verhoeff, Roald P.; Waarlo, Arend Jan; Boersma, Kerst Th.

    2008-01-01

    This article reports on educational design research concerning a learning and teaching strategy for cell biology in upper-secondary education introducing "systems modelling" as a key competence. The strategy consists of four modelling phases in which students subsequently develop models of free-living cells, a general two-dimensional model of…

  15. Satellite Maps Deliver More Realistic Gaming

    Science.gov (United States)

    2013-01-01

    When Redwood City, California-based Electronic Arts (EA) decided to make SSX, its latest snowboarding video game, it faced challenges in creating realistic-looking mountains. The solution was NASA's ASTER Global Digital Elevation Map, made available by the Jet Propulsion Laboratory, which EA used to create 28 real-life mountains from 9 different ranges for its award-winning game.

  16. Radiation physics, biophysics, and radiation biology

    International Nuclear Information System (INIS)

    Hall, E.J.; Zaider, M.

    1993-05-01

    Research at the Center for Radiological Research is a multidisciplenary blend of physics, chemistry and biology aimed at understanding the mechanisms involved in the health problems resulting from human exposure to ionizing radiations. The focus is increased on biochemistry and the application of the techniques of molecular biology to the problems of radiation biology. Research highlights of the program from the past year are described. A mathematical model describing the production of single-strand and double-strand breaks in DNA as a function radiation quality has been completed. For the first time Monte Carlo techniques have been used to obtain directly the spatial distribution of DNA moieties altered by radiation. This information was obtained by including the transport codes a realistic description of the electronic structure of DNA. We have investigated structure activity relationships for the potential oncogenicity of a new generation of bioreductive drugs that function as hypoxic cytotoxins. Experimental and theoretical investigation of the inverse dose rate effect, whereby medium LET radiations actually produce an c effect when the dose is protracted, is now at a point where the basic mechanisms are reasonably understood and the complex interplay between dose, dose rate and radiation quality which is necessary for the effect to be present can now be predicted at least in vitro. In terms of early radiobiological damage, a quantitative link has been established between basic energy deposition and locally multiply damaged sites, the radiochemical precursor of DNA double strand breaks; specifically, the spatial and energy deposition requirements necessary to form LMDs have been evaluated. For the first time, a mechanically understood ''biological fingerprint'' of high-LET radiation has been established. Specifically measurement of the ratio of inter-to intra-chromosomal aberrations produces a unique signature from alpha-particles or neutrons

  17. Radiation physics, biophysics, and radiation biology

    Energy Technology Data Exchange (ETDEWEB)

    Hall, E.J.; Zaider, M.

    1993-05-01

    Research at the Center for Radiological Research is a multidisciplenary blend of physics, chemistry and biology aimed at understanding the mechanisms involved in the health problems resulting from human exposure to ionizing radiations. The focus is increased on biochemistry and the application of the techniques of molecular biology to the problems of radiation biology. Research highlights of the program from the past year are described. A mathematical model describing the production of single-strand and double-strand breaks in DNA as a function radiation quality has been completed. For the first time Monte Carlo techniques have been used to obtain directly the spatial distribution of DNA moieties altered by radiation. This information was obtained by including the transport codes a realistic description of the electronic structure of DNA. We have investigated structure activity relationships for the potential oncogenicity of a new generation of bioreductive drugs that function as hypoxic cytotoxins. Experimental and theoretical investigation of the inverse dose rate effect, whereby medium LET radiations actually produce an c effect when the dose is protracted, is now at a point where the basic mechanisms are reasonably understood and the complex interplay between dose, dose rate and radiation quality which is necessary for the effect to be present can now be predicted at least in vitro. In terms of early radiobiological damage, a quantitative link has been established between basic energy deposition and locally multiply damaged sites, the radiochemical precursor of DNA double strand breaks; specifically, the spatial and energy deposition requirements necessary to form LMDs have been evaluated. For the first time, a mechanically understood biological fingerprint'' of high-LET radiation has been established. Specifically measurement of the ratio of inter-to intra-chromosomal aberrations produces a unique signature from alpha-particles or neutrons.

  18. Predicting perceptual quality of images in realistic scenario using deep filter banks

    Science.gov (United States)

    Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang

    2018-03-01

    Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.

  19. A data integration approach for cell cycle analysis oriented to model simulation in systems biology

    Directory of Open Access Journals (Sweden)

    Mosca Ettore

    2007-08-01

    Full Text Available Abstract Background The cell cycle is one of the biological processes most frequently investigated in systems biology studies and it involves the knowledge of a large number of genes and networks of protein interactions. A deep knowledge of the molecular aspect of this biological process can contribute to making cancer research more accurate and innovative. In this context the mathematical modelling of the cell cycle has a relevant role to quantify the behaviour of each component of the systems. The mathematical modelling of a biological process such as the cell cycle allows a systemic description that helps to highlight some features such as emergent properties which could be hidden when the analysis is performed only from a reductionism point of view. Moreover, in modelling complex systems, a complete annotation of all the components is equally important to understand the interaction mechanism inside the network: for this reason data integration of the model components has high relevance in systems biology studies. Description In this work, we present a resource, the Cell Cycle Database, intended to support systems biology analysis on the Cell Cycle process, based on two organisms, yeast and mammalian. The database integrates information about genes and proteins involved in the cell cycle process, stores complete models of the interaction networks and allows the mathematical simulation over time of the quantitative behaviour of each component. To accomplish this task, we developed, a web interface for browsing information related to cell cycle genes, proteins and mathematical models. In this framework, we have implemented a pipeline which allows users to deal with the mathematical part of the models, in order to solve, using different variables, the ordinary differential equation systems that describe the biological process. Conclusion This integrated system is freely available in order to support systems biology research on the cell cycle and

  20. A Model of How Different Biology Experts Explain Molecular and Cellular Mechanisms

    Science.gov (United States)

    Trujillo, Caleb M.; Anderson, Trevor R.; Pelaez, Nancy J.

    2015-01-01

    Constructing explanations is an essential skill for all science learners. The goal of this project was to model the key components of expert explanation of molecular and cellular mechanisms. As such, we asked: What is an appropriate model of the components of explanation used by biology experts to explain molecular and cellular mechanisms? Do explanations made by experts from different biology subdisciplines at a university support the validity of this model? Guided by the modeling framework of R. S. Justi and J. K. Gilbert, the validity of an initial model was tested by asking seven biologists to explain a molecular mechanism of their choice. Data were collected from interviews, artifacts, and drawings, and then subjected to thematic analysis. We found that biologists explained the specific activities and organization of entities of the mechanism. In addition, they contextualized explanations according to their biological and social significance; integrated explanations with methods, instruments, and measurements; and used analogies and narrated stories. The derived methods, analogies, context, and how themes informed the development of our final MACH model of mechanistic explanations. Future research will test the potential of the MACH model as a guiding framework for instruction to enhance the quality of student explanations. PMID:25999313

  1. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    Science.gov (United States)

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  2. On the relationship of steady states of continuous and discrete models arising from biology.

    Science.gov (United States)

    Veliz-Cuba, Alan; Arthur, Joseph; Hochstetler, Laura; Klomps, Victoria; Korpi, Erikka

    2012-12-01

    For many biological systems that have been modeled using continuous and discrete models, it has been shown that such models have similar dynamical properties. In this paper, we prove that this happens in more general cases. We show that under some conditions there is a bijection between the steady states of continuous and discrete models arising from biological systems. Our results also provide a novel method to analyze certain classes of nonlinear models using discrete mathematics.

  3. Protocol for an HTA report: Does therapeutic writing help people with long-term conditions? Systematic review, realist synthesis and economic modelling

    Science.gov (United States)

    Meads, C; Nyssen, O P; Wong, G; Steed, L; Bourke, L; Ross, C A; Hayman, S; Field, V; Lord, J; Greenhalgh, T; Taylor, S J C

    2014-01-01

    Introduction Long-term medical conditions (LTCs) cause reduced health-related quality of life and considerable health service expenditure. Writing therapy has potential to improve physical and mental health in people with LTCs, but its effectiveness is not established. This project aims to establish the clinical and cost-effectiveness of therapeutic writing in LTCs by systematic review and economic evaluation, and to evaluate context and mechanisms by which it might work, through realist synthesis. Methods Included are any comparative study of therapeutic writing compared with no writing, waiting list, attention control or placebo writing in patients with any diagnosed LTCs that report at least one of the following: relevant clinical outcomes; quality of life; health service use; psychological, behavioural or social functioning; adherence or adverse events. Searches will be conducted in the main medical databases including MEDLINE, EMBASE, PsycINFO, The Cochrane Library and Science Citation Index. For the realist review, further purposive and iterative searches through snowballing techniques will be undertaken. Inclusions, data extraction and quality assessment will be in duplicate with disagreements resolved through discussion. Quality assessment will include using Grading of Recommendations Assessment, Development and Evaluation (GRADE) criteria. Data synthesis will be narrative and tabular with meta-analysis where appropriate. De novo economic modelling will be attempted in one clinical area if sufficient evidence is available and performed according to the National Institute for Health and Care Excellence (NICE) reference case. PMID:24549165

  4. High school and college biology: A multi-level model of the effects of high school biology courses on student academic performance in introductory college biology courses

    Science.gov (United States)

    Loehr, John Francis

    The issue of student preparation for college study in science has been an ongoing concern for both college-bound students and educators of various levels. This study uses a national sample of college students enrolled in introductory biology courses to address the relationship between high school biology preparation and subsequent introductory college biology performance. Multi-Level Modeling was used to investigate the relationship between students' high school science and mathematics experiences and college biology performance. This analysis controls for student demographic and educational background factors along with factors associated with the college or university attended. The results indicated that high school course-taking and science instructional experiences have the largest impact on student achievement in the first introductory college biology course. In particular, enrollment in courses, such as high school Calculus and Advanced Placement (AP) Biology, along with biology course content that focuses on developing a deep understanding of the topics is found to be positively associated with student achievement in introductory college biology. On the other hand, experiencing high numbers of laboratory activities, demonstrations, and independent projects along with higher levels of laboratory freedom are associated with negative achievement. These findings are relevant to high school biology teachers, college students, their parents, and educators looking beyond the goal of high school graduation.

  5. Development of a coupled physical-biological ecosystem model ECOSMO - Part I: Model description and validation for the North Sea

    DEFF Research Database (Denmark)

    Schrum, Corinna; Alekseeva, I.; St. John, Michael

    2006-01-01

    A 3-D coupled biophysical model ECOSMO (ECOSystem MOdel) has been developed. The biological module of ECOSMO is based on lower trophic level interactions between two phyto- and two zooplankton components. The dynamics of the different phytoplankton components are governed by the availability...... of the macronutrients nitrogen, phosphate and silicate as well as light. Zooplankton production is simulated based on the consumption of the different phytoplankton groups and detritus. The biological module is coupled to a nonlinear 3-D baroclinic model. The physical and biological modules are driven by surface...... showed that the model, based on consideration of limiting processes, is able to reproduce the observed spatial and seasonal variability of the North Sea ecosystem e.g. the spring bloom, summer sub-surface production and the fall bloom. Distinct differences in regional characteristics of diatoms...

  6. A computational systems biology software platform for multiscale modeling and simulation: Integrating whole-body physiology, disease biology, and molecular reaction networks

    Directory of Open Access Journals (Sweden)

    Thomas eEissing

    2011-02-01

    Full Text Available Today, in silico studies and trial simulations already complement experimental approaches in pharmaceutical R&D and have become indispensable tools for decision making and communication with regulatory agencies. While biology is multi-scale by nature, project work and software tools usually focus on isolated aspects of drug action, such as pharmacokinetics at the organism scale or pharmacodynamic interaction on the molecular level. We present a modeling and simulation software platform consisting of PK-Sim® and MoBi® capable of building and simulating models that integrate across biological scales. A prototypical multiscale model for the progression of a pancreatic tumor and its response to pharmacotherapy is constructed and virtual patients are treated with a prodrug activated by hepatic metabolization. Tumor growth is driven by signal transduction leading to cell cycle transition and proliferation. Free tumor concentrations of the active metabolite inhibit Raf kinase in the signaling cascade and thereby cell cycle progression. In a virtual clinical study, the individual therapeutic outcome of the chemotherapeutic intervention is simulated for a large population with heterogeneous genomic background. Thereby, the platform allows efficient model building and integration of biological knowledge and prior data from all biological scales. Experimental in vitro model systems can be linked with observations in animal experiments and clinical trials. The interplay between patients, diseases, and drugs and topics with high clinical relevance such as the role of pharmacogenomics, drug-drug or drug-metabolite interactions can be addressed using this mechanistic, insight driven multiscale modeling approach.

  7. A mathematical framework for agent based models of complex biological networks.

    Science.gov (United States)

    Hinkelmann, Franziska; Murrugarra, David; Jarrah, Abdul Salam; Laubenbacher, Reinhard

    2011-07-01

    Agent-based modeling and simulation is a useful method to study biological phenomena in a wide range of fields, from molecular biology to ecology. Since there is currently no agreed-upon standard way to specify such models, it is not always easy to use published models. Also, since model descriptions are not usually given in mathematical terms, it is difficult to bring mathematical analysis tools to bear, so that models are typically studied through simulation. In order to address this issue, Grimm et al. proposed a protocol for model specification, the so-called ODD protocol, which provides a standard way to describe models. This paper proposes an addition to the ODD protocol which allows the description of an agent-based model as a dynamical system, which provides access to computational and theoretical tools for its analysis. The mathematical framework is that of algebraic models, that is, time-discrete dynamical systems with algebraic structure. It is shown by way of several examples how this mathematical specification can help with model analysis. This mathematical framework can also accommodate other model types such as Boolean networks and the more general logical models, as well as Petri nets.

  8. A stress driven growth model for soft tissue considering biological availability

    International Nuclear Information System (INIS)

    Oller, S; Bellomo, F J; Nallim, L G; Armero, F

    2010-01-01

    Some of the key factors that regulate growth and remodeling of tissues are fundamentally mechanical. However, it is important to take into account the role of bioavailability together with the stresses and strains in the processes of normal or pathological growth. In this sense, the model presented in this work is oriented to describe the growth of soft biological tissue under 'stress driven growth' and depending on the biological availability of the organism. The general theoretical framework is given by a kinematic formulation in large strain combined with the thermodynamic basis of open systems. The formulation uses a multiplicative decomposition of deformation gradient, splitting it in a growth part and visco-elastic part. The strains due to growth are incompatible and are controlled by an unbalanced stresses related to a homeostatic state. Growth implies a volume change with an increase of mass maintaining constant the density. One of the most interesting features of the proposed model is the generation of new tissue taking into account the contribution of mass to the system controlled through biological availability. Because soft biological tissues in general have a hierarchical structure with several components (usually a soft matrix reinforced with collagen fibers), the developed growth model is suitable for the characterization of the growth of each component. This allows considering a different behavior for each of them in the context of a generalized theory of mixtures. Finally, we illustrate the response of the model in case of growth and atrophy with an application example.

  9. Wiring Together Synthetic Bacterial Consortia to Create a Biological Integrated Circuit.

    Science.gov (United States)

    Perry, Nicolas; Nelson, Edward M; Timp, Gregory

    2016-12-16

    The promise of adapting biology to information processing will not be realized until engineered gene circuits, operating in different cell populations, can be wired together to express a predictable function. Here, elementary biological integrated circuits (BICs), consisting of two sets of transmitter and receiver gene circuit modules with embedded memory placed in separate cell populations, were meticulously assembled using live cell lithography and wired together by the mass transport of quorum-sensing (QS) signal molecules to form two isolated communication links (comlinks). The comlink dynamics were tested by broadcasting "clock" pulses of inducers into the networks and measuring the responses of functionally linked fluorescent reporters, and then modeled through simulations that realistically captured the protein production and molecular transport. These results show that the comlinks were isolated and each mimicked aspects of the synchronous, sequential networks used in digital computing. The observations about the flow conditions, derived from numerical simulations, and the biofilm architectures that foster or silence cell-to-cell communications have implications for everything from decontamination of drinking water to bacterial virulence.

  10. Putting theory to the test: which regulatory mechanisms can drive realistic growth of a root?

    Science.gov (United States)

    De Vos, Dirk; Vissenberg, Kris; Broeckhove, Jan; Beemster, Gerrit T S

    2014-10-01

    In recent years there has been a strong development of computational approaches to mechanistically understand organ growth regulation in plants. In this study, simulation methods were used to explore which regulatory mechanisms can lead to realistic output at the cell and whole organ scale and which other possibilities must be discarded as they result in cellular patterns and kinematic characteristics that are not consistent with experimental observations for the Arabidopsis thaliana primary root. To aid in this analysis, a 'Uniform Longitudinal Strain Rule' (ULSR) was formulated as a necessary condition for stable, unidirectional, symplastic growth. Our simulations indicate that symplastic structures are robust to differences in longitudinal strain rates along the growth axis only if these differences are small and short-lived. Whereas simple cell-autonomous regulatory rules based on counters and timers can produce stable growth, it was found that steady developmental zones and smooth transitions in cell lengths are not feasible. By introducing spatial cues into growth regulation, those inadequacies could be avoided and experimental data could be faithfully reproduced. Nevertheless, a root growth model based on previous polar auxin-transport mechanisms violates the proposed ULSR due to the presence of lateral gradients. Models with layer-specific regulation or layer-driven growth offer potential solutions. Alternatively, a model representing the known cross-talk between auxin, as the cell proliferation promoting factor, and cytokinin, as the cell differentiation promoting factor, predicts the effect of hormone-perturbations on meristem size. By down-regulating PIN-mediated transport through the transcription factor SHY2, cytokinin effectively flattens the lateral auxin gradient, at the basal boundary of the division zone, (thereby imposing the ULSR) to signal the exit of proliferation and start of elongation. This model exploration underlines the value of

  11. Simulation of photon transport in a realistic human body model

    International Nuclear Information System (INIS)

    Baccarne, V.; Turzo, A.; Bizais, Y.; Farine, M.

    1997-01-01

    A Monte-Carlo photon transport code to simulate scintigraphy is developed. The scintigraphy consists of injecting a patient with a radioactive tracer (Tc, a 140 keV photon emitter) attached to a biologically active molecule. Complicated physical phenomena, photon interactions, occurring in between the radioactive source emission and the detection of the photon on the gamma-camera, require an accurate description. All these phenomena are very sensitive to the characteristics of human tissues and we had to use segmented computerized tomography slices. A preliminary theoretical study of the physical characteristics (rather badly known) of the biological tissues resulted in a two family classification: soft and bone tissues. By devising a Monte-Carlo simulator a systematic investigation was carried out concerning the relative weight of different types of interaction taking place in the traversed tissue. The importance of bone tissues was evidenced in comparison with the soft tissues, as well as the instability of these phenomena as a function of the patient morphology. These information are crucial in the elaboration and validation of correction techniques applied to the diagnosis images of clinical examinations

  12. Introducing memory and association mechanism into a biologically inspired visual model.

    Science.gov (United States)

    Qiao, Hong; Li, Yinlin; Tang, Tang; Wang, Peng

    2014-09-01

    A famous biologically inspired hierarchical model (HMAX model), which was proposed recently and corresponds to V1 to V4 of the ventral pathway in primate visual cortex, has been successfully applied to multiple visual recognition tasks. The model is able to achieve a set of position- and scale-tolerant recognition, which is a central problem in pattern recognition. In this paper, based on some other biological experimental evidence, we introduce the memory and association mechanism into the HMAX model. The main contributions of the work are: 1) mimicking the active memory and association mechanism and adding the top down adjustment to the HMAX model, which is the first try to add the active adjustment to this famous model and 2) from the perspective of information, algorithms based on the new model can reduce the computation storage and have a good recognition performance. The new model is also applied to object recognition processes. The primary experimental results show that our method is efficient with a much lower memory requirement.

  13. Comparative systems biology between human and animal models based on next-generation sequencing methods.

    Science.gov (United States)

    Zhao, Yu-Qi; Li, Gong-Hua; Huang, Jing-Fei

    2013-04-01

    Animal models provide myriad benefits to both experimental and clinical research. Unfortunately, in many situations, they fall short of expected results or provide contradictory results. In part, this can be the result of traditional molecular biological approaches that are relatively inefficient in elucidating underlying molecular mechanism. To improve the efficacy of animal models, a technological breakthrough is required. The growing availability and application of the high-throughput methods make systematic comparisons between human and animal models easier to perform. In the present study, we introduce the concept of the comparative systems biology, which we define as "comparisons of biological systems in different states or species used to achieve an integrated understanding of life forms with all their characteristic complexity of interactions at multiple levels". Furthermore, we discuss the applications of RNA-seq and ChIP-seq technologies to comparative systems biology between human and animal models and assess the potential applications for this approach in the future studies.

  14. Modeling and simulation of viscoelastic biological particles' 3D manipulation using atomic force microscopy

    Science.gov (United States)

    Korayem, M. H.; Habibi Sooha, Y.; Rastegar, Z.

    2018-05-01

    Manipulation of the biological particles by atomic force microscopy is used to transfer these particles inside body's cells, diagnosis and destruction of the cancer cells and drug delivery to damaged cells. According to the impossibility of simultaneous observation of this process, the importance of modeling and simulation can be realized. The contact of the tip with biological particle is important during manipulation, therefore, the first step of the modeling is choosing appropriate contact model. Most of the studies about contact between atomic force microscopy and biological particles, consider the biological particle as an elastic material. This is not an appropriate assumption because biological cells are basically soft and this assumption ignores loading history. In this paper, elastic and viscoelastic JKR theories were used in modeling and simulation of the 3D manipulation for three modes of tip-particle sliding, particle-substrate sliding and particle-substrate rolling. Results showed that critical force and time in motion modes (sliding and rolling) for two elastic and viscoelastic states are very close but these magnitudes were lower in the viscoelastic state. Then, three friction models, Coulomb, LuGre and HK, were used for tip-particle sliding mode in the first phase of manipulation to make results closer to reality. In both Coulomb and LuGre models, critical force and time are very close for elastic and viscoelastic states but in general critical force and time prediction of HK model was higher than LuGre and the LuGre model itself had higher prediction than Coulomb.

  15. Generalized Fokker-Planck theory for electron and photon transport in biological tissues: application to radiotherapy.

    Science.gov (United States)

    Olbrant, Edgar; Frank, Martin

    2010-12-01

    In this paper, we study a deterministic method for particle transport in biological tissues. The method is specifically developed for dose calculations in cancer therapy and for radiological imaging. Generalized Fokker-Planck (GFP) theory [Leakeas and Larsen, Nucl. Sci. Eng. 137 (2001), pp. 236-250] has been developed to improve the Fokker-Planck (FP) equation in cases where scattering is forward-peaked and where there is a sufficient amount of large-angle scattering. We compare grid-based numerical solutions to FP and GFP in realistic medical applications. First, electron dose calculations in heterogeneous parts of the human body are performed. Therefore, accurate electron scattering cross sections are included and their incorporation into our model is extensively described. Second, we solve GFP approximations of the radiative transport equation to investigate reflectance and transmittance of light in biological tissues. All results are compared with either Monte Carlo or discrete-ordinates transport solutions.

  16. Realist cinema as world cinema

    OpenAIRE

    Nagib, Lucia

    2017-01-01

    The idea that “realism” is the common denominator across the vast range of productions normally labelled as “world cinema” is widespread and seemly uncontroversial. Leaving aside oppositional binaries that define world cinema as the other of Hollywood or of classical cinema, this chapter will test the realist premise by locating it in the mode of production. It will define this mode as an ethics that engages filmmakers, at cinema’s creative peaks, with the physical and historical environment,...

  17. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Directory of Open Access Journals (Sweden)

    Afnizanfaizal Abdullah

    Full Text Available The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  18. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Science.gov (United States)

    Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N V

    2013-01-01

    The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  19. A finite element simulation of biological conversion processes in landfills

    International Nuclear Information System (INIS)

    Robeck, M.; Ricken, T.; Widmann, R.

    2011-01-01

    Landfills are the most common way of waste disposal worldwide. Biological processes convert the organic material into an environmentally harmful landfill gas, which has an impact on the greenhouse effect. After the depositing of waste has been stopped, current conversion processes continue and emissions last for several decades and even up to 100 years and longer. A good prediction of these processes is of high importance for landfill operators as well as for authorities, but suitable models for a realistic description of landfill processes are rather poor. In order to take the strong coupled conversion processes into account, a constitutive three-dimensional model based on the multiphase Theory of Porous Media (TPM) has been developed at the University of Duisburg-Essen. The theoretical formulations are implemented in the finite element code FEAP. With the presented calculation concept we are able to simulate the coupled processes that occur in an actual landfill. The model's theoretical background and the results of the simulations as well as the meantime successfully performed simulation of a real landfill body will be shown in the following.

  20. The Biological Big Bang model for the major transitions in evolution.

    Science.gov (United States)

    Koonin, Eugene V

    2007-08-20

    Major transitions in biological evolution show the same pattern of sudden emergence of diverse forms at a new level of complexity. The relationships between major groups within an emergent new class of biological entities are hard to decipher and do not seem to fit the tree pattern that, following Darwin's original proposal, remains the dominant description of biological evolution. The cases in point include the origin of complex RNA molecules and protein folds; major groups of viruses; archaea and bacteria, and the principal lineages within each of these prokaryotic domains; eukaryotic supergroups; and animal phyla. In each of these pivotal nexuses in life's history, the principal "types" seem to appear rapidly and fully equipped with the signature features of the respective new level of biological organization. No intermediate "grades" or intermediate forms between different types are detectable. Usually, this pattern is attributed to cladogenesis compressed in time, combined with the inevitable erosion of the phylogenetic signal. I propose that most or all major evolutionary transitions that show the "explosive" pattern of emergence of new types of biological entities correspond to a boundary between two qualitatively distinct evolutionary phases. The first, inflationary phase is characterized by extremely rapid evolution driven by various processes of genetic information exchange, such as horizontal gene transfer, recombination, fusion, fission, and spread of mobile elements. These processes give rise to a vast diversity of forms from which the main classes of entities at the new level of complexity emerge independently, through a sampling process. In the second phase, evolution dramatically slows down, the respective process of genetic information exchange tapers off, and multiple lineages of the new type of entities emerge, each of them evolving in a tree-like fashion from that point on. This biphasic model of evolution incorporates the previously developed