WorldWideScience

Sample records for modeling quantitative links

  1. QSAR DataBank repository: open and linked qualitative and quantitative structure-activity relationship models.

    Science.gov (United States)

    Ruusmann, V; Sild, S; Maran, U

    2015-01-01

    Structure-activity relationship models have been used to gain insight into chemical and physical processes in biomedicine, toxicology, biotechnology, etc. for almost a century. They have been recognized as valuable tools in decision support workflows for qualitative and quantitative predictions. The main obstacle preventing broader adoption of quantitative structure-activity relationships [(Q)SARs] is that published models are still relatively difficult to discover, retrieve and redeploy in a modern computer-oriented environment. This publication describes a digital repository that makes in silico (Q)SAR-type descriptive and predictive models archivable, citable and usable in a novel way for most common research and applied science purposes. The QSAR DataBank (QsarDB) repository aims to make the processes and outcomes of in silico modelling work transparent, reproducible and accessible. Briefly, the models are represented in the QsarDB data format and stored in a content-aware repository (a.k.a. smart repository). Content awareness has two dimensions. First, models are organized into collections and then into collection hierarchies based on their metadata. Second, the repository is not only an environment for browsing and downloading models (the QDB archive) but also offers integrated services, such as model analysis and visualization and prediction making. The QsarDB repository unlocks the potential of descriptive and predictive in silico (Q)SAR-type models by allowing new and different types of collaboration between model developers and model users. The key enabling factor is the representation of (Q)SAR models in the QsarDB data format, which makes it easy to preserve and share all relevant data, information and knowledge. Model developers can become more productive by effectively reusing prior art. Model users can make more confident decisions by relying on supporting information that is larger and more diverse than before. Furthermore, the smart repository

  2. A functional-structural model of rice linking quantitative genetic information with morphological development and physiological processes

    NARCIS (Netherlands)

    Xu, L.F.; Henke, M.; Zhu, J.; Kurth, W.; Buck-Sorlin, G.H.

    2011-01-01

    Background and Aims Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype-phenotype model, we present here a

  3. Quantitative linking hypotheses for infant eye movements.

    Directory of Open Access Journals (Sweden)

    Daniel Yurovsky

    Full Text Available The study of cognitive development hinges, largely, on the analysis of infant looking. But analyses of eye gaze data require the adoption of linking hypotheses: assumptions about the relationship between observed eye movements and underlying cognitive processes. We develop a general framework for constructing, testing, and comparing these hypotheses, and thus for producing new insights into early cognitive development. We first introduce the general framework--applicable to any infant gaze experiment--and then demonstrate its utility by analyzing data from a set of experiments investigating the role of attentional cues in infant learning. The new analysis uncovers significantly more structure in these data, finding evidence of learning that was not found in standard analyses and showing an unexpected relationship between cue use and learning rate. Finally, we discuss general implications for the construction and testing of quantitative linking hypotheses. MATLAB code for sample linking hypotheses can be found on the first author's website.

  4. Response model parameter linking

    NARCIS (Netherlands)

    Barrett, M.L.D.

    2015-01-01

    With a few exceptions, the problem of linking item response model parameters from different item calibrations has been conceptualized as an instance of the problem of equating observed scores on different test forms. This thesis argues, however, that the use of item response models does not require

  5. Linking Item Response Model Parameters.

    Science.gov (United States)

    van der Linden, Wim J; Barrett, Michelle D

    2016-09-01

    With a few exceptions, the problem of linking item response model parameters from different item calibrations has been conceptualized as an instance of the problem of test equating scores on different test forms. This paper argues, however, that the use of item response models does not require any test score equating. Instead, it involves the necessity of parameter linking due to a fundamental problem inherent in the formal nature of these models-their general lack of identifiability. More specifically, item response model parameters need to be linked to adjust for the different effects of the identifiability restrictions used in separate item calibrations. Our main theorems characterize the formal nature of these linking functions for monotone, continuous response models, derive their specific shapes for different parameterizations of the 3PL model, and show how to identify them from the parameter values of the common items or persons in different linking designs.

  6. Building a Database for a Quantitative Model

    Science.gov (United States)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  7. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  8. A quantitative description of the microwave properties of melt cast Bi{sub 2}Sr{sub 2}CaCu{sub 2}O{sub 8} in terms of a weak-link model

    Energy Technology Data Exchange (ETDEWEB)

    Godel, G.; Gold, N.; Hasse, J.; Bock, J.; Halbritter, J. [Phys. Inst., Karlsruhe Univ. (Germany)

    1994-10-01

    The granular structure dominates the RF properties of the material. Below T{sub c} the surface resistance at 11.27 GHz of Bi{sub 2}Sr{sub 2}CaCu{sub 2}O{sub 8} drops initially more slowly than BSC theory predicts. Below T{sub c}/2 it shows a linear temperature dependence and a quadratic frequency and field dependence with an RF critical magnetic field of <130 A m{sup -1} at 4.2 K. This behaviour is attributed to the existence of weak superconducting regions between crystallites, which provide a strikingly good description. The weak links with a boundary resistance R{sub bn} have to be regarded as Josephson junctions with reduced superconducting properties and normal conducting leakage currents. We conclude that the weak-link model gives a consistent description of the DC and microwave properties not only in the magnitude of the penetration depth and surface resistance but also in their temperature, field and frequency dependence. Conversely, it is possible to obtain from it quantitative information about weak links in the superconductor Bi{sub 2}Sr{sub 2}CaCu{sub 2}O{sub 8}. (author)

  9. A quantitative link between recycling and osmium isotopes.

    Science.gov (United States)

    Sobolev, Alexander V; Hofmann, Albrecht W; Brügmann, Gerhard; Batanova, Valentina G; Kuzmin, Dmitry V

    2008-07-25

    Recycled subducted ocean crust has been traced by elevated 187Os/188Os in some studies and by high nickel and low manganese contents in others. Here, we show that these tracers are linked for Quaternary lavas of Iceland, strengthening the recycling model. An estimate of the osmium isotopic composition of both the recycled crust and the mantle peridotite implies that Icelandic Quaternary lavas are derived in part from an ancient crustal component with model ages between 1.1 _ 109 and 1.8 _ 109 years and from a peridotitic end-member close to present-day oceanic mantle.

  10. Mapping protein structural changes by quantitative cross-linking

    Czech Academy of Sciences Publication Activity Database

    Kukačka, Zdeněk; Strohalm, Martin; Kavan, Daniel; Novák, Petr

    2015-01-01

    Roč. 89, NOV 2015 (2015), s. 112-120 ISSN 1046-2023 R&D Projects: GA MŠk(CZ) EE2.3.20.0055; GA MŠk(CZ) EE2.3.30.0003; GA MŠk(CZ) ED1.1.00/02.0109 Grant - others:OPPC(XE) CZ.2.16/3.1.00/24023 Institutional support: RVO:61388971 Keywords : Chemical cross-linking * Proteolysis * Mass spectrometry Subject RIV: CE - Biochemistry Impact factor: 3.503, year: 2015

  11. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Juan D Chavez

    Full Text Available Chemical cross-linking mass spectrometry (XL-MS provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  12. Quantitative structure - mesothelioma potency model ...

    Science.gov (United States)

    Cancer potencies of mineral and synthetic elongated particle (EP) mixtures, including asbestos fibers, are influenced by changes in fiber dose composition, bioavailability, and biodurability in combination with relevant cytotoxic dose-response relationships. A unique and comprehensive rat intra-pleural (IP) dose characterization data set with a wide variety of EP size, shape, crystallographic, chemical, and bio-durability properties facilitated extensive statistical analyses of 50 rat IP exposure test results for evaluation of alternative dose pleural mesothelioma response models. Utilizing logistic regression, maximum likelihood evaluations of thousands of alternative dose metrics based on hundreds of individual EP dimensional variations within each test sample, four major findings emerged: (1) data for simulations of short-term EP dose changes in vivo (mild acid leaching) provide superior predictions of tumor incidence compared to non-acid leached data; (2) sum of the EP surface areas (ÓSA) from these mildly acid-leached samples provides the optimum holistic dose response model; (3) progressive removal of dose associated with very short and/or thin EPs significantly degrades resultant ÓEP or ÓSA dose-based predictive model fits, as judged by Akaike’s Information Criterion (AIC); and (4) alternative, biologically plausible model adjustments provide evidence for reduced potency of EPs with length/width (aspect) ratios 80 µm. Regar

  13. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    on the existence of a quotient construction, allowing a property phi of a parallel system phi/A to be transformed into a sufficient and necessary quotient-property yolA to be satisfied by the component 13. Given a model checking problem involving a network Pi I and a property yo, the method gradually move (by...... quotienting) components Pi from the network into the formula co. Crucial to the success of the method is the ability to manage the size of the intermediate quotient-properties by a suitable collection of efficient minimization heuristics....

  14. On the Reproducibility of Label-Free Quantitative Cross-Linking/Mass Spectrometry

    Science.gov (United States)

    Müller, Fränze; Fischer, Lutz; Chen, Zhuo Angel; Auchynnikava, Tania; Rappsilber, Juri

    2018-02-01

    Quantitative cross-linking/mass spectrometry (QCLMS) is an emerging approach to study conformational changes of proteins and multi-subunit complexes. Distinguishing protein conformations requires reproducibly identifying and quantifying cross-linked peptides. Here we analyzed the variation between multiple cross-linking reactions using bis[sulfosuccinimidyl] suberate (BS3)-cross-linked human serum albumin (HSA) and evaluated how reproducible cross-linked peptides can be identified and quantified by LC-MS analysis. To make QCLMS accessible to a broader research community, we developed a workflow that integrates the established software tools MaxQuant for spectra preprocessing, Xi for cross-linked peptide identification, and finally Skyline for quantification (MS1 filtering). Out of the 221 unique residue pairs identified in our sample, 124 were subsequently quantified across 10 analyses with coefficient of variation (CV) values of 14% (injection replica) and 32% (reaction replica). Thus our results demonstrate that the reproducibility of QCLMS is in line with the reproducibility of general quantitative proteomics and we establish a robust workflow for MS1-based quantitation of cross-linked peptides.

  15. On the Reproducibility of Label-Free Quantitative Cross-Linking/Mass Spectrometry

    Science.gov (United States)

    Müller, Fränze; Fischer, Lutz; Chen, Zhuo Angel; Auchynnikava, Tania; Rappsilber, Juri

    2017-12-01

    Quantitative cross-linking/mass spectrometry (QCLMS) is an emerging approach to study conformational changes of proteins and multi-subunit complexes. Distinguishing protein conformations requires reproducibly identifying and quantifying cross-linked peptides. Here we analyzed the variation between multiple cross-linking reactions using bis[sulfosuccinimidyl] suberate (BS3)-cross-linked human serum albumin (HSA) and evaluated how reproducible cross-linked peptides can be identified and quantified by LC-MS analysis. To make QCLMS accessible to a broader research community, we developed a workflow that integrates the established software tools MaxQuant for spectra preprocessing, Xi for cross-linked peptide identification, and finally Skyline for quantification (MS1 filtering). Out of the 221 unique residue pairs identified in our sample, 124 were subsequently quantified across 10 analyses with coefficient of variation (CV) values of 14% (injection replica) and 32% (reaction replica). Thus our results demonstrate that the reproducibility of QCLMS is in line with the reproducibility of general quantitative proteomics and we establish a robust workflow for MS1-based quantitation of cross-linked peptides. [Figure not available: see fulltext.

  16. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  17. Link mining models, algorithms, and applications

    CERN Document Server

    Yu, Philip S; Faloutsos, Christos

    2010-01-01

    This book presents in-depth surveys and systematic discussions on models, algorithms and applications for link mining. Link mining is an important field of data mining. Traditional data mining focuses on 'flat' data in which each data object is represented as a fixed-length attribute vector. However, many real-world data sets are much richer in structure, involving objects of multiple types that are related to each other. Hence, recently link mining has become an emerging field of data mining, which has a high impact in various important applications such as text mining, social network analysi

  18. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made...

  19. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  20. Molecular determinants of ligand binding modes in the histamine H(4) receptor: linking ligand-based three-dimensional quantitative structure-activity relationship (3D-QSAR) models to in silico guided receptor mutagenesis studies.

    Science.gov (United States)

    Istyastono, Enade P; Nijmeijer, Saskia; Lim, Herman D; van de Stolpe, Andrea; Roumen, Luc; Kooistra, Albert J; Vischer, Henry F; de Esch, Iwan J P; Leurs, Rob; de Graaf, Chris

    2011-12-08

    The histamine H(4) receptor (H(4)R) is a G protein-coupled receptor (GPCR) that plays an important role in inflammation. Similar to the homologous histamine H(3) receptor (H(3)R), two acidic residues in the H(4)R binding pocket, D(3.32) and E(5.46), act as essential hydrogen bond acceptors of positively ionizable hydrogen bond donors in H(4)R ligands. Given the symmetric distribution of these complementary pharmacophore features in H(4)R and its ligands, different alternative ligand binding mode hypotheses have been proposed. The current study focuses on the elucidation of the molecular determinants of H(4)R-ligand binding modes by combining (3D) quantitative structure-activity relationship (QSAR), protein homology modeling, molecular dynamics simulations, and site-directed mutagenesis studies. We have designed and synthesized a series of clobenpropit (N-(4-chlorobenzyl)-S-[3-(4(5)-imidazolyl)propyl]isothiourea) derivatives to investigate H(4)R-ligand interactions and ligand binding orientations. Interestingly, our studies indicate that clobenpropit (2) itself can bind to H(4)R in two distinct binding modes, while the addition of a cyclohexyl group to the clobenpropit isothiourea moiety allows VUF5228 (5) to adopt only one specific binding mode in the H(4)R binding pocket. Our ligand-steered, experimentally supported protein modeling method gives new insights into ligand recognition by H(4)R and can be used as a general approach to elucidate the structure of protein-ligand complexes.

  1. Rational Models for Inflation-Linked Derivatives

    DEFF Research Database (Denmark)

    Dam, Henrik; Macrina, Andrea; Skovmand, David

    2018-01-01

    We construct models for the pricing and risk management of inflation-linked derivatives. The model is rational in the sense that affine payoffs written on the consumer price index have prices that are rational functions of the state variables. The nominal pricing kernel is constructed in a multip......We construct models for the pricing and risk management of inflation-linked derivatives. The model is rational in the sense that affine payoffs written on the consumer price index have prices that are rational functions of the state variables. The nominal pricing kernel is constructed...... in a multiplicative manner that allows for closed-form pricing of vanilla inflation products suchlike zero-coupon swaps, caps and floors, year-on-year swaps, caps and floors, and the exotic limited price index swap. The model retains the attractive features of a nominal multi-curve interest rate model such as closed...

  2. Quantitative versus qualitative modeling: a complementary approach in ecosystem study.

    Science.gov (United States)

    Bondavalli, C; Favilla, S; Bodini, A

    2009-02-01

    Natural disturbance or human perturbation act upon ecosystems by changing some dynamical parameters of one or more species. Foreseeing these modifications is necessary before embarking on an intervention: predictions may help to assess management options and define hypothesis for interventions. Models become valuable tools for studying and making predictions only when they capture types of interactions and their magnitude. Quantitative models are more precise and specific about a system, but require a large effort in model construction. Because of this very often ecological systems remain only partially specified and one possible approach to their description and analysis comes from qualitative modelling. Qualitative models yield predictions as directions of change in species abundance but in complex systems these predictions are often ambiguous, being the result of opposite actions exerted on the same species by way of multiple pathways of interactions. Again, to avoid such ambiguities one needs to know the intensity of all links in the system. One way to make link magnitude explicit in a way that can be used in qualitative analysis is described in this paper and takes advantage of another type of ecosystem representation: ecological flow networks. These flow diagrams contain the structure, the relative position and the connections between the components of a system, and the quantity of matter flowing along every connection. In this paper it is shown how these ecological flow networks can be used to produce a quantitative model similar to the qualitative counterpart. Analyzed through the apparatus of loop analysis this quantitative model yields predictions that are by no means ambiguous, solving in an elegant way the basic problem of qualitative analysis. The approach adopted in this work is still preliminary and we must be careful in its application.

  3. Link Prediction via Sparse Gaussian Graphical Model

    Directory of Open Access Journals (Sweden)

    Liangliang Zhang

    2016-01-01

    Full Text Available Link prediction is an important task in complex network analysis. Traditional link prediction methods are limited by network topology and lack of node property information, which makes predicting links challenging. In this study, we address link prediction using a sparse Gaussian graphical model and demonstrate its theoretical and practical effectiveness. In theory, link prediction is executed by estimating the inverse covariance matrix of samples to overcome information limits. The proposed method was evaluated with four small and four large real-world datasets. The experimental results show that the area under the curve (AUC value obtained by the proposed method improved by an average of 3% and 12.5% compared to 13 mainstream similarity methods, respectively. This method outperforms the baseline method, and the prediction accuracy is superior to mainstream methods when using only 80% of the training set. The method also provides significantly higher AUC values when using only 60% in Dolphin and Taro datasets. Furthermore, the error rate of the proposed method demonstrates superior performance with all datasets compared to mainstream methods.

  4. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  5. Quantitative Visualization of ChIP-chip Data by Using Linked Views

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Min-Yu; Weber, Gunther; Li, Xiao-Yong; Biggin, Mark; Hamann, Bernd

    2010-11-05

    Most analyses of ChIP-chip in vivo DNA binding have focused on qualitative descriptions of whether genomic regions are bound or not. There is increasing evidence, however, that factors bind in a highly overlapping manner to the same genomic regions and that it is quantitative differences in occupancy on these commonly bound regions that are the critical determinants of the different biological specificity of factors. As a result, it is critical to have a tool to facilitate the quantitative visualization of differences between transcription factors and the genomic regions they bind to understand each factor's unique roles in the network. We have developed a framework which combines several visualizations via brushing-and-linking to allow the user to interactively analyze and explore in vivo DNA binding data of multiple transcription factors. We describe these visualization types and also provide a discussion of biological examples in this paper.

  6. Global Quantitative Modeling of Chromatin Factor Interactions

    Science.gov (United States)

    Zhou, Jian; Troyanskaya, Olga G.

    2014-01-01

    Chromatin is the driver of gene regulation, yet understanding the molecular interactions underlying chromatin factor combinatorial patterns (or the “chromatin codes”) remains a fundamental challenge in chromatin biology. Here we developed a global modeling framework that leverages chromatin profiling data to produce a systems-level view of the macromolecular complex of chromatin. Our model ultilizes maximum entropy modeling with regularization-based structure learning to statistically dissect dependencies between chromatin factors and produce an accurate probability distribution of chromatin code. Our unsupervised quantitative model, trained on genome-wide chromatin profiles of 73 histone marks and chromatin proteins from modENCODE, enabled making various data-driven inferences about chromatin profiles and interactions. We provided a highly accurate predictor of chromatin factor pairwise interactions validated by known experimental evidence, and for the first time enabled higher-order interaction prediction. Our predictions can thus help guide future experimental studies. The model can also serve as an inference engine for predicting unknown chromatin profiles — we demonstrated that with this approach we can leverage data from well-characterized cell types to help understand less-studied cell type or conditions. PMID:24675896

  7. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  8. Physiologically based quantitative modeling of unihemispheric sleep.

    Science.gov (United States)

    Kedziora, D J; Abeysuriya, R G; Phillips, A J K; Robinson, P A

    2012-12-07

    Unihemispheric sleep has been observed in numerous species, including birds and aquatic mammals. While knowledge of its functional role has been improved in recent years, the physiological mechanisms that generate this behavior remain poorly understood. Here, unihemispheric sleep is simulated using a physiologically based quantitative model of the mammalian ascending arousal system. The model includes mutual inhibition between wake-promoting monoaminergic nuclei (MA) and sleep-promoting ventrolateral preoptic nuclei (VLPO), driven by circadian and homeostatic drives as well as cholinergic and orexinergic input to MA. The model is extended here to incorporate two distinct hemispheres and their interconnections. It is postulated that inhibitory connections between VLPO nuclei in opposite hemispheres are responsible for unihemispheric sleep, and it is shown that contralateral inhibitory connections promote unihemispheric sleep while ipsilateral inhibitory connections promote bihemispheric sleep. The frequency of alternating unihemispheric sleep bouts is chiefly determined by sleep homeostasis and its corresponding time constant. It is shown that the model reproduces dolphin sleep, and that the sleep regimes of humans, cetaceans, and fur seals, the latter both terrestrially and in a marine environment, require only modest changes in contralateral connection strength and homeostatic time constant. It is further demonstrated that fur seals can potentially switch between their terrestrial bihemispheric and aquatic unihemispheric sleep patterns by varying just the contralateral connection strength. These results provide experimentally testable predictions regarding the differences between species that sleep bihemispherically and unihemispherically. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  10. Quantitative affinity electrophoresis of RNA-small molecule interactions by cross-linking the ligand to acrylamide.

    Science.gov (United States)

    Boodram, Sherry N; McCann, Lucas C; Organ, Michael G; Johnson, Philip E

    2013-11-15

    We show that the affinity electrophoresis analysis of RNA-small molecule interactions can be made quantifiable by cross-linking the ligand to the gel matrix. Using an RNA-aminoglycoside model system to verify our method, we attached an acryloyl chloride molecule to the aminoglycosides paromomycin and neomycin B to synthesize an acrylamide-aminoglycoside monomer. This molecule was then used as a component in gel polymerization for affinity electrophoresis, covalently attaching an aminoglycoside molecule to the gel matrix. To test RNA binding to the cross-linked aminoglycosides, we used the aminoglycoside binding RNA molecule derived from thymidylate synthase messenger RNA (mRNA) that contains a C-C mismatch. Binding is indicated by the difference in RNA mobility between gels with cross-linked ligand, with ligand embedded during polymerization, and with no ligand present. Critically, the predicted straight line relationship between the reciprocal of the relative migration of the RNA and the ligand concentration is obtained when using cross-linked aminoglycosides, whereas a straight line is not obtained using embedded aminoglycosides. Average apparent dissociation constants are determined from the slope of the line from these plots. This method allows an easy quantitative comparison between different nucleic acid molecules for a small molecule ligand. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Quantitative Profiling of N-linked Glycosylation Machinery in YeastSaccharomyces cerevisiae.

    Science.gov (United States)

    Poljak, Kristina; Selevsek, Nathalie; Ngwa, Elsy; Grossmann, Jonas; Losfeld, Marie Estelle; Aebi, Markus

    2018-01-01

    Asparagine-linked glycosylation is a common posttranslational protein modification regulating the structure, stability and function of many proteins. The N -linked glycosylation machinery involves enzymes responsible for the assembly of the lipid-linked oligosaccharide (LLO), which is then transferred to the asparagine residues on the polypeptides by the enzyme oligosaccharyltransferase (OST). A major goal in the study of protein glycosylation is to establish quantitative methods for the analysis of site-specific extent of glycosylation. We developed a sensitive approach to examine glycosylation site occupancy in Saccharomyces cerevisiae by coupling stable isotope labeling (SILAC) approach to parallel reaction monitoring (PRM) mass spectrometry (MS). We combined the method with genetic tools and validated the approach with the identification of novel glycosylation sites dependent on the Ost3p and Ost6p regulatory subunits of OST. Based on the observations that alternations in LLO substrate structure and OST subunits activity differentially alter the systemic output of OST, we conclude that sequon recognition is a direct property of the catalytic subunit Stt3p, auxiliary subunits such as Ost3p and Ost6p extend the OST substrate range by modulating interfering pathways such as protein folding. In addition, our proteomics approach revealed a novel regulatory network that connects isoprenoid lipid biosynthesis and LLO substrate assembly. © 2018 by The American Society for Biochemistry and Molecular Biology, Inc.

  12. Linking descriptive geology and quantitative machine learning through an ontology of lithological concepts

    Science.gov (United States)

    Klump, J. F.; Huber, R.; Robertson, J.; Cox, S. J. D.; Woodcock, R.

    2014-12-01

    Despite the recent explosion of quantitative geological data, geology remains a fundamentally qualitative science. Numerical data only constitute a certain part of data collection in the geosciences. In many cases, geological observations are compiled as text into reports and annotations on drill cores, thin sections or drawings of outcrops. The observations are classified into concepts such as lithology, stratigraphy, geological structure, etc. These descriptions are semantically rich and are generally supported by more quantitative observations using geochemical analyses, XRD, hyperspectral scanning, etc, but the goal is geological semantics. In practice it has been difficult to bring the different observations together due to differing perception or granularity of classification in human observation, or the partial observation of only some characteristics using quantitative sensors. In the past years many geological classification schemas have been transferred into ontologies and vocabularies, formalized using RDF and OWL, and published through SPARQL endpoints. Several lithological ontologies were compiled by stratigraphy.net and published through a SPARQL endpoint. This work is complemented by the development of a Python API to integrate this vocabulary into Python-based text mining applications. The applications for the lithological vocabulary and Python API are automated semantic tagging of geochemical data and descriptions of drill cores, machine learning of geochemical compositions that are diagnostic for lithological classifications, and text mining for lithological concepts in reports and geological literature. This combination of applications can be used to identify anomalies in databases, where composition and lithological classification do not match. It can also be used to identify lithological concepts in the literature and infer quantitative values. The resulting semantic tagging opens new possibilities for linking these diverse sources of data.

  13. Link model simulation and power penalty specification of the versatile link systems

    CERN Document Server

    Gong, D; Weidberg, A; Vasey, F; Zhu, L; Prosser, A; Liu, C; Xiang, A; Liu, T; Ye, J; Huffman, T

    2011-01-01

    This paper presents simulation and experimental studies of optical power penalties on the Versatile Link, a common R\\&D project on high-speed optical link for SLHC experiments. The 10 Gigabit Ethernet (10GbE) link model is examined and conservative link power penalties are predicted. We conduct parameter sensitivity analyses and find that the transmitter characteristics affect the link power penalties most. Power penalty differences of multi-mode and single-mode commerical transceiver modules over different fiber lengths are tested to be within the simulation limits. The optical power budgets are then proposed for different Versatile Link variants.

  14. Toward quantitative modeling of silicon phononic thermocrystals

    Energy Technology Data Exchange (ETDEWEB)

    Lacatena, V. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France); IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Haras, M.; Robillard, J.-F., E-mail: jean-francois.robillard@isen.iemn.univ-lille1.fr; Dubois, E. [IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Monfray, S.; Skotnicki, T. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France)

    2015-03-16

    The wealth of technological patterning technologies of deca-nanometer resolution brings opportunities to artificially modulate thermal transport properties. A promising example is given by the recent concepts of 'thermocrystals' or 'nanophononic crystals' that introduce regular nano-scale inclusions using a pitch scale in between the thermal phonons mean free path and the electron mean free path. In such structures, the lattice thermal conductivity is reduced down to two orders of magnitude with respect to its bulk value. Beyond the promise held by these materials to overcome the well-known “electron crystal-phonon glass” dilemma faced in thermoelectrics, the quantitative prediction of their thermal conductivity poses a challenge. This work paves the way toward understanding and designing silicon nanophononic membranes by means of molecular dynamics simulation. Several systems are studied in order to distinguish the shape contribution from bulk, ultra-thin membranes (8 to 15 nm), 2D phononic crystals, and finally 2D phononic membranes. After having discussed the equilibrium properties of these structures from 300 K to 400 K, the Green-Kubo methodology is used to quantify the thermal conductivity. The results account for several experimental trends and models. It is confirmed that the thin-film geometry as well as the phononic structure act towards a reduction of the thermal conductivity. The further decrease in the phononic engineered membrane clearly demonstrates that both phenomena are cumulative. Finally, limitations of the model and further perspectives are discussed.

  15. Modeling X-linked ancestral origins in multiparental populations.

    Science.gov (United States)

    Zheng, Chaozhi

    2015-03-04

    The models for the mosaic structure of an individual's genome from multiparental populations have been developed primarily for autosomes, whereas X chromosomes receive very little attention. In this paper, we extend our previous approach to model ancestral origin processes along two X chromosomes in a mapping population, which is necessary for developing hidden Markov models in the reconstruction of ancestry blocks for X-linked quantitative trait locus mapping. The model accounts for the joint recombination pattern, the asymmetry between maternally and paternally derived X chromosomes, and the finiteness of population size. The model can be applied to various mapping populations such as the advanced intercross lines (AIL), the Collaborative Cross (CC), the heterogeneous stock (HS), the Diversity Outcross (DO), and the Drosophila synthetic population resource (DSPR). We further derive the map expansion, density (per Morgan) of recombination breakpoints, in advanced intercross populations with L inbred founders under the limit of an infinitely large population size. The analytic results show that for X chromosomes the genetic map expands linearly at a rate (per generation) of two-thirds times 1 - 10/(9L) for the AIL, and at a rate of two-thirds times 1 - 1/L for the DO and the HS, whereas for autosomes the map expands at a rate of 1 - 1/L for the AIL, the DO, and the HS. Copyright © 2015 Zheng.

  16. A statistical model for telecommunication link design

    Science.gov (United States)

    Yuen, J. H.

    1975-01-01

    An evaluation is conducted of the current telecommunication link design technique and a description is presented of an alternative method, called the probability distribution method (PDM), which is free of the disadvantages of the current technique while retaining its advantages. The PDM preserves the simplicity of the design control table (DCT) format. The use of the DCT as a management design control tool is continued. The telecommunication link margin probability density function used presents the probability of achieving any particular value of link performance. It is, therefore, possible to assess the performance risk and other tradeoffs.

  17. A mathematical model for dynamics of occurrence probability of missing links in predicted missing link list

    Directory of Open Access Journals (Sweden)

    WenJun Zhang

    2016-12-01

    Full Text Available In most of the link prediction methods, all predicted missing links are ranked according to their scores. In the practical application of prediction results, starting from the first link that has the highest score in the ranking list, we verify each link one by one through experiments or other ways. Nevertheless, how to find an occurrence pattern of true missing links in the ranking list has seldomly reported. In present study, I proposed a mathematical model for relationship between cumulative number of predicted true missing links (y and cumulative number of predicted missing links (x: y=K(1-e^(-rx/K, where K is the expected total number of true missing links, and r is the intrinsic (maximum occurrence probability of true missing links. It can be used to predict the changes of occurrence probability of true missing links, assess the effectiveness of a prediction method, and help find the mechanism of link missing in the network. The model was validated by six prediction methods using the data of tumor pathways.

  18. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  19. Model reduction in mathematical pharmacology : Integration, reduction and linking of PBPK and systems biology models.

    Science.gov (United States)

    Snowden, Thomas J; van der Graaf, Piet H; Tindall, Marcus J

    2018-03-26

    In this paper we present a framework for the reduction and linking of physiologically based pharmacokinetic (PBPK) models with models of systems biology to describe the effects of drug administration across multiple scales. To address the issue of model complexity, we propose the reduction of each type of model separately prior to being linked. We highlight the use of balanced truncation in reducing the linear components of PBPK models, whilst proper lumping is shown to be efficient in reducing typically nonlinear systems biology type models. The overall methodology is demonstrated via two example systems; a model of bacterial chemotactic signalling in Escherichia coli and a model of extracellular regulatory kinase activation mediated via the extracellular growth factor and nerve growth factor receptor pathways. Each system is tested under the simulated administration of three hypothetical compounds; a strong base, a weak base, and an acid, mirroring the parameterisation of pindolol, midazolam, and thiopental, respectively. Our method can produce up to an 80% decrease in simulation time, allowing substantial speed-up for computationally intensive applications including parameter fitting or agent based modelling. The approach provides a straightforward means to construct simplified Quantitative Systems Pharmacology models that still provide significant insight into the mechanisms of drug action. Such a framework can potentially bridge pre-clinical and clinical modelling - providing an intermediate level of model granularity between classical, empirical approaches and mechanistic systems describing the molecular scale.

  20. TraceLink: A model of amnesia and consolidation.

    NARCIS (Netherlands)

    Meeter, M.; Murre, J.M.J.

    2005-01-01

    A connectionist model is presented, the TraceLink model, that implements an autonomous "off-line" consolidation process. The model consists of three subsystems: (1) a trace system (neocortex), (2) a link system (hippocampus and adjacent regions), and (3) a modulatory system (basal forebrain and

  1. Developing Quantitative Models for Auditing Journal Entries

    OpenAIRE

    Argyrou, Argyris

    2013-01-01

    The thesis examines how the auditing of journal entries can detect and prevent financial statement fraud. Financial statement fraud occurs when an intentional act causes financial statements to be materially misstated. Although it is not a new phenomenon, financial statement fraud has attracted much publicity in the wake of numerous cases of financial malfeasance (e.g. ENRON, WorldCom). Existing literature has provided limited empirical evidence on the link between auditing journal entrie...

  2. A Quantitative Software Risk Assessment Model

    Science.gov (United States)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  3. Cross-link guided molecular modeling with ROSETTA.

    Directory of Open Access Journals (Sweden)

    Abdullah Kahraman

    Full Text Available Chemical cross-links identified by mass spectrometry generate distance restraints that reveal low-resolution structural information on proteins and protein complexes. The technology to reliably generate such data has become mature and robust enough to shift the focus to the question of how these distance restraints can be best integrated into molecular modeling calculations. Here, we introduce three workflows for incorporating distance restraints generated by chemical cross-linking and mass spectrometry into ROSETTA protocols for comparative and de novo modeling and protein-protein docking. We demonstrate that the cross-link validation and visualization software Xwalk facilitates successful cross-link data integration. Besides the protocols we introduce XLdb, a database of chemical cross-links from 14 different publications with 506 intra-protein and 62 inter-protein cross-links, where each cross-link can be mapped on an experimental structure from the Protein Data Bank. Finally, we demonstrate on a protein-protein docking reference data set the impact of virtual cross-links on protein docking calculations and show that an inter-protein cross-link can reduce on average the RMSD of a docking prediction by 5.0 Å. The methods and results presented here provide guidelines for the effective integration of chemical cross-link data in molecular modeling calculations and should advance the structural analysis of particularly large and transient protein complexes via hybrid structural biology methods.

  4. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  5. Linking quantitative microbial risk assessment and epidemiological data: informing safe drinking water trials in developing countries.

    Science.gov (United States)

    Enger, Kyle S; Nelson, Kara L; Clasen, Thomas; Rose, Joan B; Eisenberg, Joseph N S

    2012-05-01

    Intervention trials are used extensively to assess household water treatment (HWT) device efficacy against diarrheal disease in developing countries. Using these data for policy, however, requires addressing issues of generalizability (relevance of one trial in other contexts) and systematic bias associated with design and conduct of a study. To illustrate how quantitative microbial risk assessment (QMRA) can address water safety and health issues, we analyzed a published randomized controlled trial (RCT) of the LifeStraw Family Filter in the Congo. The model accounted for bias due to (1) incomplete compliance with filtration, (2) unexpected antimicrobial activity by the placebo device, and (3) incomplete recall of diarrheal disease. Effectiveness was measured using the longitudinal prevalence ratio (LPR) of reported diarrhea. The Congo RCT observed an LPR of 0.84 (95% CI: 0.61, 1.14). Our model predicted LPRs, assuming a perfect placebo, ranging from 0.50 (2.5-97.5 percentile: 0.33, 0.77) to 0.86 (2.5-97.5 percentile: 0.68, 1.09) for high (but not perfect) and low (but not zero) compliance, respectively. The calibration step provided estimates of the concentrations of three pathogen types (modeled as diarrheagenic E. coli, Giardia, and rotavirus) in drinking water, consistent with the longitudinal prevalence of reported diarrhea measured in the trial, and constrained by epidemiological data from the trial. Use of a QMRA model demonstrated the importance of compliance in HWT efficacy, the need for pathogen data from source waters, the effect of quantifying biases associated with epidemiological data, and the usefulness of generalizing the effectiveness of HWT trials to other contexts. © 2012 American Chemical Society

  6. Quantitative N-linked Glycoproteomics of Myocardial Ischemia and Reperfusion Injury Reveals Early Remodeling in the Extracellular Environment

    DEFF Research Database (Denmark)

    Parker, Benjamin L; Palmisano, Giuseppe; Edwards, Alistair V G

    2011-01-01

    Extracellular and cell surface proteins are generally modified with N-linked glycans and glycopeptide enrichment is an attractive tool to analyze these proteins. The role of N-linked glycoproteins in cardiovascular disease, particularly ischemia and reperfusion injury, is poorly understood...... quantitation (iTRAQ) and validation with dimethyl labeling to analyze changes in glycoproteins from tissue following prolonged ischemia and reperfusion (40 mins ischemia and 20 mins reperfusion) indicative of myocardial infarction. The iTRAQ approach revealed 80 of 437 glycopeptides with altered abundance......-associated proteins. The data suggest that cardiac remodeling is initiated earlier during reperfusion than previously hypothesized....

  7. A Quantitative Model of Expert Transcription Typing

    Science.gov (United States)

    1993-03-08

    1-3), how degradation of the text away from normal prose affects the rate of typing (phenomena 4-6), patterns of interkey intervals (phenomena 7-11...A more detailed analysis of this phenomenon is based on the work of West and Sabban (1932). They used progressively degraded copy to test "the...company: Analytic modelling applied to real-world problems. In D. Diaper , D. Gilmore, G. Cockton, & B. Shackel (Eds.). Human-Computer Interaction INTERACT

  8. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction

    OpenAIRE

    Cobbs, Gary

    2012-01-01

    Abstract Background Numerous models for use in interpreting quantitative PCR (qPCR) data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most pote...

  9. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    The majority of modern software and hardware systems are reactive systems, where input provided by the user (possibly another system) and the output of the system is exchanged continuously throughout the (possibly) indefinite execution of the system. Natural examples include control systems, mobi......, energy consumption, latency, mean-time to failure, and cost. For systems integrated in mass-market products, the ability to quantify trade-offs between performance and robustness, under given technical and economic constraints, is of strategic importance....... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, in terms of a new mathematical basis for systems modeling which can incompas behavioural properties as well as environmental constraints. They continue by pointing out that, continuous performance and robustness measures are paramount when dealing with physical resource levels such as clock frequency...

  10. Quantitative trait loci linked to PRNP gene controlling health and production traits in INRA 401 sheep

    Directory of Open Access Journals (Sweden)

    Brunel Jean-Claude

    2007-07-01

    Full Text Available Abstract In this study, the potential association of PrP genotypes with health and productive traits was investigated. Data were recorded on animals of the INRA 401 breed from the Bourges-La Sapinière INRA experimental farm. The population consisted of 30 rams and 852 ewes, which produced 1310 lambs. The animals were categorized into three PrP genotype classes: ARR homozygous, ARR heterozygous, and animals without any ARR allele. Two analyses differing in the approach considered were carried out. Firstly, the potential association of the PrP genotype with disease (Salmonella resistance and production (wool and carcass traits was studied. The data used included 1042, 1043 and 1013 genotyped animals for the Salmonella resistance, wool and carcass traits, respectively. The different traits were analyzed using an animal model, where the PrP genotype effect was included as a fixed effect. Association analyses do not indicate any evidence of an effect of PrP genotypes on traits studied in this breed. Secondly, a quantitative trait loci (QTL detection approach using the PRNP gene as a marker was applied on ovine chromosome 13. Interval mapping was used. Evidence for one QTL affecting mean fiber diameter was found at 25 cM from the PRNP gene. However, a linkage between PRNP and this QTL does not imply unfavorable linkage disequilibrium for PRNP selection purposes.

  11. A general method for targeted quantitative cross-linking mass spectrometry

    Science.gov (United States)

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NM...

  12. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  13. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  14. Hidden Markov Model for quantitative prediction of snowfall

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  15. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  16. Hidden Markov Model for quantitative prediction of snowfall and ...

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  17. Modelling flexible link manipulators in the three-dimensional space

    Science.gov (United States)

    Kozel, David

    The force/torque relationship between the joints and the end effector of flexible link manipulators is affected not only by the motion of the joints but also by translations and rotations of the links due to their bending and torsion effects. These inaccuracies in the force/torque relationship for flexible manipulators in the three-dimensional space are addressed. This thesis presents the development of a systematic technique for determining for force/torque relationship between the joints and the end effector for flexible link manipulators. The proposed technique accounts for link bending and torsion in the kinematic equations of a flexible manipulator while considering rigid hubs and tools attached to the ends of the links. A procedure for modelling link bending and torsion to within a desired accuracy has also been developed. The resulting model is independent of manipulator configuration. In this technique, vector cross products are used instead of partial derivatives. This reduces inaccuracies which result from approximations made in the kinematic equations when modelling bending and torsion of the links. The magnitude and location of these inaccuracies are characterized. The validity of the proposed technique/procedure and the inaccuracies noted are demonstrated for link bending of a single-link planar manipulator by comparing simulations of the resulting force/torque relationships to results obtained experimentally; however, the experiments were not able to illustrate the torsion effect in the proposed model. Results indicate that the magnitude and location of the errors in the force/torque relationship are dependent upon the rotation due to link deformation, the 'length' of the rest of the manipulator, and the configuration of the manipulator. Although the proposed technique is able to account for the link bending and torsion effects in the kinematic equations of a flexible manipulator, it also suffers from several limitations. These include: translations

  18. Generalized PSF modeling for optimized quantitation in PET imaging

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-ud-Din, Hassan; Karakatsanis, Nicolas A.; Jha, Abhinav K.; Casey, Michael E.; Kadrmas, Dan J.; Rahmim, Arman

    2017-06-01

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUVmean and SUVmax, including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUVmean bias in small tumours. Overall, the results indicate that exactly matched PSF

  19. Linking spatial and dynamic models for traffic maneuvers

    DEFF Research Database (Denmark)

    Olderog, Ernst-Rüdiger; Ravn, Anders Peter; Wisniewski, Rafal

    2015-01-01

    For traffic maneuvers of multiple vehicles on highways we build an abstract spatial and a concrete dynamic model. In the spatial model we show the safety (collision freedom) of lane-change maneuvers. By linking the spatial and dynamic model via suitable refinements of the spatial atoms to distance...

  20. A novel multilayer model for missing link prediction and future link forecasting in dynamic complex networks

    Science.gov (United States)

    Yasami, Yasser; Safaei, Farshad

    2018-02-01

    The traditional complex network theory is particularly focused on network models in which all network constituents are dealt with equivalently, while fail to consider the supplementary information related to the dynamic properties of the network interactions. This is a main constraint leading to incorrect descriptions of some real-world phenomena or incomplete capturing the details of certain real-life problems. To cope with the problem, this paper addresses the multilayer aspects of dynamic complex networks by analyzing the properties of intrinsically multilayered co-authorship networks, DBLP and Astro Physics, and presenting a novel multilayer model of dynamic complex networks. The model examines the layers evolution (layers birth/death process and lifetime) throughout the network evolution. Particularly, this paper models the evolution of each node's membership in different layers by an Infinite Factorial Hidden Markov Model considering feature cascade, and thereby formulates the link generation process for intra-layer and inter-layer links. Although adjacency matrixes are useful to describe the traditional single-layer networks, such a representation is not sufficient to describe and analyze the multilayer dynamic networks. This paper also extends a generalized mathematical infrastructure to address the problems issued by multilayer complex networks. The model inference is performed using some Markov Chain Monte Carlo sampling strategies, given synthetic and real complex networks data. Experimental results indicate a tremendous improvement in the performance of the proposed multilayer model in terms of sensitivity, specificity, positive and negative predictive values, positive and negative likelihood ratios, F1-score, Matthews correlation coefficient, and accuracy for two important applications of missing link prediction and future link forecasting. The experimental results also indicate the strong predictivepower of the proposed model for the application of

  1. Quantitation of the receptor for urokinase plasminogen activator by enzyme-linked immunosorbent assay

    DEFF Research Database (Denmark)

    Rønne, E; Behrendt, N; Ploug, M

    1994-01-01

    Binding of the urokinase plasminogen activator (uPA) to a specific cell surface receptor (uPAR) plays a crucial role in proteolysis during tissue remodelling and cancer invasion. An immunosorbent assay for the quantitation of uPAR has now been developed. This assay is based on two monoclonal anti...

  2. Modeling of Atmospheric Turbulence Effect on Terrestrial FSO Link

    Directory of Open Access Journals (Sweden)

    A. Prokes

    2009-04-01

    Full Text Available Atmospheric turbulence results in many effects causing fluctuation in the received optical power. Terrestrial laser beam communication is affected above all by scintillations. The paper deals with modeling the influence of scintillation on link performance, using the modified Rytov theory. The probability of correct signal detection in direct detection system in dependence on many parameters such as link distance, power link margin, refractive-index structure parameter, etc. is discussed and different approaches to the evaluation of scintillation effect are compared. The simulations are performed for a horizontal-path propagation of the Gaussian-beam wave.

  3. Virtual Models Linked with Physical Components in Construction

    DEFF Research Database (Denmark)

    Sørensen, Kristian Birch

    components in the construction process and thereby improving the information handling. The present PhD project has examined the potential of establishing such a digital link between virtual models and physical components in construction. This is done by integrating knowledge of civil engineering, software......) project progress management, and 3) in operation and maintenance. Experiments and implementations in real life projects showed that mobile technology and passive RFID technology delineate an efficient and practically implementable ways to establish the digital links in construction and are ready for use...... virtual models that thoroughly mirror the performance of the final facility and its construction process. However, the potential of the virtual models in construction has not yet been fully utilised. One way to take more advantage of the virtual models is by digitally linking them with the physical...

  4. Quantitative modelling in design and operation of food supply systems

    NARCIS (Netherlands)

    Beek, van P.

    2004-01-01

    During the last two decades food supply systems not only got interest of food technologists but also from the field of Operations Research and Management Science. Operations Research (OR) is concerned with quantitative modelling and can be used to get insight into the optimal configuration and

  5. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  6. Link Community Detection Using Generative Model and Nonnegative Matrix Factorization

    Science.gov (United States)

    He, Dongxiao; Jin, Di; Baquero, Carlos; Liu, Dayou

    2014-01-01

    Discovery of communities in complex networks is a fundamental data analysis problem with applications in various domains. While most of the existing approaches have focused on discovering communities of nodes, recent studies have shown the advantages and uses of link community discovery in networks. Generative models provide a promising class of techniques for the identification of modular structures in networks, but most generative models mainly focus on the detection of node communities rather than link communities. In this work, we propose a generative model, which is based on the importance of each node when forming links in each community, to describe the structure of link communities. We proceed to fit the model parameters by taking it as an optimization problem, and solve it using nonnegative matrix factorization. Thereafter, in order to automatically determine the number of communities, we extend the above method by introducing a strategy of iterative bipartition. This extended method not only finds the number of communities all by itself, but also obtains high efficiency, and thus it is more suitable to deal with large and unexplored real networks. We test this approach on both synthetic benchmarks and real-world networks including an application on a large biological network, and compare it with two highly related methods. Results demonstrate the superior performance of our approach over competing methods for the detection of link communities. PMID:24489803

  7. Link Prediction in Weighted Networks: A Weighted Mutual Information Model.

    Directory of Open Access Journals (Sweden)

    Boyao Zhu

    Full Text Available The link-prediction problem is an open issue in data mining and knowledge discovery, which attracts researchers from disparate scientific communities. A wealth of methods have been proposed to deal with this problem. Among these approaches, most are applied in unweighted networks, with only a few taking the weights of links into consideration. In this paper, we present a weighted model for undirected and weighted networks based on the mutual information of local network structures, where link weights are applied to further enhance the distinguishable extent of candidate links. Empirical experiments are conducted on four weighted networks, and results show that the proposed method can provide more accurate predictions than not only traditional unweighted indices but also typical weighted indices. Furthermore, some in-depth discussions on the effects of weak ties in link prediction as well as the potential to predict link weights are also given. This work may shed light on the design of algorithms for link prediction in weighted networks.

  8. Linking knowledge and action through mental models of sustainable agriculture.

    Science.gov (United States)

    Hoffman, Matthew; Lubell, Mark; Hillis, Vicken

    2014-09-09

    Linking knowledge to action requires understanding how decision-makers conceptualize sustainability. This paper empirically analyzes farmer "mental models" of sustainability from three winegrape-growing regions of California where local extension programs have focused on sustainable agriculture. The mental models are represented as networks where sustainability concepts are nodes, and links are established when a farmer mentions two concepts in their stated definition of sustainability. The results suggest that winegrape grower mental models of sustainability are hierarchically structured, relatively similar across regions, and strongly linked to participation in extension programs and adoption of sustainable farm practices. We discuss the implications of our findings for the debate over the meaning of sustainability, and the role of local extension programs in managing knowledge systems.

  9. Quantitative and logic modelling of gene and molecular networks

    Science.gov (United States)

    Le Novère, Nicolas

    2015-01-01

    Behaviours of complex biomolecular systems are often irreducible to the elementary properties of their individual components. Explanatory and predictive mathematical models are therefore useful for fully understanding and precisely engineering cellular functions. The development and analyses of these models require their adaptation to the problems that need to be solved and the type and amount of available genetic or molecular data. Quantitative and logic modelling are among the main methods currently used to model molecular and gene networks. Each approach comes with inherent advantages and weaknesses. Recent developments show that hybrid approaches will become essential for further progress in synthetic biology and in the development of virtual organisms. PMID:25645874

  10. Effect of Linked Rules on Business Process Model Understanding

    DEFF Research Database (Denmark)

    Wang, Wei; Indulska, Marta; Sadiq, Shazia

    2017-01-01

    of business processes has not been empirically evaluated. In this paper, we report on an experiment that investigates the effect of linked rules, a specific rule integration approach, on business process model understanding. Our results indicate that linked rules are associated with better time efficiency......Business process models are widely used in organizations by information systems analysts to represent complex business requirements and by business users to understand business operations and constraints. This understanding is extracted from graphical process models as well as business rules. Prior...... research advocated integrating business rules into business process models to improve the effectiveness of important organizational activities, such as developing shared understanding, effective communication, and process improvement. However, whether such integrated modeling can improve the understanding...

  11. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger.

    Science.gov (United States)

    Moray, Neville; Groeger, John; Stanton, Neville

    2017-02-01

    This paper shows how to combine field observations, experimental data and mathematical modelling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example, we consider a major railway accident. In 1999, a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, 'black box' data, and accident and engineering reports to construct a case history of the accident. We show how to combine field data with mathematical modelling to estimate the probability that the driver observed and identified the state of the signals, and checked their status. Our methodology can explain the SPAD ('Signal Passed At Danger'), generate recommendations about signal design and placement and provide quantitative guidance for the design of safer railway systems' speed limits and the location of signals. Practitioner Summary: Detailed ergonomic analysis of railway signals and rail infrastructure reveals problems of signal identification at this location. A record of driver eye movements measures attention, from which a quantitative model for out signal placement and permitted speeds can be derived. The paper is an example of how to combine field data, basic research and mathematical modelling to solve ergonomic design problems.

  12. Sex chromosome linked genetic variance and the evolution of sexual dimorphism of quantitative traits.

    Science.gov (United States)

    Husby, Arild; Schielzeth, Holger; Forstmeier, Wolfgang; Gustafsson, Lars; Qvarnström, Anna

    2013-03-01

    Theory predicts that sex chromsome linkage should reduce intersexual genetic correlations thereby allowing the evolution of sexual dimorphism. Empirical evidence for sex linkage has come largely from crosses and few studies have examined how sexual dimorphism and sex linkage are related within outbred populations. Here, we use data on an array of different traits measured on over 10,000 individuals from two pedigreed populations of birds (collared flycatcher and zebra finch) to estimate the amount of sex-linked genetic variance (h(2)z ). Of 17 traits examined, eight showed a nonzero h(2)Z estimate but only four were significantly different from zero (wing patch size and tarsus length in collared flycatchers, wing length and beak color in zebra finches). We further tested how sexual dimorphism and the mode of selection operating on the trait relate to the proportion of sex-linked genetic variance. Sexually selected traits did not show higher h(2)Z than morphological traits and there was only a weak positive relationship between h(2)Z and sexual dimorphism. However, given the relative scarcity of empirical studies, it is premature to make conclusions about the role of sex chromosome linkage in the evolution of sexual dimorphism. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.

  13. Analysis of sensory ratings data with cumulative link models

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen; Brockhoff, Per B.

    2013-01-01

    Examples of categorical rating scales include discrete preference, liking and hedonic rating scales. Data obtained on these scales are often analyzed with normal linear regression methods or with omnibus Pearson chi2 tests. In this paper we propose to use cumulative link models that allow for reg...

  14. Quantitative comparison of canopy conductance models using a Bayesian approach

    Science.gov (United States)

    Samanta, S.; Clayton, M. K.; Mackay, D. S.; Kruger, E. L.; Ewers, B. E.

    2008-09-01

    A quantitative model comparison methodology based on deviance information criterion, a Bayesian measure of the trade-off between model complexity and goodness of fit, is developed and demonstrated by comparing semiempirical transpiration models. This methodology accounts for parameter and prediction uncertainties associated with such models and facilitates objective selection of the simplest model, out of available alternatives, which does not significantly compromise the ability to accurately model observations. We use this methodology to compare various Jarvis canopy conductance model configurations, embedded within a larger transpiration model, against canopy transpiration measured by sap flux. The results indicate that descriptions of the dependence of stomatal conductance on vapor pressure deficit, photosynthetic radiation, and temperature, as well as the gradual variation in canopy conductance through the season are essential in the transpiration model. Use of soil moisture was moderately significant, but only when used with a hyperbolic vapor pressure deficit relationship. Subtle differences in model quality could be clearly associated with small structural changes through the use of this methodology. The results also indicate that increments in model complexity are not always accompanied by improvements in model quality and that such improvements are conditional on model structure. Possible application of this methodology to compare complex semiempirical models of natural systems in general is also discussed.

  15. A Review on Quantitative Models for Sustainable Food Logistics Management

    Directory of Open Access Journals (Sweden)

    M. Soysal

    2012-12-01

    Full Text Available The last two decades food logistics systems have seen the transition from a focus on traditional supply chain management to food supply chain management, and successively, to sustainable food supply chain management. The main aim of this study is to identify key logistical aims in these three phases and analyse currently available quantitative models to point out modelling challenges in sustainable food logistics management (SFLM. A literature review on quantitative studies is conducted and also qualitative studies are consulted to understand the key logistical aims more clearly and to identify relevant system scope issues. Results show that research on SFLM has been progressively developing according to the needs of the food industry. However, the intrinsic characteristics of food products and processes have not yet been handled properly in the identified studies. The majority of the works reviewed have not contemplated on sustainability problems, apart from a few recent studies. Therefore, the study concludes that new and advanced quantitative models are needed that take specific SFLM requirements from practice into consideration to support business decisions and capture food supply chain dynamics.

  16. Genomic value prediction for quantitative traits under the epistatic model

    Directory of Open Access Journals (Sweden)

    Xu Shizhong

    2011-01-01

    Full Text Available Abstract Background Most quantitative traits are controlled by multiple quantitative trait loci (QTL. The contribution of each locus may be negligible but the collective contribution of all loci is usually significant. Genome selection that uses markers of the entire genome to predict the genomic values of individual plants or animals can be more efficient than selection on phenotypic values and pedigree information alone for genetic improvement. When a quantitative trait is contributed by epistatic effects, using all markers (main effects and marker pairs (epistatic effects to predict the genomic values of plants can achieve the maximum efficiency for genetic improvement. Results In this study, we created 126 recombinant inbred lines of soybean and genotyped 80 makers across the genome. We applied the genome selection technique to predict the genomic value of somatic embryo number (a quantitative trait for each line. Cross validation analysis showed that the squared correlation coefficient between the observed and predicted embryo numbers was 0.33 when only main (additive effects were used for prediction. When the interaction (epistatic effects were also included in the model, the squared correlation coefficient reached 0.78. Conclusions This study provided an excellent example for the application of genome selection to plant breeding.

  17. A Transformative Model for Undergraduate Quantitative Biology Education

    Science.gov (United States)

    Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions. PMID:20810949

  18. A transformative model for undergraduate quantitative biology education.

    Science.gov (United States)

    Usher, David C; Driscoll, Tobin A; Dhurjati, Prasad; Pelesko, John A; Rossi, Louis F; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions.

  19. Assessment of the link between quantitative biexponential diffusion-weighted imaging and contrast-enhanced MRI in the liver

    NARCIS (Netherlands)

    Dijkstra, Hildebrand; Oudkerk, Matthijs; Kappert, Peter; Sijens, Paul E.

    Purpose: To investigate if intravoxel incoherent motion (IVIM) modeled diffusion-weighted imaging (DWI) can be linked to contrast-enhanced (CE-)MRI in liver parenchyma and liver lesions. Methods: Twenty-five patients underwent IVIM-DWI followed by multiphase CE-MRI using Gd-EOB-DTPA (n = 20) or

  20. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  1. Quantitative analysis of a wind energy conversion model

    International Nuclear Information System (INIS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-01-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s −1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is c p = 0.15. The v 3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively. (paper)

  2. Towards Quantitative Systems Pharmacology Models of Chemotherapy-Induced Neutropenia.

    Science.gov (United States)

    Craig, M

    2017-05-01

    Neutropenia is a serious toxic complication of chemotherapeutic treatment. For years, mathematical models have been developed to better predict hematological outcomes during chemotherapy in both the traditional pharmaceutical sciences and mathematical biology disciplines. An increasing number of quantitative systems pharmacology (QSP) models that combine systems approaches, physiology, and pharmacokinetics/pharmacodynamics have been successfully developed. Here, I detail the shift towards QSP efforts, emphasizing the importance of incorporating systems-level physiological considerations in pharmacometrics. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  3. Quantitative analysis of a wind energy conversion model

    Science.gov (United States)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-03-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s-1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is cp = 0.15. The v3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively.

  4. Frequency-Domain Response Analysis for Quantitative Systems Pharmacology Models.

    Science.gov (United States)

    Schulthess, Pascal; Post, Teun M; Yates, James; van der Graaf, Piet H

    2017-11-28

    Drug dosing regimen can significantly impact drug effect and, thus, the success of treatments. Nevertheless, trial and error is still the most commonly used method by conventional pharmacometric approaches to optimize dosing regimen. In this tutorial, we utilize four distinct classes of quantitative systems pharmacology models to introduce frequency-domain response analysis, a method widely used in electrical and control engineering that allows the analytical optimization of drug treatment regimen from the dynamics of the model. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  5. Linking Adverse Outcome Pathways to Dynamic Energy Budgets: A Conceptual Model

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Cheryl [Michigan State University, East Lansing; Nisbet, Roger [University of California Santa Barbara; Antczak, Philipp [University of Liverpool, UK; Reyero, Natalia [Army Corps of Engineers, Vicksburg; Gergs, Andre [Gaiac; Lika, Dina [University of Crete; Mathews, Teresa J. [ORNL; Muller, Eric [University of California, Santa Barbara; Nacci, Dianne [U.S. Environmental Protection Agency (EPA); Peace, Angela L. [ORNL; Remien, Chris [University of Idaho; Schulz, Irv [Pacific Northwest National Laboratory (PNNL); Watanabe, Karen [Arizona State University

    2018-02-01

    Ecological risk assessment quantifies the likelihood of undesirable impacts of stressors, primarily at high levels of biological organization. Data used to inform ecological risk assessments come primarily from tests on individual organisms or from suborganismal studies, indicating a disconnect between primary data and protection goals. We know how to relate individual responses to population dynamics using individual-based models, and there are emerging ideas on how to make connections to ecosystem services. However, there is no established methodology to connect effects seen at higher levels of biological organization with suborganismal dynamics, despite progress made in identifying Adverse Outcome Pathways (AOPs) that link molecular initiating events to ecologically relevant key events. This chapter is a product of a working group at the National Center for Mathematical and Biological Synthesis (NIMBioS) that assessed the feasibility of using dynamic energy budget (DEB) models of individual organisms as a “pivot” connecting suborganismal processes to higher level ecological processes. AOP models quantify explicit molecular, cellular or organ-level processes, but do not offer a route to linking sub-organismal damage to adverse effects on individual growth, reproduction, and survival, which can be propagated to the population level through individual-based models. DEB models describe these processes, but use abstract variables with undetermined connections to suborganismal biology. We propose linking DEB and quantitative AOP models by interpreting AOP key events as measures of damage-inducing processes in a DEB model. Here, we present a conceptual model for linking AOPs to DEB models and review existing modeling tools available for both AOP and DEB.

  6. Numerical linked-cluster approach to quantum lattice models.

    Science.gov (United States)

    Rigol, Marcos; Bryant, Tyler; Singh, Rajiv R P

    2006-11-03

    We present a novel algorithm that allows one to obtain temperature dependent properties of quantum lattice models in the thermodynamic limit from exact diagonalization of small clusters. Our numerical linked-cluster approach provides a systematic framework to assess finite-size effects and is valid for any quantum lattice model. Unlike high temperature expansions, which have a finite radius of convergence in inverse temperature, these calculations are accurate at all temperatures provided the range of correlations is finite. We illustrate the power of our approach studying spin models on kagomé, triangular, and square lattices.

  7. Quantitative insight into models of Hedgehog signal transduction.

    Science.gov (United States)

    Farzan, Shohreh F; Ogden, Stacey K; Robbins, David J

    2010-01-01

    The Hedgehog (Hh) signaling pathway is an essential regulator of embryonic development and a key factor in carcinogenesis.(1,2) Hh, a secreted morphogen, activates intracellular signaling events via downstream effector proteins, which translate the signal to regulate target gene transcription.(3,4) In a recent publication, we quantitatively compared two commonly accepted models of Hh signal transduction.(5) Each model requires a different ratio of signaling components to be feasible. Thus, we hypothesized that knowing the steady-state ratio of core signaling components might allow us to distinguish between models. We reported vast differences in the molar concentrations of endogenous effectors of Hh signaling, with Smo present in limiting concentrations.(5) This extra view summarizes the implications of this endogenous ratio in relation to current models of Hh signaling and places our results in the context of recent work describing the involvement of guanine nucleotide binding protein Galphai and Cos2 motility.

  8. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach.

    Science.gov (United States)

    van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie

    2017-09-01

    This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Development of a quantitative sandwich enzyme-linked immunosorbent assay for detecting the MPT64 antigen of Mycobacterium tuberculosis.

    Science.gov (United States)

    Ji, Mijung; Cho, Byungki; Cho, Young Shik; Park, Song-Yong; Cho, Sang-Nae; Jeon, Bo-Young; Yoon, Byoung-Su

    2014-05-01

    Tuberculosis (TB) is a major infectious disease and is responsible for two million deaths annually. For the identification and quantitation of Mycobacterium tuberculosis (M. tuberculosis), a causative agent of TB, a sandwich enzyme-linked immunosorbent assay (ELISA) against the MPT64 protein of M. tuberculosis, an antigen marker of the M. tuberculosis complex, was developed. The MPT64 protein was expressed, and anti-MPT64 monoclonal antibodies were prepared. A sandwich ELISA was established using recombinant MPT64 protein and anti-MPT64 monoclonal antibodies. The sandwich MPT64 ELISA was evaluated using reference and clinical mycobacterial strains. The sandwich MPT64 ELISA detected MPT64 protein from 2.1 ng/mL to 250 ng/mL (equivalent to 1.7×10⁴ CFU/mL and 2.0×10⁶ CFU/mL). All 389 clinical M. tuberculosis isolates tested positive in the sandwich MPT64 ELISA (sensitivity, 100%), and the assay showed no cross reactivity to any tested nontuberculous mycobacterial strain (specificity, 100%). The sandwich MPT64 ELISA is a highly sensitive and quantitative test for MPT64 protein, which can identify M. tuberculosis.

  10. A cell-based model system links chromothripsis with hyperploidy

    DEFF Research Database (Denmark)

    Mardin, Balca R; Drainas, Alexandros P; Waszak, Sebastian M

    2015-01-01

    A remarkable observation emerging from recent cancer genome analyses is the identification of chromothripsis as a one-off genomic catastrophe, resulting in massive somatic DNA structural rearrangements (SRs). Largely due to lack of suitable model systems, the mechanistic basis of chromothripsis h...... in hyperploid cells. Analysis of primary medulloblastoma cancer genomes verified the link between hyperploidy and chromothripsis in vivo. CAST provides the foundation for mechanistic dissection of complex DNA rearrangement processes....

  11. Continuous Modeling of a Multi-Link Flexible Transmission

    Directory of Open Access Journals (Sweden)

    Irit Peled

    2008-01-01

    Full Text Available The problem of dynamic, infinite dimension, modeling of a transmission is considered. An accurate Laplace transfer function matrix of the system that consists of flexible shafts connected by gears that are either rigid or flexible is found. The first step is deriving a set of single input, infinite dimension, transfer functions for a single uniform link. The building blocks of those transfer functions are time delays, representing the wave motion, and low order rational expressions, representing the boundary phenomena. The next step is combining these individual transfer functions into an overall model of the transmission, by means of the link reaction approach that makes use of the geometric relationships and reaction moments between neighboring links. The outcome is a generalized dynamic model with the moments in the gear pairs as the generalized state vector. The explicit and highly structured form of the transfer functions allows physical insight into the system, exact calculation of natural frequencies and the construction of exact simulation schemes built from standard blocks that are available in multi-purpose simulation software.

  12. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  13. Quantitative aspects and dynamic modelling of glucosinolate metabolism

    DEFF Research Database (Denmark)

    Vik, Daniel

    and ecologically important glucosinolate (GLS) compounds of cruciferous plants – including the model plant Arabidopsis thaliana – have been studied extensively with regards to their biosynthesis and degradation. However, efforts to construct a dynamic model unifying the regulatory aspects have not been made......Advancements in ‘omics technologies now allow acquisition of enormous amounts of quantitative information about biomolecules. This has led to the emergence of new scientific sub‐disciplines e.g. computational, systems and ‘quantitative’ biology. These disciplines examine complex biological...... behaviour through computational and mathematical approaches and have resulted in substantial insights and advances in molecular biology and physiology. Capitalizing on the accumulated knowledge and data, it is possible to construct dynamic models of complex biological systems, thereby initiating the so...

  14. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  15. Quantifying Zika: Advancing the Epidemiology of Zika With Quantitative Models.

    Science.gov (United States)

    Keegan, Lindsay T; Lessler, Justin; Johansson, Michael A

    2017-12-16

    When Zika virus (ZIKV) emerged in the Americas, little was known about its biology, pathogenesis, and transmission potential, and the scope of the epidemic was largely hidden, owing to generally mild infections and no established surveillance systems. Surges in congenital defects and Guillain-Barré syndrome alerted the world to the danger of ZIKV. In the context of limited data, quantitative models were critical in reducing uncertainties and guiding the global ZIKV response. Here, we review some of the models used to assess the risk of ZIKV-associated severe outcomes, the potential speed and size of ZIKV epidemics, and the geographic distribution of ZIKV risk. These models provide important insights and highlight significant unresolved questions related to ZIKV and other emerging pathogens. Published by Oxford University Press for the Infectious Diseases Society of America 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  16. Modeling online social networks based on preferential linking

    International Nuclear Information System (INIS)

    Hu Hai-Bo; Chen Jun; Guo Jin-Li

    2012-01-01

    We study the phenomena of preferential linking in a large-scale evolving online social network and find that the linear preference holds for preferential creation, preferential acceptance, and preferential attachment. Based on the linear preference, we propose an analyzable model, which illustrates the mechanism of network growth and reproduces the process of network evolution. Our simulations demonstrate that the degree distribution of the network produced by the model is in good agreement with that of the real network. This work provides a possible bridge between the micro-mechanisms of network growth and the macrostructures of online social networks

  17. The fuzziness of Jupiter's core: linking formation and evolution models

    Science.gov (United States)

    Helled, Ravit; Lozovsky, Michael; Vazan, Allona; Stevenson, David; Guillot, Tristan; Hubbard, William

    2017-04-01

    Juno data can be used to better constrain Jupiter's internal structure and origin. First, we present Jupiter's primordial internal structure based on formation models and show that Jupiter's core might not be distinct from the envelope, and that the deep interior can have a gradual heavy-element structure. Second, we explore how such a primordial (non-adiabatic) interior affects Jupiter's long-term evolution. Finally, we will discuss the link between these formation and evolution models and Jupiter's current-state internal structure.

  18. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction.

    Science.gov (United States)

    Cobbs, Gary

    2012-08-16

    Numerous models for use in interpreting quantitative PCR (qPCR) data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the literature. They also give better estimates of

  19. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction

    Directory of Open Access Journals (Sweden)

    Cobbs Gary

    2012-08-01

    Full Text Available Abstract Background Numerous models for use in interpreting quantitative PCR (qPCR data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Results Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the

  20. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  1. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    2000-07-01

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  2. Structure and evolution of protein interaction networks: a statistical model for link dynamics and gene duplications

    Directory of Open Access Journals (Sweden)

    Wagner Andreas

    2004-11-01

    Full Text Available Abstract Background The structure of molecular networks derives from dynamical processes on evolutionary time scales. For protein interaction networks, global statistical features of their structure can now be inferred consistently from several large-throughput datasets. Understanding the underlying evolutionary dynamics is crucial for discerning random parts of the network from biologically important properties shaped by natural selection. Results We present a detailed statistical analysis of the protein interactions in Saccharomyces cerevisiae based on several large-throughput datasets. Protein pairs resulting from gene duplications are used as tracers into the evolutionary past of the network. From this analysis, we infer rate estimates for two key evolutionary processes shaping the network: (i gene duplications and (ii gain and loss of interactions through mutations in existing proteins, which are referred to as link dynamics. Importantly, the link dynamics is asymmetric, i.e., the evolutionary steps are mutations in just one of the binding parters. The link turnover is shown to be much faster than gene duplications. Both processes are assembled into an empirically grounded, quantitative model for the evolution of protein interaction networks. Conclusions According to this model, the link dynamics is the dominant evolutionary force shaping the statistical structure of the network, while the slower gene duplication dynamics mainly affects its size. Specifically, the model predicts (i a broad distribution of the connectivities (i.e., the number of binding partners of a protein and (ii correlations between the connectivities of interacting proteins, a specific consequence of the asymmetry of the link dynamics. Both features have been observed in the protein interaction network of S. cerevisiae.

  3. Curing critical links in oscillator networks as power flow models

    International Nuclear Information System (INIS)

    Rohden, Martin; Meyer-Ortmanns, Hildegard; Witthaut, Dirk; Timme, Marc

    2017-01-01

    Modern societies crucially depend on the robust supply with electric energy so that blackouts of power grids can have far reaching consequences. Typically, large scale blackouts take place after a cascade of failures: the failure of a single infrastructure component, such as a critical transmission line, results in several subsequent failures that spread across large parts of the network. Improving the robustness of a network to prevent such secondary failures is thus key for assuring a reliable power supply. In this article we analyze the nonlocal rerouting of power flows after transmission line failures for a simplified AC power grid model and compare different strategies to improve network robustness. We identify critical links in the grid and compute alternative pathways to quantify the grid’s redundant capacity and to find bottlenecks along the pathways. Different strategies are developed and tested to increase transmission capacities to restore stability with respect to transmission line failures. We show that local and nonlocal strategies typically perform alike: one can equally well cure critical links by providing backup capacities locally or by extending the capacities of bottleneck links at remote locations. (paper)

  4. Modelling and Intelligent Control of an Elastic Link Robot Manipulator

    Directory of Open Access Journals (Sweden)

    Malik Loudini

    2013-01-01

    Full Text Available In this paper, precise control of the end-point position of a planar single-link elastic manipulator robot is discussed. The Timoshenko beam theory (TBT has been used to characterize the structural link elasticity including important damping mechanisms. A suitable nonlinear model is derived based on the Lagrangian assumed modes method. Elastic link manipulators are classified as systems possessing highly complex dynamics. In addition, the environment in which they operate may have a lot of disturbances. These give rise to special problems that may be solved using intelligent control techniques. The application of two advanced control strategies based on fuzzy set theory is investigated. The first closed-loop control scheme to be applied is the standard Proportional-Derivative (PD type fuzzy logic controller (FLC, also known as PD-type Mamdani's FLC (MPDFLC. Then, a genetic algorithm (GA is used to optimize the MPDFLC parameters with innovative tuning procedures. Both the MPDFLC and the GA optimized FLC (GAOFLC are implemented and tested to achieve a precise control of the manipulator end-point. The performances of the adopted closed-loop intelligent control strategies are examined via simulation experiments.

  5. Skein Invariants of Links and Their State Sum Models

    Directory of Open Access Journals (Sweden)

    Louis H. Kauffman

    2017-10-01

    Full Text Available We present the new skein invariants of classical links, H [ H ] , K [ K ] and D [ D ] , based on the invariants of links, H, K and D, denoting the regular isotopy version of the Homflypt polynomial, the Kauffman polynomial and the Dubrovnik polynomial. The invariants are obtained by abstracting the skein relation of the corresponding invariant and making a new skein algorithm comprising two computational levels: first producing unlinked knotted components, then evaluating the resulting knots. The invariants in this paper, were revealed through the skein theoretic definition of the invariants Θ d related to the Yokonuma–Hecke algebras and their 3-variable generalization Θ , which generalizes the Homflypt polynomial. H [ H ] is the regular isotopy counterpart of Θ . The invariants K [ K ] and D [ D ] are new generalizations of the Kauffman and the Dubrovnik polynomials. We sketch skein theoretic proofs of the well-definedness and topological properties of these invariants. The invariants of this paper are reformulated into summations of the generating invariants (H, K, D on sublinks of the given link L, obtained by partitioning L into collections of sublinks. The first such reformulation was achieved by W.B.R. Lickorish for the invariant Θ and we generalize it to the Kauffman and Dubrovnik polynomial cases. State sum models are formulated for all the invariants. These state summation models are based on our skein template algorithm which formalizes the skein theoretic process as an analogue of a statistical mechanics partition function. Relationships with statistical mechanics models are articulated. Finally, we discuss physical situations where a multi-leveled course of action is taken naturally.

  6. Towards Quantitative Spatial Models of Seabed Sediment Composition.

    Directory of Open Access Journals (Sweden)

    David Stephens

    Full Text Available There is a need for fit-for-purpose maps for accurately depicting the types of seabed substrate and habitat and the properties of the seabed for the benefits of research, resource management, conservation and spatial planning. The aim of this study is to determine whether it is possible to predict substrate composition across a large area of seabed using legacy grain-size data and environmental predictors. The study area includes the North Sea up to approximately 58.44°N and the United Kingdom's parts of the English Channel and the Celtic Seas. The analysis combines outputs from hydrodynamic models as well as optical remote sensing data from satellite platforms and bathymetric variables, which are mainly derived from acoustic remote sensing. We build a statistical regression model to make quantitative predictions of sediment composition (fractions of mud, sand and gravel using the random forest algorithm. The compositional data is analysed on the additive log-ratio scale. An independent test set indicates that approximately 66% and 71% of the variability of the two log-ratio variables are explained by the predictive models. A EUNIS substrate model, derived from the predicted sediment composition, achieved an overall accuracy of 83% and a kappa coefficient of 0.60. We demonstrate that it is feasible to spatially predict the seabed sediment composition across a large area of continental shelf in a repeatable and validated way. We also highlight the potential for further improvements to the method.

  7. Quantitative Modeling of Human-Environment Interactions in Preindustrial Time

    Science.gov (United States)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-04-01

    Quantifying human-environment interactions and anthropogenic influences on the environment prior to the Industrial revolution is essential for understanding the current state of the earth system. This is particularly true for the terrestrial biosphere, but marine ecosystems and even climate were likely modified by human activities centuries to millennia ago. Direct observations are however very sparse in space and time, especially as one considers prehistory. Numerical models are therefore essential to produce a continuous picture of human-environment interactions in the past. Agent-based approaches, while widely applied to quantifying human influence on the environment in localized studies, are unsuitable for global spatial domains and Holocene timescales because of computational demands and large parameter uncertainty. Here we outline a new paradigm for the quantitative modeling of human-environment interactions in preindustrial time that is adapted to the global Holocene. Rather than attempting to simulate agency directly, the model is informed by a suite of characteristics describing those things about society that cannot be predicted on the basis of environment, e.g., diet, presence of agriculture, or range of animals exploited. These categorical data are combined with the properties of the physical environment in coupled human-environment model. The model is, at its core, a dynamic global vegetation model with a module for simulating crop growth that is adapted for preindustrial agriculture. This allows us to simulate yield and calories for feeding both humans and their domesticated animals. We couple this basic caloric availability with a simple demographic model to calculate potential population, and, constrained by labor requirements and land limitations, we create scenarios of land use and land cover on a moderate-resolution grid. We further implement a feedback loop where anthropogenic activities lead to changes in the properties of the physical

  8. The Pseudomonas community in metal-contaminated sediments as revealed by quantitative PCR: a link with metal bioavailability.

    Science.gov (United States)

    Roosa, Stéphanie; Wauven, Corinne Vander; Billon, Gabriel; Matthijs, Sandra; Wattiez, Ruddy; Gillan, David C

    2014-10-01

    Pseudomonas bacteria are ubiquitous Gram-negative and aerobic microorganisms that are known to harbor metal resistance mechanisms such as efflux pumps and intracellular redox enzymes. Specific Pseudomonas bacteria have been quantified in some metal-contaminated environments, but the entire Pseudomonas population has been poorly investigated under these conditions, and the link with metal bioavailability was not previously examined. In the present study, quantitative PCR and cell cultivation were used to monitor and characterize the Pseudomonas population at 4 different sediment sites contaminated with various levels of metals. At the same time, total metals and metal bioavailability (as estimated using an HCl 1 m extraction) were measured. It was found that the total level of Pseudomonas, as determined by qPCR using two different genes (oprI and the 16S rRNA gene), was positively and significantly correlated with total and HCl-extractable Cu, Co, Ni, Pb and Zn, with high correlation coefficients (>0.8). Metal-contaminated sediments featured isolates of the Pseudomonas putida, Pseudomonas fluorescens, Pseudomonas lutea and Pseudomonas aeruginosa groups, with other bacterial genera such as Mycobacterium, Klebsiella and Methylobacterium. It is concluded that Pseudomonas bacteria do proliferate in metal-contaminated sediments, but are still part of a complex community. Copyright © 2014 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.

  9. Quantitative Modelling of Trace Elements in Hard Coal.

    Science.gov (United States)

    Smoliński, Adam; Howaniec, Natalia

    2016-01-01

    The significance of coal in the world economy remains unquestionable for decades. It is also expected to be the dominant fossil fuel in the foreseeable future. The increased awareness of sustainable development reflected in the relevant regulations implies, however, the need for the development and implementation of clean coal technologies on the one hand, and adequate analytical tools on the other. The paper presents the application of the quantitative Partial Least Squares method in modeling the concentrations of trace elements (As, Ba, Cd, Co, Cr, Cu, Mn, Ni, Pb, Rb, Sr, V and Zn) in hard coal based on the physical and chemical parameters of coal, and coal ash components. The study was focused on trace elements potentially hazardous to the environment when emitted from coal processing systems. The studied data included 24 parameters determined for 132 coal samples provided by 17 coal mines of the Upper Silesian Coal Basin, Poland. Since the data set contained outliers, the construction of robust Partial Least Squares models for contaminated data set and the correct identification of outlying objects based on the robust scales were required. These enabled the development of the correct Partial Least Squares models, characterized by good fit and prediction abilities. The root mean square error was below 10% for all except for one the final Partial Least Squares models constructed, and the prediction error (root mean square error of cross-validation) exceeded 10% only for three models constructed. The study is of both cognitive and applicative importance. It presents the unique application of the chemometric methods of data exploration in modeling the content of trace elements in coal. In this way it contributes to the development of useful tools of coal quality assessment.

  10. Melanoma screening: Informing public health policy with quantitative modelling.

    Directory of Open Access Journals (Sweden)

    Stephen Gilmore

    Full Text Available Australia and New Zealand share the highest incidence rates of melanoma worldwide. Despite the substantial increase in public and physician awareness of melanoma in Australia over the last 30 years-as a result of the introduction of publicly funded mass media campaigns that began in the early 1980s -mortality has steadily increased during this period. This increased mortality has led investigators to question the relative merits of primary versus secondary prevention; that is, sensible sun exposure practices versus early detection. Increased melanoma vigilance on the part of the public and among physicians has resulted in large increases in public health expenditure, primarily from screening costs and increased rates of office surgery. Has this attempt at secondary prevention been effective? Unfortunately epidemiologic studies addressing the causal relationship between the level of secondary prevention and mortality are prohibitively difficult to implement-it is currently unknown whether increased melanoma surveillance reduces mortality, and if so, whether such an approach is cost-effective. Here I address the issue of secondary prevention of melanoma with respect to incidence and mortality (and cost per life saved by developing a Markov model of melanoma epidemiology based on Australian incidence and mortality data. The advantages of developing a methodology that can determine constraint-based surveillance outcomes are twofold: first, it can address the issue of effectiveness; and second, it can quantify the trade-off between cost and utilisation of medical resources on one hand, and reduced morbidity and lives saved on the other. With respect to melanoma, implementing the model facilitates the quantitative determination of the relative effectiveness and trade-offs associated with different levels of secondary and tertiary prevention, both retrospectively and prospectively. For example, I show that the surveillance enhancement that began in

  11. Model of Atmospheric Links on Optical Communications from High Altitude

    Science.gov (United States)

    Subich, Christopher

    2004-01-01

    Optical communication links have the potential to solve many of the problems of current radio and microwave links to satellites and high-altitude aircraft. The higher frequency involved in optical systems allows for significantly greater signal bandwidth, and thus information transfer rate, in excess of 10 Gbps, and the highly directional nature of laser-based signals eliminates the need for frequency-division multiplexing seen in radio and microwave links today. The atmosphere, however, distorts an optical signal differently than a microwave signal. While the ionosphere is one of the most significant sources of noise and distortion in a microwave or radio signal, the lower atmosphere affects an optical signal more significantly. Refractive index fluctuations, primarily caused by changes in atmospheric temperature and density, distort the incoming signal in both deterministic and nondeterministic ways. Additionally, suspended particles, such as those in haze or rain, further corrupt the transmitted signal. To model many of the atmospheric effects on the propagating beam, we use simulations based on the beam-propagation method. This method, developed both for simulation of signals in waveguides and propagation in atmospheric turbulence, separates the propagation into a diffraction and refraction problem. The diffraction step is an exact solution, within the limits of numerical precision, to the problem of propagation in free space, and the refraction step models the refractive index variances over a segment of the propagation path. By applying refraction for a segment of the propagation path, then diffracting over that same segment, this method forms a good approximation to true propagation through the atmospheric medium. Iterating over small segments of the total propagation path gives a good approximation to the problem of propagation over the entire path. Parameters in this model, such as initial beam profile and atmospheric constants, are easily modified in a

  12. Linking an ecosystem model and a landscape model to study forest species response to climate warming

    Science.gov (United States)

    Hong S. He; David J. Mladenoff; Thomas R. Crow

    1999-01-01

    No single model can address forest change from single tree to regional scales. We discuss a framework linking an ecosystem process model {LINKAGES) with a spatial landscape model (LANDIS) to examine forest species responses to climate warming for a large, heterogeneous landscape in northern Wisconsin, USA. Individual species response at the ecosystem scale was...

  13. A Quantitative Trait Locus (LSq-1) on Mouse Chromosome 7 Is Linked to the Absence of Tissue Loss After Surgical Hindlimb Ischemia

    Science.gov (United States)

    Dokun, Ayotunde O.; Keum, Sehoon; Hazarika, Surovi; Li, Yongjun; Lamonte, Gregory M.; Wheeler, Ferrin; Marchuk, Douglas A.; Annex, Brian H.

    2010-01-01

    Background Peripheral arterial disease (PAD) caused by occlusive atherosclerosis of the lower extremity has 2 major clinical manifestations. Critical limb ischemia is characterized by rest pain and/or tissue loss and has a ≥40% risk of death and major amputation. Intermittent claudication causes pain on walking, has no tissue loss, and has amputation plus mortality rates of 2% to 4% per year. Progression from claudication to limb ischemia is infrequent. Risk factors in most PAD patients overlap. Thus, we hypothesized that genetic variations may be linked to presence or absence of tissue loss in PAD. Methods and Results Hindlimb ischemia (murine model of PAD) was induced in C57BL/6, BALB/c, C57BL/6×BALB/c (F1), F1×BALB/c (N2), A/J, and C57BL/6J-Chr7A/J/NaJ chromosome substitution strains. Mice were monitored for perfusion recovery and tissue necrosis. Genome-wide scanning with polymorphic markers across the 19 murine autosomes was performed on the N2 mice. Greater tissue loss and poorer perfusion recovery occurred in BALB/c than in the C57BL/6 strain. Analysis of 105 N2 progeny identified a single quantitative trait locus on chromosome 7 that exhibited significant linkage to both tissue necrosis and extent of perfusion recovery. Using the appropriate chromosome substitution strain, we demonstrate that C57BL/6-derived chromosome 7 is required for tissue preservation. Conclusions We have identified a quantitative trait locus on murine chromosome 7 (LSq-1) that is associated with the absence of tissue loss in a preclinical model of PAD and may be useful in identifying gene(s) that influence PAD in humans. PMID:18285563

  14. Quantitative comparisons of analogue models of brittle wedge dynamics

    Science.gov (United States)

    Schreurs, Guido

    2010-05-01

    Analogue model experiments are widely used to gain insights into the evolution of geological structures. In this study, we present a direct comparison of experimental results of 14 analogue modelling laboratories using prescribed set-ups. A quantitative analysis of the results will document the variability among models and will allow an appraisal of reproducibility and limits of interpretation. This has direct implications for comparisons between structures in analogue models and natural field examples. All laboratories used the same frictional analogue materials (quartz and corundum sand) and prescribed model-building techniques (sieving and levelling). Although each laboratory used its own experimental apparatus, the same type of self-adhesive foil was used to cover the base and all the walls of the experimental apparatus in order to guarantee identical boundary conditions (i.e. identical shear stresses at the base and walls). Three experimental set-ups using only brittle frictional materials were examined. In each of the three set-ups the model was shortened by a vertical wall, which moved with respect to the fixed base and the three remaining sidewalls. The minimum width of the model (dimension parallel to mobile wall) was also prescribed. In the first experimental set-up, a quartz sand wedge with a surface slope of ˜20° was pushed by a mobile wall. All models conformed to the critical taper theory, maintained a stable surface slope and did not show internal deformation. In the next two experimental set-ups, a horizontal sand pack consisting of alternating quartz sand and corundum sand layers was shortened from one side by the mobile wall. In one of the set-ups a thin rigid sheet covered part of the model base and was attached to the mobile wall (i.e. a basal velocity discontinuity distant from the mobile wall). In the other set-up a basal rigid sheet was absent and the basal velocity discontinuity was located at the mobile wall. In both types of experiments

  15. Enzyme-linked immunosorbent assay to quantitate serum ferritin in black and white ruffed lemurs (Varecia variegata variegata).

    Science.gov (United States)

    Andrews, Gordon A; Chavey, Patricia Sue; Crawford, Graham

    2005-12-01

    Lemurs in captivity progressively accumulate iron deposits in a variety of organs (hemosiderosis) including duodenum, liver, and spleen throughout their lives. When excessive, the toxic effects of intracellular iron on parenchymal cells, particularly the liver, can result in clinical disease and death. The pathogenesis of excessive iron storage in these species has been attributed to dietary factors related to diets commonly fed in captivity. Tissue iron stores can be directly estimated by tissue biopsy and histologic examination, or quantitated by chemical analysis of biopsy tissue, However, expense and risk associated with anesthesia and surgery prevent routine use of tissue biopsy to assess iron status. A noninvasive means of assessing total body iron stores is needed to monitor iron stores in lemurs to determine whether dietary modification is preventing excessive iron deposition, and to monitor potential therapies such as phlebotomy or chelation. Serum ferritin concentration correlates with tissue iron stores in humans, horses, calves, dogs, cats, and pigs. Serum ferritin is considered the best serum analyte to predict total body iron stores in these species and is more reliable than serum iron or total iron binding capacity, both of which may be affected by disorders unrelated to iron adequacy or excess including hypoproteinemia, chronic infection, hemolytic anemia, hypothyroidism, renal disease, and drug administration. We have developed an enzyme-linked immunosorbent assay to measure serum ferritin in lemurs. The assay uses polyclonal rabbit anti-human ferritin antibodies in a sandwich arrangement. Ferritin isolated from liver and spleen of a black and white ruffed lemur (Varecia variegata variegata) was used as a standard. Ferritin standards were linear from 0 to 50 microg/L. Recovery of purified ferritin from lemur serum varied from 95% to 110%. The within-assay variability was 4.5%, and the assay-to-assay variability for three different samples ranged

  16. The link between laboratory/field observations and models

    International Nuclear Information System (INIS)

    Cole, C.R.; Foley, M.G.

    1986-01-01

    The various linkages in system performance assessments that integrate disposal program elements must be understood. The linkage between model development and field/laboratory observations is described as the iterative program of site and system characterization for development of an observational-confirmatory data base. This data base is designed to develop, improve, and support conceptual models for site and system behavior. The program consists of data gathering and experiments to demonstrate understanding at various spatial and time scales and degrees of complexity. Understanding and accounting for the decreasing characterization certainty that arises with increasing space and time scales is an important aspect of the link between models and observations. The performance allocation process for setting performance goals and confidence levels, coupled with a performance assessment approach that provides these performance and confidence estimates, will determine when sufficient characterization has been achieved. At each iteration, performance allocation goals are reviewed and revised as necessary. The updated data base and appropriate performance assessment tools and approaches are utilized to identify and design additional tests and data needs necessary to meet current performance allocation goals

  17. Quantitative Outline-based Shape Analysis and Classification of Planetary Craterforms using Supervised Learning Models

    Science.gov (United States)

    Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric

    2017-10-01

    The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.

  18. Quantitative Model for Supply Chain Visibility: Process Capability Perspective

    Directory of Open Access Journals (Sweden)

    Youngsu Lee

    2016-01-01

    Full Text Available Currently, the intensity of enterprise competition has increased as a result of a greater diversity of customer needs as well as the persistence of a long-term recession. The results of competition are becoming severe enough to determine the survival of company. To survive global competition, each firm must focus on achieving innovation excellence and operational excellence as core competency for sustainable competitive advantage. Supply chain management is now regarded as one of the most effective innovation initiatives to achieve operational excellence, and its importance has become ever more apparent. However, few companies effectively manage their supply chains, and the greatest difficulty is in achieving supply chain visibility. Many companies still suffer from a lack of visibility, and in spite of extensive research and the availability of modern technologies, the concepts and quantification methods to increase supply chain visibility are still ambiguous. Based on the extant researches in supply chain visibility, this study proposes an extended visibility concept focusing on a process capability perspective and suggests a more quantitative model using Z score in Six Sigma methodology to evaluate and improve the level of supply chain visibility.

  19. Implementation of a vibrationally linked chemical reaction model for DSMC

    Science.gov (United States)

    Carlson, A. B.; Bird, Graeme A.

    1994-01-01

    A new procedure closely linking dissociation and exchange reactions in air to the vibrational levels of the diatomic molecules has been implemented in both one- and two-dimensional versions of Direct Simulation Monte Carlo (DSMC) programs. The previous modeling of chemical reactions with DSMC was based on the continuum reaction rates for the various possible reactions. The new method is more closely related to the actual physics of dissociation and is more appropriate to the particle nature of DSMC. Two cases are presented: the relaxation to equilibrium of undissociated air initially at 10,000 K, and the axisymmetric calculation of shuttle forebody heating during reentry at 92.35 km and 7500 m/s. Although reaction rates are not used in determining the dissociations or exchange reactions, the new method produces rates which agree astonishingly well with the published rates derived from experiment. The results for gas properties and surface properties also agree well with the results produced by earlier DSMC models, equilibrium air calculations, and experiment.

  20. A Transformative Model for Undergraduate Quantitative Biology Education

    OpenAIRE

    Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematic...

  1. A theoretical quantitative model for evolution of cancer chemotherapy resistance

    Directory of Open Access Journals (Sweden)

    Gatenby Robert A

    2010-04-01

    Full Text Available Abstract Background Disseminated cancer remains a nearly uniformly fatal disease. While a number of effective chemotherapies are available, tumors inevitably evolve resistance to these drugs ultimately resulting in treatment failure and cancer progression. Causes for chemotherapy failure in cancer treatment reside in multiple levels: poor vascularization, hypoxia, intratumoral high interstitial fluid pressure, and phenotypic resistance to drug-induced toxicity through upregulated xenobiotic metabolism or DNA repair mechanisms and silencing of apoptotic pathways. We propose that in order to understand the evolutionary dynamics that allow tumors to develop chemoresistance, a comprehensive quantitative model must be used to describe the interactions of cell resistance mechanisms and tumor microenvironment during chemotherapy. Ultimately, the purpose of this model is to identify the best strategies to treat different types of tumor (tumor microenvironment, genetic/phenotypic tumor heterogeneity, tumor growth rate, etc.. We predict that the most promising strategies are those that are both cytotoxic and apply a selective pressure for a phenotype that is less fit than that of the original cancer population. This strategy, known as double bind, is different from the selection process imposed by standard chemotherapy, which tends to produce a resistant population that simply upregulates xenobiotic metabolism. In order to achieve this goal we propose to simulate different tumor progression and therapy strategies (chemotherapy and glucose restriction targeting stabilization of tumor size and minimization of chemoresistance. Results This work confirms the prediction of previous mathematical models and simulations that suggested that administration of chemotherapy with the goal of tumor stabilization instead of eradication would yield better results (longer subject survival than the use of maximum tolerated doses. Our simulations also indicate that the

  2. Non-constant link tension coefficient in the tumbling-snake model subjected to simple shear

    Science.gov (United States)

    Stephanou, Pavlos S.; Kröger, Martin

    2017-11-01

    The authors of the present study have recently presented evidence that the tumbling-snake model for polymeric systems has the necessary capacity to predict the appearance of pronounced undershoots in the time-dependent shear viscosity as well as an absence of equally pronounced undershoots in the transient two normal stress coefficients. The undershoots were found to appear due to the tumbling behavior of the director u when a rotational Brownian diffusion term is considered within the equation of motion of polymer segments, and a theoretical basis concerning the use of a link tension coefficient given through the nematic order parameter had been provided. The current work elaborates on the quantitative predictions of the tumbling-snake model to demonstrate its capacity to predict undershoots in the time-dependent shear viscosity. These predictions are shown to compare favorably with experimental rheological data for both polymer melts and solutions, help us to clarify the microscopic origin of the observed phenomena, and demonstrate in detail why a constant link tension coefficient has to be abandoned.

  3. A communications model for an ISAS to NASA span link

    Science.gov (United States)

    Green, James L.; Mcguire, Robert E.; Lopez-Swafford, Brian

    1987-01-01

    The authors propose that an initial computer-to-computer communication link use the public packet switched networks (PPSN) Venus-P in Japan and TELENET in the U.S. When the traffic warrants it, this link would then be upgraded to a dedicated leased line that directly connects into the Space Physics Analysis Network (SPAN). The proposed system of hardware and software will easily support migration to such a dedicated link. It therefore provides a cost effective approach to the network problem. Once a dedicated line becomes operation it is suggested that the public networks link and continue to coexist, providing a backup capability.

  4. Modelling and experimental validation of two-dimensional transverse vibrations in a flexible robot link

    DEFF Research Database (Denmark)

    Sørensen, Paul Haase; Baungaard, Jens Rane

    1996-01-01

    A general model for a rotating homogenous flexible robot link is developed. The model describes two-dimensional transverse vibrations induced by the actuator due to misalignment of the actuator axis of rotation relative to the link symmetry axis and due to translational acceleration of the link...

  5. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting...... particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLAN) allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLAN semantics based on discrete-time Markov chains....... The Maude implementation of PFLAN is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning) and the expected average...

  6. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    Directory of Open Access Journals (Sweden)

    Maurice H. ter Beek

    2015-04-01

    Full Text Available We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLan with action rates, which specify the likelihood of exhibiting particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLan allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLan semantics based on discrete-time Markov chains. The Maude implementation of PFLan is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning and the expected average cost of products.

  7. Mesoscale models for stacking faults, deformation twins and martensitic transformations: Linking atomistics to continuum

    Science.gov (United States)

    Kibey, Sandeep A.

    We present a hierarchical approach that spans multiple length scales to describe defect formation---in particular, formation of stacking faults (SFs) and deformation twins---in fcc crystals. We link the energy pathways (calculated here via ab initio density functional theory, DFT) associated with formation of stacking faults and twins to corresponding heterogeneous defect nucleation models (described through mesoscale dislocation mechanics). Through the generalized Peieirls-Nabarro model, we first correlate the width of intrinsic SFs in fcc alloy systems to their nucleation pathways called generalized stacking fault energies (GSFE). We then establish a qualitative dependence of twinning tendency in fee metals and alloys---specifically, in pure Cu and dilute Cu-xAl (x= 5.0 and 8.3 at.%)---on their twin-energy pathways called the generalized planar fault energies (GPFE). We also link the twinning behavior of Cu-Al alloys to their electronic structure by determining the effect of solute Al on the valence charge density redistribution at the SF through ab initio DFT. Further, while several efforts have been undertaken to incorporate twinning for predicting stress-strain response of fcc materials, a fundamental law for critical twinning stress has not yet emerged. We resolve this long-standing issue by linking quantitatively the twin-energy pathways (GPFE) obtained via ab initio DFT to heterogeneous, dislocation-based twin nucleation models. We establish an analytical expression that quantitatively predicts the critical twinning stress in fcc metals in agreement with experiments without requiring any empiricism at any length scale. Our theory connects twinning stress to twin-energy pathways and predicts a monotonic relation between stress and unstable twin stacking fault energy revealing the physics of twinning. We further demonstrate that the theory holds for fcc alloys as well. Our theory inherently accounts for directional nature of twinning which available

  8. Herd immunity and pneumococcal conjugate vaccine: a quantitative model.

    Science.gov (United States)

    Haber, Michael; Barskey, Albert; Baughman, Wendy; Barker, Lawrence; Whitney, Cynthia G; Shaw, Kate M; Orenstein, Walter; Stephens, David S

    2007-07-20

    Invasive pneumococcal disease in older children and adults declined markedly after introduction in 2000 of the pneumococcal conjugate vaccine for young children. An empirical quantitative model was developed to estimate the herd (indirect) effects on the incidence of invasive disease among persons >or=5 years of age induced by vaccination of young children with 1, 2, or >or=3 doses of the pneumococcal conjugate vaccine, Prevnar (PCV7), containing serotypes 4, 6B, 9V, 14, 18C, 19F and 23F. From 1994 to 2003, cases of invasive pneumococcal disease were prospectively identified in Georgia Health District-3 (eight metropolitan Atlanta counties) by Active Bacterial Core surveillance (ABCs). From 2000 to 2003, vaccine coverage levels of PCV7 for children aged 19-35 months in Fulton and DeKalb counties (of Atlanta) were estimated from the National Immunization Survey (NIS). Based on incidence data and the estimated average number of doses received by 15 months of age, a Poisson regression model was fit, describing the trend in invasive pneumococcal disease in groups not targeted for vaccination (i.e., adults and older children) before and after the introduction of PCV7. Highly significant declines in all the serotypes contained in PCV7 in all unvaccinated populations (5-19, 20-39, 40-64, and >64 years) from 2000 to 2003 were found under the model. No significant change in incidence was seen from 1994 to 1999, indicating rates were stable prior to vaccine introduction. Among unvaccinated persons 5+ years of age, the modeled incidence of disease caused by PCV7 serotypes as a group dropped 38.4%, 62.0%, and 76.6% for 1, 2, and 3 doses, respectively, received on average by the population of children by the time they are 15 months of age. Incidence of serotypes 14 and 23F had consistent significant declines in all unvaccinated age groups. In contrast, the herd immunity effects on vaccine-related serotype 6A incidence were inconsistent. Increasing trends of non

  9. Branching enzyme assay: selective quantitation of the alpha 1,6-linked glucosyl residues involved in the branching points.

    Science.gov (United States)

    Krisman, C R; Tolmasky, D S; Raffo, S

    1985-06-01

    Methods previously described for glycogen or amylopectin branching enzymatic activity are insufficiently sensitive and not quantitative. A new, more sensitive, specific, and quantitative one was developed. It is based upon the quantitation of the glucose residues joined by alpha 1,6 bonds introduced by varying amounts of branching enzyme. The procedure involved the synthesis of a polysaccharide from Glc-1-P and phosphorylase in the presence of the sample to be tested. The branched polysaccharide was then purified and the glucoses involved in the branching points were quantitated after degradation with phosphorylase and debranching enzymes. This method appeared to be useful, not only in enzymatic activity determinations but also in the study of the structure of alpha-D-glucans when combined with those of total polysaccharide quantitation, such as iodine and phenol-sulfuric acid.

  10. The link between perceived human resource management practices, engagement and employee behaviour : A moderated mediation model

    NARCIS (Netherlands)

    Alfes, K.; Shantz, A.D.; Truss, C.; Soane, E.C.

    2013-01-01

    This study contributes to our understanding of the mediating and moderating processes through which human resource management (HRM) practices are linked with behavioural outcomes. We developed and tested a moderated mediation model linking perceived HRM practices to organisational citizenship

  11. Toward linking demographic and economic models for impact assessment

    International Nuclear Information System (INIS)

    Williams, C.A.; Meenan, C.D.

    1991-01-01

    One of the objectives of the Yucca Mountain Project, in Southern Nevada, is to evaluate the effects of the development of a high-level nuclear waste repository. As described in the Section 175 Report to the Congress of the US, the temporal scope of this repository project encompasses approximately 70 years and includes four phases: Site characterization and licensing, construction, operation, and closure and decommissioning. If retrieval of the waste were to be required, the temporal scope of the repository project could be extended to approximately 100 years. The study of the potential socioeconomic effects of this project is the foundation for this paper. This paper focuses on the economic and demographic aspects and a possible method to interface the two. First, the authors briefly discuss general socioeconomic modeling theory from a county level view point, as well as methods for the apportionment of county level data to sub-county areas. Next, the authors describe the unique economic and demographic conditions which exist in Nevada at both the state and county levels. Finally, the authors evaluate a possible procedure for analyzing repository effects at a sub-county level; this involves discussion of an interface linking the economic and demographic aspects, which is based on the reconciliation of supply and demand for labor. The authors conclude that the basis for further model development may rely on the interaction of supply and demand to produce change in wage rates. These changes in expected wages should be a justification for allocating economic migrants (who may respond to Yucca Mountain Project development) into various communities

  12. Linking the M&Rfi Weather Generator with Agrometeorological Models

    Science.gov (United States)

    Dubrovsky, Martin; Trnka, Miroslav

    2015-04-01

    Realistic meteorological inputs (representing the present and/or future climates) for the agrometeorological model simulations are often produced by stochastic weather generators (WGs). This contribution presents some methodological issues and results obtained in our recent experiments. We also address selected questions raised in the synopsis of this session. The input meteorological time series for our experiments are produced by the parametric single site weather generator (WG) Marfi, which is calibrated from the available observational data (or interpolated from surrounding stations). To produce meteorological series representing the future climate, the WG parameters are modified by climate change scenarios, which are prepared by the pattern scaling method: the standardised scenarios derived from Global or Regional Climate Models are multiplied by the change in global mean temperature (ΔTG) determined by the simple climate model MAGICC. The presentation will address following questions: (i) The dependence of the quality of the synthetic weather series and impact results on the WG settings. An emphasis will be put on an effect of conditioning the daily WG on monthly WG (presently being one of our hot topics), which aims at improvement of the reproduction of the low-frequency weather variability. Comparison of results obtained with various WG settings is made in terms of climatic and agroclimatic indices (including extreme temperature and precipitation characteristics and drought indices). (ii) Our methodology accounts for the uncertainties coming from various sources. We will show how the climate change impact results are affected by 1. uncertainty in climate modelling, 2. uncertainty in ΔTG, and 3. uncertainty related to the complexity of the climate change scenario (focusing on an effect of inclusion of changes in variability into the climate change scenarios). Acknowledgements: This study was funded by project "Building up a multidisciplinary scientific

  13. Linking changes in epithelial morphogenesis to cancer mutations using computational modeling.

    Directory of Open Access Journals (Sweden)

    Katarzyna A Rejniak

    2010-08-01

    Full Text Available Most tumors arise from epithelial tissues, such as mammary glands and lobules, and their initiation is associated with the disruption of a finely defined epithelial architecture. Progression from intraductal to invasive tumors is related to genetic mutations that occur at a subcellular level but manifest themselves as functional and morphological changes at the cellular and tissue scales, respectively. Elevated proliferation and loss of epithelial polarization are the two most noticeable changes in cell phenotypes during this process. As a result, many three-dimensional cultures of tumorigenic clones show highly aberrant morphologies when compared to regular epithelial monolayers enclosing the hollow lumen (acini. In order to shed light on phenotypic changes associated with tumor cells, we applied the bio-mechanical IBCell model of normal epithelial morphogenesis quantitatively matched to data acquired from the non-tumorigenic human mammary cell line, MCF10A. We then used a high-throughput simulation study to reveal how modifications in model parameters influence changes in the simulated architecture. Three parameters have been considered in our study, which define cell sensitivity to proliferative, apoptotic and cell-ECM adhesive cues. By mapping experimental morphologies of four MCF10A-derived cell lines carrying different oncogenic mutations onto the model parameter space, we identified changes in cellular processes potentially underlying structural modifications of these mutants. As a case study, we focused on MCF10A cells expressing an oncogenic mutant HER2-YVMA to quantitatively assess changes in cell doubling time, cell apoptotic rate, and cell sensitivity to ECM accumulation when compared to the parental non-tumorigenic cell line. By mapping in vitro mutant morphologies onto in silico ones we have generated a means of linking the morphological and molecular scales via computational modeling. Thus, IBCell in combination with 3D acini

  14. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  15. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... enhancing safety margins and adding greater precision to quantitative accident prediction [45]. One deterministic methodology is the stringency level (SL) approach, which is recognized as a valuable decision tool in the selection of standardized materials specifications to prevent potential failures [3].

  16. Hidden Markov Model for quantitative prediction of snowfall and ...

    Indian Academy of Sciences (India)

    forecasting of quantitative snowfall at 10 meteoro- logical stations in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. At these stations of Snow and Avalanche Study Estab- lishment (SASE), snow and meteorological data are recorded twice daily at 08:30 and 17:30 hrs since more than last four decades ...

  17. A Transformative Model for Undergraduate Quantitative Biology Education

    Science.gov (United States)

    Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The "BIO2010" report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3)…

  18. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    that require novel methods and applications. One of the important issues in this context is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese smart grid implementation as a case study and address the verification problem for performance and energy......Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... consumption.We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  19. ASSETS MANAGEMENT - A CONCEPTUAL MODEL DECOMPOSING VALUE FOR THE CUSTOMER AND A QUANTITATIVE MODEL

    Directory of Open Access Journals (Sweden)

    Susana Nicola

    2015-03-01

    Full Text Available In this paper we describe de application of a modeling framework, the so-called Conceptual Model Decomposing Value for the Customer (CMDVC, in a Footwear Industry case study, to ascertain the usefulness of this approach. The value networks were used to identify the participants, both tangible and intangible deliverables/endogenous and exogenous assets, and the analysis of their interactions as the indication for an adequate value proposition. The quantitative model of benefits and sacrifices, using the Fuzzy AHP method, enables the discussion of how the CMDVC can be applied and used in the enterprise environment and provided new relevant relations between perceived benefits (PBs.

  20. Quantitative and predictive model of kinetic regulation by E. coli TPP riboswitches.

    Science.gov (United States)

    Guedich, Sondés; Puffer-Enders, Barbara; Baltzinger, Mireille; Hoffmann, Guillaume; Da Veiga, Cyrielle; Jossinet, Fabrice; Thore, Stéphane; Bec, Guillaume; Ennifar, Eric; Burnouf, Dominique; Dumas, Philippe

    2016-01-01

    Riboswitches are non-coding elements upstream or downstream of mRNAs that, upon binding of a specific ligand, regulate transcription and/or translation initiation in bacteria, or alternative splicing in plants and fungi. We have studied thiamine pyrophosphate (TPP) riboswitches regulating translation of thiM operon and transcription and translation of thiC operon in E. coli, and that of THIC in the plant A. thaliana. For all, we ascertained an induced-fit mechanism involving initial binding of the TPP followed by a conformational change leading to a higher-affinity complex. The experimental values obtained for all kinetic and thermodynamic parameters of TPP binding imply that the regulation by A. thaliana riboswitch is governed by mass-action law, whereas it is of kinetic nature for the two bacterial riboswitches. Kinetic regulation requires that the RNA polymerase pauses after synthesis of each riboswitch aptamer to leave time for TPP binding, but only when its concentration is sufficient. A quantitative model of regulation highlighted how the pausing time has to be linked to the kinetic rates of initial TPP binding to obtain an ON/OFF switch in the correct concentration range of TPP. We verified the existence of these pauses and the model prediction on their duration. Our analysis also led to quantitative estimates of the respective efficiency of kinetic and thermodynamic regulations, which shows that kinetically regulated riboswitches react more sharply to concentration variation of their ligand than thermodynamically regulated riboswitches. This rationalizes the interest of kinetic regulation and confirms empirical observations that were obtained by numerical simulations.

  1. Linking river management to species conservation using dynamic landscape scale models

    Science.gov (United States)

    Freeman, Mary C.; Buell, Gary R.; Hay, Lauren E.; Hughes, W. Brian; Jacobson, Robert B.; Jones, John W.; Jones, S.A.; LaFontaine, Jacob H.; Odom, Kenneth R.; Peterson, James T.; Riley, Jeffrey W.; Schindler, J. Stephen; Shea, C.; Weaver, J.D.

    2013-01-01

    Efforts to conserve stream and river biota could benefit from tools that allow managers to evaluate landscape-scale changes in species distributions in response to water management decisions. We present a framework and methods for integrating hydrology, geographic context and metapopulation processes to simulate effects of changes in streamflow on fish occupancy dynamics across a landscape of interconnected stream segments. We illustrate this approach using a 482 km2 catchment in the southeastern US supporting 50 or more stream fish species. A spatially distributed, deterministic and physically based hydrologic model is used to simulate daily streamflow for sub-basins composing the catchment. We use geographic data to characterize stream segments with respect to channel size, confinement, position and connectedness within the stream network. Simulated streamflow dynamics are then applied to model fish metapopulation dynamics in stream segments, using hypothesized effects of streamflow magnitude and variability on population processes, conditioned by channel characteristics. The resulting time series simulate spatially explicit, annual changes in species occurrences or assemblage metrics (e.g. species richness) across the catchment as outcomes of management scenarios. Sensitivity analyses using alternative, plausible links between streamflow components and metapopulation processes, or allowing for alternative modes of fish dispersal, demonstrate large effects of ecological uncertainty on model outcomes and highlight needed research and monitoring. Nonetheless, with uncertainties explicitly acknowledged, dynamic, landscape-scale simulations may prove useful for quantitatively comparing river management alternatives with respect to species conservation.

  2. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: broadening the matrix approach.

    NARCIS (Netherlands)

    Grootel, L. van; Wesel, F. van; O'Mara-Eves, A.; Thomas, J.; Hox, J.; Boeije, H.

    2017-01-01

    Background: This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the

  3. Network Formation Models With Costs for Establishing Links

    NARCIS (Netherlands)

    Slikker, M.; van den Nouweland, C.G.A.M.

    1999-01-01

    In this paper we study endogenous formation of communication networks in situations where the economic possibilities of groups of players can be described by a cooperative game. We concentrate on the in uence that the existence of costs for establishing communication links has on the communication

  4. Mathematical Modelling of a Two – Link Planar Manipulator Arm ...

    African Journals Online (AJOL)

    ... transformation matrices relating gripper's (generally known as the end – effector) frame with the base / reference frame have being derived using Denavit – Hartenberg matrix. Keywords: Manipulator Arm, Frame, Link, Joint, End – Effector, Mapping, Kinematics. Journal of the Nigerian Association of Mathematical Physics, ...

  5. A Review of Quantitative Situation Assessment Models for Nuclear Power Plant Operators

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Situation assessment is the process of developing situation awareness and situation awareness is defined as 'the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.' Situation awareness is an important element influencing human actions because human decision making is based on the result of situation assessment or situation awareness. There are many models for situation awareness and those models can be categorized into qualitative or quantitative. As the effects of some input factors on situation awareness can be investigated through the quantitative models, the quantitative models are more useful for the design of operator interfaces, automation strategies, training program, and so on, than the qualitative models. This study presents the review of two quantitative models of situation assessment (SA) for nuclear power plant operators

  6. 76 FR 28819 - NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection...

    Science.gov (United States)

    2011-05-18

    ... COMMISSION NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection... issued for public comment a document entitled: NUREG/CR-XXXX, ``Development of Quantitative Software... development of regulatory guidance for using risk information related to digital systems in the licensing...

  7. Dynamics of childhood growth and obesity development and validation of a quantitative mathematical model

    Science.gov (United States)

    Clinicians and policy makers need the ability to predict quantitatively how childhood bodyweight will respond to obesity interventions. We developed and validated a mathematical model of childhood energy balance that accounts for healthy growth and development of obesity, and that makes quantitative...

  8. Linear regression models for quantitative assessment of left ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-07-04

    Jul 4, 2008 ... computed. Linear regression models for the prediction of left ventricular structures were established. Prediction models for ... study aimed at establishing linear regression models that could be used in the prediction ..... Is white cat hypertension associated with artenal disease or left ventricular hypertrophy?

  9. What should a quantitative model of masking look like and why would we want it?

    Science.gov (United States)

    Francis, Gregory

    2008-07-15

    Quantitative models of backward masking appeared almost as soon as computing technology was available to simulate them; and continued interest in masking has lead to the development of new models. Despite this long history, the impact of the models on the field has been limited because they have fundamental shortcomings. This paper discusses these shortcomings and outlines what future quantitative models should look like. It also discusses several issues about modeling and how a model could be used by researchers to better explore masking and other aspects of cognition.

  10. Process-based monitoring and modeling of Karst springs - Linking intrinsic to specific vulnerability.

    Science.gov (United States)

    Epting, Jannis; Page, Rebecca M; Auckenthaler, Adrian; Huggenberger, Peter

    2018-06-01

    The presented work illustrates to what extent field investigations as well as monitoring and modeling approaches are necessary to understand the high discharge dynamics and vulnerability of Karst springs. In complex settings the application of 3D geological models is essential for evaluating the vulnerability of Karst systems. They allow deriving information on catchment characteristics, as the geometry of aquifers and aquitards as well as their displacements along faults. A series of Karst springs in northwestern Switzerland were compared and Karst system dynamics with respect to qualitative and quantitative issues were evaluated. The main objective of the studies was to combine information of catchment characteristics and data from novel monitoring systems (physicochemical and microbiological parameters) to assess the intrinsic vulnerability of Karst springs to microbiological contamination with simulated spring discharges derived from numerical modeling (linear storage models). The numerically derived relation of fast and slow groundwater flow components enabled us to relate different sources of groundwater recharge and to characterize the dynamics of the Karst springs. Our study illustrates that comparably simple model-setups were able to reproduce the overall dynamic intrinsic vulnerability of several Karst systems and that one of the most important processes involved was the temporal variation of groundwater recharge (precipitation, evapotranspiration and snow melt). Furthermore, we make a first attempt on how to link intrinsic to specific vulnerability of Karst springs, which involves activities within the catchment area as human impacts from agriculture and settlements. Likewise, by a more detailed representation of system dynamics the influence of surface water, which is impacted by release events from storm sewers, infiltrating into the Karst system, could be considered. Overall, we demonstrate that our approach can be the basis for a more flexible and

  11. Dynamic root growth and architecture responses to limiting nutrient availability: linking physiological models and experimentation.

    Science.gov (United States)

    Postma, Johannes A; Schurr, Ulrich; Fiorani, Fabio

    2014-01-01

    In recent years the study of root phenotypic plasticity in response to sub-optimal environmental factors and the genetic control of these responses have received renewed attention. As a path to increased productivity, in particular for low fertility soils, several applied research projects worldwide target the improvement of crop root traits both in plant breeding and biotechnology contexts. To assist these tasks and address the challenge of optimizing root growth and architecture for enhanced mineral resource use, the development of realistic simulation models is of great importance. We review this research field from a modeling perspective focusing particularly on nutrient acquisition strategies for crop production on low nitrogen and low phosphorous soils. Soil heterogeneity and the dynamics of nutrient availability in the soil pose a challenging environment in which plants have to forage efficiently for nutrients in order to maintain their internal nutrient homeostasis throughout their life cycle. Mathematical models assist in understanding plant growth strategies and associated root phenes that have potential to be tested and introduced in physiological breeding programs. At the same time, we stress that it is necessary to carefully consider model assumptions and development from a whole plant-resource allocation perspective and to introduce or refine modules simulating explicitly root growth and architecture dynamics through ontogeny with reference to key factors that constrain root growth. In this view it is important to understand negative feedbacks such as plant-plant competition. We conclude by briefly touching on available and developing technologies for quantitative root phenotyping from lab to field, from quantification of partial root profiles in the field to 3D reconstruction of whole root systems. Finally, we discuss how these approaches can and should be tightly linked to modeling to explore the root phenome. © 2013.

  12. Spatial variability in compartmental fate modelling : Linking fugacity models and GIS.

    Science.gov (United States)

    Wania, F

    1996-03-01

    A new approach is presented which is designed to address the spatial heterogeneity of the environment in compartmental mass balance models of chemical fate in the environment. It rests on the assumption of chemical equilibration within one phase despite prevailing environmental heterogeneity. Composite D- and Z-values are derived from sub-unit specific environmental parameters and are used to solve mass balance equations which can be adopted essentially unchanged from existing compartmental fugacity models. With the resulting common fugacity value for each compartment, sub-unit specific concentrations and process rates can be calculated. The approach is illustrated using the QWASI lake model to calculate the fate of hexachlorobenzene in a hypothetical lake sub-divided in four distinct sub-units. The approach allows the subdivision of each compartment in a large number of sub-units with distinct environmental characteristics without substantially increasing model complexity. This is a necessary condition for linking fugacity models to geographical information systems.

  13. A quantitative risk-based model for reasoning over critical system properties

    Science.gov (United States)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  14. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  15. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  16. Linear regression models for quantitative assessment of left ...

    African Journals Online (AJOL)

    Changes in left ventricular structures and function have been reported in cardiomyopathies. No prediction models have been established in this environment. This study established regression models for prediction of left ventricular structures in normal subjects. A sample of normal subjects was drawn from a large urban ...

  17. Hidden Markov Model for quantitative prediction of snowfall and ...

    Indian Academy of Sciences (India)

    used to simulate large-scale atmospheric circu- lation patterns and for determining the effect of changes ... to simulate precipitation and snow cover over the. Himalaya. Though this model underestimated pre- ...... Wilks D and Wilby R 1999 The weather generation game: A review of stochastic weather models; Progr. Phys.

  18. A Quantitative Causal Model Theory of Conditional Reasoning

    Science.gov (United States)

    Fernbach, Philip M.; Erb, Christopher D.

    2013-01-01

    The authors propose and test a causal model theory of reasoning about conditional arguments with causal content. According to the theory, the acceptability of modus ponens (MP) and affirming the consequent (AC) reflect the conditional likelihood of causes and effects based on a probabilistic causal model of the scenario being judged. Acceptability…

  19. Analysis of Water Conflicts across Natural and Societal Boundaries: Integration of Quantitative Modeling and Qualitative Reasoning

    Science.gov (United States)

    Gao, Y.; Balaram, P.; Islam, S.

    2009-12-01

    Water issues and problems have bewildered humankind for a long time yet a systematic approach for understanding such issues remain elusive. This is partly because many water-related problems are framed from a contested terrain in which many actors (individuals, communities, businesses, NGOs, states, and countries) compete to protect their own and often conflicting interests. We argue that origin of many water problems may be understood as a dynamic consequence of competition, interconnections, and feedback among variables in the Natural and Societal Systems (NSSs). Within the natural system, we recognize that triple constraints on water- water quantity (Q), water quality (P), and ecosystem (E)- and their interdependencies and feedback may lead to conflicts. Such inherent and multifaceted constraints of the natural water system are exacerbated often at the societal boundaries. Within the societal system, interdependencies and feedback among values and norms (V), economy (C), and governance (G) interact in various ways to create intractable contextual differences. The observation that natural and societal systems are linked is not novel. Our argument here, however, is that rigid disciplinary boundaries between these two domains will not produce solutions to the water problems we are facing today. The knowledge needed to address water problems need to go beyond scientific assessment in which societal variables (C, G, and V) are treated as exogenous or largely ignored, and policy research that does not consider the impact of natural variables (E, P, and Q) and that coupling among them. Consequently, traditional quantitative methods alone are not appropriate to address the dynamics of water conflicts, because we cannot quantify the societal variables and the exact mathematical relationships among the variables are not fully known. On the other hand, conventional qualitative study in societal domain has mainly been in the form of individual case studies and therefore

  20. Quantitative proteomics links metabolic pathways to specific developmental stages of the plant-pathogenic oomycete Phytophthora capsici.

    Science.gov (United States)

    Pang, Zhili; Srivastava, Vaibhav; Liu, Xili; Bulone, Vincent

    2017-04-01

    The oomycete Phytophthora capsici is a plant pathogen responsible for important losses to vegetable production worldwide. Its asexual reproduction plays an important role in the rapid propagation and spread of the disease in the field. A global proteomics study was conducted to compare two key asexual life stages of P. capsici, i.e. the mycelium and cysts, to identify stage-specific biochemical processes. A total of 1200 proteins was identified using qualitative and quantitative proteomics. The transcript abundance of some of the enriched proteins was also analysed by quantitative real-time polymerase chain reaction. Seventy-three proteins exhibited different levels of abundance between the mycelium and cysts. The proteins enriched in the mycelium are mainly associated with glycolysis, the tricarboxylic acid (or citric acid) cycle and the pentose phosphate pathway, providing the energy required for the biosynthesis of cellular building blocks and hyphal growth. In contrast, the proteins that are predominant in cysts are essentially involved in fatty acid degradation, suggesting that the early infection stage of the pathogen relies primarily on fatty acid degradation for energy production. The data provide a better understanding of P. capsici biology and suggest potential metabolic targets at the two different developmental stages for disease control. © 2016 BSPP AND JOHN WILEY & SONS LTD.

  1. Digital clocks: simple Boolean models can quantitatively describe circadian systems.

    Science.gov (United States)

    Akman, Ozgur E; Watterson, Steven; Parton, Andrew; Binns, Nigel; Millar, Andrew J; Ghazal, Peter

    2012-09-07

    The gene networks that comprise the circadian clock modulate biological function across a range of scales, from gene expression to performance and adaptive behaviour. The clock functions by generating endogenous rhythms that can be entrained to the external 24-h day-night cycle, enabling organisms to optimally time biochemical processes relative to dawn and dusk. In recent years, computational models based on differential equations have become useful tools for dissecting and quantifying the complex regulatory relationships underlying the clock's oscillatory dynamics. However, optimizing the large parameter sets characteristic of these models places intense demands on both computational and experimental resources, limiting the scope of in silico studies. Here, we develop an approach based on Boolean logic that dramatically reduces the parametrization, making the state and parameter spaces finite and tractable. We introduce efficient methods for fitting Boolean models to molecular data, successfully demonstrating their application to synthetic time courses generated by a number of established clock models, as well as experimental expression levels measured using luciferase imaging. Our results indicate that despite their relative simplicity, logic models can (i) simulate circadian oscillations with the correct, experimentally observed phase relationships among genes and (ii) flexibly entrain to light stimuli, reproducing the complex responses to variations in daylength generated by more detailed differential equation formulations. Our work also demonstrates that logic models have sufficient predictive power to identify optimal regulatory structures from experimental data. By presenting the first Boolean models of circadian circuits together with general techniques for their optimization, we hope to establish a new framework for the systematic modelling of more complex clocks, as well as other circuits with different qualitative dynamics. In particular, we anticipate

  2. Digital clocks: simple Boolean models can quantitatively describe circadian systems

    Science.gov (United States)

    Akman, Ozgur E.; Watterson, Steven; Parton, Andrew; Binns, Nigel; Millar, Andrew J.; Ghazal, Peter

    2012-01-01

    The gene networks that comprise the circadian clock modulate biological function across a range of scales, from gene expression to performance and adaptive behaviour. The clock functions by generating endogenous rhythms that can be entrained to the external 24-h day–night cycle, enabling organisms to optimally time biochemical processes relative to dawn and dusk. In recent years, computational models based on differential equations have become useful tools for dissecting and quantifying the complex regulatory relationships underlying the clock's oscillatory dynamics. However, optimizing the large parameter sets characteristic of these models places intense demands on both computational and experimental resources, limiting the scope of in silico studies. Here, we develop an approach based on Boolean logic that dramatically reduces the parametrization, making the state and parameter spaces finite and tractable. We introduce efficient methods for fitting Boolean models to molecular data, successfully demonstrating their application to synthetic time courses generated by a number of established clock models, as well as experimental expression levels measured using luciferase imaging. Our results indicate that despite their relative simplicity, logic models can (i) simulate circadian oscillations with the correct, experimentally observed phase relationships among genes and (ii) flexibly entrain to light stimuli, reproducing the complex responses to variations in daylength generated by more detailed differential equation formulations. Our work also demonstrates that logic models have sufficient predictive power to identify optimal regulatory structures from experimental data. By presenting the first Boolean models of circadian circuits together with general techniques for their optimization, we hope to establish a new framework for the systematic modelling of more complex clocks, as well as other circuits with different qualitative dynamics. In particular, we

  3. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Science.gov (United States)

    2012-07-17

    ...] Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A... Influenza Disease Models to Quantitatively Evaluate the Benefits and Risks of Vaccines: A Technical Workshop... model to quantitatively estimate the benefits and risks of a hypothetical influenza vaccine, and to seek...

  4. Quantitative modeling of chronic myeloid leukemia: insights from radiobiology

    Science.gov (United States)

    Radivoyevitch, Tomas; Hlatky, Lynn; Landaw, Julian

    2012-01-01

    Mathematical models of chronic myeloid leukemia (CML) cell population dynamics are being developed to improve CML understanding and treatment. We review such models in light of relevant findings from radiobiology, emphasizing 3 points. First, the CML models almost all assert that the latency time, from CML initiation to diagnosis, is at most ∼ 10 years. Meanwhile, current radiobiologic estimates, based on Japanese atomic bomb survivor data, indicate a substantially higher maximum, suggesting longer-term relapses and extra resistance mutations. Second, different CML models assume different numbers, between 400 and 106, of normal HSCs. Radiobiologic estimates favor values > 106 for the number of normal cells (often assumed to be the HSCs) that are at risk for a CML-initiating BCR-ABL translocation. Moreover, there is some evidence for an HSC dead-band hypothesis, consistent with HSC numbers being very different across different healthy adults. Third, radiobiologists have found that sporadic (background, age-driven) chromosome translocation incidence increases with age during adulthood. BCR-ABL translocation incidence increasing with age would provide a hitherto underanalyzed contribution to observed background adult-onset CML incidence acceleration with age, and would cast some doubt on stage-number inferences from multistage carcinogenesis models in general. PMID:22353999

  5. Conceptual Processes for Linking Eutrophication and Network Models

    Science.gov (United States)

    2006-08-01

    1 PURPOSE: This three-year study investigates the coupling of eutrophication and network models, applies the results to a specific problem , and...affect water quality problems such as low dissolved oxygen? No straightforward means of coupling the two modeling approaches is available or...Dorothy H. Tillman, Dr. Carl F. Cerco, and Mr. Mark R. Noel of the Water Quality and Contaminant Modeling Branch, Enviromental Laboratory (EL

  6. Evaluating quantitative and qualitative models: An application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  7. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  8. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  9. The place of quantitative energy models in a prospective approach

    International Nuclear Information System (INIS)

    Taverdet-Popiolek, N.

    2009-01-01

    Futurology above all depends on having the right mind set. Gaston Berger summarizes the prospective approach in 5 five main thrusts: prepare for the distant future, be open-minded (have a systems and multidisciplinary approach), carry out in-depth analyzes (draw out actors which are really determinant or the future, as well as established shed trends), take risks (imagine risky but flexible projects) and finally think about humanity, futurology being a technique at the service of man to help him build a desirable future. On the other hand, forecasting is based on quantified models so as to deduce 'conclusions' about the future. In the field of energy, models are used to draw up scenarios which allow, for instance, measuring medium or long term effects of energy policies on greenhouse gas emissions or global welfare. Scenarios are shaped by the model's inputs (parameters, sets of assumptions) and outputs. Resorting to a model or projecting by scenario is useful in a prospective approach as it ensures coherence for most of the variables that have been identified through systems analysis and that the mind on its own has difficulty to grasp. Interpretation of each scenario must be carried out in the light o the underlying framework of assumptions (the backdrop), developed during the prospective stage. When the horizon is far away (very long-term), the worlds imagined by the futurologist contain breaks (technological, behavioural and organizational) which are hard to integrate into the models. It is here that the main limit for the use of models in futurology is located. (author)

  10. Improved Mental Acuity Forecasting with an Individualized Quantitative Sleep Model

    Directory of Open Access Journals (Sweden)

    Brent D. Winslow

    2017-04-01

    Full Text Available Sleep impairment significantly alters human brain structure and cognitive function, but available evidence suggests that adults in developed nations are sleeping less. A growing body of research has sought to use sleep to forecast cognitive performance by modeling the relationship between the two, but has generally focused on vigilance rather than other cognitive constructs affected by sleep, such as reaction time, executive function, and working memory. Previous modeling efforts have also utilized subjective, self-reported sleep durations and were restricted to laboratory environments. In the current effort, we addressed these limitations by employing wearable systems and mobile applications to gather objective sleep information, assess multi-construct cognitive performance, and model/predict changes to mental acuity. Thirty participants were recruited for participation in the study, which lasted 1 week. Using the Fitbit Charge HR and a mobile version of the automated neuropsychological assessment metric called CogGauge, we gathered a series of features and utilized the unified model of performance to predict mental acuity based on sleep records. Our results suggest that individuals poorly rate their sleep duration, supporting the need for objective sleep metrics to model circadian changes to mental acuity. Participant compliance in using the wearable throughout the week and responding to the CogGauge assessments was 80%. Specific biases were identified in temporal metrics across mobile devices and operating systems and were excluded from the mental acuity metric development. Individualized prediction of mental acuity consistently outperformed group modeling. This effort indicates the feasibility of creating an individualized, mobile assessment and prediction of mental acuity, compatible with the majority of current mobile devices.

  11. Improved Mental Acuity Forecasting with an Individualized Quantitative Sleep Model.

    Science.gov (United States)

    Winslow, Brent D; Nguyen, Nam; Venta, Kimberly E

    2017-01-01

    Sleep impairment significantly alters human brain structure and cognitive function, but available evidence suggests that adults in developed nations are sleeping less. A growing body of research has sought to use sleep to forecast cognitive performance by modeling the relationship between the two, but has generally focused on vigilance rather than other cognitive constructs affected by sleep, such as reaction time, executive function, and working memory. Previous modeling efforts have also utilized subjective, self-reported sleep durations and were restricted to laboratory environments. In the current effort, we addressed these limitations by employing wearable systems and mobile applications to gather objective sleep information, assess multi-construct cognitive performance, and model/predict changes to mental acuity. Thirty participants were recruited for participation in the study, which lasted 1 week. Using the Fitbit Charge HR and a mobile version of the automated neuropsychological assessment metric called CogGauge, we gathered a series of features and utilized the unified model of performance to predict mental acuity based on sleep records. Our results suggest that individuals poorly rate their sleep duration, supporting the need for objective sleep metrics to model circadian changes to mental acuity. Participant compliance in using the wearable throughout the week and responding to the CogGauge assessments was 80%. Specific biases were identified in temporal metrics across mobile devices and operating systems and were excluded from the mental acuity metric development. Individualized prediction of mental acuity consistently outperformed group modeling. This effort indicates the feasibility of creating an individualized, mobile assessment and prediction of mental acuity, compatible with the majority of current mobile devices.

  12. First principles pharmacokinetic modeling: A quantitative study on Cyclosporin

    DEFF Research Database (Denmark)

    Mošat', Andrej; Lueshen, Eric; Heitzig, Martina

    2013-01-01

    renal and hepatic clearances, elimination half-life, and mass transfer coefficients, to establish drug biodistribution dynamics in all organs and tissues. This multi-scale model satisfies first principles and conservation of mass, species and momentum.Prediction of organ drug bioaccumulation...... as a function of cardiac output, physiology, pathology or administration route may be possible with the proposed PBPK framework. Successful application of our model-based drug development method may lead to more efficient preclinical trials, accelerated knowledge gain from animal experiments, and shortened time-to-market...

  13. Modeling X-linked ancestral origins in multiparental populations

    NARCIS (Netherlands)

    Zheng, Chaozhi

    2015-01-01

    The models for the mosaic structure of an individual's genome from multiparental populations have been developed primarily for autosomes, whereas X chromosomes receive very little attention. In this paper, we extend our previous approach to model ancestral origin processes along two X chromosomes

  14. Theory and Practice: An Integrative Model Linking Class and Field

    Science.gov (United States)

    Lesser, Joan Granucci; Cooper, Marlene

    2006-01-01

    Social work has evolved over the years taking on the challenges of the times. The profession now espouses a breadth of theoretical approaches and treatment modalities. We have developed a model to help graduate social work students master the skill of integrating theory and social work practice. The Integrative Model has five components: (l) The…

  15. Linking statistical bias description to multiobjective model calibration

    Science.gov (United States)

    Reichert, P.; Schuwirth, N.

    2012-09-01

    In the absence of model deficiencies, simulation results at the correct parameter values lead to an unbiased description of observed data with remaining deviations due to observation errors only. However, this ideal cannot be reached in the practice of environmental modeling, because the required simplified representation of the complex reality by the model and errors in model input lead to errors that are reflected in biased model output. This leads to two related problems: First, ignoring bias of output in the statistical model description leads to bias in parameter estimates, model predictions and, in particular, in the quantification of their uncertainty. Second, as there is no objective choice of how much bias to accept in which output variable, it is not possible to design an "objective" model calibration procedure. The first of these problems has been addressed by introducing a statistical (Bayesian) description of bias, the second by suggesting the use of multiobjective calibration techniques that cannot easily be used for uncertainty analysis. We merge the ideas of these two approaches by using the prior of the statistical bias description to quantify the importance of multiple calibration objectives. This leads to probabilistic inference and prediction while still taking multiple calibration objectives into account. The ideas and technical details of the suggested approach are outlined and a didactical example as well as an application to environmental data are provided to demonstrate its practical feasibility and computational efficiency.

  16. Essays on Quantitative Marketing Models and Monte Carlo Integration Methods

    NARCIS (Netherlands)

    R.D. van Oest (Rutger)

    2005-01-01

    textabstractThe last few decades have led to an enormous increase in the availability of large detailed data sets and in the computing power needed to analyze such data. Furthermore, new models and new computing techniques have been developed to exploit both sources. All of this has allowed for

  17. Quantitative Research: A Dispute Resolution Model for FTC Advertising Regulation.

    Science.gov (United States)

    Richards, Jef I.; Preston, Ivan L.

    Noting the lack of a dispute mechanism for determining whether an advertising practice is truly deceptive without generating the costs and negative publicity produced by traditional Federal Trade Commission (FTC) procedures, this paper proposes a model based upon early termination of the issues through jointly commissioned behavioral research. The…

  18. Quantitative modeling of human performance in complex, dynamic systems

    National Research Council Canada - National Science Library

    Baron, Sheldon; Kruser, Dana S; Huey, Beverly Messick

    1990-01-01

    ... Sheldon Baron, Dana S. Kruser, and Beverly Messick Huey, editors Panel on Human Performance Modeling Committee on Human Factors Commission on Behavioral and Social Sciences and Education National Research Council NATIONAL ACADEMY PRESS Washington, D.C. 1990 Copyrightoriginal retained, the be not from cannot book, paper original however, for version forma...

  19. A quantitative risk model for early lifecycle decision making

    Science.gov (United States)

    Feather, M. S.; Cornford, S. L.; Dunphy, J.; Hicks, K.

    2002-01-01

    Decisions made in the earliest phases of system development have the most leverage to influence the success of the entire development effort, and yet must be made when information is incomplete and uncertain. We have developed a scalable cost-benefit model to support this critical phase of early-lifecycle decision-making.

  20. Quantitative Comparison Between Crowd Models for Evacuation Planning and Evaluation

    NARCIS (Netherlands)

    Viswanathan, V.; Lee, C.E.; Lees, M.H.; Cheong, S.A.; Sloot, P.M.A.

    2014-01-01

    Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we

  1. Linking Time and Space Scales in Distributed Hydrological Modelling - a case study for the VIC model

    Science.gov (United States)

    Melsen, Lieke; Teuling, Adriaan; Torfs, Paul; Zappa, Massimiliano; Mizukami, Naoki; Clark, Martyn; Uijlenhoet, Remko

    2015-04-01

    One of the famous paradoxes of the Greek philosopher Zeno of Elea (~450 BC) is the one with the arrow: If one shoots an arrow, and cuts its motion into such small time steps that at every step the arrow is standing still, the arrow is motionless, because a concatenation of non-moving parts does not create motion. Nowadays, this reasoning can be refuted easily, because we know that motion is a change in space over time, which thus by definition depends on both time and space. If one disregards time by cutting it into infinite small steps, motion is also excluded. This example shows that time and space are linked and therefore hard to evaluate separately. As hydrologists we want to understand and predict the motion of water, which means we have to look both in space and in time. In hydrological models we can account for space by using spatially explicit models. With increasing computational power and increased data availability from e.g. satellites, it has become easier to apply models at a higher spatial resolution. Increasing the resolution of hydrological models is also labelled as one of the 'Grand Challenges' in hydrology by Wood et al. (2011) and Bierkens et al. (2014), who call for global modelling at hyperresolution (~1 km and smaller). A literature survey on 242 peer-viewed articles in which the Variable Infiltration Capacity (VIC) model was used, showed that the spatial resolution at which the model is applied has decreased over the past 17 years: From 0.5 to 2 degrees when the model was just developed, to 1/8 and even 1/32 degree nowadays. On the other hand the literature survey showed that the time step at which the model is calibrated and/or validated remained the same over the last 17 years; mainly daily or monthly. Klemeš (1983) stresses the fact that space and time scales are connected, and therefore downscaling the spatial scale would also imply downscaling of the temporal scale. Is it worth the effort of downscaling your model from 1 degree to 1

  2. Quantitative properties of clustering within modern microscopic nuclear models

    International Nuclear Information System (INIS)

    Volya, A.; Tchuvil’sky, Yu. M.

    2016-01-01

    A method for studying cluster spectroscopic properties of nuclear fragmentation, such as spectroscopic amplitudes, cluster form factors, and spectroscopic factors, is developed on the basis of modern precision nuclear models that take into account the mixing of large-scale shell-model configurations. Alpha-cluster channels are considered as an example. A mathematical proof of the need for taking into account the channel-wave-function renormalization generated by exchange terms of the antisymmetrization operator (Fliessbach effect) is given. Examples where this effect is confirmed by a high quality of the description of experimental data are presented. By and large, the method in question extends substantially the possibilities for studying clustering phenomena in nuclei and for improving the quality of their description.

  3. Quantitative modeling of selective lysosomal targeting for drug design

    DEFF Research Database (Denmark)

    Trapp, Stefan; Rosania, G.; Horobin, R.W.

    2008-01-01

    Lysosomes are acidic organelles and are involved in various diseases, the most prominent is malaria. Accumulation of molecules in the cell by diffusion from the external solution into cytosol, lysosome and mitochondrium was calculated with the Fick–Nernst–Planck equation. The cell model considers...... the diffusion of neutral and ionic molecules across biomembranes, protonation to mono- or bivalent ions, adsorption to lipids, and electrical attraction or repulsion. Based on simulation results, high and selective accumulation in lysosomes was found for weak mono- and bivalent bases with intermediate to high...... predicted by the model and three were close. Five of the antimalarial drugs were lipophilic weak dibasic compounds. The predicted optimum properties for a selective accumulation of weak bivalent bases in lysosomes are consistent with experimental values and are more accurate than any prior calculation...

  4. Quantitative Risk Modeling of Fire on the International Space Station

    Science.gov (United States)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  5. AddRemove : A new link model for use in QM/MM studies

    NARCIS (Netherlands)

    Swart, M

    2003-01-01

    The division of a system under study in a quantum mechanical (QM) and a classical system in QM/MM molecular mechanical calculations is sometimes very natural, but a problem arises in the case of bonds crossing the QM/MM boundary. A new link model that uses a capping (link) atom to satisfy the

  6. Comparison of Methods for Modeling a Hydraulic Loader Crane With Flexible Translational Links

    DEFF Research Database (Denmark)

    Pedersen, Henrik Clemmensen; Andersen, Torben O.; Nielsen, Brian K.

    2015-01-01

    not hold for translational links. Hence, special care has to be taken when including flexible translational links. In the current paper, different methods for modeling a hydraulic loader crane with a telescopic arm are investigated and compared using both the finite segment (FS) and AMs method...

  7. Modeling the video distribution link in the Next Generation Optical Access Networks

    DEFF Research Database (Denmark)

    Amaya, F.; Cárdenas, A.; Tafur Monroy, Idelfonso

    2011-01-01

    In this work we present a model for the design and optimization of the video distribution link in the next generation optical access network. We analyze the video distribution performance in a SCM-WDM link, including the noise, the distortion and the fiber optic nonlinearities. Additionally, we...

  8. A link based network route choice model with unrestricted choice set

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Frejinger, Emma; Karlstrom, Anders

    2013-01-01

    This paper considers the path choice problem, formulating and discussing an econometric random utility model for the choice of path in a network with no restriction on the choice set. Starting from a dynamic specification of link choices we show that it is equivalent to a static model...... of the multinomial logit form but with infinitely many alternatives. The model can be consistently estimated and used for prediction in a computationally efficient way. Similarly to the path size logit model, we propose an attribute called link size that corrects utilities of overlapping paths but that is link...... additive. The model is applied to data recording path choices in a network with more than 3000 nodes and 7000 links....

  9. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  10. Development of a quantitative NS1-capture enzyme-linked immunosorbent assay for early detection of yellow fever virus infection.

    Science.gov (United States)

    Ricciardi-Jorge, Taissa; Bordignon, Juliano; Koishi, Andrea; Zanluca, Camila; Mosimann, Ana Luiza; Duarte Dos Santos, Claudia Nunes

    2017-11-24

    Yellow fever is an arboviral disease that causes thousands of deaths every year in Africa and the Americas. However, few commercial diagnostic kits are available. Non-structural protein 1 (NS1) is an early marker of several flavivirus infections and is widely used to diagnose dengue virus (DENV) infection. Nonetheless, little is known about the dynamics of Yellow fever virus (YFV) NS1 expression and secretion, to encourage its use in diagnosis. To tackle this issue, we developed a quantitative NS1-capture ELISA specific for YFV using a monoclonal antibody and recombinant NS1 protein. This test was used to quantify NS1 in mosquito and human cell line cultures infected with vaccine and wild YFV strains. Our results showed that NS1 was detectable in the culture supernatants of both cell lines; however, a higher concentration was maintained as cell-associated rather than secreted into the extracellular milieu. A panel of 73 human samples was used to demonstrate the suitability of YFV NS1 as a diagnostic tool, resulting in 80% sensitivity, 100% specificity, a 100% positive predictive value and a 95.5% negative predictive value compared with RT-PCR. Overall, the developed NS1-capture ELISA showed potential as a promising assay for the detection of early YF infection.

  11. Identifying Quantitative Trait Loci (QTLs) and Developing Diagnostic Markers Linked to Orange Rust Resistance in Sugarcane (Saccharum spp.).

    Science.gov (United States)

    Yang, Xiping; Islam, Md S; Sood, Sushma; Maya, Stephanie; Hanson, Erik A; Comstock, Jack; Wang, Jianping

    2018-01-01

    Sugarcane ( Saccharum spp.) is an important economic crop, contributing up to 80% of table sugar used in the world and has become a promising feedstock for biofuel production. Sugarcane production has been threatened by many diseases, and fungicide applications for disease control have been opted out for sustainable agriculture. Orange rust is one of the major diseases impacting sugarcane production worldwide. Identifying quantitative trait loci (QTLs) and developing diagnostic markers are valuable for breeding programs to expedite release of superior sugarcane cultivars for disease control. In this study, an F 1 segregating population derived from a cross between two hybrid sugarcane clones, CP95-1039 and CP88-1762, was evaluated for orange rust resistance in replicated trails. Three QTLs controlling orange rust resistance in sugarcane (qORR109, qORR4 and qORR102) were identified for the first time ever, which can explain 58, 12 and 8% of the phenotypic variation, separately. We also characterized 1,574 sugarcane putative resistance ( R ) genes. These sugarcane putative R genes and simple sequence repeats in the QTL intervals were further used to develop diagnostic markers for marker-assisted selection of orange rust resistance. A PCR-based Resistance gene-derived maker, G1 was developed, which showed significant association with orange rust resistance. The putative QTLs and marker developed in this study can be effectively utilized in sugarcane breeding programs to facilitate the selection process, thus contributing to the sustainable agriculture for orange rust disease control.

  12. Modeling and control of a hydraulically actuated flexible-prismatic link robot

    International Nuclear Information System (INIS)

    Love, L.; Kress, R.; Jansen, J.

    1996-12-01

    Most of the research related to flexible link manipulators to date has focused on single link, fixed length, single plane of vibration test beds. In addition, actuation has been predominantly based upon electromagnetic motors. Ironically, these elements are rarely found in the existing industrial long reach systems. This manuscript describes a new hydraulically actuated, long reach manipulator with a flexible prismatic link at Oak Ridge National Laboratory (ORNL). Focus is directed towards both modeling and control of hydraulic actuators as well as flexible links that have variable natural frequencies

  13. Linking Experimental Characterization and Computational Modeling in Microstructural Evolution

    Energy Technology Data Exchange (ETDEWEB)

    Demirel, Melik Cumhar [Univ. of Pittsburgh, PA (United States)

    2002-06-01

    It is known that by controlling microstructural development, desirable properties of materials can be achieved. The main objective of our research is to understand and control interface dominated material properties, and finally, to verify experimental results with computer simulations. In order to accomplish this objective, we studied the grain growth in detail with experimental techniques and computational simulations. We obtained 5170-grain data from an Aluminum-film (120μm thick) with a columnar grain structure from the Electron Backscattered Diffraction (EBSD) measurements. Experimentally obtained starting microstructure and grain boundary properties are input for the three-dimensional grain growth simulation. In the computational model, minimization of the interface energy is the driving force for the grain boundary motion. The computed evolved microstructure is compared with the final experimental microstructure, after annealing at 550 ºC. Two different measures were introduced as methods of comparing experimental and computed microstructures. Modeling with anisotropic mobility explains a significant amount of mismatch between experiment and isotropic modeling. We have shown that isotropic modeling has very little predictive value. Microstructural evolution in columnar Aluminum foils can be correctly modeled with anisotropic parameters. We observed a strong similarity between grain growth experiments and anisotropic three-dimensional simulations.

  14. Linking Experimental Characterization and Computational Modeling in Microstructural Evolution

    Energy Technology Data Exchange (ETDEWEB)

    Demirel, Melik Cumhur [Univ. of California, Berkeley, CA (United States)

    2002-06-01

    It is known that by controlling microstructural development, desirable properties of materials can be achieved. The main objective of our research is to understand and control interface dominated material properties, and finally, to verify experimental results with computer simulations. In order to accomplish this objective, we studied the grain growth in detail with experimental techniques and computational simulations. We obtained 5170-grain data from an Aluminum-film (120μm thick) with a columnar grain structure from the Electron Backscattered Diffraction (EBSD) measurements. Experimentally obtained starting microstructure and grain boundary properties are input for the three-dimensional grain growth simulation. In the computational model, minimization of the interface energy is the driving force for the grain boundary motion. The computed evolved microstructure is compared with the final experimental microstructure, after annealing at 550 ºC. Two different measures were introduced as methods of comparing experimental and computed microstructures. Modeling with anisotropic mobility explains a significant amount of mismatch between experiment and isotropic modeling. We have shown that isotropic modeling has very little predictive value. Microstructural evolution in columnar Aluminum foils can be correctly modeled with anisotropic parameters. We observed a strong similarity

  15. Characterizing Cognitive Aging in Humans with Links to Animal Models

    Directory of Open Access Journals (Sweden)

    Gene E Alexander

    2012-09-01

    Full Text Available With the population of older adults expected to grow rapidly over the next two decades, it has become increasingly important to advance research efforts to elucidate the mechanisms associated with cognitive aging, with the ultimate goal of developing effective interventions and prevention therapies. Although there has been a vast research literature on the use of cognitive tests to evaluate the effects of aging and age-related neurodegenerative disease, the need for a set of standardized measures to characterize the cognitive profiles specific to healthy aging has been widely recognized. Here we present a review of selected methods and approaches that have been applied in human research studies to evaluate the effects of aging on cognition, including executive function, memory, processing speed, language, and visuospatial function. The effects of healthy aging on each of these cognitive domains are discussed with examples from cognitive/experimental and clinical/neuropsychological approaches. Further, we consider those measures that have clear conceptual and methodological links to tasks currently in use for non-human animal studies of aging, as well as those that have the potential for translation to animal aging research. Having a complementary set of measures to assess the cognitive profiles of healthy aging across species provides a unique opportunity to enhance research efforts for cross-sectional, longitudinal, and intervention studies of cognitive aging. Taking a cross-species, translational approach will help to advance cognitive aging research, leading to a greater understanding of associated neurobiological mechanisms with the potential for developing effective interventions and prevention therapies for age-related cognitive decline.

  16. Improved Visibility of Barrett's Esophagus with Linked Color Imaging: Inter- and Intra-Rater Reliability and Quantitative Analysis.

    Science.gov (United States)

    Takeda, Tsutomu; Nagahara, Akihito; Ishizuka, Kei; Okubo, Shoki; Haga, Keiichi; Suzuki, Maiko; Nakajima, Akihito; Komori, Hiroyuki; Akazawa, Yoichi; Izumi, Kentaro; Matsumoto, Kohei; Ueyama, Hiroya; Shimada, Yuji; Matsumoto, Kenshi; Asaoka, Daisuke; Shibuya, Tomoyoshi; Sakamoto, Naoto; Osada, Taro; Hojo, Mariko; Nojiri, Shuko; Watanabe, Sumio

    2018-01-01

    To evaluate the usefulness of linked color imaging (LCI) and blue LASER imaging (BLI) in Barrett's esophagus (BE) compared with white light imaging (WLI). Five expert and trainee endoscopists compared WLI, LCI, and BLI images obtained from 63 patients with short-segment BE. Physicians assessed visibility as follows: 5 (improved), 4 (somewhat improved), 3 (equivalent), 2 (somewhat decreased), and one (decreased). Scores were evaluated to assess visibility. The inter- and intra-rater reliability (intra-class correlation coefficient) of image assessments were also evaluated. Images were objectively evaluated based on L* a* b* color values and color differences (ΔE*) in a CIELAB color space system. Improved visibility compared with WLI was achieved for LCI: 44.4%, BLI: 0% for all endoscopists; LCI: 55.6%, BLI: 1.6% for trainees; and LCI: 47.6%, BLI: 0% for experts. The visibility score of trainees compared with experts was significantly higher for LCI (p = 0.02). Intra- and inter-rater reliability ratings for LCI compared with WLI were "moderate" for trainees, and "moderate-substantial" for experts. The ΔE* revealed statistically significant differences between WLI and LCI. LCI improved the visibility of short-segment BE compared with WLI, especially for trainees, when evaluated both subjectively and objectively. © 2018 S. Karger AG, Basel.

  17. Quantitative variation in obesity-related traits and insulin precursors linked to the OB gene region on human chromosome 7

    Energy Technology Data Exchange (ETDEWEB)

    Duggirala, R.; Stern, M.P.; Reinhart, L.J. [Univ. of Texas Health Science Center, San Antonio, TX (United States)] [and others

    1996-09-01

    Despite the evidence that human obesity has strong genetic determinants, efforts at identifying specific genes that influence human obesity have largely been unsuccessful. Using the sibship data obtained from 32 low-income Mexican American pedigrees ascertained on a type II diabetic proband and a multipoint variance-components method, we tested for linkage between various obesity-related traits plus associated metabolic traits and 15 markers on human chromosome 7. We found evidence for linkage between markers in the OB gene region and various traits, as follows: D7S514 and extremity skinfolds (LOD = 3.1), human carboxypeptidase A1 (HCPA1) and 32,33-split proinsulin level (LOD = 4.2), and HCPA1 and proinsulin level (LOD = 3.2). A putative susceptibility locus linked to the marker D7S514 explained 56% of the total phenotypic variation in extremity skinfolds. Variation at the HCPA1 locus explained 64% of phenotypic variation in proinsulin level and {approximately}73% of phenotypic variation in split proinsulin concentration, respectively. Weaker evidence for linkage to several other obesity-related traits (e.g., waist circumference, body-mass index, fat mass by bioimpedance, etc.) was observed for a genetic location, which is {approximately}15 cM telomeric to OB. In conclusion, our study reveals that the OB region plays a significant role in determining the phenotypic variation of both insulin precursors and obesity-related traits, at least in Mexican Americans. 66 refs., 3 figs., 4 tabs.

  18. Identifying Quantitative Trait Loci (QTLs and Developing Diagnostic Markers Linked to Orange Rust Resistance in Sugarcane (Saccharum spp.

    Directory of Open Access Journals (Sweden)

    Xiping Yang

    2018-03-01

    Full Text Available Sugarcane (Saccharum spp. is an important economic crop, contributing up to 80% of table sugar used in the world and has become a promising feedstock for biofuel production. Sugarcane production has been threatened by many diseases, and fungicide applications for disease control have been opted out for sustainable agriculture. Orange rust is one of the major diseases impacting sugarcane production worldwide. Identifying quantitative trait loci (QTLs and developing diagnostic markers are valuable for breeding programs to expedite release of superior sugarcane cultivars for disease control. In this study, an F1 segregating population derived from a cross between two hybrid sugarcane clones, CP95-1039 and CP88-1762, was evaluated for orange rust resistance in replicated trails. Three QTLs controlling orange rust resistance in sugarcane (qORR109, qORR4 and qORR102 were identified for the first time ever, which can explain 58, 12 and 8% of the phenotypic variation, separately. We also characterized 1,574 sugarcane putative resistance (R genes. These sugarcane putative R genes and simple sequence repeats in the QTL intervals were further used to develop diagnostic markers for marker-assisted selection of orange rust resistance. A PCR-based Resistance gene-derived maker, G1 was developed, which showed significant association with orange rust resistance. The putative QTLs and marker developed in this study can be effectively utilized in sugarcane breeding programs to facilitate the selection process, thus contributing to the sustainable agriculture for orange rust disease control.

  19. A Functional Link Between Bir1 and the Saccharomyces cerevisiae Ctf19 Kinetochore Complex Revealed Through Quantitative Fitness Analysis.

    Science.gov (United States)

    Makrantoni, Vasso; Ciesiolka, Adam; Lawless, Conor; Fernius, Josefin; Marston, Adele; Lydall, David; Stark, Michael J R

    2017-09-07

    The chromosomal passenger complex (CPC) is a key regulator of eukaryotic cell division, consisting of the protein kinase Aurora B/Ipl1 in association with its activator (INCENP/Sli15) and two additional proteins (Survivin/Bir1 and Borealin/Nbl1). Here, we report a genome-wide genetic interaction screen in Saccharomyces cerevisiae using the bir1-17 mutant, identifying through quantitative fitness analysis deletion mutations that act as enhancers and suppressors. Gene knockouts affecting the Ctf19 kinetochore complex were identified as the strongest enhancers of bir1-17 , while mutations affecting the large ribosomal subunit or the mRNA nonsense-mediated decay pathway caused strong phenotypic suppression. Thus, cells lacking a functional Ctf19 complex become highly dependent on Bir1 function and vice versa. The negative genetic interaction profiles of bir1-17 and the cohesin mutant mcd1-1 showed considerable overlap, underlining the strong functional connection between sister chromatid cohesion and chromosome biorientation. Loss of some Ctf19 components, such as Iml3 or Chl4, impacted differentially on bir1-17 compared with mutations affecting other CPC components: despite the synthetic lethality shown by either iml3 ∆ or chl4 ∆ in combination with bir1-17 , neither gene knockout showed any genetic interaction with either ipl1-321 or sli15-3 Our data therefore imply a specific functional connection between the Ctf19 complex and Bir1 that is not shared with Ipl1. Copyright © 2017 Makrantoni et al.

  20. The indirect link between perceived parenting and adolescent future orientation : A multiple-step model

    NARCIS (Netherlands)

    Seginer, R.; Vermulst, A.A.; Shoyer, S.

    2004-01-01

    The indirect links between perceived mothers' and fathers' autonomous-accepting parenting and future orientation were examined in a mediational model consisting of five steps: perceived mothers' and fathers' autonomous-accepting parenting, self-evaluation, and the motivational, cognitive

  1. A quantitative confidence signal detection model: 1. Fitting psychometric functions

    Science.gov (United States)

    Yi, Yongwoo

    2016-01-01

    Perceptual thresholds are commonly assayed in the laboratory and clinic. When precision and accuracy are required, thresholds are quantified by fitting a psychometric function to forced-choice data. The primary shortcoming of this approach is that it typically requires 100 trials or more to yield accurate (i.e., small bias) and precise (i.e., small variance) psychometric parameter estimates. We show that confidence probability judgments combined with a model of confidence can yield psychometric parameter estimates that are markedly more precise and/or markedly more efficient than conventional methods. Specifically, both human data and simulations show that including confidence probability judgments for just 20 trials can yield psychometric parameter estimates that match the precision of those obtained from 100 trials using conventional analyses. Such an efficiency advantage would be especially beneficial for tasks (e.g., taste, smell, and vestibular assays) that require more than a few seconds for each trial, but this potential benefit could accrue for many other tasks. PMID:26763777

  2. High-response piezoelectricity modeled quantitatively near a phase boundary

    Science.gov (United States)

    Newns, Dennis M.; Kuroda, Marcelo A.; Cipcigan, Flaviu S.; Crain, Jason; Martyna, Glenn J.

    2017-01-01

    Interconversion of mechanical and electrical energy via the piezoelectric effect is fundamental to a wide range of technologies. The discovery in the 1990s of giant piezoelectric responses in certain materials has therefore opened new application spaces, but the origin of these properties remains a challenge to our understanding. A key role is played by the presence of a structural instability in these materials at compositions near the "morphotropic phase boundary" (MPB) where the crystal structure changes abruptly and the electromechanical responses are maximal. Here we formulate a simple, unified theoretical description which accounts for extreme piezoelectric response, its observation at compositions near the MPB, accompanied by ultrahigh dielectric constant and mechanical compliances with rather large anisotropies. The resulting model, based upon a Landau free energy expression, is capable of treating the important domain engineered materials and is found to be predictive while maintaining simplicity. It therefore offers a general and powerful means of accounting for the full set of signature characteristics in these functional materials including volume conserving sum rules and strong substrate clamping effects.

  3. An integrative model linking feedback environment and organizational citizenship behavior.

    Science.gov (United States)

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed.

  4. Linking density functional and mode coupling models for supercooled liquids

    OpenAIRE

    Premkumar, Leishangthem; Bidhoodi, Neeta; Das, Shankar P.

    2015-01-01

    We compare predictions from two familiar models of the metastable supercooled liquid respectively constructed with thermodynamic and dynamic approach. In the so called density functional theory (DFT) the free energy $F[\\rho]$ of the liquid is a functional of the inhomogeneous density $\\rho({\\bf r})$. The metastable state is identified as a local minimum of $F[\\rho]$. The sharp density profile characterizing $\\rho({\\bf r})$ is identified as a single particle oscillator, whose frequency is obta...

  5. Links between fluid mechanics and quantum mechanics: a model for information in economics?

    Science.gov (United States)

    Haven, Emmanuel

    2016-05-28

    This paper tallies the links between fluid mechanics and quantum mechanics, and attempts to show whether those links can aid in beginning to build a formal template which is usable in economics models where time is (a)symmetric and memory is absent or present. An objective of this paper is to contemplate whether those formalisms can allow us to model information in economics in a novel way. © 2016 The Author(s).

  6. Quantitative analysis of crossflow model of the COBRA-IV.1 code

    International Nuclear Information System (INIS)

    Lira, C.A.B.O.

    1983-01-01

    Based on experimental data in a rod bundle test section, the crossflow model of the COBRA-IV.1 code was quantitatively analysed. The analysis showed that is possible to establish some operational conditions in which the results of the theoretical model are acceptable. (author) [pt

  7. Development of probabilistic models for quantitative pathway analysis of plant pests introduction for the EU territory

    NARCIS (Netherlands)

    Douma, J.C.; Robinet, C.; Hemerik, L.; Mourits, M.C.M.; Roques, A.; Werf, van der W.

    2015-01-01

    The aim of this report is to provide EFSA with probabilistic models for quantitative pathway analysis of plant pest introduction for the EU territory through non-edible plant products or plants. We first provide a conceptualization of two types of pathway models. The individual based PM simulates an

  8. Qualitative to quantitative : linked trajectory of method triangulation in a study on HIV/AIDS in Goa, India

    NARCIS (Netherlands)

    Bailey, Ajay; Hutter, Inge

    2008-01-01

    With 3.1 million people estimated to be living with HIV/AIDS in India and 39.5 million people globally, the epidemic has posed academics the challenge of identifying behaviours and their underlying beliefs in the effort to reduce the risk of HIV transmission. The Health Belief Model (HBM) is

  9. Excess Loss Model for Low Elevation Links in Urban Areas for UAVs

    Directory of Open Access Journals (Sweden)

    M. Simunek

    2011-09-01

    Full Text Available In this paper we analyze the link between an UAV and a ground control station in an urban area. This link shows a unique geometry which is somewhere in between the purely terrestrial (e.g., a macro-cell channel and the land mobile satellite case (LMS. We describe a measurement campaign which reproduces the UAV link conditions and shows how the excess loss is mainly dependent on the elevation angle and fairly independent of the distance. Finally, we propose a simple physical model for predicting the excess loss based on a combination of diffracted and reflected components. Results from this model are in good agreement with the measurements.

  10. Modeling water quality, temperature, and flow in Link River, south-central Oregon

    Science.gov (United States)

    Sullivan, Annett B.; Rounds, Stewart A.

    2016-09-09

    The 2.1-km (1.3-mi) Link River connects Upper Klamath Lake to the Klamath River in south-central Oregon. A CE-QUAL-W2 flow and water-quality model of Link River was developed to provide a connection between an existing model of the upper Klamath River and any existing or future models of Upper Klamath Lake. Water-quality sampling at six locations in Link River was done during 2013–15 to support model development and to provide a better understanding of instream biogeochemical processes. The short reach and high velocities in Link River resulted in fast travel times and limited water-quality transformations, except for dissolved oxygen. Reaeration through the reach, especially at the falls in Link River, was particularly important in moderating dissolved oxygen concentrations that at times entered the reach at Link River Dam with marked supersaturation or subsaturation. This reaeration resulted in concentrations closer to saturation downstream at the mouth of Link River.

  11. A mouse model for nonsyndromic deafness (DFNB12) links hearing loss to defects in tip links of mechanosensory hair cells

    Science.gov (United States)

    Schwander, Martin; Xiong, Wei; Tokita, Joshua; Lelli, Andrea; Elledge, Heather M.; Kazmierczak, Piotr; Sczaniecka, Anna; Kolatkar, Anand; Wiltshire, Tim; Kuhn, Peter; Holt, Jeffrey R.; Kachar, Bechara; Tarantino, Lisa; Müller, Ulrich

    2009-01-01

    Deafness is the most common form of sensory impairment in humans and is frequently caused by single gene mutations. Interestingly, different mutations in a gene can cause syndromic and nonsyndromic forms of deafness, as well as progressive and age-related hearing loss. We provide here an explanation for the phenotypic variability associated with mutations in the cadherin 23 gene (CDH23). CDH23 null alleles cause deaf-blindness (Usher syndrome type 1D; USH1D), whereas missense mutations cause nonsyndromic deafness (DFNB12). In a forward genetic screen, we have identified salsa mice, which suffer from hearing loss due to a Cdh23 missense mutation modeling DFNB12. In contrast to waltzer mice, which carry a CDH23 null allele mimicking USH1D, hair cell development is unaffected in salsa mice. Instead, tip links, which are thought to gate mechanotransduction channels in hair cells, are progressively lost. Our findings suggest that DFNB12 belongs to a new class of disorder that is caused by defects in tip links. We propose that mutations in other genes that cause USH1 and nonsyndromic deafness may also have distinct effects on hair cell development and function. PMID:19270079

  12. Quantitation of pulmonary surfactant protein SP-B in the absence or presence of phospholipids by enzyme-linked immunosorbent assay

    DEFF Research Database (Denmark)

    Oviedo, J M; Valiño, F; Plasencia, I

    2001-01-01

    We have developed an enzyme-linked immunosorbent assay (ELISA) that uses polyclonal or monoclonal anti-surfactant protein SP-B antibodies to quantitate purified SP-B in chloroform/methanol and in chloroform/methanol extracts of whole pulmonary surfactant at nanogram levels. This method has been...... used to explore the effect of the presence of different phospholipids on the immunoreactivity of SP-B. Both polyclonal and monoclonal antibodies produced reproducible ELISA calibration curves for methanolic SP-B solutions with protein concentrations in the range of 20-1000 ng/mL. At these protein...... pronounced changes on the conformation of SP-B when the solvent was evaporated and dry lipid-protein films were formed, a necessary step to expose protein to antibodies in ELISA. Under these conditions, negatively charged lipids, but not zwitterionic ones, induced a marked decrease on the ellipticity of SP...

  13. Impact of implementation choices on quantitative predictions of cell-based computational models

    Science.gov (United States)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  14. Development of quantitative structure activity relationship (QSAR) model for disinfection byproduct (DBP) research: A review of methods and resources.

    Science.gov (United States)

    Chen, Baiyang; Zhang, Tian; Bond, Tom; Gan, Yiqun

    2015-12-15

    Quantitative structure-activity relationship (QSAR) models are tools for linking chemical activities with molecular structures and compositions. Due to the concern about the proliferating number of disinfection byproducts (DBPs) in water and the associated financial and technical burden, researchers have recently begun to develop QSAR models to investigate the toxicity, formation, property, and removal of DBPs. However, there are no standard procedures or best practices regarding how to develop QSAR models, which potentially limit their wide acceptance. In order to facilitate more frequent use of QSAR models in future DBP research, this article reviews the processes required for QSAR model development, summarizes recent trends in QSAR-DBP studies, and shares some important resources for QSAR development (e.g., free databases and QSAR programs). The paper follows the four steps of QSAR model development, i.e., data collection, descriptor filtration, algorithm selection, and model validation; and finishes by highlighting several research needs. Because QSAR models may have an important role in progressing our understanding of DBP issues, it is hoped that this paper will encourage their future use for this application. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Development of quantitative structure activity relationship (QSAR) model for disinfection byproduct (DBP) research: A review of methods and resources

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Baiyang, E-mail: poplar_chen@hotmail.com [Harbin Institute of Technology Shenzhen Graduate School, Shenzhen Key Laboratory of Water Resource Utilization and Environmental Pollution Control, Shenzhen 518055 (China); Zhang, Tian [Harbin Institute of Technology Shenzhen Graduate School, Shenzhen Key Laboratory of Water Resource Utilization and Environmental Pollution Control, Shenzhen 518055 (China); Bond, Tom [Department of Civil and Environmental Engineering, Imperial College, London SW7 2AZ (United Kingdom); Gan, Yiqun [Harbin Institute of Technology Shenzhen Graduate School, Shenzhen Key Laboratory of Water Resource Utilization and Environmental Pollution Control, Shenzhen 518055 (China)

    2015-12-15

    Quantitative structure–activity relationship (QSAR) models are tools for linking chemical activities with molecular structures and compositions. Due to the concern about the proliferating number of disinfection byproducts (DBPs) in water and the associated financial and technical burden, researchers have recently begun to develop QSAR models to investigate the toxicity, formation, property, and removal of DBPs. However, there are no standard procedures or best practices regarding how to develop QSAR models, which potentially limit their wide acceptance. In order to facilitate more frequent use of QSAR models in future DBP research, this article reviews the processes required for QSAR model development, summarizes recent trends in QSAR-DBP studies, and shares some important resources for QSAR development (e.g., free databases and QSAR programs). The paper follows the four steps of QSAR model development, i.e., data collection, descriptor filtration, algorithm selection, and model validation; and finishes by highlighting several research needs. Because QSAR models may have an important role in progressing our understanding of DBP issues, it is hoped that this paper will encourage their future use for this application.

  16. Detection of Prostate Cancer: Quantitative Multiparametric MR Imaging Models Developed Using Registered Correlative Histopathology.

    Science.gov (United States)

    Metzger, Gregory J; Kalavagunta, Chaitanya; Spilseth, Benjamin; Bolan, Patrick J; Li, Xiufeng; Hutter, Diane; Nam, Jung W; Johnson, Andrew D; Henriksen, Jonathan C; Moench, Laura; Konety, Badrinath; Warlick, Christopher A; Schmechel, Stephen C; Koopmeiners, Joseph S

    2016-06-01

    Purpose To develop multiparametric magnetic resonance (MR) imaging models to generate a quantitative, user-independent, voxel-wise composite biomarker score (CBS) for detection of prostate cancer by using coregistered correlative histopathologic results, and to compare performance of CBS-based detection with that of single quantitative MR imaging parameters. Materials and Methods Institutional review board approval and informed consent were obtained. Patients with a diagnosis of prostate cancer underwent multiparametric MR imaging before surgery for treatment. All MR imaging voxels in the prostate were classified as cancer or noncancer on the basis of coregistered histopathologic data. Predictive models were developed by using more than one quantitative MR imaging parameter to generate CBS maps. Model development and evaluation of quantitative MR imaging parameters and CBS were performed separately for the peripheral zone and the whole gland. Model accuracy was evaluated by using the area under the receiver operating characteristic curve (AUC), and confidence intervals were calculated with the bootstrap procedure. The improvement in classification accuracy was evaluated by comparing the AUC for the multiparametric model and the single best-performing quantitative MR imaging parameter at the individual level and in aggregate. Results Quantitative T2, apparent diffusion coefficient (ADC), volume transfer constant (K(trans)), reflux rate constant (kep), and area under the gadolinium concentration curve at 90 seconds (AUGC90) were significantly different between cancer and noncancer voxels (P models demonstrated the best performance in both the peripheral zone (AUC, 0.85; P = .010 vs ADC alone) and whole gland (AUC, 0.77; P = .043 vs ADC alone). Individual-level analysis showed statistically significant improvement in AUC in 82% (23 of 28) and 71% (24 of 34) of patients with peripheral-zone and whole-gland models, respectively, compared with ADC alone. Model-based CBS

  17. Linking density functional and mode coupling models for supercooled liquids.

    Science.gov (United States)

    Premkumar, Leishangthem; Bidhoodi, Neeta; Das, Shankar P

    2016-03-28

    We compare predictions from two familiar models of the metastable supercooled liquid, respectively, constructed with thermodynamic and dynamic approaches. In the so called density functional theory the free energy F[ρ] of the liquid is a functional of the inhomogeneous density ρ(r). The metastable state is identified as a local minimum of F[ρ]. The sharp density profile characterizing ρ(r) is identified as a single particle oscillator, whose frequency is obtained from the parameters of the optimum density function. On the other hand, a dynamic approach to supercooled liquids is taken in the mode coupling theory (MCT) which predict a sharp ergodicity-non-ergodicity transition at a critical density. The single particle dynamics in the non-ergodic state, treated approximately, represents a propagating mode whose characteristic frequency is computed from the corresponding memory function of the MCT. The mass localization parameters in the above two models (treated in their simplest forms) are obtained, respectively, in terms of the corresponding natural frequencies depicted and are shown to have comparable magnitudes.

  18. Linking density functional and mode coupling models for supercooled liquids

    Energy Technology Data Exchange (ETDEWEB)

    Premkumar, Leishangthem; Bidhoodi, Neeta; Das, Shankar P. [School of Physical Sciences, Jawaharlal Nehru University, New Delhi 110067 (India)

    2016-03-28

    We compare predictions from two familiar models of the metastable supercooled liquid, respectively, constructed with thermodynamic and dynamic approaches. In the so called density functional theory the free energy F[ρ] of the liquid is a functional of the inhomogeneous density ρ(r). The metastable state is identified as a local minimum of F[ρ]. The sharp density profile characterizing ρ(r) is identified as a single particle oscillator, whose frequency is obtained from the parameters of the optimum density function. On the other hand, a dynamic approach to supercooled liquids is taken in the mode coupling theory (MCT) which predict a sharp ergodicity-non-ergodicity transition at a critical density. The single particle dynamics in the non-ergodic state, treated approximately, represents a propagating mode whose characteristic frequency is computed from the corresponding memory function of the MCT. The mass localization parameters in the above two models (treated in their simplest forms) are obtained, respectively, in terms of the corresponding natural frequencies depicted and are shown to have comparable magnitudes.

  19. Linking nitrogen deposition to nitrate concentrations in groundwater below nature areas : modelling approach and data requirements

    NARCIS (Netherlands)

    Bonten, L.T.C.; Mol-Dijkstra, J.P.; Wieggers, H.J.J.; Vries, de W.; Pul, van W.A.J.; Hoek, van den K.W.

    2009-01-01

    This study determines the most suitable model and required model improvements to link atmospheric deposition of nitrogen and other elements in the Netherlands to measurements of nitrogen and other elements in the upper groundwater. The deterministic model SMARTml was found to be the most suitable

  20. Modelling soil nitrogen: The MAGIC model with nitrogen retention linked to carbon turnover using decomposer dynamics

    International Nuclear Information System (INIS)

    Oulehle, F.; Cosby, B.J.; Wright, R.F.; Hruška, J.; Kopáček, J.; Krám, P.; Evans, C.D.; Moldan, F.

    2012-01-01

    We present a new formulation of the acidification model MAGIC that uses decomposer dynamics to link nitrogen (N) cycling to carbon (C) turnover in soils. The new model is evaluated by application to 15–30 years of water chemistry data at three coniferous-forested sites in the Czech Republic where deposition of sulphur (S) and N have decreased by >80% and 40%, respectively. Sulphate concentrations in waters have declined commensurately with S deposition, but nitrate concentrations have shown much larger decreases relative to N deposition. This behaviour is inconsistent with most conceptual models of N saturation, and with earlier versions of MAGIC which assume N retention to be a first-order function of N deposition and/or controlled by the soil C/N ratio. In comparison with earlier versions, the new formulation more correctly simulates observed short-term changes in nitrate leaching, as well as long-term retention of N in soils. The model suggests that, despite recent deposition reductions and recovery, progressive N saturation will lead to increased future nitrate leaching, ecosystem eutrophication and re-acidification. - Highlights: ► New version of the biogeochemical model MAGIC developed to simulate C/N dynamics. ► New formulation of N retention based directly on the decomposer processes. ► The new formulation simulates observed changes in nitrate leaching and in soil C/N. ► The model suggests progressive N saturation at sites examined. ► The model performance meets a growing need for realistic process-based simulations. - Process-based modelling of nitrogen dynamics and acidification in forest ecosystems.

  1. Interpretation of Quantitative Structure-Activity Relationship Models: Past, Present, and Future.

    Science.gov (United States)

    Polishchuk, Pavel

    2017-11-27

    This paper is an overview of the most significant and impactful interpretation approaches of quantitative structure-activity relationship (QSAR) models, their development, and application. The evolution of the interpretation paradigm from "model → descriptors → (structure)" to "model → structure" is indicated. The latter makes all models interpretable regardless of machine learning methods or descriptors used for modeling. This opens wide prospects for application of corresponding interpretation approaches to retrieve structure-property relationships captured by any models. Issues of separate approaches are discussed as well as general issues and prospects of QSAR model interpretation.

  2. Linking effort and fishing mortality in a mixed fisheries model

    DEFF Research Database (Denmark)

    Thøgersen, Thomas Talund; Hoff, Ayoe; Frost, Hans Staby

    2012-01-01

    Since the implementation of the Common Fisheries Policy of the European Union in 1983, the management of EU fisheries has been enormously challenging. The abundance of many fish stocks has declined because too much fishing capacity has been utilised on healthy fish stocks. Today, this decline...... in fish stocks has led to overcapacity in many fisheries, leading to incentives for overfishing. Recent research has shown that the allocation of effort among fleets can play an important role in mitigating overfishing when the targeting covers a range of species (multi-species—i.e., so-called mixed...... fisheries), while simultaneously optimising the overall economic performance of the fleets. The so-called FcubEcon model, in particular, has elucidated both the biologically and economically optimal method for allocating catches—and thus effort—between fishing fleets, while ensuring that the quotas...

  3. Linking Memories across Time via Neuronal and Dendritic Overlaps in Model Neurons with Active Dendrites

    Directory of Open Access Journals (Sweden)

    George Kastellakis

    2016-11-01

    Full Text Available Memories are believed to be stored in distributed neuronal assemblies through activity-induced changes in synaptic and intrinsic properties. However, the specific mechanisms by which different memories become associated or linked remain a mystery. Here, we develop a simplified, biophysically inspired network model that incorporates multiple plasticity processes and explains linking of information at three different levels: (1 learning of a single associative memory, (2 rescuing of a weak memory when paired with a strong one, and (3 linking of multiple memories across time. By dissecting synaptic from intrinsic plasticity and neuron-wide from dendritically restricted protein capture, the model reveals a simple, unifying principle: linked memories share synaptic clusters within the dendrites of overlapping populations of neurons. The model generates numerous experimentally testable predictions regarding the cellular and sub-cellular properties of memory engrams as well as their spatiotemporal interactions.

  4. Modeling and Representing National Climate Assessment Information using Linked Data

    Science.gov (United States)

    Zheng, J.; Tilmes, C.; Smith, A.; Zednik, S.; Fox, P. A.

    2012-12-01

    Every four years, earth scientists work together on a National Climate Assessment (NCA) report which integrates, evaluates, and interprets the findings of climate change and impacts on affected industries such as agriculture, natural environment, energy production and use, etc. Given the amount of information presented in each report, and the wide range of information sources and topics, it can be difficult for users to find and identify desired information. To ease the user effort of information discovery, well-structured metadata is needed that describes the report's key statements and conclusions and provide for traceable provenance of data sources used. We present an assessment ontology developed to describe terms, concepts and relations required for the NCA metadata. Wherever possible, the assessment ontology reuses terms from well-known ontologies such as Semantic Web for Earth and Environmental Terminology (SWEET) ontology, Dublin Core (DC) vocabulary. We have generated sample National Climate Assessment metadata conforming to our assessment ontology and publicly exposed via a SPARQL-endpoint and website. We have also modeled provenance information for the NCA writing activities using the W3C recommendation-candidate PROV-O ontology. Using this provenance the user will be able to trace the sources of information used in the assessment and therefore make trust decisions. In the future, we are planning to implement a faceted browser over the metadata to enhance metadata traversal and information discovery.

  5. Quantitative modeling of gene networks of biological systems using fuzzy Petri nets and fuzzy sets

    Directory of Open Access Journals (Sweden)

    Raed I. Hamed

    2018-01-01

    Full Text Available Quantitative demonstrating of organic frameworks has turned into an essential computational methodology in the configuration of novel and investigation of existing natural frameworks. Be that as it may, active information that portrays the framework's elements should be known keeping in mind the end goal to get pertinent results with the routine displaying strategies. This information is frequently robust or even difficult to get. Here, we exhibit a model of quantitative fuzzy rational demonstrating approach that can adapt to obscure motor information and hence deliver applicable results despite the fact that dynamic information is fragmented or just dubiously characterized. Besides, the methodology can be utilized as a part of the blend with the current cutting edge quantitative demonstrating strategies just in specific parts of the framework, i.e., where the data are absent. The contextual analysis of the methodology suggested in this paper is performed on the model of nine-quality genes. We propose a kind of FPN model in light of fuzzy sets to manage the quantitative modeling of biological systems. The tests of our model appear that the model is practical and entirely powerful for information impersonation and thinking of fuzzy expert frameworks.

  6. A Quantitative Geochemical Target for Modeling the Formation of the Earth and Moon

    Science.gov (United States)

    Boyce, Jeremy W.; Barnes, Jessica J.; McCubbin, Francis M.

    2017-01-01

    The past decade has been one of geochemical, isotopic, and computational advances that are bringing the laboratory measurements and computational modeling neighborhoods of the Earth-Moon community to ever closer proximity. We are now however in the position to become even better neighbors: modelers can generate testable hypthotheses for geochemists; and geochemists can provide quantitive targets for modelers. Here we present a robust example of the latter based on Cl isotope measurements of mare basalts.

  7. Implementation of a combined association-linkage model for quantitative traits in linear mixed model procedures of statistical packages

    NARCIS (Netherlands)

    Beem, A. Leo; Boomsma, Dorret I.

    2006-01-01

    A transmission disequilibrium test for quantitative traits which combines association and linkage analyses is currently available in several dedicated software packages. We describe how to implement such models in linear mixed model procedures that are available in widely used statistical packages

  8. Pricing of premiums for equity-linked life insurance based on joint mortality models

    Science.gov (United States)

    Riaman; Parmikanti, K.; Irianingsih, I.; Supian, S.

    2018-03-01

    Life insurance equity - linked is a financial product that not only offers protection, but also investment. The calculation of equity-linked life insurance premiums generally uses mortality tables. Because of advances in medical technology and reduced birth rates, it appears that the use of mortality tables is less relevant in the calculation of premiums. To overcome this problem, we use a combination mortality model which in this study is determined based on Indonesian Mortality table 2011 to determine the chances of death and survival. In this research, we use the Combined Mortality Model of the Weibull, Inverse-Weibull, and Gompertz Mortality Model. After determining the Combined Mortality Model, simulators calculate the value of the claim to be given and the premium price numerically. By calculating equity-linked life insurance premiums well, it is expected that no party will be disadvantaged due to the inaccuracy of the calculation result

  9. A Framework for Linking Population Model Development with Ecological Risk Assessment Objectives.

    Science.gov (United States)

    The value of models that link organism‐level impacts to the responses of a population in ecological risk assessments (ERAs) has been demonstrated extensively over the past few decades. There is little debate about the utility of these models to translate multiple organism&#...

  10. A Simple Forecasting Model Linking Macroeconomic Policy to Industrial Employment Demand.

    Science.gov (United States)

    Malley, James R.; Hady, Thomas F.

    A study detailed further a model linking monetary and fiscal policy to industrial employment in metropolitan and nonmetropolitan areas of four United States regions. The model was used to simulate the impacts on area and regional employment of three events in the economy: changing real gross national product (GNP) via monetary policy, holding the…

  11. Quantitative modelling of interaction of propafenone with sodium channels in cardiac cells

    Czech Academy of Sciences Publication Activity Database

    Pásek, Michal; Šimurda, J.

    2004-01-01

    Roč. 42, č. 2 (2004), s. 151-157 ISSN 0140-0118 R&D Projects: GA ČR GP204/02/D129 Institutional research plan: CEZ:AV0Z2076919 Keywords : cardiac cell * sodium current block * quantitative modelling Subject RIV: BO - Biophysics Impact factor: 1.070, year: 2004

  12. Quantitative analyses and modelling to support achievement of the 2020 goals for nine neglected tropical diseases

    NARCIS (Netherlands)

    T.D. Hollingsworth (T. Déirdre); E.R. Adams (Emily R.); R.M. Anderson (Roy); K. Atkins (Katherine); S. Bartsch (Sarah); M-G. Basáñez (María-Gloria); M. Behrend (Matthew); D.J. Blok (David); L.A.C. Chapman (Lloyd A. C.); L.E. Coffeng (Luc); O. Courtenay (Orin); R.E. Crump (Ron E.); S.J. de Vlas (Sake); A.P. Dobson (Andrew); L. Dyson (Louise); H. Farkas (Hajnal); A.P. Galvani (Alison P.); M. Gambhir (Manoj); D. Gurarie (David); M.A. Irvine (Michael A.); S. Jervis (Sarah); M.J. Keeling (Matt J.); L. Kelly-Hope (Louise); C. King (Charles); B.Y. Lee (Bruce Y.); E.A. le Rutte (Epke); T.M. Lietman (Thomas M.); M. Ndeffo-Mbah (Martial); G.F. Medley (Graham F.); E. Michael (Edwin); A. Pandey (Abhishek); J.K. Peterson (Jennifer K.); A. Pinsent (Amy); T.C. Porco (Travis C.); J.H. Richardus (Jan Hendrik); L. Reimer (Lisa); K.S. Rock (Kat S.); B.K. Singh (Brajendra K.); W.A. Stolk (Wilma); S. Swaminathan (Subramanian); S.J. Torr (Steve J.); J. Townsend (Jeffrey); J. Truscott (James); M. Walker (Martin); A. Zoueva (Alexandra)

    2015-01-01

    textabstractQuantitative analysis and mathematical models are useful tools in informing strategies to control or eliminate disease. Currently, there is an urgent need to develop these tools to inform policy to achieve the 2020 goals for neglected tropical diseases (NTDs). In this paper we give an

  13. A quantitative model of the cardiac ventricular cell incorporating the transverse-axial tubular system

    Czech Academy of Sciences Publication Activity Database

    Pásek, Michal; Christé, G.; Šimurda, J.

    2003-01-01

    Roč. 22, č. 3 (2003), s. 355-368 ISSN 0231-5882 R&D Projects: GA ČR GP204/02/D129 Institutional research plan: CEZ:AV0Z2076919 Keywords : cardiac cell * tubular system * quantitative modelling Subject RIV: BO - Biophysics Impact factor: 0.794, year: 2003

  14. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT.

    Science.gov (United States)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R; La Riviere, Patrick J; Alessio, Adam M

    2014-04-07

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)(-1), cardiac output = 3, 5, 8 L min(-1)). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This

  15. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT

    Science.gov (United States)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-04-01

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that

  16. A bibliography of terrain modeling (geomorphometry), the quantitative representation of topography: supplement 4.0

    Science.gov (United States)

    Pike, Richard J.

    2002-01-01

    Terrain modeling, the practice of ground-surface quantification, is an amalgam of Earth science, mathematics, engineering, and computer science. The discipline is known variously as geomorphometry (or simply morphometry), terrain analysis, and quantitative geomorphology. It continues to grow through myriad applications to hydrology, geohazards mapping, tectonics, sea-floor and planetary exploration, and other fields. Dating nominally to the co-founders of academic geography, Alexander von Humboldt (1808, 1817) and Carl Ritter (1826, 1828), the field was revolutionized late in the 20th Century by the computer manipulation of spatial arrays of terrain heights, or digital elevation models (DEMs), which can quantify and portray ground-surface form over large areas (Maune, 2001). Morphometric procedures are implemented routinely by commercial geographic information systems (GIS) as well as specialized software (Harvey and Eash, 1996; Köthe and others, 1996; ESRI, 1997; Drzewiecki et al., 1999; Dikau and Saurer, 1999; Djokic and Maidment, 2000; Wilson and Gallant, 2000; Breuer, 2001; Guth, 2001; Eastman, 2002). The new Earth Surface edition of the Journal of Geophysical Research, specializing in surficial processes, is the latest of many publication venues for terrain modeling. This is the fourth update of a bibliography and introduction to terrain modeling (Pike, 1993, 1995, 1996, 1999) designed to collect the diverse, scattered literature on surface measurement as a resource for the research community. The use of DEMs in science and technology continues to accelerate and diversify (Pike, 2000a). New work appears so frequently that a sampling must suffice to represent the vast literature. This report adds 1636 entries to the 4374 in the four earlier publications1. Forty-eight additional entries correct dead Internet links and other errors found in the prior listings. Chronicling the history of terrain modeling, many entries in this report predate the 1999 supplement

  17. Quantitative modeling assesses the contribution of bond strengthening, rebinding and force sharing to the avidity of biomolecule interactions.

    Directory of Open Access Journals (Sweden)

    Valentina Lo Schiavo

    Full Text Available Cell adhesion is mediated by numerous membrane receptors. It is desirable to derive the outcome of a cell-surface encounter from the molecular properties of interacting receptors and ligands. However, conventional parameters such as affinity or kinetic constants are often insufficient to account for receptor efficiency. Avidity is a qualitative concept frequently used to describe biomolecule interactions: this includes incompletely defined properties such as the capacity to form multivalent attachments. The aim of this study is to produce a working description of monovalent attachments formed by a model system, then to measure and interpret the behavior of divalent attachments under force. We investigated attachments between antibody-coated microspheres and surfaces coated with sparse monomeric or dimeric ligands. When bonds were subjected to a pulling force, they exhibited both a force-dependent dissociation consistent with Bell's empirical formula and a force- and time-dependent strengthening well described by a single parameter. Divalent attachments were stronger and less dependent on forces than monovalent ones. The proportion of divalent attachments resisting a force of 30 piconewtons for at least 5 s was 3.7 fold higher than that of monovalent attachments. Quantitative modeling showed that this required rebinding, i.e. additional bond formation between surfaces linked by divalent receptors forming only one bond. Further, experimental data were compatible with but did not require stress sharing between bonds within divalent attachments. Thus many ligand-receptor interactions do not behave as single-step reactions in the millisecond to second timescale. Rather, they exhibit progressive stabilization. This explains the high efficiency of multimerized or clustered receptors even when bonds are only subjected to moderate forces. Our approach provides a quantitative way of relating binding avidity to measurable parameters including bond

  18. Quantitative detection and biological propagation of scrapie seeding activity in vitro facilitate use of prions as model pathogens for disinfection.

    Directory of Open Access Journals (Sweden)

    Sandra Pritzkow

    Full Text Available Prions are pathogens with an unusually high tolerance to inactivation and constitute a complex challenge to the re-processing of surgical instruments. On the other hand, however, they provide an informative paradigm which has been exploited successfully for the development of novel broad-range disinfectants simultaneously active also against bacteria, viruses and fungi. Here we report on the development of a methodological platform that further facilitates the use of scrapie prions as model pathogens for disinfection. We used specifically adapted serial protein misfolding cyclic amplification (PMCA for the quantitative detection, on steel wires providing model carriers for decontamination, of 263K scrapie seeding activity converting normal protease-sensitive into abnormal protease-resistant prion protein. Reference steel wires carrying defined amounts of scrapie infectivity were used for assay calibration, while scrapie-contaminated test steel wires were subjected to fifteen different procedures for disinfection that yielded scrapie titre reductions of ≤10(1- to ≥10(5.5-fold. As confirmed by titration in hamsters the residual scrapie infectivity on test wires could be reliably deduced for all examined disinfection procedures, from our quantitative seeding activity assay. Furthermore, we found that scrapie seeding activity present in 263K hamster brain homogenate or multiplied by PMCA of scrapie-contaminated steel wires both triggered accumulation of protease-resistant prion protein and was further propagated in a novel cell assay for 263K scrapie prions, i.e., cerebral glial cell cultures from hamsters. The findings from our PMCA- and glial cell culture assays revealed scrapie seeding activity as a biochemically and biologically replicative principle in vitro, with the former being quantitatively linked to prion infectivity detected on steel wires in vivo. When combined, our in vitro assays provide an alternative to titrations of biological

  19. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... whose operational behaviour interacts with a store of constraints, neatly separating product configuration from product behaviour. The resulting probabilistic configurations and behaviour converge seamlessly in a semantics based on DTMCs, thus enabling quantitative analyses ranging from the likelihood...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....

  20. Potential Investigation of Linking PROSAIL with the Ross-Li BRDF Model for Vegetation Characterization

    Directory of Open Access Journals (Sweden)

    Xiaoning Zhang

    2018-03-01

    Full Text Available Methods that link different models for investigating the retrieval of canopy biophysical/structural variables have been substantially adopted in the remote sensing community. To retrieve global biophysical parameters from multiangle data, the kernel-driven bidirectional reflectance distribution function (BRDF model has been widely applied to satellite multiangle observations to model (interpolate/extrapolate the bidirectional reflectance factor (BRF in an arbitrary direction of viewing and solar geometries. Such modeled BRFs, as an essential information source, are then input into an inversion procedure that is devised through a large number of simulation analyses from some widely used physical models that can generalize such an inversion relationship between the BRFs (or their simple algebraic composite and the biophysical/structural parameter. Therefore, evaluation of such a link between physical models and kernel-driven models contributes to the development of such inversion procedures to accurately retrieve vegetation properties, particularly based on the operational global BRDF parameters derived from satellite multiangle observations (e.g., MODIS. In this study, the main objective is to investigate the potential for linking a popular physical model (PROSAIL with the widely used kernel-driven Ross-Li models. To do this, the BRFs and albedo are generated by the physical PROSAIL in a forward model, and then the simulated BRFs are input into the kernel-driven BRDF model for retrieval of the BRFs and albedo in the same viewing and solar geometries. To further strengthen such an investigation, a variety of field-measured multiangle reflectances have also been used to investigate the potential for linking these two models. For simulated BRFs generated by the PROSAIL model at 659 and 865 nm, the two models are generally comparable to each other, and the resultant root mean square errors (RMSEs are 0.0092 and 0.0355, respectively, although some

  1. [Feasibility of the extended application of near infrared universal quantitative models].

    Science.gov (United States)

    Lei, De-Qing; Hu, Chang-Qin; Feng, Yan-Chun; Feng, Fang

    2010-11-01

    Construction of a successful near infrared analysis model is a complex task. It spends a lot of manpower and material resources, and is restricted by sample collection and model optimization. So it is important to study on the extended application of the existing near infrared (NIR) models. In this paper, cephradine capsules universal quantitative model was used as an example to study on the feasibility of its extended application. Slope/bias correction and piecewise direct standardization correction methods were used to make the universal model to fit to predict the intermediates in manufacturing processes of cephradine capsules, such as the content of powder blend or granules. The results showed that the corrected NIR universal quantitative model can be used for process control although the results of the model correction by slope/bias or piecewise direct standardization were not as good as that of model updating. And it also indicated that the model corrected by slope/bias is better than that by piecewise direct standardization. Model correction provided a new application for NIR universal models in process control.

  2. A minimal model for stabilization of biomolecules by hydrocarbon cross-linking

    Science.gov (United States)

    Hamacher, K.; Hübsch, A.; McCammon, J. A.

    2006-04-01

    Programmed cell death regulating protein motifs play an essential role in the development of an organism, its immune response, and disease-related cellular mechanisms. Among those motifs the BH3 domain of the BCL-2 family is found to be of crucial importance. Recent experiments showed how the isolated, otherwise unstructured BH3 peptide can be modified by a hydrocarbon linkage to regain function. We parametrized a reduced, dynamic model for the stability effects of such covalent cross-linking and confirmed that the model reproduces the reinforcement of the structural stability of the BH3 motif by cross-linking. We show that an analytically solvable model for thermostability around the native state is not capable of reproducing the stabilization effect. This points to the crucial importance of the peptide dynamics and the fluctuations neglected in the analytic model for the cross-linking system to function properly. This conclusion is supported by a thorough analysis of a simulated Gō model. The resulting model is suitable for rational design of generic cross-linking systems in silicio.

  3. The Chain-Link Fence Model: A Framework for Creating Security Procedures

    OpenAIRE

    Houghton, Robert F.

    2013-01-01

    A long standing problem in information technology security is how to help reduce the security footprint. Many specific proposals exist to address specific problems in information technology security. Most information technology solutions need to be repeatable throughout the course of an information systems lifecycle. The Chain-Link Fence Model is a new model for creating and implementing information technology procedures. This model was validated by two different methods: the first being int...

  4. Development of quantitative atomic modeling for tungsten transport study Using LHD plasma with tungsten pellet injection

    International Nuclear Information System (INIS)

    Murakami, I.; Sakaue, H.A.; Suzuki, C.; Kato, D.; Goto, M.; Tamura, N.; Sudo, S.; Morita, S.

    2014-10-01

    Quantitative tungsten study with reliable atomic modeling is important for successful achievement of ITER and fusion reactors. We have developed tungsten atomic modeling for understanding the tungsten behavior in fusion plasmas. The modeling is applied to the analysis of tungsten spectra observed from currentless plasmas of the Large Helical Device (LHD) with tungsten pellet injection. We found that extreme ultraviolet (EUV) lines of W 24+ to W 33+ ions are very sensitive to electron temperature (Te) and useful to examine the tungsten behavior in edge plasmas. Based on the first quantitative analysis of measured spatial profile of W 44+ ion, the tungsten concentration is determined to be n(W 44+ )/n e = 1.4x10 -4 and the total radiation loss is estimated as ∼4 MW, of which the value is roughly half the total NBI power. (author)

  5. Group Active Engagements Using Quantitative Modeling of Physiology Concepts in Large-Enrollment Biology Classes

    Directory of Open Access Journals (Sweden)

    Karen L. Carleton

    2016-12-01

    Full Text Available Organismal Biology is the third introductory biology course taught at the University of Maryland. Students learn about the geometric, physical, chemical, and thermodynamic constraints that are common to all life, and their implications for the evolution of multicellular organisms based on a common genetic “toolbox.”  An additional goal is helping students to improve their scientific logic and comfort with quantitative modeling.  We recently developed group active engagement exercises (GAEs for this Organismal Biology class.  Currently, our class is built around twelve GAE activities implemented in an auditorium lecture hall in a large enrollment class.  The GAEs examine scientific concepts using a variety of models including physical models, qualitative models, and Excel-based quantitative models. Three quantitative GAEs give students an opportunity to build their understanding of key physiological ideas. 1 The Escape from Planet Ranvier exercise reinforces student understanding that membrane permeability means that ions move through open channels in the membrane.  2 The Stressing and Straining exercise requires students to quantify the elastic modulus from data gathered either in class or from scientific literature. 3 In Leveraging Your Options exercise, students learn about lever systems and apply this knowledge to biological systems.

  6. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  7. Discovering link communities in complex networks by an integer programming model and a genetic algorithm.

    Directory of Open Access Journals (Sweden)

    Zhenping Li

    Full Text Available Identification of communities in complex networks is an important topic and issue in many fields such as sociology, biology, and computer science. Communities are often defined as groups of related nodes or links that correspond to functional subunits in the corresponding complex systems. While most conventional approaches have focused on discovering communities of nodes, some recent studies start partitioning links to find overlapping communities straightforwardly. In this paper, we propose a new quantity function for link community identification in complex networks. Based on this quantity function we formulate the link community partition problem into an integer programming model which allows us to partition a complex network into overlapping communities. We further propose a genetic algorithm for link community detection which can partition a network into overlapping communities without knowing the number of communities. We test our model and algorithm on both artificial networks and real-world networks. The results demonstrate that the model and algorithm are efficient in detecting overlapping community structure in complex networks.

  8. Discovering link communities in complex networks by an integer programming model and a genetic algorithm.

    Science.gov (United States)

    Li, Zhenping; Zhang, Xiang-Sun; Wang, Rui-Sheng; Liu, Hongwei; Zhang, Shihua

    2013-01-01

    Identification of communities in complex networks is an important topic and issue in many fields such as sociology, biology, and computer science. Communities are often defined as groups of related nodes or links that correspond to functional subunits in the corresponding complex systems. While most conventional approaches have focused on discovering communities of nodes, some recent studies start partitioning links to find overlapping communities straightforwardly. In this paper, we propose a new quantity function for link community identification in complex networks. Based on this quantity function we formulate the link community partition problem into an integer programming model which allows us to partition a complex network into overlapping communities. We further propose a genetic algorithm for link community detection which can partition a network into overlapping communities without knowing the number of communities. We test our model and algorithm on both artificial networks and real-world networks. The results demonstrate that the model and algorithm are efficient in detecting overlapping community structure in complex networks.

  9. Towards the Development of Global Nano-Quantitative Structure–Property Relationship Models: Zeta Potentials of Metal Oxide Nanoparticles

    Directory of Open Access Journals (Sweden)

    Andrey A. Toropov

    2018-04-01

    Full Text Available Zeta potential indirectly reflects a charge of the surface of nanoparticles in solutions and could be used to represent the stability of the colloidal solution. As processes of synthesis, testing and evaluation of new nanomaterials are expensive and time-consuming, so it would be helpful to estimate an approximate range of properties for untested nanomaterials using computational modeling. We collected the largest dataset of zeta potential measurements of bare metal oxide nanoparticles in water (87 data points. The dataset was used to develop quantitative structure–property relationship (QSPR models. Essential features of nanoparticles were represented using a modified simplified molecular input line entry system (SMILES. SMILES strings reflected the size-dependent behavior of zeta potentials, as the considered quasi-SMILES modification included information about both chemical composition and the size of the nanoparticles. Three mathematical models were generated using the Monte Carlo method, and their statistical quality was evaluated (R2 for the training set varied from 0.71 to 0.87; for the validation set, from 0.67 to 0.82; root mean square errors for both training and validation sets ranged from 11.3 to 17.2 mV. The developed models were analyzed and linked to aggregation effects in aqueous solutions.

  10. Quantitative chemogenomics: machine-learning models of protein-ligand interaction.

    Science.gov (United States)

    Andersson, Claes R; Gustafsson, Mats G; Strömbergsson, Helena

    2011-01-01

    Chemogenomics is an emerging interdisciplinary field that lies in the interface of biology, chemistry, and informatics. Most of the currently used drugs are small molecules that interact with proteins. Understanding protein-ligand interaction is therefore central to drug discovery and design. In the subfield of chemogenomics known as proteochemometrics, protein-ligand-interaction models are induced from data matrices that consist of both protein and ligand information along with some experimentally measured variable. The two general aims of this quantitative multi-structure-property-relationship modeling (QMSPR) approach are to exploit sparse/incomplete information sources and to obtain more general models covering larger parts of the protein-ligand space, than traditional approaches that focuses mainly on specific targets or ligands. The data matrices, usually obtained from multiple sparse/incomplete sources, typically contain series of proteins and ligands together with quantitative information about their interactions. A useful model should ideally be easy to interpret and generalize well to new unseen protein-ligand combinations. Resolving this requires sophisticated machine-learning methods for model induction, combined with adequate validation. This review is intended to provide a guide to methods and data sources suitable for this kind of protein-ligand-interaction modeling. An overview of the modeling process is presented including data collection, protein and ligand descriptor computation, data preprocessing, machine-learning-model induction and validation. Concerns and issues specific for each step in this kind of data-driven modeling will be discussed. © 2011 Bentham Science Publishers

  11. A Model Linked to E. Coli Related to Electrostrictive Energy in Cancer Cell

    Directory of Open Access Journals (Sweden)

    T. K. BASAK

    2010-02-01

    Full Text Available The paper has focused on a new concept in respect of the status of oxidant/antioxidant in cancer cell following radiation therapy. And in this respect a model has been developed linked with an environment of E.Coli in which TrpRS II is induced after radiation damage. It is interesting to note that Electrostrictive energy is the input to the model the output of which is the oxidant/antioxidant ratio. This ratio is related to the status of Electrostrictive energy derived from capacitance relaxation phenomenon (US patent No. US Patent No. TK Basak 5691178, 1997 in cancer cell. The oxidant/antioxidant ratio is linked to Electrostrictive energy with increasing pH. This paper discusses about the status of phosphorylation and dephosphorylation after radiation therapy linked to E.Coli environment against the pH gradient is indicative for the treatment of cancer.

  12. Quantitative Analysis of Intra Urban Growth Modeling using socio economic agents by combining cellular automata model with agent based model

    Science.gov (United States)

    Singh, V. K.; Jha, A. K.; Gupta, K.; Srivastav, S. K.

    2017-12-01

    Recent studies indicate that there is a significant improvement in the urban land use dynamics through modeling at finer spatial resolutions. Geo-computational models such as cellular automata and agent based model have given evident proof regarding the quantification of the urban growth pattern with urban boundary. In recent studies, socio- economic factors such as demography, education rate, household density, parcel price of the current year, distance to road, school, hospital, commercial centers and police station are considered to the major factors influencing the Land Use Land Cover (LULC) pattern of the city. These factors have unidirectional approach to land use pattern which makes it difficult to analyze the spatial aspects of model results both quantitatively and qualitatively. In this study, cellular automata model is combined with generic model known as Agent Based Model to evaluate the impact of socio economic factors on land use pattern. For this purpose, Dehradun an Indian city is selected as a case study. Socio economic factors were collected from field survey, Census of India, Directorate of economic census, Uttarakhand, India. A 3X3 simulating window is used to consider the impact on LULC. Cellular automata model results are examined for the identification of hot spot areas within the urban area and agent based model will be using logistic based regression approach where it will identify the correlation between each factor on LULC and classify the available area into low density, medium density, high density residential or commercial area. In the modeling phase, transition rule, neighborhood effect, cell change factors are used to improve the representation of built-up classes. Significant improvement is observed in the built-up classes from 84 % to 89 %. However after incorporating agent based model with cellular automata model the accuracy improved from 89 % to 94 % in 3 classes of urban i.e. low density, medium density and commercial classes

  13. Summary goodness-of-fit statistics for binary generalized linear models with noncanonical link functions.

    Science.gov (United States)

    Canary, Jana D; Blizzard, Leigh; Barry, Ronald P; Hosmer, David W; Quinn, Stephen J

    2016-05-01

    Generalized linear models (GLM) with a canonical logit link function are the primary modeling technique used to relate a binary outcome to predictor variables. However, noncanonical links can offer more flexibility, producing convenient analytical quantities (e.g., probit GLMs in toxicology) and desired measures of effect (e.g., relative risk from log GLMs). Many summary goodness-of-fit (GOF) statistics exist for logistic GLM. Their properties make the development of GOF statistics relatively straightforward, but it can be more difficult under noncanonical links. Although GOF tests for logistic GLM with continuous covariates (GLMCC) have been applied to GLMCCs with log links, we know of no GOF tests in the literature specifically developed for GLMCCs that can be applied regardless of link function chosen. We generalize the Tsiatis GOF statistic originally developed for logistic GLMCCs, (TG), so that it can be applied under any link function. Further, we show that the algebraically related Hosmer-Lemeshow (HL) and Pigeon-Heyse (J(2) ) statistics can be applied directly. In a simulation study, TG, HL, and J(2) were used to evaluate the fit of probit, log-log, complementary log-log, and log models, all calculated with a common grouping method. The TG statistic consistently maintained Type I error rates, while those of HL and J(2) were often lower than expected if terms with little influence were included. Generally, the statistics had similar power to detect an incorrect model. An exception occurred when a log GLMCC was incorrectly fit to data generated from a logistic GLMCC. In this case, TG had more power than HL or J(2) . © 2015 John Wiley & Sons Ltd/London School of Economics.

  14. [Study on temperature correctional models of quantitative analysis with near infrared spectroscopy].

    Science.gov (United States)

    Zhang, Jun; Chen, Hua-cai; Chen, Xing-dan

    2005-06-01

    Effect of enviroment temperature on near infrared spectroscopic quantitative analysis was studied. The temperature correction model was calibrated with 45 wheat samples at different environment temperaturs and with the temperature as an external variable. The constant temperature model was calibated with 45 wheat samples at the same temperature. The predicted results of two models for the protein contents of wheat samples at different temperatures were compared. The results showed that the mean standard error of prediction (SEP) of the temperature correction model was 0.333, but the SEP of constant temperature (22 degrees C) model increased as the temperature difference enlarged, and the SEP is up to 0.602 when using this model at 4 degrees C. It was suggested that the temperature correctional model improves the analysis precision.

  15. Quantitative explanation of circuit experiments and real traffic using the optimal velocity model

    Science.gov (United States)

    Nakayama, Akihiro; Kikuchi, Macoto; Shibata, Akihiro; Sugiyama, Yuki; Tadaki, Shin-ichi; Yukawa, Satoshi

    2016-04-01

    We have experimentally confirmed that the occurrence of a traffic jam is a dynamical phase transition (Tadaki et al 2013 New J. Phys. 15 103034, Sugiyama et al 2008 New J. Phys. 10 033001). In this study, we investigate whether the optimal velocity (OV) model can quantitatively explain the results of experiments. The occurrence and non-occurrence of jammed flow in our experiments agree with the predictions of the OV model. We also propose a scaling rule for the parameters of the model. Using this rule, we obtain critical density as a function of a single parameter. The obtained critical density is consistent with the observed values for highway traffic.

  16. Study on quantitative reliability analysis by multilevel flow models for nuclear power plants

    International Nuclear Information System (INIS)

    Yang Ming; Zhang Zhijian

    2011-01-01

    Multilevel Flow Models (MFM) is a goal-oriented system modeling method. MFM explicitly describes how a system performs the required functions under stated conditions for a stated period of time. This paper presents a novel system reliability analysis method based on MFM (MRA). The proposed method allows describing the system knowledge at different levels of abstraction which makes the reliability model easy for understanding, establishing, modifying and extending. The success probabilities of all main goals and sub-goals can be available by only one-time quantitative analysis. The proposed method is suitable for the system analysis and scheme comparison for complex industrial systems such as nuclear power plants. (authors)

  17. A quantitative analysis of instabilities in the linear chiral sigma model

    International Nuclear Information System (INIS)

    Nemes, M.C.; Nielsen, M.; Oliveira, M.M. de; Providencia, J. da

    1990-08-01

    We present a method to construct a complete set of stationary states corresponding to small amplitude motion which naturally includes the continuum solution. The energy wheighted sum rule (EWSR) is shown to provide for a quantitative criterium on the importance of instabilities which is known to occur in nonasymptotically free theories. Out results for the linear σ model showed be valid for a large class of models. A unified description of baryon and meson properties in terms of the linear σ model is also given. (author)

  18. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...

  19. Polymorphic ethyl alcohol as a model system for the quantitative study of glassy behaviour

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H.E.; Schober, H.; Gonzalez, M.A. [Institut Max von Laue - Paul Langevin (ILL), 38 - Grenoble (France); Bermejo, F.J.; Fayos, R.; Dawidowski, J. [Consejo Superior de Investigaciones Cientificas, Madrid (Spain); Ramos, M.A.; Vieira, S. [Universidad Autonoma de Madrid (Spain)

    1997-04-01

    The nearly universal transport and dynamical properties of amorphous materials or glasses are investigated. Reasonably successful phenomenological models have been developed to account for these properties as well as the behaviour near the glass-transition, but quantitative microscopic models have had limited success. One hindrance to these investigations has been the lack of a material which exhibits glass-like properties in more than one phase at a given temperature. This report presents results of neutron-scattering experiments for one such material ordinary ethyl alcohol, which promises to be a model system for future investigations of glassy behaviour. (author). 8 refs.

  20. A hierarchically adaptable spatial regression model to link aggregated health data and environmental data

    NARCIS (Netherlands)

    Truong Ngoc Phuong, Phuong; Stein, A.

    2017-01-01

    Health data and environmental data are commonly collected at different levels of aggregation. A persistent challenge of using a spatial regression model to link these data is that their associations can vary as a function of aggregation. This results into ecological fallacy if association at one

  1. Linking HR strategy, e-HR goals, architectures, and outcomes: a model and case study evidence.

    NARCIS (Netherlands)

    Reddington, Martin; Martin, Graeme; Bondarouk, T.V.; Bondarouk, Tatiana; Ruel, H.; Ruel, Hubertus Johannes Maria; Looise, J.C.; Looise, Jan C.

    2011-01-01

    Building on our earlier model of the links between HR strategy, e-HR goals, architectures, and outcomes, we illustrate the relationship between some of these elements with data from three global organizations. In doing so, we aim to help academics and practitioners understand this increasingly

  2. Link Budget Analysis and Modeling of Short-Range UWB Channels

    NARCIS (Netherlands)

    Irahhauten, Z.; Dacuna, J.; Janssen, G.J.M.; Nikookar, H.; Yarovoy, A.G.; Ligthart, L.P.

    2008-01-01

    Ultrawideband (UWB) technology is an attractive alternative for short-range applications, e.g., wireless personal area networks. In these applications, transmit and receive antennas are very close to each other and the far-field condition assumed in most of the link budget models may not be

  3. Evaluation of mobile ad hoc network reliability using propagation-based link reliability model

    International Nuclear Information System (INIS)

    Padmavathy, N.; Chaturvedi, Sanjay K.

    2013-01-01

    A wireless mobile ad hoc network (MANET) is a collection of solely independent nodes (that can move randomly around the area of deployment) making the topology highly dynamic; nodes communicate with each other by forming a single hop/multi-hop network and maintain connectivity in decentralized manner. MANET is modelled using geometric random graphs rather than random graphs because the link existence in MANET is a function of the geometric distance between the nodes and the transmission range of the nodes. Among many factors that contribute to the MANET reliability, the reliability of these networks also depends on the robustness of the link between the mobile nodes of the network. Recently, the reliability of such networks has been evaluated for imperfect nodes (transceivers) with binary model of communication links based on the transmission range of the mobile nodes and the distance between them. However, in reality, the probability of successful communication decreases as the signal strength deteriorates due to noise, fading or interference effects even up to the nodes' transmission range. Hence, in this paper, using a propagation-based link reliability model rather than a binary-model with nodes following a known failure distribution to evaluate the network reliability (2TR m , ATR m and AoTR m ) of MANET through Monte Carlo Simulation is proposed. The method is illustrated with an application and some imperative results are also presented

  4. Translational PKPD modeling in schizophrenia: linking receptor occupancy of antipsychotics to efficacy and safety

    NARCIS (Netherlands)

    Pilla Reddy, Venkatesh; Kozielska, Magdalena; Johnson, Martin; Vermeulen, An; Liu, Jing; de Greef, Rik; Groothuis, Genoveva; Danhof, Meindert; Proost, Johannes

    2012-01-01

    Objectives: To link the brain dopamine D2 receptor occupancy (D2RO) of antipsychotic drugs with clinical endpoints of efficacy and safety to assess the therapeutic window of D2RO. Methods: Pharmacokinetic-Pharmacodynamic (PK-PD) models were developed to predict the D2 receptor occupancy of

  5. The Chain-Link Fence Model: A Framework for Creating Security Procedures

    Science.gov (United States)

    Houghton, Robert F.

    2013-01-01

    A long standing problem in information technology security is how to help reduce the security footprint. Many specific proposals exist to address specific problems in information technology security. Most information technology solutions need to be repeatable throughout the course of an information systems lifecycle. The Chain-Link Fence Model is…

  6. Incorporation of caffeine into a quantitative model of fatigue and sleep.

    Science.gov (United States)

    Puckeridge, M; Fulcher, B D; Phillips, A J K; Robinson, P A

    2011-03-21

    A recent physiologically based model of human sleep is extended to incorporate the effects of caffeine on sleep-wake timing and fatigue. The model includes the sleep-active neurons of the hypothalamic ventrolateral preoptic area (VLPO), the wake-active monoaminergic brainstem populations (MA), their interactions with cholinergic/orexinergic (ACh/Orx) input to MA, and circadian and homeostatic drives. We model two effects of caffeine on the brain due to competitive antagonism of adenosine (Ad): (i) a reduction in the homeostatic drive and (ii) an increase in cholinergic activity. By comparing the model output to experimental data, constraints are determined on the parameters that describe the action of caffeine on the brain. In accord with experiment, the ranges of these parameters imply significant variability in caffeine sensitivity between individuals, with caffeine's effectiveness in reducing fatigue being highly dependent on an individual's tolerance, and past caffeine and sleep history. Although there are wide individual differences in caffeine sensitivity and thus in parameter values, once the model is calibrated for an individual it can be used to make quantitative predictions for that individual. A number of applications of the model are examined, using exemplar parameter values, including: (i) quantitative estimation of the sleep loss and the delay to sleep onset after taking caffeine for various doses and times; (ii) an analysis of the system's stable states showing that the wake state during sleep deprivation is stabilized after taking caffeine; and (iii) comparing model output successfully to experimental values of subjective fatigue reported in a total sleep deprivation study examining the reduction of fatigue with caffeine. This model provides a framework for quantitatively assessing optimal strategies for using caffeine, on an individual basis, to maintain performance during sleep deprivation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. From classical genetics to quantitative genetics to systems biology: modeling epistasis.

    Directory of Open Access Journals (Sweden)

    David L Aylor

    2008-03-01

    Full Text Available Gene expression data has been used in lieu of phenotype in both classical and quantitative genetic settings. These two disciplines have separate approaches to measuring and interpreting epistasis, which is the interaction between alleles at different loci. We propose a framework for estimating and interpreting epistasis from a classical experiment that combines the strengths of each approach. A regression analysis step accommodates the quantitative nature of expression measurements by estimating the effect of gene deletions plus any interaction. Effects are selected by significance such that a reduced model describes each expression trait. We show how the resulting models correspond to specific hierarchical relationships between two regulator genes and a target gene. These relationships are the basic units of genetic pathways and genomic system diagrams. Our approach can be extended to analyze data from a variety of experiments, multiple loci, and multiple environments.

  8. Refining the statistical model for quantitative immunostaining of surface-functionalized nanoparticles by AFM.

    Science.gov (United States)

    MacCuspie, Robert I; Gorka, Danielle E

    2013-10-01

    Recently, an atomic force microscopy (AFM)-based approach for quantifying the number of biological molecules conjugated to a nanoparticle surface at low number densities was reported. The number of target molecules conjugated to the analyte nanoparticle can be determined with single nanoparticle fidelity using antibody-mediated self-assembly to decorate the analyte nanoparticles with probe nanoparticles (i.e., quantitative immunostaining). This work refines the statistical models used to quantitatively interpret the observations when AFM is used to image the resulting structures. The refinements add terms to the previous statistical models to account for the physical sizes of the analyte nanoparticles, conjugated molecules, antibodies, and probe nanoparticles. Thus, a more physically realistic statistical computation can be implemented for a given sample of known qualitative composition, using the software scripts provided. Example AFM data sets, using horseradish peroxidase conjugated to gold nanoparticles, are presented to illustrate how to implement this method successfully.

  9. Accounting for genetic interactions improves modeling of individual quantitative trait phenotypes in yeast.

    Science.gov (United States)

    Forsberg, Simon K G; Bloom, Joshua S; Sadhu, Meru J; Kruglyak, Leonid; Carlborg, Örjan

    2017-04-01

    Experiments in model organisms report abundant genetic interactions underlying biologically important traits, whereas quantitative genetics theory predicts, and data support, the notion that most genetic variance in populations is additive. Here we describe networks of capacitating genetic interactions that contribute to quantitative trait variation in a large yeast intercross population. The additive variance explained by individual loci in a network is highly dependent on the allele frequencies of the interacting loci. Modeling of phenotypes for multilocus genotype classes in the epistatic networks is often improved by accounting for the interactions. We discuss the implications of these results for attempts to dissect genetic architectures and to predict individual phenotypes and long-term responses to selection.

  10. Tannin structural elucidation and quantitative ³¹P NMR analysis. 1. Model compounds.

    Science.gov (United States)

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    Tannins and flavonoids are secondary metabolites of plants that display a wide array of biological activities. This peculiarity is related to the inhibition of extracellular enzymes that occurs through the complexation of peptides by tannins. Not only the nature of these interactions, but more fundamentally also the structure of these heterogeneous polyphenolic molecules are not completely clear. This first paper describes the development of a new analytical method for the structural characterization of tannins on the basis of tannin model compounds employing an in situ labeling of all labile H groups (aliphatic OH, phenolic OH, and carboxylic acids) with a phosphorus reagent. The ³¹P NMR analysis of ³¹P-labeled samples allowed the unprecedented quantitative and qualitative structural characterization of hydrolyzable tannins, proanthocyanidins, and catechin tannin model compounds, forming the foundations for the quantitative structural elucidation of a variety of actual tannin samples described in part 2 of this series.

  11. Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model

    Science.gov (United States)

    Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi

    2017-09-01

    Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.

  12. Reaction-Diffusion Degradation Model for Delayed Erosion of Cross-Linked Polyanhydride Biomaterials

    OpenAIRE

    Domanskyi, Sergii; Poetz, Katie L.; Shipp, Devon A.; Privman, Vladimir

    2015-01-01

    We develop a theoretical model to explain the long induction interval of water intake that precedes the onset of erosion due to degradation caused by hydrolysis in the recently synthesized and studied cross-linked polyanhydrides. Various kinetic mechanisms are incorporated in the model in an attempt to explain the experimental data for the mass loss profile. Our key finding is that the observed long induction interval is attributable to the nonlinear dependence of the degradation rate constan...

  13. [Application of DOSC combined with SBC in batches transfer of NIR quantitative model].

    Science.gov (United States)

    Jia, Yi-Fei; Zhang, Ying-Ying; Xu, Bing; Wang, An-Dong; Zhan, Xue-Yan

    2017-06-01

    Near infrared model established under a certain condition can be applied to the new samples status, environmental conditions or instrument status through the model transfer. Spectral background correction and model update are two types of data process methods of NIR quantitative model transfer, and orthogonal signal regression (OSR) is a method based on spectra background correction, in which virtual standard spectra is used to fit a linear relation between master batches spectra and slave batches spectra, and map the slave batches spectra to the master batch spectra to realize the transfer of near infrared quantitative model. However, the above data processing method requires the represent activeness of the virtual standard spectra, otherwise the big error will occur in the process of regression. Therefore, direct orthogonal signal correction-slope and bias correction (DOSC-SBC) method was proposed in this paper to solve the problem of PLS model's failure to predict accurately the content of target components in the formula of different batches, analyze the difference between the spectra background of the samples from different sources and the prediction error of PLS models. DOSC method was used to eliminate the difference of spectral background unrelated to target value, and after being combined with SBC method, the system errors between the different batches of samples were corrected to make the NIR quantitative model transferred between different batches. After DOSC-SBC method was used in the preparation process of water extraction and ethanol precipitation of Lonicerae Japonicae Flos in this paper, the prediction error of new batches of samples was decreased to 7.30% from 32.3% and to 4.34% from 237%, with significantly improved prediction accuracy, so that the target component in the new batch samples can be quickly quantified. DOSC-SBC model transfer method has realized the transfer of NIR quantitative model between different batches, and this method does

  14. Variable selection in near infrared spectroscopy for quantitative models of homologous analogs of cephalosporins

    Directory of Open Access Journals (Sweden)

    Yan-Chun Feng

    2014-07-01

    Full Text Available Two universal spectral ranges (4550–4100 cm-1 and 6190–5510 cm-1 for construction of quantitative models of homologous analogs of cephalosporins were proposed by evaluating the performance of five spectral ranges and their combinations, using three data sets of cephalosporins for injection, i.e., cefuroxime sodium, ceftriaxone sodium and cefoperazone sodium. Subsequently, the proposed ranges were validated by using eight calibration sets of other homologous analogs of cephalosporins for injection, namely cefmenoxime hydrochloride, ceftezole sodium, cefmetazole, cefoxitin sodium, cefotaxime sodium, cefradine, cephazolin sodium and ceftizoxime sodium. All the constructed quantitative models for the eight kinds of cephalosporins using these universal ranges could fulfill the requirements for quick quantification. After that, competitive adaptive reweighted sampling (CARS algorithm and infrared (IR–near infrared (NIR two-dimensional (2D correlation spectral analysis were used to determine the scientific basis of these two spectral ranges as the universal regions for the construction of quantitative models of cephalosporins. The CARS algorithm demonstrated that the ranges of 4550–4100 cm-1 and 6190–5510 cm-1 included some key wavenumbers which could be attributed to content changes of cephalosporins. The IR–NIR 2D spectral analysis showed that certain wavenumbers in these two regions have strong correlations to the structures of those cephalosporins that were easy to degrade.

  15. Qualitative and quantitative guidelines for the comparison of environmental model predictions

    International Nuclear Information System (INIS)

    Scott, M.

    1995-03-01

    The question of how to assess or compare predictions from a number of models is one of concern in the validation of models, in understanding the effects of different models and model parameterizations on model output, and ultimately in assessing model reliability. Comparison of model predictions with observed data is the basic tool of model validation while comparison of predictions amongst different models provides one measure of model credibility. The guidance provided here is intended to provide qualitative and quantitative approaches (including graphical and statistical techniques) to such comparisons for use within the BIOMOVS II project. It is hoped that others may find it useful. It contains little technical information on the actual methods but several references are provided for the interested reader. The guidelines are illustrated on data from the VAMP CB scenario. Unfortunately, these data do not permit all of the possible approaches to be demonstrated since predicted uncertainties were not provided. The questions considered are concerned with a) intercomparison of model predictions and b) comparison of model predictions with the observed data. A series of examples illustrating some of the different types of data structure and some possible analyses have been constructed. A bibliography of references on model validation is provided. It is important to note that the results of the various techniques discussed here, whether qualitative or quantitative, should not be considered in isolation. Overall model performance must also include an evaluation of model structure and formulation, i.e. conceptual model uncertainties, and results for performance measures must be interpreted in this context. Consider a number of models which are used to provide predictions of a number of quantities at a number of time points. In the case of the VAMP CB scenario, the results include predictions of total deposition of Cs-137 and time dependent concentrations in various

  16. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    Science.gov (United States)

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  17. A prosthesis-specific multi-link segment model of lower-limb amputee sprinting.

    Science.gov (United States)

    Rigney, Stacey M; Simmons, Anne; Kark, Lauren

    2016-10-03

    Lower-limb amputees commonly utilize non-articulating energy storage and return (ESAR) prostheses for high impact activities such as sprinting. Despite these prostheses lacking an articulating ankle joint, amputee gait analysis conventionally features a two-link segment model of the prosthetic foot. This paper investigated the effects of the selected link segment model׳s marker-set and geometry on a unilateral amputee sprinter׳s calculated lower-limb kinematics, kinetics and energetics. A total of five lower-limb models of the Ottobock ® 1E90 Sprinter were developed, including two conventional shank-foot models that each used a different version of the Plug-in-Gait (PiG) marker-set to test the effect of prosthesis ankle marker location. Two Hybrid prosthesis-specific models were then developed, also using the PiG marker-sets, with the anatomical shank and foot replaced by prosthesis-specific geometry separated into two segments. Finally, a Multi-link segment (MLS) model was developed, consisting of six segments for the prosthesis as defined by a custom marker-set. All full-body musculoskeletal models were tested using four trials of experimental marker trajectories within OpenSim 3.2 (Stanford, California, USA) to find the affected and unaffected hip, knee and ankle kinematics, kinetics and energetics. The geometry of the selected lower-limb prosthesis model was found to significantly affect all variables on the affected leg (p prosthesis-specific spatial, inertial and elastic properties from full-body models significantly affects the calculated amputee gait characteristics, and we therefore recommend the implementation of a MLS model. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Fixing the cracks in the crystal ball: A maturity model for quantitative risk assessment

    International Nuclear Information System (INIS)

    Rae, Andrew; Alexander, Rob; McDermid, John

    2014-01-01

    Quantitative risk assessment (QRA) is widely practiced in system safety, but there is insufficient evidence that QRA in general is fit for purpose. Defenders of QRA draw a distinction between poor or misused QRA and correct, appropriately used QRA, but this distinction is only useful if we have robust ways to identify the flaws in an individual QRA. In this paper we present a comprehensive maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature and in a collection of risk assessment peer reviews. We provide initial validation of the completeness and realism of the model. Our risk assessment maturity model provides a way to prioritise both process development within an organisation and empirical research within the QRA community. - Highlights: • Quantitative risk assessment (QRA) is widely practiced, but there is insufficient evidence that it is fit for purpose. • A given QRA may be good, or it may not – we need systematic ways to distinguish this. • We have created a maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature. • We have provided initial validation of the completeness and realism of the model. • The maturity model can also be used to prioritise QRA research discipline-wide

  19. Instantaneous thermal modeling of the DC-link capacitor in PhotoVoltaic systems

    DEFF Research Database (Denmark)

    Yang, Yongheng; Ma, Ke; Wang, Huai

    2015-01-01

    , instantaneous thermal modeling approaches considering mission profiles for the DC-link capacitor in single-phase PV systems are explored in this paper. These thermal modelling approaches are based on: a) fast Fourier transform, b) look-up tables, and c) ripple current reconstruction. Moreover, the thermal...... thermal loading from the operating conditions. As a consequence, it offers new insights into the temperature monitoring and reliability-oriented design of the DC-link capacitors, and thus a more reliable operation of single-phase grid-connected PV systems can be enhanced. Study results on a 3-kW single......-phase grid-connected PV system have been adopted to demonstrate a look-up table based modelling approach, where real-field daily ambient conditions are considered....

  20. Dynamic Modelling and Trajectory Tracking of Parallel Manipulator with Flexible Link

    Directory of Open Access Journals (Sweden)

    Chen Zhengsheng

    2013-09-01

    Full Text Available This paper mainly focuses on dynamic modelling and real-time control for a parallel manipulator with flexible link. The Lagrange principle and assumed modes method (AMM substructure technique is presented to formulate the dynamic modelling of a two-degrees-of-freedom (DOF parallel manipulator with flexible links. Then, the singular perturbation technique (SPT is used to decompose the nonlinear dynamic system into slow time-scale and fast time-scale subsystems. Furthermore, the SPT is employed to transform the differential algebraic equations (DAEs for kinematic constraints into explicit ordinary differential equations (ODEs, which makes real-time control possible. In addition, a novel composite control scheme is presented; the computed torque control is applied for a slow subsystem and the H∞ technique for the fast subsystem, taking account of the model uncertainty and outside disturbance. The simulation results show the composite control can effectively achieve fast and accurate tracking control.

  1. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    Science.gov (United States)

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (pquantitative DCE-MRI, the quantitative DCE-MRI parameter K trans is a more sensitive and accurate index for detecting early reduced perfusion in osteoporotic bone.

  2. CSML2SBML: a novel tool for converting quantitative biological pathway models from CSML into SBML.

    Science.gov (United States)

    Li, Chen; Nagasaki, Masao; Ikeda, Emi; Sekiya, Yayoi; Miyano, Satoru

    2014-07-01

    CSML and SBML are XML-based model definition standards which are developed with the aim of creating exchange formats for modeling, visualizing and simulating biological pathways. In this article we report a release of a format convertor for quantitative pathway models, namely CSML2SBML. It translates models encoded by CSML into SBML without loss of structural and kinetic information. The simulation and parameter estimation of the resulting SBML model can be carried out with compliant tool CellDesigner for further analysis. The convertor is based on the standards CSML version 3.0 and SBML Level 2 Version 4. In our experiments, 11 out of 15 pathway models in CSML model repository and 228 models in Macrophage Pathway Knowledgebase (MACPAK) are successfully converted to SBML models. The consistency of the resulting model is validated by libSBML Consistency Check of CellDesigner. Furthermore, the converted SBML model assigned with the kinetic parameters translated from CSML model can reproduce the same dynamics with CellDesigner as CSML one running on Cell Illustrator. CSML2SBML, along with its instructions and examples for use are available at http://csml2sbml.csml.org. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  4. A proposed model of psychodynamic psychotherapy linked to Erik Erikson's eight stages of psychosocial development.

    Science.gov (United States)

    Knight, Zelda Gillian

    2017-09-01

    Just as Freud used stages of psychosexual development to ground his model of psychoanalysis, it is possible to do the same with Erik Erikson's stages of development with regards to a model of psychodynamic psychotherapy. This paper proposes an eight-stage model of psychodynamic psychotherapy linked to Erik Erikson's eight stages of psychosocial development. Various suggestions are offered. One such suggestion is that as each of Erikson's developmental stages is triggered by a crisis, in therapy it is triggered by the client's search. The resolution of the search often leads to the development of another search, which implies that the therapy process comprises a series of searches. This idea of a series of searches and resolutions leads to the understanding that identity is developmental and therapy is a space in which a new sense of identity may emerge. The notion of hope is linked to Erikson's stage of Basic Trust and the proposed model of therapy views hope and trust as essential for the therapy process. Two clinical vignettes are offered to illustrate these ideas. Psychotherapy can be approached as an eight-stage process and linked to Erikson's eight stages model of development. Psychotherapy may be viewed as a series of searches and thus as a developmental stage resolution process, which leads to the understanding that identity is ongoing throughout the life span. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Curating and Preparing High-Throughput Screening Data for Quantitative Structure-Activity Relationship Modeling.

    Science.gov (United States)

    Kim, Marlene T; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao

    2016-01-01

    Publicly available bioassay data often contains errors. Curating massive bioassay data, especially high-throughput screening (HTS) data, for Quantitative Structure-Activity Relationship (QSAR) modeling requires the assistance of automated data curation tools. Using automated data curation tools are beneficial to users, especially ones without prior computer skills, because many platforms have been developed and optimized based on standardized requirements. As a result, the users do not need to extensively configure the curation tool prior to the application procedure. In this chapter, a freely available automatic tool to curate and prepare HTS data for QSAR modeling purposes will be described.

  6. Critical body residues linked to octanol-water partitioning, organism composition, and LC50 QSARs: Meta-analysis and model

    NARCIS (Netherlands)

    Hendriks, A.J.; Traas, T.P.; Huijbregts, M.A.J.

    2005-01-01

    To protect thousands of species from thousands of chemicals released in the environment, various risk assessment tools have been developed. Here, we link quantitative structure-activity relationships (QSARs) for response concentrations in water (LC50) to critical concentrations in organisms (C-50)

  7. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    Science.gov (United States)

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. © 2013 Elsevier B.V. All rights reserved.

  8. A Quantitative Risk Evaluation Model for Network Security Based on Body Temperature

    Directory of Open Access Journals (Sweden)

    Y. P. Jiang

    2016-01-01

    Full Text Available These days, in allusion to the traditional network security risk evaluation model, which have certain limitations for real-time, accuracy, characterization. This paper proposed a quantitative risk evaluation model for network security based on body temperature (QREM-BT, which refers to the mechanism of biological immune system and the imbalance of immune system which can result in body temperature changes, firstly, through the r-contiguous bits nonconstant matching rate algorithm to improve the detection quality of detector and reduce missing rate or false detection rate. Then the dynamic evolution process of the detector was described in detail. And the mechanism of increased antibody concentration, which is made up of activating mature detector and cloning memory detector, is mainly used to assess network risk caused by various species of attacks. Based on these reasons, this paper not only established the equation of antibody concentration increase factor but also put forward the antibody concentration quantitative calculation model. Finally, because the mechanism of antibody concentration change is reasonable and effective, which can effectively reflect the network risk, thus body temperature evaluation model was established in this paper. The simulation results showed that, according to body temperature value, the proposed model has more effective, real time to assess network security risk.

  9. Quantitative structure-activity relationship (QSAR) for insecticides: development of predictive in vivo insecticide activity models.

    Science.gov (United States)

    Naik, P K; Singh, T; Singh, H

    2009-07-01

    Quantitative structure-activity relationship (QSAR) analyses were performed independently on data sets belonging to two groups of insecticides, namely the organophosphates and carbamates. Several types of descriptors including topological, spatial, thermodynamic, information content, lead likeness and E-state indices were used to derive quantitative relationships between insecticide activities and structural properties of chemicals. A systematic search approach based on missing value, zero value, simple correlation and multi-collinearity tests as well as the use of a genetic algorithm allowed the optimal selection of the descriptors used to generate the models. The QSAR models developed for both organophosphate and carbamate groups revealed good predictability with r(2) values of 0.949 and 0.838 as well as [image omitted] values of 0.890 and 0.765, respectively. In addition, a linear correlation was observed between the predicted and experimental LD(50) values for the test set data with r(2) of 0.871 and 0.788 for both the organophosphate and carbamate groups, indicating that the prediction accuracy of the QSAR models was acceptable. The models were also tested successfully from external validation criteria. QSAR models developed in this study should help further design of novel potent insecticides.

  10. A Structured Review of Quantitative Models of the Pharmaceutical Supply Chain

    Directory of Open Access Journals (Sweden)

    Carlos Franco

    2017-01-01

    Full Text Available The aim of this review is to identify and provide a structured overview of quantitative models in the pharmaceutical supply chain, a subject not exhaustively studied in the previous reviews on healthcare logistics related mostly to quantitative models in healthcare or logistics studies in hospitals. The models are classified into three categories of classification: network design, inventory models, and optimization of a pharmaceutical supply chain. A taxonomy for each category is shown describing the principal features of each echelon included in the review; this taxonomy allows the readers to identify easily a paper based on the actors of the pharmaceutical supply chain. The search process included research articles published in the databases between 1984 and November 2016. In total 46 studies were included. In the review process we found that in the three fields the most common source of uncertainty used is the demand in the 56% of the cases. Within the review process we can conclude that most of the articles in the literature are focused on the optimization of the pharmaceutical supply chain and inventory models but the field on supply chain network design is not deeply studied.

  11. Quantitative immunohistochemical method for detection of wheat protein in model sausage

    Directory of Open Access Journals (Sweden)

    Zuzana Řezáčová Lukášková

    2014-01-01

    Full Text Available Since gluten can induce coeliac symptoms in hypersensitive consumers with coeliac disease, it is necessary to label foodstuffs containing it. In order to label foodstuffs, it is essential to find reliable methods to accurately determine the amount of wheat protein in food. The objective of this study was to compare the quantitative detection of wheat protein in model sausages by ELISA and immunohistochemical methods. Immunohistochemistry was combined with stereology to achieve quantitative results. High correlation between addition of wheat protein and compared methods was confirmed. For ELISA method the determined values were r = 0.98, P P < 0.01. Although ELISA is an accredited method, it was not reliable, unlike immunohistochemical methods (stereology SD = 3.1.

  12. Modeling techniques used in the communications link analysis and simulation system (CLASS)

    Science.gov (United States)

    Braun, W. R.; Mckenzie, T. M.

    1985-01-01

    CLASS (Communications Link Analysis and Simulation System) is a software package developed for NASA to predict the communication and tracking performance of the Tracking and Data Relay Satellite System (TDRSS) services. The modeling techniques used in CLASS are described. The components of TDRSS and the performance parameters to be computed by CLASS are too diverse to permit the use of a single technique to evaluate all performance measures. Hence, each CLASS module applies the modeling approach best suited for a particular subsystem and/or performance parameter in terms of model accuracy and computational speed.

  13. Towards Controlling the Glycoform: A Model Framework Linking Extracellular Metabolites to Antibody Glycosylation

    Directory of Open Access Journals (Sweden)

    Philip M. Jedrzejewski

    2014-03-01

    Full Text Available Glycoproteins represent the largest group of the growing number of biologically-derived medicines. The associated glycan structures and their distribution are known to have a large impact on pharmacokinetics. A modelling framework was developed to provide a link from the extracellular environment and its effect on intracellular metabolites to the distribution of glycans on the constant region of an antibody product. The main focus of this work is the mechanistic in silico reconstruction of the nucleotide sugar donor (NSD metabolic network by means of 34 species mass balances and the saturation kinetics rates of the 60 metabolic reactions involved. NSDs are the co-substrates of the glycosylation process in the Golgi apparatus and their simulated dynamic intracellular concentration profiles were linked to an existing model describing the distribution of N-linked glycan structures of the antibody constant region. The modelling framework also describes the growth dynamics of the cell population by means of modified Monod kinetics. Simulation results match well to experimental data from a murine hybridoma cell line. The result is a modelling platform which is able to describe the product glycoform based on extracellular conditions. It represents a first step towards the in silico prediction of the glycoform of a biotherapeutic and provides a platform for the optimisation of bioprocess conditions with respect to product quality.

  14. Quantitative assessment of manual and robotic microcannulation for eye surgery using new eye model.

    Science.gov (United States)

    Tanaka, Shinichi; Harada, Kanako; Ida, Yoshiki; Tomita, Kyohei; Kato, Ippei; Arai, Fumihito; Ueta, Takashi; Noda, Yasuo; Sugita, Naohiko; Mitsuishi, Mamoru

    2015-06-01

    Microcannulation, a surgical procedure for the eye that requires drug injection into a 60-90 µm retinal vein, is difficult to perform manually. Robotic assistance has been proposed; however, its effectiveness in comparison to manual operation has not been quantified. An eye model has been developed to quantify the performance of manual and robotic microcannulation. The eye model, which is implemented with a force sensor and microchannels, also simulates the mechanical constraints of the instrument's movement. Ten subjects performed microcannulation using the model, with and without robotic assistance. The results showed that the robotic assistance was useful for motion stability when the drug was injected, whereas its positioning accuracy offered no advantage. An eye model was used to quantitatively assess the robotic microcannulation performance in comparison to manual operation. This approach could be valid for a better evaluation of surgical robotic assistance. Copyright © 2014 John Wiley & Sons, Ltd.

  15. SOME USES OF MODELS OF QUANTITATIVE GENETIC SELECTION IN SOCIAL SCIENCE.

    Science.gov (United States)

    Weight, Michael D; Harpending, Henry

    2017-01-01

    The theory of selection of quantitative traits is widely used in evolutionary biology, agriculture and other related fields. The fundamental model known as the breeder's equation is simple, robust over short time scales, and it is often possible to estimate plausible parameters. In this paper it is suggested that the results of this model provide useful yardsticks for the description of social traits and the evaluation of transmission models. The differences on a standard personality test between samples of Old Order Amish and Indiana rural young men from the same county and the decline of homicide in Medieval Europe are used as illustrative examples of the overall approach. It is shown that the decline of homicide is unremarkable under a threshold model while the differences between rural Amish and non-Amish young men are too large to be a plausible outcome of simple genetic selection in which assortative mating by affiliation is equivalent to truncation selection.

  16. Quantitative analysis of anaerobic oxidation of methane (AOM) in marine sediments: A modeling perspective

    Science.gov (United States)

    Regnier, P.; Dale, A. W.; Arndt, S.; LaRowe, D. E.; Mogollón, J.; Van Cappellen, P.

    2011-05-01

    Recent developments in the quantitative modeling of methane dynamics and anaerobic oxidation of methane (AOM) in marine sediments are critically reviewed. The first part of the review begins with a comparison of alternative kinetic models for AOM. The roles of bioenergetic limitations, intermediate compounds and biomass growth are highlighted. Next, the key transport mechanisms in multi-phase sedimentary environments affecting AOM and methane fluxes are briefly treated, while attention is also given to additional controls on methane and sulfate turnover, including organic matter mineralization, sulfur cycling and methane phase transitions. In the second part of the review, the structure, forcing functions and parameterization of published models of AOM in sediments are analyzed. The six-orders-of-magnitude range in rate constants reported for the widely used bimolecular rate law for AOM emphasizes the limited transferability of this simple kinetic model and, hence, the need for more comprehensive descriptions of the AOM reaction system. The derivation and implementation of more complete reaction models, however, are limited by the availability of observational data. In this context, we attempt to rank the relative benefits of potential experimental measurements that should help to better constrain AOM models. The last part of the review presents a compilation of reported depth-integrated AOM rates (ΣAOM). These rates reveal the extreme variability of ΣAOM in marine sediments. The model results are further used to derive quantitative relationships between ΣAOM and the magnitude of externally impressed fluid flow, as well as between ΣAOM and the depth of the sulfate-methane transition zone (SMTZ). This review contributes to an improved understanding of the global significance of the AOM process, and helps identify outstanding questions and future directions in the modeling of methane cycling and AOM in marine sediments.

  17. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  18. Universally applicable model for the quantitative determination of lake sediment composition using fourier transform infrared spectroscopy.

    Science.gov (United States)

    Rosén, Peter; Vogel, Hendrik; Cunningham, Laura; Hahn, Annette; Hausmann, Sonja; Pienitz, Reinhard; Zolitschka, Bernd; Wagner, Bernd; Persson, Per

    2011-10-15

    Fourier transform infrared spectroscopy (FTIRS) can provide detailed information on organic and minerogenic constituents of sediment records. Based on a large number of sediment samples of varying age (0-340,000 yrs) and from very diverse lake settings in Antarctica, Argentina, Canada, Macedonia/Albania, Siberia, and Sweden, we have developed universally applicable calibration models for the quantitative determination of biogenic silica (BSi; n = 816), total inorganic carbon (TIC; n = 879), and total organic carbon (TOC; n = 3164) using FTIRS. These models are based on the differential absorbance of infrared radiation at specific wavelengths with varying concentrations of individual parameters, due to molecular vibrations associated with each parameter. The calibration models have low prediction errors and the predicted values are highly correlated with conventionally measured values (R = 0.94-0.99). Robustness tests indicate the accuracy of the newly developed FTIRS calibration models is similar to that of conventional geochemical analyses. Consequently FTIRS offers a useful and rapid alternative to conventional analyses for the quantitative determination of BSi, TIC, and TOC. The rapidity, cost-effectiveness, and small sample size required enables FTIRS determination of geochemical properties to be undertaken at higher resolutions than would otherwise be possible with the same resource allocation, thus providing crucial sedimentological information for climatic and environmental reconstructions.

  19. Simulink models for performance analysis of high speed DQPSK modulated optical link

    Energy Technology Data Exchange (ETDEWEB)

    Sharan, Lucky, E-mail: luckysharan@pilani.bits-pilani.ac.in; Rupanshi,, E-mail: f2011222@pilani.bits-pilani.ac.in; Chaubey, V. K., E-mail: vkc@pilani.bits-pilani.ac.in [EEE Department, BITS-Pilani, Rajasthan, 333031 (India)

    2016-03-09

    This paper attempts to present the design approach for development of simulation models to study and analyze the transmission of 10 Gbps DQPSK signal over a single channel Peer to Peer link using Matlab Simulink. The simulation model considers the different optical components used in link design with their behavior represented initially by theoretical interpretation, including the transmitter topology, Mach Zehnder Modulator(MZM) module and, the propagation model for optical fibers etc. thus allowing scope for direct realization in experimental configurations. It provides the flexibility to incorporate the various photonic components as either user-defined or fixed and, can also be enhanced or removed from the model as per the design requirements. We describe the detailed operation and need of every component model and its representation in Simulink blocksets. Moreover the developed model can be extended in future to support Dense Wavelength Division Multiplexing (DWDM) system, thereby allowing high speed transmission with N × 40 Gbps systems. The various compensation techniques and their influence on system performance can be easily investigated by using such models.

  20. Simulink models for performance analysis of high speed DQPSK modulated optical link

    Science.gov (United States)

    Sharan, Lucky; Rupanshi, Chaubey, V. K.

    2016-03-01

    This paper attempts to present the design approach for development of simulation models to study and analyze the transmission of 10 Gbps DQPSK signal over a single channel Peer to Peer link using Matlab Simulink. The simulation model considers the different optical components used in link design with their behavior represented initially by theoretical interpretation, including the transmitter topology, Mach Zehnder Modulator(MZM) module and, the propagation model for optical fibers etc. thus allowing scope for direct realization in experimental configurations. It provides the flexibility to incorporate the various photonic components as either user-defined or fixed and, can also be enhanced or removed from the model as per the design requirements. We describe the detailed operation and need of every component model and its representation in Simulink blocksets. Moreover the developed model can be extended in future to support Dense Wavelength Division Multiplexing (DWDM) system, thereby allowing high speed transmission with N × 40 Gbps systems. The various compensation techniques and their influence on system performance can be easily investigated by using such models.

  1. Security in the data link layer of the OSI model on LANs wired Cisco

    OpenAIRE

    Moreira Santos, María Genoveva; Alcívar Marcillo, Pedro Antonio

    2018-01-01

    There are no technologies or protocols completely secure in network infrastructures, for this reason, this document aims to demonstrate the importance of configuring security options on network equipments. On this occasion we will focus on the data link layer of the OSI model, which is where controls have begun to be implemented at level of protocols. The tools that are used in the research facilitate the implementation of a virtual laboratory, which consists of a base operating system (windo...

  2. Nonparametric modeling of longitudinal covariance structure in functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yap, John Stephen; Fan, Jianqing; Wu, Rongling

    2009-12-01

    Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.

  3. Effects of Noninhibitory Serpin Maspin on the Actin Cytoskeleton: A Quantitative Image Modeling Approach.

    Science.gov (United States)

    Al-Mamun, Mohammed; Ravenhill, Lorna; Srisukkham, Worawut; Hossain, Alamgir; Fall, Charles; Ellis, Vincent; Bass, Rosemary

    2016-04-01

    Recent developments in quantitative image analysis allow us to interrogate confocal microscopy images to answer biological questions. Clumped and layered cell nuclei and cytoplasm in confocal images challenges the ability to identify subcellular compartments. To date, there is no perfect image analysis method to identify cytoskeletal changes in confocal images. Here, we present a multidisciplinary study where an image analysis model was developed to allow quantitative measurements of changes in the cytoskeleton of cells with different maspin exposure. Maspin, a noninhibitory serpin influences cell migration, adhesion, invasion, proliferation, and apoptosis in ways that are consistent with its identification as a tumor metastasis suppressor. Using different cell types, we tested the hypothesis that reduction in cell migration by maspin would be reflected in the architecture of the actin cytoskeleton. A hybrid marker-controlled watershed segmentation technique was used to segment the nuclei, cytoplasm, and ruffling regions before measuring cytoskeletal changes. This was informed by immunohistochemical staining of cells transfected stably or transiently with maspin proteins, or with added bioactive peptides or protein. Image analysis results showed that the effects of maspin were mirrored by effects on cell architecture, in a way that could be described quantitatively.

  4. Mechanisms linking socioeconomic status to smoking cessation: a structural equation modeling approach.

    Science.gov (United States)

    Businelle, Michael S; Kendzor, Darla E; Reitzel, Lorraine R; Costello, Tracy J; Cofta-Woerpel, Ludmila; Li, Yisheng; Mazas, Carlos A; Vidrine, Jennifer Irvin; Cinciripini, Paul M; Greisinger, Anthony J; Wetter, David W

    2010-05-01

    Although there has been a socioeconomic gradient in smoking prevalence, cessation, and disease burden for decades, these disparities have become even more pronounced over time. The aim of the current study was to develop and test a conceptual model of the mechanisms linking socioeconomic status (SES) to smoking cessation. The conceptual model was evaluated using a latent variable modeling approach in a sample of 424 smokers seeking treatment (34% African American; 33% Latino; 33% White). Hypothesized mechanisms included social support, neighborhood disadvantage, negative affect/stress, agency, and craving. The primary outcome was Week 4 smoking status. As was hypothesized, SES had significant direct and indirect effects on cessation. Specifically, neighborhood disadvantage, social support, negative affect/stress, and agency mediated the relation between SES and smoking cessation. A multiple group analysis indicated that the model was a good fit across racial/ethnic groups. The present study yielded one of the more comprehensive models illuminating the specific mechanisms that link SES and smoking cessation. Policy, community, and individual-level interventions that target low SES smokers and address the specific pathways identified in the current model could potentially attenuate the impact of SES on cessation. (c) 2010 APA, all rights reserved.

  5. Quantitative model for the generic 3D shape of ICMEs at 1 AU

    Science.gov (United States)

    Démoulin, P.; Janvier, M.; Masías-Meza, J. J.; Dasso, S.

    2016-10-01

    Context. Interplanetary imagers provide 2D projected views of the densest plasma parts of interplanetary coronal mass ejections (ICMEs), while in situ measurements provide magnetic field and plasma parameter measurements along the spacecraft trajectory, that is, along a 1D cut. The data therefore only give a partial view of the 3D structures of ICMEs. Aims: By studying a large number of ICMEs, crossed at different distances from their apex, we develop statistical methods to obtain a quantitative generic 3D shape of ICMEs. Methods: In a first approach we theoretically obtained the expected statistical distribution of the shock-normal orientation from assuming simple models of 3D shock shapes, including distorted profiles, and compared their compatibility with observed distributions. In a second approach we used the shock normal and the flux rope axis orientations together with the impact parameter to provide statistical information across the spacecraft trajectory. Results: The study of different 3D shock models shows that the observations are compatible with a shock that is symmetric around the Sun-apex line as well as with an asymmetry up to an aspect ratio of around 3. Moreover, flat or dipped shock surfaces near their apex can only be rare cases. Next, the sheath thickness and the ICME velocity have no global trend along the ICME front. Finally, regrouping all these new results and those of our previous articles, we provide a quantitative ICME generic 3D shape, including the global shape of the shock, the sheath, and the flux rope. Conclusions: The obtained quantitative generic ICME shape will have implications for several aims. For example, it constrains the output of typical ICME numerical simulations. It is also a base for studying the transport of high-energy solar and cosmic particles during an ICME propagation as well as for modeling and forecasting space weather conditions near Earth.

  6. Modeling, control and simulation of a chain link STATCOM in EMTP-RV

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Nikunj M. [Siemens Energy and Automation, 100 Technology Drive, Alpharetta, GA 30005 (United States); Sood, Vijay K. [Faculty of Engineering and Applied Science, University of Ontario Institute of Technology (UOIT), Oshawa, ON L1H 7K4 (Canada); Ramachandran, V. [Department of Electrical and Computer Engineering, Concordia University, Montreal, QC H3G 1M8 (Canada)

    2009-03-15

    This paper describes an alternative STATic synchronous COMpensator (STATCOM), by connecting a number of gate turn off (GTO) thyristor converters in series on the ac side of the system. Each GTO converter forms one 'link' of a 1-phase, full-bridge voltage-source-converter (VSC) and is referred to as a 'Chain Link Converter' (CLC). Each GTO of a chain link STATCOM (CLS), is switched 'ON/OFF' only once per cycle of the fundamental frequency by using a sinusoidal pulse width modulation (SPWM) technique. Approximate models of a 3-phase CLS using dq-transformation are derived to design two controllers for controlling reactive current and ac voltage to stabilize the system voltage at the point of common coupling (PCC). A novel technique, called the rotated gate signal pattern (RGSP), is used for balancing the voltages of the link dc capacitors. The performance investigation of the CLS system when used in a radial line is carried out under steady- and transient-state operating conditions by means of the simulation package; EMTP-RV and the results are presented. (author)

  7. Computational modeling in nanomedicine: prediction of multiple antibacterial profiles of nanoparticles using a quantitative structure-activity relationship perturbation model.

    Science.gov (United States)

    Speck-Planche, Alejandro; Kleandrova, Valeria V; Luan, Feng; Cordeiro, Maria Natália D S

    2015-01-01

    We introduce the first quantitative structure-activity relationship (QSAR) perturbation model for probing multiple antibacterial profiles of nanoparticles (NPs) under diverse experimental conditions. The dataset is based on 300 nanoparticles containing dissimilar chemical compositions, sizes, shapes and surface coatings. In general terms, the NPs were tested against different bacteria, by considering several measures of antibacterial activity and diverse assay times. The QSAR perturbation model was created from 69,231 nanoparticle-nanoparticle (NP-NP) pairs, which were randomly generated using a recently reported perturbation theory approach. The model displayed an accuracy rate of approximately 98% for classifying NPs as active or inactive, and a new copper-silver nanoalloy was correctly predicted by this model with consensus accuracy of 77.73%. Our QSAR perturbation model can be used as an efficacious tool for the virtual screening of antibacterial nanomaterials.

  8. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  9. Spatiotemporal microbiota dynamics from quantitative in vitro and in silico models of the gut

    Science.gov (United States)

    Hwa, Terence

    The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth behaviors, which ultimately dictate the gut microbiota composition. Combining measurements of bacterial growth physiology with analysis of published data on human physiology into a quantitative modeling framework, we show how hydrodynamic forces in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla in the gut. Our model quantitatively explains the observed variation of microbiota composition among healthy adults, and predicts colonic water absorption (manifested as stool consistency) and nutrient intake to be two key factors determining this composition. The model further reveals that both factors, which have been identified in recent correlative studies, exert their effects through the same mechanism: changes in colonic pH that differentially affect the growth of different bacteria. Our findings show that a predictive and mechanistic understanding of microbial ecology in the human gut is possible, and offer the hope for the rational design of intervention strategies to actively control the microbiota. This work is supported by the Bill and Melinda Gates Foundation.

  10. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  11. Quantitative Analysis of Situation Awareness (QASA): modelling and measuring situation awareness using signal detection theory.

    Science.gov (United States)

    Edgar, Graham K; Catherwood, Di; Baker, Steven; Sallis, Geoff; Bertels, Michael; Edgar, Helen E; Nikolla, Dritan; Buckle, Susanna; Goodwin, Charlotte; Whelan, Allana

    2017-12-29

    This paper presents a model of situation awareness (SA) that emphasises that SA is necessarily built using a subset of available information. A technique (Quantitative Analysis of Situation Awareness - QASA), based around signal detection theory, has been developed from this model that provides separate measures of actual SA (ASA) and perceived SA (PSA), together with a feature unique to QASA, a measure of bias (information acceptance). These measures allow the exploration of the relationship between actual SA, perceived SA and information acceptance. QASA can also be used for the measurement of dynamic ASA, PSA and bias. Example studies are presented and full details of the implementation of the QASA technique are provided. Practitioner Summary: This paper presents a new model of situation awareness (SA) together with an associated tool (Quantitative Analysis of Situation Awareness - QASA) that employs signal detection theory to measure several aspects of SA, including actual and perceived SA and information acceptance. Full details are given of the implementation of the tool.

  12. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    International Nuclear Information System (INIS)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity

  13. Searching for recursive causal structures in multivariate quantitative genetics mixed models.

    Science.gov (United States)

    Valente, Bruno D; Rosa, Guilherme J M; de Los Campos, Gustavo; Gianola, Daniel; Silva, Martinho A

    2010-06-01

    Biology is characterized by complex interactions between phenotypes, such as recursive and simultaneous relationships between substrates and enzymes in biochemical systems. Structural equation models (SEMs) can be used to study such relationships in multivariate analyses, e.g., with multiple traits in a quantitative genetics context. Nonetheless, the number of different recursive causal structures that can be used for fitting a SEM to multivariate data can be huge, even when only a few traits are considered. In recent applications of SEMs in mixed-model quantitative genetics settings, causal structures were preselected on the basis of prior biological knowledge alone. Therefore, the wide range of possible causal structures has not been properly explored. Alternatively, causal structure spaces can be explored using algorithms that, using data-driven evidence, can search for structures that are compatible with the joint distribution of the variables under study. However, the search cannot be performed directly on the joint distribution of the phenotypes as it is possibly confounded by genetic covariance among traits. In this article we propose to search for recursive causal structures among phenotypes using the inductive causation (IC) algorithm after adjusting the data for genetic effects. A standard multiple-trait model is fitted using Bayesian methods to obtain a posterior covariance matrix of phenotypes conditional to unobservable additive genetic effects, which is then used as input for the IC algorithm. As an illustrative example, the proposed methodology was applied to simulated data related to multiple traits measured on a set of inbred lines.

  14. Development and Validation of Quantitative Structure-Activity Relationship Models for Compounds Acting on Serotoninergic Receptors

    Directory of Open Access Journals (Sweden)

    Grażyna Żydek

    2012-01-01

    Full Text Available A quantitative structure-activity relationship (QSAR study has been made on 20 compounds with serotonin (5-HT receptor affinity. Thin-layer chromatographic (TLC data and physicochemical parameters were applied in this study. RP2 TLC 60F254 plates (silanized impregnated with solutions of propionic acid, ethylbenzene, 4-ethylphenol, and propionamide (used as analogues of the key receptor amino acids and their mixtures (denoted as S1–S7 biochromatographic models were used in two developing phases as a model of drug-5-HT receptor interaction. The semiempirical method AM1 (HyperChem v. 7.0 program and ACD/Labs v. 8.0 program were employed to calculate a set of physicochemical parameters for the investigated compounds. Correlation and multiple linear regression analysis were used to search for the best QSAR equations. The correlations obtained for the compounds studied represent their interactions with the proposed biochromatographic models. The good multivariate relationships (R2=0.78–0.84 obtained by means of regression analysis can be used for predicting the quantitative effect of biological activity of different compounds with 5-HT receptor affinity. “Leave-one-out” (LOO and “leave-N-out” (LNO cross-validation methods were used to judge the predictive power of final regression equations.

  15. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity.

  16. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  17. Tumour-cell killing by X-rays and immunity quantitated in a mouse model system

    International Nuclear Information System (INIS)

    Porteous, D.D.; Porteous, K.M.; Hughes, M.J.

    1979-01-01

    As part of an investigation of the interaction of X-rays and immune cytotoxicity in tumour control, an experimental mouse model system has been used in which quantitative anti-tumour immunity was raised in prospective recipients of tumour-cell suspensions exposed to varying doses of X-rays in vitro before injection. Findings reported here indicate that, whilst X-rays kill a proportion of cells, induced immunity deals with a fixed number dependent upon the immune status of the host, and that X-rays and anti-tumour immunity do not act synergistically in tumour-cell killing. The tumour used was the ascites sarcoma BP8. (author)

  18. Modeling of microfluidic microbial fuel cells using quantitative bacterial transport parameters

    Science.gov (United States)

    Mardanpour, Mohammad Mahdi; Yaghmaei, Soheila; Kalantar, Mohammad

    2017-02-01

    The objective of present study is to analyze the dynamic modeling of bioelectrochemical processes and improvement of the performance of previous models using quantitative data of bacterial transport parameters. The main deficiency of previous MFC models concerning spatial distribution of biocatalysts is an assumption of initial distribution of attached/suspended bacteria on electrode or in anolyte bulk which is the foundation for biofilm formation. In order to modify this imperfection, the quantification of chemotactic motility to understand the mechanisms of the suspended microorganisms' distribution in anolyte and/or their attachment to anode surface to extend the biofilm is implemented numerically. The spatial and temporal distributions of the bacteria, as well as the dynamic behavior of the anolyte and biofilm are simulated. The performance of the microfluidic MFC as a chemotaxis assay is assessed by analyzing the bacteria activity, substrate variation, bioelectricity production rate and the influences of external resistance on the biofilm and anolyte's features.

  19. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  20. Ontologies to Support RFID-Based Link between Virtual Models and Construction Components

    DEFF Research Database (Denmark)

    Sørensen, Kristian Birch; Christiansson, Per; Svidt, Kjeld

    2010-01-01

    as the foundation for information sharing between trading partners, reuse of data from one phase in construction to the next, integration of process and product models with enterprise resource planning (ERP) systems, easy access of information, communication of data through networks, reading of data stored...... of errors, it gives a better production basis, and it improves clarity and enhances communication compared to traditional 2D drafting methods. However, there is still much unutilized potential in the virtual models, especially for use in the construction and operation phases. A digital link between...

  1. On the missing link in ecology: improving communication between modellers and experimentalists

    DEFF Research Database (Denmark)

    Heuschele, Jan; Ekvall, Mikael T.; Mariani, Patrizio

    2017-01-01

    Collaboration between modellers and experimentalists is essential in ecological research, however, different obstacles linking both camps often hinder scientific progress. In this commentary, we discuss several issues of the current state of affairs in this research loop. Backed by an online survey...... amongst fellow ecologists, modellers and experimentalists alike, we identify two major areas that need to be mended. Firstly, differences in language and jargon lead to a lack of exchange of ideas and to unrealistic mutual expectations. And secondly, constraint data sharing, accessibility and quality...... future research collaboration and to increase the impact of single ecological studies alike....

  2. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its...

  3. Quantitative Hydraulic Models Of Early Land Plants Provide Insight Into Middle Paleozoic Terrestrial Paleoenvironmental Conditions

    Science.gov (United States)

    Wilson, J. P.; Fischer, W. W.

    2010-12-01

    Fossil plants provide useful proxies of Earth’s climate because plants are closely connected, through physiology and morphology, to the environments in which they lived. Recent advances in quantitative hydraulic models of plant water transport provide new insight into the history of climate by allowing fossils to speak directly to environmental conditions based on preserved internal anatomy. We report results of a quantitative hydraulic model applied to one of the earliest terrestrial plants preserved in three dimensions, the ~396 million-year-old vascular plant Asteroxylon mackei. This model combines equations describing the rate of fluid flow through plant tissues with detailed observations of plant anatomy; this allows quantitative estimates of two critical aspects of plant function. First and foremost, results from these models quantify the supply of water to evaporative surfaces; second, results describe the ability of plant vascular systems to resist tensile damage from extreme environmental events, such as drought or frost. This approach permits quantitative comparisons of functional aspects of Asteroxylon with other extinct and extant plants, informs the quality of plant-based environmental proxies, and provides concrete data that can be input into climate models. Results indicate that despite their small size, water transport cells in Asteroxylon could supply a large volume of water to the plant's leaves--even greater than cells from some later-evolved seed plants. The smallest Asteroxylon tracheids have conductivities exceeding 0.015 m^2 / MPa * s, whereas Paleozoic conifer tracheids do not reach this threshold until they are three times wider. However, this increase in conductivity came at the cost of little to no adaptations for transport safety, placing the plant’s vegetative organs in jeopardy during drought events. Analysis of the thickness-to-span ratio of Asteroxylon’s tracheids suggests that environmental conditions of reduced relative

  4. Linking Land Change with Driving Forces and Actors: Four Conceptual Models

    Directory of Open Access Journals (Sweden)

    Anna M. Hersperger

    2010-12-01

    Full Text Available Models in land change research are often chosen arbitrarily based on practical rather than theoretical considerations. More specifically, research on land change is often based on a research framework with three crucial elements - driving forces, actors, and land change - in an ad hoc and case-specific configuration. The lack of solid and widely applicable concepts about the conceptual link between these three elements can negatively affect individual research projects and hamper communication and generalizations beyond the individual project. We present four basic models for linking land change with driving forces and actors. These models are illustrated with examples from the research literature. Based on the main characteristics of the models and practical considerations, we propose guidelines for choosing among the four models for specific studies. More generally, we want to raise awareness that land change research is especially demanding with respect to conceptual backgrounds and that conceptual considerations will help improve the scientific quality of individual studies as well as their potential contribution towards generic theories of land change.

  5. Development of quantitative atomic modeling for tungsten transport study using LHD plasma with tungsten pellet injection

    Science.gov (United States)

    Murakami, I.; Sakaue, H. A.; Suzuki, C.; Kato, D.; Goto, M.; Tamura, N.; Sudo, S.; Morita, S.

    2015-09-01

    Quantitative tungsten study with reliable atomic modeling is important for successful achievement of ITER and fusion reactors. We have developed tungsten atomic modeling for understanding the tungsten behavior in fusion plasmas. The modeling is applied to the analysis of tungsten spectra observed from plasmas of the large helical device (LHD) with tungsten pellet injection. We found that extreme ultraviolet (EUV) emission of W24+ to W33+ ions at 1.5-3.5 nm are sensitive to electron temperature and useful to examine the tungsten behavior in edge plasmas. We can reproduce measured EUV spectra at 1.5-3.5 nm by calculated spectra with the tungsten atomic model and obtain charge state distributions of tungsten ions in LHD plasmas at different temperatures around 1 keV. Our model is applied to calculate the unresolved transition array (UTA) seen at 4.5-7 nm tungsten spectra. We analyze the effect of configuration interaction on population kinetics related to the UTA structure in detail and find the importance of two-electron-one-photon transitions between 4p54dn+1- 4p64dn-14f. Radiation power rate of tungsten due to line emissions is also estimated with the model and is consistent with other models within factor 2.

  6. Quantitative Structure-activity Relationship (QSAR) Models for Docking Score Correction.

    Science.gov (United States)

    Fukunishi, Yoshifumi; Yamasaki, Satoshi; Yasumatsu, Isao; Takeuchi, Koh; Kurosawa, Takashi; Nakamura, Haruki

    2017-01-01

    In order to improve docking score correction, we developed several structure-based quantitative structure activity relationship (QSAR) models by protein-drug docking simulations and applied these models to public affinity data. The prediction models used descriptor-based regression, and the compound descriptor was a set of docking scores against multiple (∼600) proteins including nontargets. The binding free energy that corresponded to the docking score was approximated by a weighted average of docking scores for multiple proteins, and we tried linear, weighted linear and polynomial regression models considering the compound similarities. In addition, we tried a combination of these regression models for individual data sets such as IC 50 , K i , and %inhibition values. The cross-validation results showed that the weighted linear model was more accurate than the simple linear regression model. Thus, the QSAR approaches based on the affinity data of public databases should improve docking scores. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  7. Quantitative computational models of molecular self-assembly in systems biology

    Science.gov (United States)

    Thomas, Marcus; Schwartz, Russell

    2017-06-01

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.

  8. Quantitative computational models of molecular self-assembly in systems biology.

    Science.gov (United States)

    Thomas, Marcus; Schwartz, Russell

    2017-05-23

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.

  9. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    Science.gov (United States)

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Mi...

  10. Satellite Contributions to the Quantitative Characterization of Biomass Burning for Climate Modeling

    Science.gov (United States)

    Ichoku, Charles; Kahn, Ralph; Chin, Mian

    2012-01-01

    Characterization of biomass burning from space has been the subject of an extensive body of literature published over the last few decades. Given the importance of this topic, we review how satellite observations contribute toward improving the representation of biomass burning quantitatively in climate and air-quality modeling and assessment. Satellite observations related to biomass burning may be classified into five broad categories: (i) active fire location and energy release, (ii) burned areas and burn severity, (iii) smoke plume physical disposition, (iv) aerosol distribution and particle properties, and (v) trace gas concentrations. Each of these categories involves multiple parameters used in characterizing specific aspects of the biomass-burning phenomenon. Some of the parameters are merely qualitative, whereas others are quantitative, although all are essential for improving the scientific understanding of the overall distribution (both spatial and temporal) and impacts of biomass burning. Some of the qualitative satellite datasets, such as fire locations, aerosol index, and gas estimates have fairly long-term records. They date back as far as the 1970s, following the launches of the DMSP, Landsat, NOAA, and Nimbus series of earth observation satellites. Although there were additional satellite launches in the 1980s and 1990s, space-based retrieval of quantitative biomass burning data products began in earnest following the launch of Terra in December 1999. Starting in 2000, fire radiative power, aerosol optical thickness and particle properties over land, smoke plume injection height and profile, and essential trace gas concentrations at improved resolutions became available. The 2000s also saw a large list of other new satellite launches, including Aqua, Aura, Envisat, Parasol, and CALIPSO, carrying a host of sophisticated instruments providing high quality measurements of parameters related to biomass burning and other phenomena. These improved data

  11. Quartz Crystal Microbalance Model for Quantitatively Probing the Deformation of Adsorbed Particles at Low Surface Coverage.

    Science.gov (United States)

    Gillissen, Jurriaan J J; Jackman, Joshua A; Tabaei, Seyed R; Yoon, Bo Kyeong; Cho, Nam-Joon

    2017-11-07

    Characterizing the deformation of nanoscale, soft-matter particulates at solid-liquid interfaces is a demanding task, and there are limited experimental options to perform quantitative measurements in a nonperturbative manner. Previous attempts, based on the quartz crystal microbalance (QCM) technique, focused on the high surface coverage regime and modeled the adsorbed particles as a homogeneous film, while not considering the coupling between particles and surrounding fluid and hence resulting in an underestimation of the known particle height. In this work, we develop a model for the hydrodynamic coupling between adsorbed particles and surrounding fluid in the limit of a low surface coverage, which can be used to extract shape information from QCM measurement data. We tackle this problem by using hydrodynamic simulations of an ellipsoidal particle on an oscillating surface. From the simulation results, we derived a phenomenological relation between the aspect ratio r of the absorbed particles and the slope and intercept of the line that fits instantaneous, overtone-dependent QCM data on (δ/a, -Δf/n) coordinates where δ is the viscous penetration depth, a is the particle radius, Δf is the QCM frequency shift, and n is the overtone number. The model was applied to QCM measurement data pertaining to the adsorption of 34 nm radius, fluid-phase and gel-phase liposomes onto a titanium oxide-coated surface. The osmotic pressure across the liposomal bilayer was varied to induce shape deformation. By combining these results with a membrane bending model, we determined the membrane bending energy for the gel-phase liposomes, and the results are consistent with literature values. In summary, a phenomenological model is presented and validated in order to show for the first time that QCM experiments can quantitatively measure the deformation of adsorbed particles at low surface coverage.

  12. Quantitative Modeling of Acid Wormholing in Carbonates- What Are the Gaps to Bridge

    KAUST Repository

    Qiu, Xiangdong

    2013-01-01

    Carbonate matrix acidization extends a well\\'s effective drainage radius by dissolving rock and forming conductive channels (wormholes) from the wellbore. Wormholing is a dynamic process that involves balance between the acid injection rate and reaction rate. Generally, injection rate is well defined where injection profiles can be controlled, whereas the reaction rate can be difficult to obtain due to its complex dependency on interstitial velocity, fluid composition, rock surface properties etc. Conventional wormhole propagation models largely ignore the impact of reaction products. When implemented in a job design, the significant errors can result in treatment fluid schedule, rate, and volume. A more accurate method to simulate carbonate matrix acid treatments would accomodate the effect of reaction products on reaction kinetics. It is the purpose of this work to properly account for these effects. This is an important step in achieving quantitative predictability of wormhole penetration during an acidzing treatment. This paper describes the laboratory procedures taken to obtain the reaction-product impacted kinetics at downhole conditions using a rotating disk apparatus, and how this new set of kinetics data was implemented in a 3D wormholing model to predict wormhole morphology and penetration velocity. The model explains some of the differences in wormhole morphology observed in limestone core flow experiments where injection pressure impacts the mass transfer of hydrogen ions to the rock surface. The model uses a CT scan rendered porosity field to capture the finer details of the rock fabric and then simulates the fluid flow through the rock coupled with reactions. Such a validated model can serve as a base to scale up to near wellbore reservoir and 3D radial flow geometry allowing a more quantitative acid treatment design.

  13. Qualitative and quantitative examination of the performance of regional air quality models representing different modeling approaches

    International Nuclear Information System (INIS)

    Bhumralkar, C.M.; Ludwig, F.L.; Shannon, J.D.; McNaughton, D.

    1985-04-01

    The calculations of three different air quality models were compared with the best available observations. The comparisons were made without calibrating the models to improve agreement with the observations. Model performance was poor for short averaging times (less than 24 hours). Some of the poor performance can be traced to errors in the input meteorological fields, but error exist on all levels. It should be noted that these models were not originally designed for treating short-term episodes. For short-term episodes, much of the variance in the data can arise from small spatial scale features that tend to be averaged out over longer periods. These small spatial scale features cannot be resolved with the coarse grids that are used for the meteorological and emissions inputs. Thus, it is not surprising that the models performed for the longer averaging times. The models compared were RTM-II, ENAMAP-2 and ACID. (17 refs., 5 figs., 4 tabs

  14. Reservoir architecture modeling: Nonstationary models for quantitative geological characterization. Final report, April 30, 1998

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, D.; Epili, D.; Kelkar, M.; Redner, R.; Reynolds, A.

    1998-12-01

    The study was comprised of four investigations: facies architecture; seismic modeling and interpretation; Markov random field and Boolean models for geologic modeling of facies distribution; and estimation of geological architecture using the Bayesian/maximum entropy approach. This report discusses results from all four investigations. Investigations were performed using data from the E and F units of the Middle Frio Formation, Stratton Field, one of the major reservoir intervals in the Gulf Coast Basin.

  15. Kinematic modeling of mobile robot with rocker-bogie link structure

    Science.gov (United States)

    Gang, Taig-Gi; Yi, Soo-Yeong

    2005-12-01

    A method for kinematic modeling of a mobile robot with rocker-bogie link mechanism was described. By using the well-known concept of the instantaneous coordinates, it derives the kinematic model for the full six degree of freedom motion including the x, y, and z motions and the pitch, roll, and yaw rotations. The kinematic model here implies both of the forward and the inverse kinematic equations. The forward kinematic equation with the wheel Jacobian matrices can be used to obtain the robot position and orientation from the measured wheel velocities and the rocker-bogie joint angles. On the contrary, the inverse kinematic equation implies a resulting robot motions consisting of body velocity and turning rate from the individual wheel velocities. Through the computer simulation, the kinematic model of the mobile robot was verified.

  16. Quantitative evaluation and modeling of two-dimensional neovascular network complexity: the surface fractal dimension

    International Nuclear Information System (INIS)

    Grizzi, Fabio; Russo, Carlo; Colombo, Piergiuseppe; Franceschini, Barbara; Frezza, Eldo E; Cobos, Everardo; Chiriva-Internati, Maurizio

    2005-01-01

    Modeling the complex development and growth of tumor angiogenesis using mathematics and biological data is a burgeoning area of cancer research. Architectural complexity is the main feature of every anatomical system, including organs, tissues, cells and sub-cellular entities. The vascular system is a complex network whose geometrical characteristics cannot be properly defined using the principles of Euclidean geometry, which is only capable of interpreting regular and smooth objects that are almost impossible to find in Nature. However, fractal geometry is a more powerful means of quantifying the spatial complexity of real objects. This paper introduces the surface fractal dimension (D s ) as a numerical index of the two-dimensional (2-D) geometrical complexity of tumor vascular networks, and their behavior during computer-simulated changes in vessel density and distribution. We show that D s significantly depends on the number of vessels and their pattern of distribution. This demonstrates that the quantitative evaluation of the 2-D geometrical complexity of tumor vascular systems can be useful not only to measure its complex architecture, but also to model its development and growth. Studying the fractal properties of neovascularity induces reflections upon the real significance of the complex form of branched anatomical structures, in an attempt to define more appropriate methods of describing them quantitatively. This knowledge can be used to predict the aggressiveness of malignant tumors and design compounds that can halt the process of angiogenesis and influence tumor growth

  17. A Cytomorphic Chip for Quantitative Modeling of Fundamental Bio-Molecular Circuits.

    Science.gov (United States)

    2015-08-01

    We describe a 0.35 μm BiCMOS silicon chip that quantitatively models fundamental molecular circuits via efficient log-domain cytomorphic transistor equivalents. These circuits include those for biochemical binding with automatic representation of non-modular and loading behavior, e.g., in cascade and fan-out topologies; for representing variable Hill-coefficient operation and cooperative binding; for representing inducer, transcription-factor, and DNA binding; for probabilistic gene transcription with analogic representations of log-linear and saturating operation; for gain, degradation, and dynamics of mRNA and protein variables in transcription and translation; and, for faithfully representing biological noise via tunable stochastic transistor circuits. The use of on-chip DACs and ADCs enables multiple chips to interact via incoming and outgoing molecular digital data packets and thus create scalable biochemical reaction networks. The use of off-chip digital processors and on-chip digital memory enables programmable connectivity and parameter storage. We show that published static and dynamic MATLAB models of synthetic biological circuits including repressilators, feed-forward loops, and feedback oscillators are in excellent quantitative agreement with those from transistor circuits on the chip. Computationally intensive stochastic Gillespie simulations of molecular production are also rapidly reproduced by the chip and can be reliably tuned over the range of signal-to-noise ratios observed in biological cells.

  18. Application of a linked stress release model in Corinth Gulf and Central Ionian Islands (Greece)

    Science.gov (United States)

    Mangira, Ourania; Vasiliadis, Georgios; Papadimitriou, Eleftheria

    2017-06-01

    Spatio-temporal stress changes and interactions between adjacent fault segments consist of the most important component in seismic hazard assessment, as they can alter the occurrence probability of strong earthquake onto these segments. The investigation of the interactions between adjacent areas by means of the linked stress release model is attempted for moderate earthquakes ( M ≥ 5.2) in the Corinth Gulf and the Central Ionian Islands (Greece). The study areas were divided in two subareas, based on seismotectonic criteria. The seismicity of each subarea is investigated by means of a stochastic point process and its behavior is determined by the conditional intensity function, which usually gets an exponential form. A conditional intensity function of Weibull form is used for identifying the most appropriate among the models (simple, independent and linked stress release model) for the interpretation of the earthquake generation process. The appropriateness of the models was decided after evaluation via the Akaike information criterion. Despite the fact that the curves of the conditional intensity functions exhibit similar behavior, the use of the exponential-type conditional intensity function seems to fit better the data.

  19. Latent constructs model explaining the attachment-linked variation in autobiographical remembering.

    Science.gov (United States)

    Öner, Sezin; Gülgöz, Sami

    2016-01-01

    In the current study, we proposed a latent constructs model to characterise the qualitative aspects of autobiographical remembering and investigated the structural relations in the model that may vary across individuals. Primarily, we focused on the memories of romantic relationships and argued that attachment anxiety and avoidance would be reflected in the ways that individuals encode, rehearse, or remember autobiographical memories in close relationships. Participants reported two positive and two negative relationship-specific memories and rated the characteristics for each memory. As predicted, the basic memory model yielded appropriate fit, indicating that event characteristics (EC) predicted the frequency of rehearsal (RC) and phenomenology at retrieval (PC). When attachment variables were integrated, the model showed that rehearsal mediated the link between anxiety and PC, especially for negative memories. On the other hand, for avoidance EC was the key factor mediating the link between avoidance and RC, as well as PC. Findings were discussed with respect to autobiographical memory functions emphasising a systematically, integrated framework.

  20. Quantitative Circulatory Physiology: an integrative mathematical model of human physiology for medical education.

    Science.gov (United States)

    Abram, Sean R; Hodnett, Benjamin L; Summers, Richard L; Coleman, Thomas G; Hester, Robert L

    2007-06-01

    We have developed Quantitative Circulatory Physiology (QCP), a mathematical model of integrative human physiology containing over 4,000 variables of biological interactions. This model provides a teaching environment that mimics clinical problems encountered in the practice of medicine. The model structure is based on documented physiological responses within peer-reviewed literature and serves as a dynamic compendium of physiological knowledge. The model is solved using a desktop, Windows-based program, allowing students to calculate time-dependent solutions and interactively alter over 750 parameters that modify physiological function. The model can be used to understand proposed mechanisms of physiological function and the interactions among physiological variables that may not be otherwise intuitively evident. In addition to open-ended or unstructured simulations, we have developed 30 physiological simulations, including heart failure, anemia, diabetes, and hemorrhage. Additional stimulations include 29 patients in which students are challenged to diagnose the pathophysiology based on their understanding of integrative physiology. In summary, QCP allows students to examine, integrate, and understand a host of physiological factors without causing harm to patients. This model is available as a free download for Windows computers at http://physiology.umc.edu/themodelingworkshop.

  1. Modelling and Quantitative Analysis of LTRACK–A Novel Mobility Management Algorithm

    Directory of Open Access Journals (Sweden)

    Benedek Kovács

    2006-01-01

    Full Text Available This paper discusses the improvements and parameter optimization issues of LTRACK, a recently proposed mobility management algorithm. Mathematical modelling of the algorithm and the behavior of the Mobile Node (MN are used to optimize the parameters of LTRACK. A numerical method is given to determine the optimal values of the parameters. Markov chains are used to model both the base algorithm and the so-called loop removal effect. An extended qualitative and quantitative analysis is carried out to compare LTRACK to existing handover mechanisms such as MIP, Hierarchical Mobile IP (HMIP, Dynamic Hierarchical Mobility Management Strategy (DHMIP, Telecommunication Enhanced Mobile IP (TeleMIP, Cellular IP (CIP and HAWAII. LTRACK is sensitive to network topology and MN behavior so MN movement modelling is also introduced and discussed with different topologies. The techniques presented here can not only be used to model the LTRACK algorithm, but other algorithms too. There are many discussions and calculations to support our mathematical model to prove that it is adequate in many cases. The model is valid on various network levels, scalable vertically in the ISO-OSI layers and also scales well with the number of network elements.

  2. A pulsatile flow model for in vitro quantitative evaluation of prosthetic valve regurgitation

    Directory of Open Access Journals (Sweden)

    S. Giuliatti

    2000-03-01

    Full Text Available A pulsatile pressure-flow model was developed for in vitro quantitative color Doppler flow mapping studies of valvular regurgitation. The flow through the system was generated by a piston which was driven by stepper motors controlled by a computer. The piston was connected to acrylic chambers designed to simulate "ventricular" and "atrial" heart chambers. Inside the "ventricular" chamber, a prosthetic heart valve was placed at the inflow connection with the "atrial" chamber while another prosthetic valve was positioned at the outflow connection with flexible tubes, elastic balloons and a reservoir arranged to mimic the peripheral circulation. The flow model was filled with a 0.25% corn starch/water suspension to improve Doppler imaging. A continuous flow pump transferred the liquid from the peripheral reservoir to another one connected to the "atrial" chamber. The dimensions of the flow model were designed to permit adequate imaging by Doppler echocardiography. Acoustic windows allowed placement of transducers distal and perpendicular to the valves, so that the ultrasound beam could be positioned parallel to the valvular flow. Strain-gauge and electromagnetic transducers were used for measurements of pressure and flow in different segments of the system. The flow model was also designed to fit different sizes and types of prosthetic valves. This pulsatile flow model was able to generate pressure and flow in the physiological human range, with independent adjustment of pulse duration and rate as well as of stroke volume. This model mimics flow profiles observed in patients with regurgitant prosthetic valves.

  3. A linked simulation-optimization model for solving the unknown groundwater pollution source identification problems.

    Science.gov (United States)

    Ayvaz, M Tamer

    2010-09-20

    This study proposes a linked simulation-optimization model for solving the unknown groundwater pollution source identification problems. In the proposed model, MODFLOW and MT3DMS packages are used to simulate the flow and transport processes in the groundwater system. These models are then integrated with an optimization model which is based on the heuristic harmony search (HS) algorithm. In the proposed simulation-optimization model, the locations and release histories of the pollution sources are treated as the explicit decision variables and determined through the optimization model. Also, an implicit solution procedure is proposed to determine the optimum number of pollution sources which is an advantage of this model. The performance of the proposed model is evaluated on two hypothetical examples for simple and complex aquifer geometries, measurement error conditions, and different HS solution parameter sets. Identified results indicated that the proposed simulation-optimization model is an effective way and may be used to solve the inverse pollution source identification problems. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  4. Use of a plant level logic model for quantitative assessment of systems interactions

    International Nuclear Information System (INIS)

    Chu, B.B.; Rees, D.C.; Kripps, L.P.; Hunt, R.N.; Bradley, M.

    1985-01-01

    The Electric Power Research Institute (EPRI) has sponsored a research program to investigate methods for identifying systems interactions (SIs) and for the evaluation of their importance. Phase 1 of the EPRI research project focused on the evaluation of methods for identification of SIs. Major results of the Phase 1 activities are the documentation of four different methodologies for identification of potential SIs and development of guidelines for performing an effective plant walkdown in support of an SI analysis. Phase II of the project, currently being performed, is utilizing a plant level logic model of a pressurized water reactor (PWR) to determine the quantitative importance of identified SIs. In Phase II, previously reported events involving interactions between systems were screened and selected on the basis of their relevance to the Baltimore Gas and Electric (BGandE) Calvert Cliffs Nuclear Power Plant design and perceived potential safety significance. Selected events were then incorporated into the BGandE plant level GO logic model. The model is being exercised to calculate the relative importance of these events. Five previously identified event scenarios, extracted from licensee event reports (LERs) are being evaluated during the course of the study. A key feature of the approach being used in Phase II is the use of a logic model in a manner to effectively evaluate the impact of events on the system level and the plant level for the mitigation of transients. Preliminary study results indicate that the developed methodology can be a viable and effective means for determining the quantitative significance of SIs

  5. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  6. A quantitative quasispecies theory-based model of virus escape mutation under immune selection.

    Science.gov (United States)

    Woo, Hyung-June; Reifman, Jaques

    2012-08-07

    Viral infections involve a complex interplay of the immune response and escape mutation of the virus quasispecies inside a single host. Although fundamental aspects of such a balance of mutation and selection pressure have been established by the quasispecies theory decades ago, its implications have largely remained qualitative. Here, we present a quantitative approach to model the virus evolution under cytotoxic T-lymphocyte immune response. The virus quasispecies dynamics are explicitly represented by mutations in the combined sequence space of a set of epitopes within the viral genome. We stochastically simulated the growth of a viral population originating from a single wild-type founder virus and its recognition and clearance by the immune response, as well as the expansion of its genetic diversity. Applied to the immune escape of a simian immunodeficiency virus epitope, model predictions were quantitatively comparable to the experimental data. Within the model parameter space, we found two qualitatively different regimes of infectious disease pathogenesis, each representing alternative fates of the immune response: It can clear the infection in finite time or eventually be overwhelmed by viral growth and escape mutation. The latter regime exhibits the characteristic disease progression pattern of human immunodeficiency virus, while the former is bounded by maximum mutation rates that can be suppressed by the immune response. Our results demonstrate that, by explicitly representing epitope mutations and thus providing a genotype-phenotype map, the quasispecies theory can form the basis of a detailed sequence-specific model of real-world viral pathogens evolving under immune selection.

  7. Linking the Power and Transport Sectors—Part 2: Modelling a Sector Coupling Scenario for Germany

    Directory of Open Access Journals (Sweden)

    Martin Robinius

    2017-07-01

    Full Text Available “Linking the power and transport sectors—Part 1” describes the general principle of “sector coupling” (SC, develops a working definition intended of the concept to be of utility to the international scientific community, contains a literature review that provides an overview of relevant scientific papers on this topic and conducts a rudimentary analysis of the linking of the power and transport sectors on a worldwide, EU and German level. The aim of this follow-on paper is to outline an approach to the modelling of SC. Therefore, a study of Germany as a case study was conducted. This study assumes a high share of renewable energy sources (RES contributing to the grid and significant proportion of fuel cell vehicles (FCVs in the year 2050, along with a dedicated hydrogen pipeline grid to meet hydrogen demand. To construct a model of this nature, the model environment “METIS” (models for energy transformation and integration systems we developed will be described in more detail in this paper. Within this framework, a detailed model of the power and transport sector in Germany will be presented in this paper and the rationale behind its assumptions described. Furthermore, an intensive result analysis for the power surplus, utilization of electrolysis, hydrogen pipeline and economic considerations has been conducted to show the potential outcomes of modelling SC. It is hoped that this will serve as a basis for researchers to apply this framework in future to models and analysis with an international focus.

  8. An open source web interface for linking models to infrastructure system databases

    Science.gov (United States)

    Knox, S.; Mohamed, K.; Harou, J. J.; Rheinheimer, D. E.; Medellin-Azuara, J.; Meier, P.; Tilmant, A.; Rosenberg, D. E.

    2016-12-01

    Models of networked engineered resource systems such as water or energy systems are often built collaboratively with developers from different domains working at different locations. These models can be linked to large scale real world databases, and they are constantly being improved and extended. As the development and application of these models becomes more sophisticated, and the computing power required for simulations and/or optimisations increases, so has the need for online services and tools which enable the efficient development and deployment of these models. Hydra Platform is an open source, web-based data management system, which allows modellers of network-based models to remotely store network topology and associated data in a generalised manner, allowing it to serve multiple disciplines. Hydra Platform uses a web API using JSON to allow external programs (referred to as `Apps') to interact with its stored networks and perform actions such as importing data, running models, or exporting the networks to different formats. Hydra Platform supports multiple users accessing the same network and has a suite of functions for managing users and data. We present ongoing development in Hydra Platform, the Hydra Web User Interface, through which users can collaboratively manage network data and models in a web browser. The web interface allows multiple users to graphically access, edit and share their networks, run apps and view results. Through apps, which are located on the server, the web interface can give users access to external data sources and models without the need to install or configure any software. This also ensures model results can be reproduced by removing platform or version dependence. Managing data and deploying models via the web interface provides a way for multiple modellers to collaboratively manage data, deploy and monitor model runs and analyse results.

  9. Evaluating the Impacts of Spatial Uncertainties in Quantitative Precipitation Estimation (QPE) Products on Flood Modelling

    Science.gov (United States)

    Gao, Z.; Wu, H.; Li, J.; Hong, Y.; Huang, J.

    2017-12-01

    Precipitation is often the major uncertainty source of hydrologic modelling, e.g., for flood simulation. The quantitative precipitation estimation (QPE) products when used as input for hydrologic modelling can cause significant difference in model performance because of the large variations in their estimation of precipitation intensity, duration, and spatial distribution. Objectively evaluating QPE and deriving the best estimation of precipitation at river basin scale, represent a bottleneck which has been faced by the hydrometeorological community, despite they are desired by many researches including flood simulation, such as the Global Flood Monitoring System using the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model (Wu et al., 2014). Recently we developed a Multiple-product-driven hydrological Modeling Framework (MMF) for objective evaluation of QPE products using the DRIVE model (Wu et al., 2017). In this study based on the MMF, we (1) compare location, spatial characteristics, and geometric patterns of precipitation among QPE products at various temporal scales by adopting an object-oriented method; (2) demonstrate their effects on flood magnitude and timing simulation through the DRIVE model; and (3) further investigate and understand how different precipitation spatial patterns evolute and result in difference in streamflow and flood peak (magnitude and timing), through a linear routing scheme which is employed to decompose the contribution of flood peak during rain-flood events. This study shows that there can be significant difference in spatial patterns of accumulated precipitation at various temporal scales (from days to hourly) among QPE products, which cause significant difference in flood simulation particularly in peak timing prediction. Therefore, the evaluation of spatial pattern of precipitation should be considered as an important part of the framework for objective evaluation of QPE and the derivation of the best

  10. Identification of X-linked quantitative trait loci affecting cold tolerance in Drosophila melanogaster and fine mapping by selective sweep analysis.

    Science.gov (United States)

    Svetec, Nicolas; Werzner, Annegret; Wilches, Ricardo; Pavlidis, Pavlos; Alvarez-Castro, José M; Broman, Karl W; Metzler, Dirk; Stephan, Wolfgang

    2011-02-01

    Drosophila melanogaster is a cosmopolitan species that colonizes a great variety of environments. One trait that shows abundant evidence for naturally segregating genetic variance in different populations of D. melanogaster is cold tolerance. Previous work has found quantitative trait loci (QTL) exclusively on the second and the third chromosomes. To gain insight into the genetic architecture of cold tolerance on the X chromosome and to compare the results with our analyses of selective sweeps, a mapping population was derived from a cross between substitution lines that solely differed in the origin of their X chromosome: one originates from a European inbred line and the other one from an African inbred line. We found a total of six QTL for cold tolerance factors on the X chromosome of D. melanogaster. Although the composite interval mapping revealed slightly different QTL profiles between sexes, a coherent model suggests that most QTL overlapped between sexes, and each explained around 5-14% of the genetic variance (which may be slightly overestimated). The allelic effects were largely additive, but we also detected two significant interactions. Taken together, this provides evidence for multiple QTL that are spread along the entire X chromosome and whose effects range from low to intermediate. One detected transgressive QTL influences cold tolerance in different ways for the two sexes. While females benefit from the European allele increasing their cold tolerance, males tend to do better with the African allele. Finally, using selective sweep mapping, the candidate gene CG16700 for cold tolerance colocalizing with a QTL was identified. © 2010 Blackwell Publishing Ltd.

  11. Need for collection of quantitative distribution data for dosimetry and metabolic modeling

    International Nuclear Information System (INIS)

    Lathrop, K.A.

    1976-01-01

    Problems in radiation dose distribution studies in humans are discussed. Data show the effective half-times for 7 Be and 75 Se in the mouse, rat, monkey, dog, and human show no correlation with weight, body surface, or other readily apparent factor that could be used to equate nonhuman and human data. Another problem sometimes encountered in attempting to extrapolate animal data to humans involves equivalent doses of the radiopharmaceutical. A usual human dose for a radiopharmaceutical is 1 ml or 0.017 mg/kg. The same solution injected into a mouse in a convenient volume of 0.1 ml results in a dose of 4 ml/kg or 240 times that received by the human. The effect on whole body retention produced by a dose difference of similar magnitude for selenium in the rat shows the retention is at least twice as great with the smaller amount. With the development of methods for the collection of data throughout the body representing the fractional distribution of radioactivity versus time, not only can more realistic dose estimates be made, but also the tools will be provided for the study of physiological and biochemical interrelationships in the intact subject from which compartmental models may be made which have diagnostic significance. The unique requirement for quantitative biologic data needed for calculation of radiation absorbed doses is the same as the unique scientific contribution that nuclear medicine can make, which is the quantitative in vivo study of physiologic and biochemical processes. The technique involved is not the same as quantitation of a radionuclide image, but is a step beyond

  12. Tree Root System Characterization and Volume Estimation by Terrestrial Laser Scanning and Quantitative Structure Modeling

    Directory of Open Access Journals (Sweden)

    Aaron Smith

    2014-12-01

    Full Text Available The accurate characterization of three-dimensional (3D root architecture, volume, and biomass is important for a wide variety of applications in forest ecology and to better understand tree and soil stability. Technological advancements have led to increasingly more digitized and automated procedures, which have been used to more accurately and quickly describe the 3D structure of root systems. Terrestrial laser scanners (TLS have successfully been used to describe aboveground structures of individual trees and stand structure, but have only recently been applied to the 3D characterization of whole root systems. In this study, 13 recently harvested Norway spruce root systems were mechanically pulled from the soil, cleaned, and their volumes were measured by displacement. The root systems were suspended, scanned with TLS from three different angles, and the root surfaces from the co-registered point clouds were modeled with the 3D Quantitative Structure Model to determine root architecture and volume. The modeling procedure facilitated the rapid derivation of root volume, diameters, break point diameters, linear root length, cumulative percentages, and root fraction counts. The modeled root systems underestimated root system volume by 4.4%. The modeling procedure is widely applicable and easily adapted to derive other important topological and volumetric root variables.

  13. Quantitative Agent Based Model of Opinion Dynamics: Polish Elections of 2015

    Science.gov (United States)

    Sobkowicz, Pawel

    2016-01-01

    We present results of an abstract, agent based model of opinion dynamics simulations based on the emotion/information/opinion (E/I/O) approach, applied to a strongly polarized society, corresponding to the Polish political scene between 2005 and 2015. Under certain conditions the model leads to metastable coexistence of two subcommunities of comparable size (supporting the corresponding opinions)—which corresponds to the bipartisan split found in Poland. Spurred by the recent breakdown of this political duopoly, which occurred in 2015, we present a model extension that describes both the long term coexistence of the two opposing opinions and a rapid, transitory change due to the appearance of a third party alternative. We provide quantitative comparison of the model with the results of polls and elections in Poland, testing the assumptions related to the modeled processes and the parameters used in the simulations. It is shown, that when the propaganda messages of the two incumbent parties differ in emotional tone, the political status quo may be unstable. The asymmetry of the emotions within the support bases of the two parties allows one of them to be ‘invaded’ by a newcomer third party very quickly, while the second remains immune to such invasion. PMID:27171226

  14. Non Linear Programming (NLP formulation for quantitative modeling of protein signal transduction pathways.

    Directory of Open Access Journals (Sweden)

    Alexander Mitsos

    Full Text Available Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i excessive CPU time requirements and ii loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  15. Antiproliferative Pt(IV) complexes: synthesis, biological activity, and quantitative structure-activity relationship modeling.

    Science.gov (United States)

    Gramatica, Paola; Papa, Ester; Luini, Mara; Monti, Elena; Gariboldi, Marzia B; Ravera, Mauro; Gabano, Elisabetta; Gaviglio, Luca; Osella, Domenico

    2010-09-01

    Several Pt(IV) complexes of the general formula [Pt(L)2(L')2(L'')2] [axial ligands L are Cl-, RCOO-, or OH-; equatorial ligands L' are two am(m)ine or one diamine; and equatorial ligands L'' are Cl- or glycolato] were rationally designed and synthesized in the attempt to develop a predictive quantitative structure-activity relationship (QSAR) model. Numerous theoretical molecular descriptors were used alongside physicochemical data (i.e., reduction peak potential, Ep, and partition coefficient, log Po/w) to obtain a validated QSAR between in vitro cytotoxicity (half maximal inhibitory concentrations, IC50, on A2780 ovarian and HCT116 colon carcinoma cell lines) and some features of Pt(IV) complexes. In the resulting best models, a lipophilic descriptor (log Po/w or the number of secondary sp3 carbon atoms) plus an electronic descriptor (Ep, the number of oxygen atoms, or the topological polar surface area expressed as the N,O polar contribution) is necessary for modeling, supporting the general finding that the biological behavior of Pt(IV) complexes can be rationalized on the basis of their cellular uptake, the Pt(IV)-->Pt(II) reduction, and the structure of the corresponding Pt(II) metabolites. Novel compounds were synthesized on the basis of their predicted cytotoxicity in the preliminary QSAR model, and were experimentally tested. A final QSAR model, based solely on theoretical molecular descriptors to ensure its general applicability, is proposed.

  16. Quantitative evaluation of ultrasonic sound fields in anisotropic austenitic welds using 2D ray tracing model

    Science.gov (United States)

    Kolkoori, S. R.; Rahaman, M.-U.; Chinta, P. K.; Kreutzbruck, M.; Prager, J.

    2012-05-01

    Ultrasonic investigation of inhomogeneous anisotropic materials such as austenitic welds is complicated because its columnar grain structure leads to curved energy paths, beam splitting and asymmetrical beam profiles. A ray tracing model has potential advantage in analyzing the ultrasonic sound field propagation and there with optimizing the inspection parameters. In this contribution we present a 2D ray tracing model to predict energy ray paths, ray amplitudes and travel times for the three wave modes quasi longitudinal, quasi shear vertical, and shear horizontal waves in austenitic weld materials. Inhomogenity in the austenitic weld material is represented by discretizing the inhomogeneous region into several homogeneous layers. At each interface between the layers the reflection and transmission problem is computed and yields energy direction, amplitude and energy coefficients. The ray amplitudes are computed accurately by taking into account directivity, divergence and density of rays, phase relations as well as transmission coefficients. Ultrasonic sound fields obtained from the ray tracing model are compared quantitatively with the 2D Elastodynamic Finite Integration Technique (EFIT). The excellent agreement between both models confirms the validity of the presented ray tracing results. Experiments are conducted on austenitic weld samples with longitudinal beam transducer as transmitting probe and amplitudes at the rear surface are scanned by means of electrodynamical probes. Finally, the ray tracing model results are also validated through the experiments.

  17. Non Linear Programming (NLP) formulation for quantitative modeling of protein signal transduction pathways.

    Science.gov (United States)

    Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  18. Daphnia and fish toxicity of (benzo)triazoles: validated QSAR models, and interspecies quantitative activity-activity modelling.

    Science.gov (United States)

    Cassani, Stefano; Kovarich, Simona; Papa, Ester; Roy, Partha Pratim; van der Wal, Leon; Gramatica, Paola

    2013-08-15

    Due to their chemical properties synthetic triazoles and benzo-triazoles ((B)TAZs) are mainly distributed to the water compartments in the environment, and because of their wide use the potential effects on aquatic organisms are cause of concern. Non testing approaches like those based on quantitative structure-activity relationships (QSARs) are valuable tools to maximize the information contained in existing experimental data and predict missing information while minimizing animal testing. In the present study, externally validated QSAR models for the prediction of acute (B)TAZs toxicity in Daphnia magna and Oncorhynchus mykiss have been developed according to the principles for the validation of QSARs and their acceptability for regulatory purposes, proposed by the Organization for Economic Co-operation and Development (OECD). These models are based on theoretical molecular descriptors, and are statistically robust, externally predictive and characterized by a verifiable structural applicability domain. They have been applied to predict acute toxicity for over 300 (B)TAZs without experimental data, many of which are in the pre-registration list of the REACH regulation. Additionally, a model based on quantitative activity-activity relationships (QAAR) has been developed, which allows for interspecies extrapolation from daphnids to fish. The importance of QSAR/QAAR, especially when dealing with specific chemical classes like (B)TAZs, for screening and prioritization of pollutants under REACH, has been highlighted. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Biomine: predicting links between biological entities using network models of heterogeneous databases

    Directory of Open Access Journals (Sweden)

    Eronen Lauri

    2012-06-01

    Full Text Available Abstract Background Biological databases contain large amounts of data concerning the functions and associations of genes and proteins. Integration of data from several such databases into a single repository can aid the discovery of previously unknown connections spanning multiple types of relationships and databases. Results Biomine is a system that integrates cross-references from several biological databases into a graph model with multiple types of edges, such as protein interactions, gene-disease associations and gene ontology annotations. Edges are weighted based on their type, reliability, and informativeness. We present Biomine and evaluate its performance in link prediction, where the goal is to predict pairs of nodes that will be connected in the future, based on current data. In particular, we formulate protein interaction prediction and disease gene prioritization tasks as instances of link prediction. The predictions are based on a proximity measure computed on the integrated graph. We consider and experiment with several such measures, and perform a parameter optimization procedure where different edge types are weighted to optimize link prediction accuracy. We also propose a novel method for disease-gene prioritization, defined as finding a subset of candidate genes that cluster together in the graph. We experimentally evaluate Biomine by predicting future annotations in the source databases and prioritizing lists of putative disease genes. Conclusions The experimental results show that Biomine has strong potential for predicting links when a set of selected candidate links is available. The predictions obtained using the entire Biomine dataset are shown to clearly outperform ones obtained using any single source of data alone, when different types of links are suitably weighted. In the gene prioritization task, an established reference set of disease-associated genes is useful, but the results show that under favorable

  20. Application of cross-linked and hydrolyzed arabinoxylans in baking of model rye bread.

    Science.gov (United States)

    Buksa, Krzysztof; Nowotna, Anna; Ziobro, Rafał

    2016-02-01

    The role of water extractable arabinoxylan with varying molar mass and structure (cross-linked vs. hydrolyzed) in the structure formation of rye bread was examined using a model bread. Instead of the normal flour, the dough contained starch, arabinoxylan and protein, which were isolated from rye wholemeal. It was observed that the applied mixes of these constituents result in a product closely resembling typical rye bread, even if arabinoxylan was modified (by cross-linking or hydrolysis). The levels of arabinoxylan required for bread preparation depended on its modification and mix composition. At 3% protein, the maximum applicable level of poorly soluble cross-linked arabinoxylan was 3%, as higher amounts of this preparation resulted in an extensively viscous dough and diminished bread volume. On the other hand highly soluble, hydrolyzed arabinoxylan could be used at a higher level (6%) together with larger amounts of rye protein (3% or 6%). Further addition of arabinoxylan leads to excessive water absorption, resulting in a decreased viscosity of the dough during baking and insufficient gas retention. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Modeling and Performance Analysis of 10 Gbps Inter-satellite Optical Wireless Communication Link

    Science.gov (United States)

    Singh, Mehtab

    2017-12-01

    Free-space optical (FSO) communication has the advantages of two of the most predominant data transmission technologies - optical fiber communication and wireless communication. Most of the technical aspects of FSO are similar to that of optical fiber communication, with major difference in the information signal propagation medium which is free space in case of FSO rather than silica glass in optical fiber communication. One of the most important applications of FSO is inter-satellite optical wireless communication (IsOWC) links which will be deployed in the future in space. The IsOWC links have many advantages over the previously existing microwave satellite communication technologies such as higher bandwidth, lower power consumption, low cost of implementation, light size, and weight. In this paper, modeling and performance analysis of a 10-Gbps inter-satellite communication link with two satellites separated at a distance of 1,200 km has been done using OPTISYSTEM simulation software. Performance has been analyzed on the basis of quality factor, signal to noise ratio (SNR), and total power of the received signal.

  2. Random blebbing motion: A simple model linking cell structural properties to migration characteristics

    Science.gov (United States)

    Woolley, Thomas E.; Gaffney, Eamonn A.; Goriely, Alain

    2017-07-01

    If the plasma membrane of a cell is able to delaminate locally from its actin cortex, a cellular bleb can be produced. Blebs are pressure-driven protrusions, which are noteworthy for their ability to produce cellular motion. Starting from a general continuum mechanics description, we restrict ourselves to considering cell and bleb shapes that maintain approximately spherical forms. From this assumption, we obtain a tractable algebraic system for bleb formation. By including cell-substrate adhesions, we can model blebbing cell motility. Further, by considering mechanically isolated blebbing events, which are randomly distributed over the cell, we can derive equations linking the macroscopic migration characteristics to the microscopic structural parameters of the cell. This multiscale modeling framework is then used to provide parameter estimates, which are in agreement with current experimental data. In summary, the construction of the mathematical model provides testable relationships between the bleb size and cell motility.

  3. Continuum mechanical model for cross-linked actin networks with contractile bundles

    Science.gov (United States)

    Ferreira, J. P. S.; Parente, M. P. L.; Natal Jorge, R. M.

    2018-01-01

    In the context of a mechanical approach to cell biology, there is a close relationship between cellular function and mechanical properties. In recent years, an increasing amount of attention has been given to the coupling between biochemical and mechanical signals by means of constitutive models. In particular, on the active contractility of the actin cytoskeleton. Given the importance of the actin contraction on the physiological functions, this study propose a constitutive model to describe how the filamentous network controls its mechanics actively. Embedded in a soft isotropic ground substance, the network behaves as a viscous mechanical continuum, comprised of isotropically distributed cross-linked actin filaments and actomyosin bundles. Trough virtual rheometry experiments, the present model relates the dynamics of the myosin motors with the network stiffness, which is to a large extent governed by the time-scale of the applied deformations/forces.

  4. Link between hopping models and percolation scaling laws for charge transport in mixtures of small molecules

    Directory of Open Access Journals (Sweden)

    Dong-Gwang Ha

    2016-04-01

    Full Text Available Mixed host compositions that combine charge transport materials with luminescent dyes offer superior control over exciton formation and charge transport in organic light emitting devices (OLEDs. Two approaches are typically used to optimize the fraction of charge transport materials in a mixed host composition: either an empirical percolative model, or a hopping transport model. We show that these two commonly-employed models are linked by an analytic expression which relates the localization length to the percolation threshold and critical exponent. The relation is confirmed both numerically and experimentally through measurements of the relative conductivity of Tris(4-carbazoyl-9-ylphenylamine (TCTA :1,3-bis(3,5-dipyrid-3-yl-phenylbenzene (BmPyPb mixtures with different concentrations, where the TCTA plays a role as hole conductor and the BmPyPb as hole insulator. The analytic relation may allow the rational design of mixed layers of small molecules for high-performance OLEDs.

  5. Linking genes to ecosystem trace gas fluxes in a large-scale model system

    Science.gov (United States)

    Meredith, L. K.; Cueva, A.; Volkmann, T. H. M.; Sengupta, A.; Troch, P. A.

    2017-12-01

    Soil microorganisms mediate biogeochemical cycles through biosphere-atmosphere gas exchange with significant impact on atmospheric trace gas composition. Improving process-based understanding of these microbial populations and linking their genomic potential to the ecosystem-scale is a challenge, particularly in soil systems, which are heterogeneous in biodiversity, chemistry, and structure. In oligotrophic systems, such as the Landscape Evolution Observatory (LEO) at Biosphere 2, atmospheric trace gas scavenging may supply critical metabolic needs to microbial communities, thereby promoting tight linkages between microbial genomics and trace gas utilization. This large-scale model system of three initially homogenous and highly instrumented hillslopes facilitates high temporal resolution characterization of subsurface trace gas fluxes at hundreds of sampling points, making LEO an ideal location to study microbe-mediated trace gas fluxes from the gene to ecosystem scales. Specifically, we focus on the metabolism of ubiquitous atmospheric reduced trace gases hydrogen (H2), carbon monoxide (CO), and methane (CH4), which may have wide-reaching impacts on microbial community establishment, survival, and function. Additionally, microbial activity on LEO may facilitate weathering of the basalt matrix, which can be studied with trace gas measurements of carbonyl sulfide (COS/OCS) and carbon dioxide (O-isotopes in CO2), and presents an additional opportunity for gene to ecosystem study. This work will present initial measurements of this suite of trace gases to characterize soil microbial metabolic activity, as well as links between spatial and temporal variability of microbe-mediated trace gas fluxes in LEO and their relation to genomic-based characterization of microbial community structure (phylogenetic amplicons) and genetic potential (metagenomics). Results from the LEO model system will help build understanding of the importance of atmospheric inputs to

  6. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (IBM PC VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  7. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (MACINTOSH VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  8. Probabilistic Model for Free-Space Optical Links Under Continental Fog Conditions

    Directory of Open Access Journals (Sweden)

    Marzuki

    2010-09-01

    Full Text Available The error characteristics of a free-space optical (FSO channel are significantly different from the fiber based optical links and thus require a deep physical understanding of the propagation channel. In particular different fog conditions greatly influence the optical transmissions and thus a channel model is required to estimate the detrimental fog effects. In this paper we shall present the probabilistic model for radiation fog from the measured data over a 80 m FSO link installed at Graz, Austria. The fog events are classified into thick fog, moderate fog, light fog and general fog based on the international code of visibility range. We applied some probability distribution functions (PDFs such as Kumaraswamy, Johnson SB and Logistic distribution, to the actual measured optical attenuations. The performance of each distribution is evaluated by Q-Q and P-P plots. It is found that Kumaraswamy distribution is the best fit for general fog, while Logistic distribution is the optimum choice for thick fog. On the other hand, Johnson SB distribution best fits the moderate and light fog related measured attenuation data. The difference in these probabilistic models and the resultant variation in the received signal strength under different fog types needs to be considered in designing an efficient FSO system.

  9. SIMO optical wireless links with nonzero boresight pointing errors over M modeled turbulence channels

    Science.gov (United States)

    Varotsos, G. K.; Nistazakis, H. E.; Petkovic, M. I.; Djordjevic, G. T.; Tombras, G. S.

    2017-11-01

    Over the last years terrestrial free-space optical (FSO) communication systems have demonstrated an increasing scientific and commercial interest in response to the growing demands for ultra high bandwidth, cost-effective and secure wireless data transmissions. However, due the signal propagation through the atmosphere, the performance of such links depends strongly on the atmospheric conditions such as weather phenomena and turbulence effect. Additionally, their operation is affected significantly by the pointing errors effect which is caused by the misalignment of the optical beam between the transmitter and the receiver. In order to address this significant performance degradation, several statistical models have been proposed, while particular attention has been also given to diversity methods. Here, the turbulence-induced fading of the received optical signal irradiance is studied through the M (alaga) distribution, which is an accurate model suitable for weak to strong turbulence conditions and unifies most of the well-known, previously emerged models. Thus, taking into account the atmospheric turbulence conditions along with the pointing errors effect with nonzero boresight and the modulation technique that is used, we derive mathematical expressions for the estimation of the average bit error rate performance for SIMO FSO links. Finally, proper numerical results are given to verify our derived expressions and Monte Carlo simulations are also provided to further validate the accuracy of the analysis proposed and the obtained mathematical expressions.

  10. Modeling channel interference in an orbital angular momentum-multiplexed laser link

    Science.gov (United States)

    Anguita, Jaime A.; Neifeld, Mark A.; Vasic, Bane V.

    2009-08-01

    We study the effects of optical turbulence on the energy crosstalk among constituent orbital angular momentum (OAM) states in a vortex-based multi-channel laser communication link and determine channel interference in terms of turbulence strength and OAM state separation. We characterize the channel interference as a function of C2n and transmit OAM state, and propose probability models to predict the random fluctuations in the received signals for such architecture. Simulations indicate that turbulence-induced channel interference is mutually correlated across receive channels.

  11. Linear approaches to intramolecular Förster resonance energy transfer probe measurements for quantitative modeling.

    Directory of Open Access Journals (Sweden)

    Marc R Birtwistle

    Full Text Available Numerous unimolecular, genetically-encoded Förster Resonance Energy Transfer (FRET probes for monitoring biochemical activities in live cells have been developed over the past decade. As these probes allow for collection of high frequency, spatially resolved data on signaling events in live cells and tissues, they are an attractive technology for obtaining data to develop quantitative, mathematical models of spatiotemporal signaling dynamics. However, to be useful for such purposes the observed FRET from such probes should be related to a biological quantity of interest through a defined mathematical relationship, which is straightforward when this relationship is linear, and can be difficult otherwise. First, we show that only in rare circumstances is the observed FRET linearly proportional to a biochemical activity. Therefore in most cases FRET measurements should only be compared either to explicitly modeled probes or to concentrations of products of the biochemical activity, but not to activities themselves. Importantly, we find that FRET measured by standard intensity-based, ratiometric methods is inherently non-linear with respect to the fraction of probes undergoing FRET. Alternatively, we find that quantifying FRET either via (1 fluorescence lifetime imaging (FLIM or (2 ratiometric methods where the donor emission intensity is divided by the directly-excited acceptor emission intensity (denoted R(alt is linear with respect to the fraction of probes undergoing FRET. This linearity property allows one to calculate the fraction of active probes based on the FRET measurement. Thus, our results suggest that either FLIM or ratiometric methods based on R(alt are the preferred techniques for obtaining quantitative data from FRET probe experiments for mathematical modeling purposes.

  12. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  13. A Quantitative, Time-Dependent Model of Oxygen Isotopes in the Solar Nebula: Step one

    Science.gov (United States)

    Nuth, J. A.; Paquette, J. A.; Farquhar, A.; Johnson, N. M.

    2011-01-01

    The remarkable discovery that oxygen isotopes in primitive meteorites were fractionated along a line of slope I rather than along the typical slope 0,52 terrestrial fractionation line occurred almost 40 years ago, However, a satisfactory, quantitative explanation for this observation has yet to be found, though many different explanations have been proposed, The first of these explanations proposed that the observed line represented the final product produced by mixing molecular cloud dust with a nucleosynthetic component, rich in O-16, possibly resulting from a nearby supernova explosion, Donald Clayton suggested that Galactic Chemical Evolution would gradually change the oxygen isotopic composition of the interstellar grain population by steadily producing O-16 in supernovae, then producing the heavier isotopes as secondary products in lower mass stars, Thiemens and collaborators proposed a chemical mechanism that relied on the availability of additional active rotational and vibrational states in otherwise-symmetric molecules, such as CO2, O3 or SiO2, containing two different oxygen isotopes and a second, photochemical process that suggested that differential photochemical dissociation processes could fractionate oxygen , This second line of research has been pursued by several groups, though none of the current models is quantitative,

  14. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism

    DEFF Research Database (Denmark)

    Birkel, Garrett W.; Ghosh, Amit; Kumar, Vinay S.

    2017-01-01

    analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics, proteomics and metabolomics) are urgently needed.Results: The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes...

  15. Linking normative models of natural tasks to descriptive models of neural response.

    Science.gov (United States)

    Jaini, Priyank; Burge, Johannes

    2017-10-01

    Understanding how nervous systems exploit task-relevant properties of sensory stimuli to perform natural tasks is fundamental to the study of perceptual systems. However, there are few formal methods for determining which stimulus properties are most useful for a given natural task. As a consequence, it is difficult to develop principled models for how to compute task-relevant latent variables from natural signals, and it is difficult to evaluate descriptive models fit to neural response. Accuracy maximization analysis (AMA) is a recently developed Bayesian method for finding the optimal task-specific filters (receptive fields). Here, we introduce AMA-Gauss, a new faster form of AMA that incorporates the assumption that the class-conditional filter responses are Gaussian distributed. Then, we use AMA-Gauss to show that its assumptions are justified for two fundamental visual tasks: retinal speed estimation and binocular disparity estimation. Next, we show that AMA-Gauss has striking formal similarities to popular quadratic models of neural response: the energy model and the generalized quadratic model (GQM). Together, these developments deepen our understanding of why the energy model of neural response have proven useful, improve our ability to evaluate results from subunit model fits to neural data, and should help accelerate psychophysics and neuroscience research with natural stimuli.

  16. Individual-based modeling of fish: Linking to physical models and water quality.

    Energy Technology Data Exchange (ETDEWEB)

    Rose, K.A.

    1997-08-01

    The individual-based modeling approach for the simulating fish population and community dynamics is gaining popularity. Individual-based modeling has been used in many other fields, such as forest succession and astronomy. The popularity of the individual-based approach is partly a result of the lack of success of the more aggregate modeling approaches traditionally used for simulating fish population and community dynamics. Also, recent recognition that it is often the atypical individual that survives has fostered interest in the individual-based approach. Two general types of individual-based models are distribution and configuration. Distribution models follow the probability distributions of individual characteristics, such as length and age. Configuration models explicitly simulate each individual; the sum over individuals being the population. DeAngelis et al (1992) showed that, when distribution and configuration models were formulated from the same common pool of information, both approaches generated similar predictions. The distribution approach was more compact and general, while the configuration approach was more flexible. Simple biological changes, such as making growth rate dependent on previous days growth rates, were easy to implement in the configuration version but prevented simple analytical solution of the distribution version.

  17. Nonlinear quantitative radiation sensitivity prediction model based on NCI-60 cancer cell lines.

    Science.gov (United States)

    Zhang, Chunying; Girard, Luc; Das, Amit; Chen, Sun; Zheng, Guangqiang; Song, Kai

    2014-01-01

    We proposed a nonlinear model to perform a novel quantitative radiation sensitivity prediction. We used the NCI-60 panel, which consists of nine different cancer types, as the platform to train our model. Important radiation therapy (RT) related genes were selected by significance analysis of microarrays (SAM). Orthogonal latent variables (LVs) were then extracted by the partial least squares (PLS) method as the new compressive input variables. Finally, support vector machine (SVM) regression model was trained with these LVs to predict the SF2 (the surviving fraction of cells after a radiation dose of 2 Gy γ-ray) values of the cell lines. Comparison with the published results showed significant improvement of the new method in various ways: (a) reducing the root mean square error (RMSE) of the radiation sensitivity prediction model from 0.20 to 0.011; and (b) improving prediction accuracy from 62% to 91%. To test the predictive performance of the gene signature, three different types of cancer patient datasets were used. Survival analysis across these different types of cancer patients strongly confirmed the clinical potential utility of the signature genes as a general prognosis platform. The gene regulatory network analysis identified six hub genes that are involved in canonical cancer pathways.

  18. Nonlinear Quantitative Radiation Sensitivity Prediction Model Based on NCI-60 Cancer Cell Lines

    Directory of Open Access Journals (Sweden)

    Chunying Zhang

    2014-01-01

    Full Text Available We proposed a nonlinear model to perform a novel quantitative radiation sensitivity prediction. We used the NCI-60 panel, which consists of nine different cancer types, as the platform to train our model. Important radiation therapy (RT related genes were selected by significance analysis of microarrays (SAM. Orthogonal latent variables (LVs were then extracted by the partial least squares (PLS method as the new compressive input variables. Finally, support vector machine (SVM regression model was trained with these LVs to predict the SF2 (the surviving fraction of cells after a radiation dose of 2 Gy γ-ray values of the cell lines. Comparison with the published results showed significant improvement of the new method in various ways: (a reducing the root mean square error (RMSE of the radiation sensitivity prediction model from 0.20 to 0.011; and (b improving prediction accuracy from 62% to 91%. To test the predictive performance of the gene signature, three different types of cancer patient datasets were used. Survival analysis across these different types of cancer patients strongly confirmed the clinical potential utility of the signature genes as a general prognosis platform. The gene regulatory network analysis identified six hub genes that are involved in canonical cancer pathways.

  19. Quantitative structure-activity relationship models of chemical transformations from matched pairs analyses.

    Science.gov (United States)

    Beck, Jeremy M; Springer, Clayton

    2014-04-28

    The concepts of activity cliffs and matched molecular pairs (MMP) are recent paradigms for analysis of data sets to identify structural changes that may be used to modify the potency of lead molecules in drug discovery projects. Analysis of MMPs was recently demonstrated as a feasible technique for quantitative structure-activity relationship (QSAR) modeling of prospective compounds. Although within a small data set, the lack of matched pairs, and the lack of knowledge about specific chemical transformations limit prospective applications. Here we present an alternative technique that determines pairwise descriptors for each matched pair and then uses a QSAR model to estimate the activity change associated with a chemical transformation. The descriptors effectively group similar transformations and incorporate information about the transformation and its local environment. Use of a transformation QSAR model allows one to estimate the activity change for novel transformations and therefore returns predictions for a larger fraction of test set compounds. Application of the proposed methodology to four public data sets results in increased model performance over a benchmark random forest and direct application of chemical transformations using QSAR-by-matched molecular pairs analysis (QSAR-by-MMPA).

  20. Data-driven interdisciplinary mathematical modelling quantitatively unveils competition dynamics of co-circulating influenza strains.

    Science.gov (United States)

    Ho, Bin-Shenq; Chao, Kun-Mao

    2017-07-28

    Co-circulation of influenza strains is common to seasonal epidemics and pandemic emergence. Competition was considered involved in the vicissitudes of co-circulating influenza strains but never quantitatively studied at the human population level. The main purpose of the study was to explore the competition dynamics of co-circulating influenza strains in a quantitative way. We constructed a heterogeneous dynamic transmission model and ran the model to fit the weekly A/H1N1 influenza virus isolation rate through an influenza season. The construction process started on the 2007-2008 single-clade influenza season and, with the contribution from the clade-based A/H1N1 epidemiological curves, advanced to the 2008-2009 two-clade influenza season. Pearson method was used to estimate the correlation coefficient between the simulated epidemic curve and the observed weekly A/H1N1 influenza virus isolation rate curve. The model found the potentially best-fit simulation with correlation coefficient up to 96% and all the successful simulations converging to the best-fit. The annual effective reproductive number of each co-circulating influenza strain was estimated. We found that, during the 2008-2009 influenza season, the annual effective reproductive number of the succeeding A/H1N1 clade 2B-2, carrying H275Y mutation in the neuraminidase, was estimated around 1.65. As to the preceding A/H1N1 clade 2C-2, the annual effective reproductive number would originally be equivalent to 1.65 but finally took on around 0.75 after the emergence of clade 2B-2. The model reported that clade 2B-2 outcompeted for the 2008-2009 influenza season mainly because clade 2C-2 suffered from a reduction of transmission fitness of around 71% on encountering the former. We conclude that interdisciplinary data-driven mathematical modelling could bring to light the transmission dynamics of the A/H1N1 H275Y strains during the 2007-2009 influenza seasons worldwide and may inspire us to tackle the

  1. Development of Multidimensional Gap Conductance model using Virtual Link Gap Element

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    The gap conductance that determines temperature gradient between pellet and cladding can be quite sensitive to gap thickness. For instance, once the gap size increases up to several micrometers in certain region, difference of pellet surface temperatures increases up to 100 Kelvin. Therefore, iterative thermo-mechanical coupled analysis is required to solve temperature distribution throughout pellet and cladding. Recently, multidimensional fuel performance codes have been being developed in the advanced countries to evaluate thermal behavior of fuel for off normal conditions and DBA(design based accident) conditions using the Finite Element Method (FEM). FRAPCON-FRAPTRAN code system, which is well known as the verified and reliable code, incorporates 1D thermal module and multidimensional mechanical module. In this code, multidimensional gap conductance model is not applied. ALCYONE developed by CEA introduces equivalent heat convection coefficient that represents multidimensional gap conductance as a function of gap thickness. BISON, which is multidimensional fuel performance code developed by INL, owns multidimensional gap conductance model using projected thermal contact. In general, thermal contact algorithm is nonlinear calculation which is expensive approach numerically. The gap conductance model for multi-dimension is difficult issue in terms of convergence and nonlinearity because gap conductance is function of gap thickness which depends on mechanical analysis at each iteration step. In this paper, virtual link gap (VLG) element has been proposed to resolve convergence issue and nonlinear characteristic of multidimensional gap conductance. In terms of calculation accuracy and convergence efficiency, the proposed VLG model was evaluated. LWR fuel performance codes should incorporate thermo-mechanical loop to solve gap conductance problem, iteratively. However, gap conductance in multidimensional model is difficult issue owing to its nonlinearity and

  2. Diffusion-weighted MRI and quantitative biophysical modeling of hippocampal neurite loss in chronic stress.

    Directory of Open Access Journals (Sweden)

    Peter Vestergaard-Poulsen

    Full Text Available Chronic stress has detrimental effects on physiology, learning and memory and is involved in the development of anxiety and depressive disorders. Besides changes in synaptic formation and neurogenesis, chronic stress also induces dendritic remodeling in the hippocampus, amygdala and the prefrontal cortex. Investigations of dendritic remodeling during development and treatment of stress are currently limited by the invasive nature of histological and stereological methods. Here we show that high field diffusion-weighted MRI combined with quantitative biophysical modeling of the hippocampal dendritic loss in 21 day restraint stressed rats highly correlates with former histological findings. Our study strongly indicates that diffusion-weighted MRI is sensitive to regional dendritic loss and thus a promising candidate for non-invasive studies of dendritic plasticity in chronic stress and stress-related disorders.

  3. Assessment for Improvement: Two Models for Assessing a Large Quantitative Reasoning Requirement

    Directory of Open Access Journals (Sweden)

    Mary C. Wright

    2015-03-01

    Full Text Available We present two models for assessment of a large and diverse quantitative reasoning (QR requirement at the University of Michigan. These approaches address two key challenges in assessment: (1 dissemination of findings for curricular improvement and (2 resource constraints associated with measurement of large programs. Approaches we present for data collection include convergent validation of self-report surveys, as well as use of mixed methods and learning analytics. Strategies we present for dissemination of findings include meetings with instructors to share data and best practices, sharing of results through social media, and use of easily accessible dashboards. These assessment approaches may be of particular interest to universities with large numbers of students engaging in a QR experience, projects that involve multiple courses with diverse instructional goals, or those who wish to promote evidence-based curricular improvement.

  4. Linking 1D coastal ocean modelling to environmental management: an ensemble approach

    Science.gov (United States)

    Mussap, Giulia; Zavatarelli, Marco; Pinardi, Nadia

    2017-12-01

    The use of a one-dimensional interdisciplinary numerical model of the coastal ocean as a tool contributing to the formulation of ecosystem-based management (EBM) is explored. The focus is on the definition of an experimental design based on ensemble simulations, integrating variability linked to scenarios (characterised by changes in the system forcing) and to the concurrent variation of selected, and poorly constrained, model parameters. The modelling system used was previously specifically designed for the use in "data-rich" areas, so that horizontal dynamics can be resolved by a diagnostic approach and external inputs can be parameterised by nudging schemes properly calibrated. Ensembles determined by changes in the simulated environmental (physical and biogeochemical) dynamics, under joint forcing and parameterisation variations, highlight the uncertainties associated to the application of specific scenarios that are relevant to EBM, providing an assessment of the reliability of the predicted changes. The work has been carried out by implementing the coupled modelling system BFM-POM1D in an area of Gulf of Trieste (northern Adriatic Sea), considered homogeneous from the point of view of hydrological properties, and forcing it by changing climatic (warming) and anthropogenic (reduction of the land-based nutrient input) pressure. Model parameters affected by considerable uncertainties (due to the lack of relevant observations) were varied jointly with the scenarios of change. The resulting large set of ensemble simulations provided a general estimation of the model uncertainties related to the joint variation of pressures and model parameters. The information of the model result variability aimed at conveying efficiently and comprehensibly the information on the uncertainties/reliability of the model results to non-technical EBM planners and stakeholders, in order to have the model-based information effectively contributing to EBM.

  5. Dynamically linking economic models to ecological condition for coastal zone management: Application to sustainable tourism planning.

    Science.gov (United States)

    Dvarskas, Anthony

    2017-03-01

    While the development of the tourism industry can bring economic benefits to an area, it is important to consider the long-run impact of the industry on a given location. Particularly when the tourism industry relies upon a certain ecological state, those weighing different development options need to consider the long-run impacts of increased tourist numbers upon measures of ecological condition. This paper presents one approach for linking a model of recreational visitor behavior with an ecological model that estimates the impact of the increased visitors upon the environment. Two simulations were run for the model using initial parameters available from survey data and water quality data for beach locations in Croatia. Results suggest that the resilience of a given tourist location to the changes brought by increasing tourism numbers is important in determining its long-run sustainability. Further work should investigate additional model components, including the tourism industry, refinement of the relationships assumed by the model, and application of the proposed model in additional areas. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Linking models of human behaviour and climate alters projected climate change

    Science.gov (United States)

    Beckage, Brian; Gross, Louis J.; Lacasse, Katherine; Carr, Eric; Metcalf, Sara S.; Winter, Jonathan M.; Howe, Peter D.; Fefferman, Nina; Franck, Travis; Zia, Asim; Kinzig, Ann; Hoffman, Forrest M.

    2018-01-01

    Although not considered in climate models, perceived risk stemming from extreme climate events may induce behavioural changes that alter greenhouse gas emissions. Here, we link the C-ROADS climate model to a social model of behavioural change to examine how interactions between perceived risk and emissions behaviour influence projected climate change. Our coupled climate and social model resulted in a global temperature change ranging from 3.4-6.2 °C by 2100 compared with 4.9 °C for the C-ROADS model alone, and led to behavioural uncertainty that was of a similar magnitude to physical uncertainty (2.8 °C versus 3.5 °C). Model components with the largest influence on temperature were the functional form of response to extreme events, interaction of perceived behavioural control with perceived social norms, and behaviours leading to sustained emissions reductions. Our results suggest that policies emphasizing the appropriate attribution of extreme events to climate change and infrastructural mitigation may reduce climate change the most.

  7. Image guided interstitial laser thermotherapy: a canine model evaluated by magnetic resonance imaging and quantitative autoradiography.

    Science.gov (United States)

    Muacevic, A; Peller, M; Ruprecht, L; Berg, D; Fend, L; Sroka, R; Reulen, H J; Reiser, M; Tonn, J Ch; Kreth, F W

    2005-02-01

    To determine the applicability and safety of a new canine model suitable for correlative magnetic resonance imaging (MRI) studies and morphological/pathophysiological examination over time after interstitial laser thermotherapy (ILTT) in brain tissue. A laser fibre (Diode Laser 830 nm) with an integrated temperature feedback system was inserted into the right frontal white matter in 18 dogs using frameless navigation technique. MRI thermometry (phase mapping i.e. chemical shift of the proton resonance frequency) during interstitial heating was compared to simultaneously recorded interstitial fiberoptic temperature readings on the border of the lesion. To study brain capillary function in response to ILTT over time quantitative autoradiography was performed investigating the unidirectional blood-to-tissue transport of carbon-14-labelled alpha amino-isobutyric acid (transfer constant K of AIB) 12, 36 hours, 7, 14 days, 4 weeks and 3 months after ILTT. All laser procedures were well tolerated, laser and temperature fibres could be adequately placed in the right frontal lobe in all animals. In 5 animals MRI-based temperature quantification correlated strongly to invasive temperature measurements. In the remaining animals the temperature fibre was located in the area of susceptibility artifacts, therefore, no temperature correlation was possible. The laser lesions consisted of a central area of calcified necrosis which was surrounded by an area of reactive brain tissue with increased permeability. Quantitative autoradiography indicated a thin and spherical blood brain barrier lesion. The magnitude of K of AIB increased from 12 hours to 14 days after ILTT and decreased thereafter. The mean value of K of AIB was 19 times (2 times) that of normal white matter (cortex), respectively. ILTT causes transient, highly localised areas of increased capillary permeability surrounding the laser lesion. Phase contrast imaging for MRI thermomonitoring can currently not be used for

  8. A mediation model linking body weight, cognition, and sleep-disordered breathing.

    Science.gov (United States)

    Spruyt, Karen; Gozal, David

    2012-01-15

    Academic success involves the ability to use cognitive skills in a school environment. Poor academic performance has been linked to disrupted sleep associated with sleep-disordered breathing (SDB). In parallel, poor sleep is associated with increased risk for obesity, and weight management problems have been linked to executive dysfunction, suggesting that interactions may be operational between SDB and obesity to adversely affect neurocognitive outcomes. To test whether mediator relationships exist between body weight, SDB, and cognition. Structural equation modeling was conducted on data from 351 children in a community-based cohort assessed with the core subtests of the Differential Abilities Scales after an overnight polysomnogram. Body mass index, apnea-hypopnea index, and cognitive abilities were modeled as latent constructs. In a sample of predominantly white children 6 to 10 years of age, SDB amplified the adverse cognitive and weight outcomes by 0.55- to 0.46-fold, respectively. Weight amplified the risk by 0.39- to 0.40-fold for SDB and cognitive outcomes, respectively. Poor ability to perform complex mental processing functions increased the risk of adverse weight and SDB outcomes by 2.9- and 7.9-fold, respectively. Cognitive functioning in children is adversely affected by frequent health-related problems, such as obesity and SDB. Furthermore, poorer integrative mental processing may place a child at a bigger risk for adverse health outcomes.

  9. DFT Modeling of Cross-Linked Polyethylene: Role of Gold Atoms and Dispersion Interactions.

    Science.gov (United States)

    Blaško, Martin; Mach, Pavel; Antušek, Andrej; Urban, Miroslav

    2018-02-08

    Using DFT modeling, we analyze the concerted action of gold atoms and dispersion interactions in cross-linked polyethylene. Our model consists of two oligomer chains (PEn) with 7, 11, 15, 19, or 23 carbon atoms in each oligomer cross-linked with one to three Au atoms through C-Au-C bonds. In structures with a single gold atom the C-Au-C bond is located in the central position of the oligomer. Binding energies (BEs) with respect to two oligomer radical fragments and Au are as high as 362-489 kJ/mol depending on the length of the oligomer chain. When the dispersion contribution in PEn-Au-PEn oligomers is omitted, BE is almost independent of the number of carbon atoms, lying between 293 and 296 kJ/mol. The dispersion energy contributions to BEs in PEn-Au-PEn rise nearly linearly with the number of carbon atoms in the PEn chain. The carbon-carbon distance in the C-Au-C moiety is around 4.1 Å, similar to the bond distance between saturated closed shell chains in the polyethylene crystal. BEs of pure saturated closed shell PEn-PEn oligomers are 51-187 kJ/mol. Both Au atoms and dispersion interactions contribute considerably to the creation of nearly parallel chains of oligomers with reasonably high binding energies.

  10. Quantitative assessments of mantle flow models against seismic observations: Influence of uncertainties in mineralogical parameters

    Science.gov (United States)

    Schuberth, Bernhard S. A.

    2017-04-01

    One of the major challenges in studies of Earth's deep mantle is to bridge the gap between geophysical hypotheses and observations. The biggest dataset available to investigate the nature of mantle flow are recordings of seismic waveforms. On the other hand, numerical models of mantle convection can be simulated on a routine basis nowadays for earth-like parameters, and modern thermodynamic mineralogical models allow us to translate the predicted temperature field to seismic structures. The great benefit of the mineralogical models is that they provide the full non-linear relation between temperature and seismic velocities and thus ensure a consistent conversion in terms of magnitudes. This opens the possibility for quantitative assessments of the theoretical predictions. The often-adopted comparison between geodynamic and seismic models is unsuitable in this respect owing to the effects of damping, limited resolving power and non-uniqueness inherent to tomographic inversions. The most relevant issue, however, is related to wavefield effects that reduce the magnitude of seismic signals (e.g., traveltimes of waves), a phenomenon called wavefront healing. Over the past couple of years, we have developed an approach that takes the next step towards a quantitative assessment of geodynamic models and that enables us to test the underlying geophysical hypotheses directly against seismic observations. It is based solely on forward modelling and warrants a physically correct treatment of the seismic wave equation without theoretical approximations. Fully synthetic 3-D seismic wavefields are computed using a spectral element method for 3-D seismic structures derived from mantle flow models. This way, synthetic seismograms are generated independent of any seismic observations. Furthermore, through the wavefield simulations, it is possible to relate the magnitude of lateral temperature variations in the dynamic flow simulations directly to body-wave traveltime residuals. The

  11. Linking susceptibility genes and pathogenesis mechanisms using mouse models of systemic lupus erythematosus

    Directory of Open Access Journals (Sweden)

    Steve P. Crampton

    2014-09-01

    Full Text Available Systemic lupus erythematosus (SLE represents a challenging autoimmune disease from a clinical perspective because of its varied forms of presentation. Although broad-spectrum steroids remain the standard treatment for SLE, they have many side effects and only provide temporary relief from the symptoms of the disease. Thus, gaining a deeper understanding of the genetic traits and biological pathways that confer susceptibility to SLE will help in the design of more targeted and effective therapeutics. Both human genome-wide association studies (GWAS and investigations using a variety of mouse models of SLE have been valuable for the identification of the genes and pathways involved in pathogenesis. In this Review, we link human susceptibility genes for SLE with biological pathways characterized in mouse models of lupus, and discuss how the mechanistic insights gained could advance drug discovery for the disease.

  12. Linking susceptibility genes and pathogenesis mechanisms using mouse models of systemic lupus erythematosus

    Science.gov (United States)

    Crampton, Steve P.; Morawski, Peter A.; Bolland, Silvia

    2014-01-01

    Systemic lupus erythematosus (SLE) represents a challenging autoimmune disease from a clinical perspective because of its varied forms of presentation. Although broad-spectrum steroids remain the standard treatment for SLE, they have many side effects and only provide temporary relief from the symptoms of the disease. Thus, gaining a deeper understanding of the genetic traits and biological pathways that confer susceptibility to SLE will help in the design of more targeted and effective therapeutics. Both human genome-wide association studies (GWAS) and investigations using a variety of mouse models of SLE have been valuable for the identification of the genes and pathways involved in pathogenesis. In this Review, we link human susceptibility genes for SLE with biological pathways characterized in mouse models of lupus, and discuss how the mechanistic insights gained could advance drug discovery for the disease. PMID:25147296

  13. Numerical linked-cluster algorithms. II. t-J models on the square lattice.

    Science.gov (United States)

    Rigol, Marcos; Bryant, Tyler; Singh, Rajiv R P

    2007-06-01

    We discuss the application of a recently introduced numerical linked-cluster (NLC) algorithm to strongly correlated itinerant models. In particular, we present a study of thermodynamic observables: chemical potential, entropy, specific heat, and uniform susceptibility for the t-J model on the square lattice, with Jt=0.5 and 0.3. Our NLC results are compared with those obtained from high-temperature expansions (HTE) and the finite-temperature Lanczos method (FTLM). We show that there is a sizeable window in temperature where NLC results converge without extrapolations whereas HTE diverges. Upon extrapolations, the overall agreement between NLC, HTE, and FTLM is excellent in some cases down to 0.25t . At intermediate temperatures NLC results are better controlled than other methods, making it easier to judge the convergence and numerical accuracy of the method.

  14. A quantitative model for dermal infection and oedema in BALB/c mice pinna.

    Science.gov (United States)

    Marino-Marmolejo, Erika Nahomy; Flores-Hernández, Flor Yohana; Flores-Valdez, Mario Alberto; García-Morales, Luis Felipe; González-Villegas, Ana Cecilia; Bravo-Madrigal, Jorge

    2016-12-12

    Pharmaceutical industry demands innovation for developing new molecules to improve effectiveness and safety of therapeutic medicines. Preclinical assays are the first tests performed to evaluate new therapeutic molecules using animal models. Currently, there are several models for evaluation of treatments, for dermal oedema or infection. However, the most common or usual way is to induce the inflammation with chemical substances instead of infectious agents. On the other hand, this kind of models require the implementation of histological techniques and the interpretation of pathologies to verify the effectiveness of the therapy under assessment. This work was focused on developing a quantitative model of infection and oedema in mouse pinna. The infection was achieved with a strain of Streptococcus pyogenes that was inoculated in an injury induced at the auricle of BALB/c mice, the induced oedema was recorded by measuring the ear thickness with a digital micrometer and histopathological analysis was performed to verify the damage. The presence of S. pyogenes at the infection site was determined every day by culture. Our results showed that S. pyogenes can infect the mouse pinna and that it can be recovered at least for up to 4 days from the infected site; we also found that S. pyogenes can induce a bigger oedema than the PBS-treated control for at least 7 days; our results were validated with an antibacterial and anti-inflammatory formulation made with ciprofloxacin and hydrocortisone. The model we developed led us to emulate a dermal infection and allowed us to objectively evaluate the increase or decrease of the oedema by measuring the thickness of the ear pinna, and to determine the presence of the pathogen in the infection site. We consider that the model could be useful for assessment of new anti-inflammatory or antibacterial therapies for dermal infections.

  15. Validation of quantitative structure-activity relationship (QSAR) model for photosensitizer activity prediction.

    Science.gov (United States)

    Frimayanti, Neni; Yam, Mun Li; Lee, Hong Boon; Othman, Rozana; Zain, Sharifuddin M; Rahman, Noorsaadah Abd

    2011-01-01

    Photodynamic therapy is a relatively new treatment method for cancer which utilizes a combination of oxygen, a photosensitizer and light to generate reactive singlet oxygen that eradicates tumors via direct cell-killing, vasculature damage and engagement of the immune system. Most of photosensitizers that are in clinical and pre-clinical assessments, or those that are already approved for clinical use, are mainly based on cyclic tetrapyrroles. In an attempt to discover new effective photosensitizers, we report the use of the quantitative structure-activity relationship (QSAR) method to develop a model that could correlate the structural features of cyclic tetrapyrrole-based compounds with their photodynamic therapy (PDT) activity. In this study, a set of 36 porphyrin derivatives was used in the model development where 24 of these compounds were in the training set and the remaining 12 compounds were in the test set. The development of the QSAR model involved the use of the multiple linear regression analysis (MLRA) method. Based on the method, r(2) value, r(2) (CV) value and r(2) prediction value of 0.87, 0.71 and 0.70 were obtained. The QSAR model was also employed to predict the experimental compounds in an external test set. This external test set comprises 20 porphyrin-based compounds with experimental IC(50) values ranging from 0.39 μM to 7.04 μM. Thus the model showed good correlative and predictive ability, with a predictive correlation coefficient (r(2) prediction for external test set) of 0.52. The developed QSAR model was used to discover some compounds as new lead photosensitizers from this external test set.

  16. Validation of Quantitative Structure-Activity Relationship (QSAR) Model for Photosensitizer Activity Prediction

    Science.gov (United States)

    Frimayanti, Neni; Yam, Mun Li; Lee, Hong Boon; Othman, Rozana; Zain, Sharifuddin M.; Rahman, Noorsaadah Abd.

    2011-01-01

    Photodynamic therapy is a relatively new treatment method for cancer which utilizes a combination of oxygen, a photosensitizer and light to generate reactive singlet oxygen that eradicates tumors via direct cell-killing, vasculature damage and engagement of the immune system. Most of photosensitizers that are in clinical and pre-clinical assessments, or those that are already approved for clinical use, are mainly based on cyclic tetrapyrroles. In an attempt to discover new effective photosensitizers, we report the use of the quantitative structure-activity relationship (QSAR) method to develop a model that could correlate the structural features of cyclic tetrapyrrole-based compounds with their photodynamic therapy (PDT) activity. In this study, a set of 36 porphyrin derivatives was used in the model development where 24 of these compounds were in the training set and the remaining 12 compounds were in the test set. The development of the QSAR model involved the use of the multiple linear regression analysis (MLRA) method. Based on the method, r2 value, r2 (CV) value and r2 prediction value of 0.87, 0.71 and 0.70 were obtained. The QSAR model was also employed to predict the experimental compounds in an external test set. This external test set comprises 20 porphyrin-based compounds with experimental IC50 values ranging from 0.39 μM to 7.04 μM. Thus the model showed good correlative and predictive ability, with a predictive correlation coefficient (r2 prediction for external test set) of 0.52. The developed QSAR model was used to discover some compounds as new lead photosensitizers from this external test set. PMID:22272096

  17. Linking sediment fingerprinting and modeling outputs for a Spanish Pyrenean river catchment.

    Science.gov (United States)

    Palazón, Leticia; Latorre, Borja; Gaspar, Leticia; Blake, Williams H.; Smith, Hugh G.; Navas, Ana

    2015-04-01

    Indirect techniques to study fine sediment redistribution in river catchments could provide unique and diverse information, which, when combined become a powerful tool to address catchment management problems. Such combinations could solve limitations of individual techniques and provide different lines of information to address a particular problem. The Barasona reservoir has suffered from siltation since its construction, with the loss of over one third of its storage volume in around 30 study years (period 1972-1996). Information on sediment production from tributary catchments for the reservoir is required to develop management plans for maintaining reservoir sustainability. Large spatial variability in sediment delivery was found in previous studies in the Barasona catchment and the major sediment sources identified included badlands developed in the middle part of the catchment and the agricultural fields in its lower part. From the diverse range of indirect techniques, fingerprinting sediment sources and computer models could be linked to obtain a more holistic view of the processes related to sediment redistribution in the Barasona river catchment (1509 km2, Central Spanish Pyrenees), which comprises agricultural and forest land uses. In the present study, the results from a fingerprinting procedure and the SWAT model were compared and combined to improve the knowledge of land use sediment source contributions to the reservoir. Samples from the study catchment were used to define soil parameters for the model and for fingerprinting the land use sources. The fingerprinting approach provided information about relative contributions from land use sources to the superficial sediment samples taken from the reservoir infill. The calibration and validation of the model provided valuable information, for example on the timescale of sediment production from the different land uses within the catchment. Linking results from both techniques enabled us to achieve a

  18. Current Challenges in the First Principle Quantitative Modelling of the Lower Hybrid Current Drive in Tokamaks

    Science.gov (United States)

    Peysson, Y.; Bonoli, P. T.; Chen, J.; Garofalo, A.; Hillairet, J.; Li, M.; Qian, J.; Shiraiwa, S.; Decker, J.; Ding, B. J.; Ekedahl, A.; Goniche, M.; Zhai, X.

    2017-10-01

    The Lower Hybrid (LH) wave is widely used in existing tokamaks for tailoring current density profile or extending pulse duration to steady-state regimes. Its high efficiency makes it particularly attractive for a fusion reactor, leading to consider it for this purpose in ITER tokamak. Nevertheless, if basics of the LH wave in tokamak plasma are well known, quantitative modeling of experimental observations based on first principles remains a highly challenging exercise, despite considerable numerical efforts achieved so far. In this context, a rigorous methodology must be carried out in the simulations to identify the minimum number of physical mechanisms that must be considered to reproduce experimental shot to shot observations and also scalings (density, power spectrum). Based on recent simulations carried out for EAST, Alcator C-Mod and Tore Supra tokamaks, the state of the art in LH modeling is reviewed. The capability of fast electron bremsstrahlung, internal inductance li and LH driven current at zero loop voltage to constrain all together LH simulations is discussed, as well as the needs of further improvements (diagnostics, codes, LH model), for robust interpretative and predictive simulations.

  19. Using Modified Contour Deformable Model to Quantitatively Estimate Ultrasound Parameters for Osteoporosis Assessment

    Science.gov (United States)

    Chen, Yung-Fu; Du, Yi-Chun; Tsai, Yi-Ting; Chen, Tainsong

    Osteoporosis is a systemic skeletal disease, which is characterized by low bone mass and micro-architectural deterioration of bone tissue, leading to bone fragility. Finding an effective method for prevention and early diagnosis of the disease is very important. Several parameters, including broadband ultrasound attenuation (BUA), speed of sound (SOS), and stiffness index (STI), have been used to measure the characteristics of bone tissues. In this paper, we proposed a method, namely modified contour deformable model (MCDM), bases on the active contour model (ACM) and active shape model (ASM) for automatically detecting the calcaneus contour from quantitative ultrasound (QUS) parametric images. The results show that the difference between the contours detected by the MCDM and the true boundary for the phantom is less than one pixel. By comparing the phantom ROIs, significant relationship was found between contour mean and bone mineral density (BMD) with R=0.99. The influence of selecting different ROI diameters (12, 14, 16 and 18 mm) and different region-selecting methods, including fixed region (ROI fix ), automatic circular region (ROI cir ) and calcaneal contour region (ROI anat ), were evaluated for testing human subjects. Measurements with large ROI diameters, especially using fixed region, result in high position errors (10-45%). The precision errors of the measured ultrasonic parameters for ROI anat are smaller than ROI fix and ROI cir . In conclusion, ROI anat provides more accurate measurement of ultrasonic parameters for the evaluation of osteoporosis and is useful for clinical application.

  20. Effect of arterial deprivation on growing femoral epiphysis: Quantitative magnetic resonance imaging using a piglet model

    Energy Technology Data Exchange (ETDEWEB)

    Cheon, Jung Eun; Yoo, Won Joon; Kim, In One; Kim, Woo Sun; Choi, Young Hun [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    2015-06-15

    To investigate the usefulness of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) and diffusion MRI for the evaluation of femoral head ischemia. Unilateral femoral head ischemia was induced by selective embolization of the medial circumflex femoral artery in 10 piglets. All MRIs were performed immediately (1 hour) and after embolization (1, 2, and 4 weeks). Apparent diffusion coefficients (ADCs) were calculated for the femoral head. The estimated pharmacokinetic parameters (Kep and Ve from two-compartment model) and semi-quantitative parameters including peak enhancement, time-to-peak (TTP), and contrast washout were evaluated. The epiphyseal ADC values of the ischemic hip decreased immediately (1 hour) after embolization. However, they increased rapidly at 1 week after embolization and remained elevated until 4 weeks after embolization. Perfusion MRI of ischemic hips showed decreased epiphyseal perfusion with decreased Kep immediately after embolization. Signal intensity-time curves showed delayed TTP with limited contrast washout immediately post-embolization. At 1-2 weeks after embolization, spontaneous reperfusion was observed in ischemic epiphyses. The change of ADC (p = 0.043) and Kep (p = 0.043) were significantly different between immediate (1 hour) after embolization and 1 week post-embolization. Diffusion MRI and pharmacokinetic model obtained from the DCE-MRI are useful in depicting early changes of perfusion and tissue damage using the model of femoral head ischemia in skeletally immature piglets.

  1. Development of conceptual ecological models linking management of the Missouri River to pallid sturgeon population dynamics

    Science.gov (United States)

    Jacobson, Robert B.; Parsley, Michael J.; Annis, Mandy L.; Colvin, Michael E.; Welker, Timothy L.; James, Daniel A.

    2015-01-01

    This report documents the process of developing and refining conceptual ecological models (CEMs) for linking river management to pallid sturgeon (Scaphirhynchus albus) population dynamics in the Missouri River. The refined CEMs are being used in the Missouri River Pallid Sturgeon Effects Analysis to organize, document, and formalize an understanding of pallid sturgeon population responses to past and future management alternatives. The general form of the CEMs, represented by a population-level model and component life-stage models, was determined in workshops held in the summer of 2013. Subsequently, the Missouri River Pallid Sturgeon Effects Analysis team designed a general hierarchical structure for the component models, refined the graphical structure, and reconciled variation among the components and between models developed for the upper river (Upper Missouri & Yellowstone Rivers) and the lower river (Missouri River downstream from Gavins Point Dam). Importance scores attributed to the relations between primary biotic characteristics and survival were used to define a candidate set of working dominant hypotheses about pallid sturgeon population dynamics. These CEMs are intended to guide research and adaptive-management actions to benefit pallid sturgeon populations in the Missouri River.

  2. Quantitative structure-activity relationship modeling of the toxicity of organothiophosphate pesticides to Daphnia magna and Cyprinus carpio

    NARCIS (Netherlands)

    Zvinavashe, E.; Du, T.; Griff, T.; Berg, van den J.H.J.; Soffers, A.E.M.F.; Vervoort, J.J.M.; Murk, A.J.; Rietjens, I.

    2009-01-01

    Within the REACH regulatory framework in the EU, quantitative structure-activity relationships (QSAR) models are expected to help reduce the number of animals used for experimental testing. The objective of this study was to develop QSAR models to describe the acute toxicity of organothiophosphate

  3. Linking Murine and Human Plasmodium falciparum Challenge Models in a Translational Path for Antimalarial Drug Development

    Science.gov (United States)

    McCarthy, James S.; Marquart, Louise; Sekuloski, Silvana; Trenholme, Katharine; Elliott, Suzanne; Griffin, Paul; Rockett, Rebecca; O'Rourke, Peter; Sloots, Theo; Angulo-Barturen, Iñigo; Ferrer, Santiago; Jiménez-Díaz, María Belén; Martínez, María-Santos; Duparc, Stephan; Leroy, Didier; Wells, Timothy N. C.; Baker, Mark

    2016-01-01

    Effective progression of candidate antimalarials is dependent on optimal dosing in clinical studies, which is determined by a sound understanding of pharmacokinetics and pharmacodynamics (PK/PD). Recently, two important translational models for antimalarials have been developed: the NOD/SCID/IL2Rγ−/− (NSG) model, whereby mice are engrafted with noninfected and Plasmodium falciparum-infected human erythrocytes, and the induced blood-stage malaria (IBSM) model in human volunteers. The antimalarial mefloquine was used to directly measure the PK/PD in both models, which were compared to previously published trial data for malaria patients. The clinical part was a single-center, controlled study using a blood-stage Plasmodium falciparum challenge inoculum in volunteers to characterize the effectiveness of mefloquine against early malaria. The study was conducted in three cohorts (n = 8 each) using different doses of mefloquine. The characteristic delay in onset of action of about 24 h was seen in both NSG and IBSM systems. In vivo 50% inhibitory concentrations (IC50s) were estimated at 2.0 μg/ml and 1.8 μg/ml in the NSG and IBSM models, respectively, aligning with 1.8 μg/ml reported previously for patients. In the IBSM model, the parasite reduction ratios were 157 and 195 for the 10- and 15-mg/kg doses, within the range of previously reported clinical data for patients but significantly lower than observed in the mouse model. Linking mouse and human challenge models to clinical trial data can accelerate the accrual of critical data on antimalarial drug activity. Such data can guide large clinical trials required for development of urgently needed novel antimalarial combinations. (This trial was registered at the Australian New Zealand Clinical Trials Registry [http://anzctr.org.au] under registration number ACTRN12612000323820.) PMID:27044554

  4. Using Coupled Simulation Models to Link Pastoral Decision Making and Ecosystem Services

    Directory of Open Access Journals (Sweden)

    Randall B. Boone

    2011-06-01

    Full Text Available Historically, pastoral people were able to more freely use the services their semi-arid and arid ecosystems provide, and they adapted to changes in ways that improved their well-being. More recently, their ability to adapt has been constrained due to changes from within and from outside their communities. To compare possible responses by pastoral communities, we modeled ecosystem services and tied those services to decisions that people make at the household level. We created an agent-based household model called DECUMA, joined that model with the ecosystem model SAVANNA, and applied the linked models to southeastern Kajiado District, Kenya. The structure of the new agent-based model and linkages between the models are described, and then we demonstrate the model results using a scenario that shows changes in Maasai well-being in response to drought. We then explore two additional but related scenarios, quantifying household well-being if access to a grazing reserve is lost and if access is lost but those most affected are compensated. In the second scenario, households in group ranches abutting the grazing reserve that lost access had large declines in livestock populations, less food energy from animal sources, increased livestock sales and grain purchases, and increased need for supplemental foods. Households in more distant areas showed no changes or had increases in livestock populations because their herds had fewer animals with which to compete for forage. When households neighboring the grazing reserve were compensated for the lease of the lands they had used, they prospered. We describe some benefits and limitations of the agent-based approach.

  5. Regression models for linking patterns of growth to a later outcome: infant growth and childhood overweight

    Directory of Open Access Journals (Sweden)

    Andrew K. Wills

    2016-04-01

    Full Text Available Abstract Background Regression models are widely used to link serial measures of anthropometric size or changes in size to a later outcome. Different parameterisations of these models enable one to target different questions about the effect of growth, however, their interpretation can be challenging. Our objective was to formulate and classify several sets of parameterisations by their underlying growth pattern contrast, and to discuss their utility using an expository example. Methods We describe and classify five sets of model parameterisations in accordance with their underlying growth pattern contrast (conditional growth; being bigger v being smaller; becoming bigger and staying bigger; growing faster v being bigger; becoming and staying bigger versus being bigger. The contrasts are estimated by including different sets of repeated measures of size and changes in size in a regression model. We illustrate these models in the setting of linking infant growth (measured on 6 occasions: birth, 6 weeks, 3, 6, 12 and 24 months in weight-for-height-for-age z-scores to later childhood overweight at 8y using complete cases from the Norwegian Childhood Growth study (n = 900. Results In our expository example, conditional growth during all periods, becoming bigger in any interval and staying bigger through infancy, and being bigger from birth were all associated with higher odds of later overweight. The highest odds of later overweight occurred for individuals who experienced high conditional growth or became bigger in the 3 to 6 month period and stayed bigger, and those who were bigger from birth to 24 months. Comparisons between periods and between growth patterns require large sample sizes and need to consider how to scale associations to make comparisons fair; with respect to the latter, we show one approach. Conclusion Studies interested in detrimental growth patterns may gain extra insight from reporting several sets of growth pattern

  6. Linked linear mixed models: A joint analysis of fixation locations and fixation durations in natural reading.

    Science.gov (United States)

    Hohenstein, Sven; Matuschek, Hannes; Kliegl, Reinhold

    2017-06-01

    The complexity of eye-movement control during reading allows measurement of many dependent variables, the most prominent ones being fixation durations and their locations in words. In current practice, either variable may serve as dependent variable or covariate for the other in linear mixed models (LMMs) featuring also psycholinguistic covariates of word recognition and sentence comprehension. Rather than analyzing fixation location and duration with separate LMMs, we propose linking the two according to their sequential dependency. Specifically, we include predicted fixation location (estimated in the first LMM from psycholinguistic covariates) and its associated residual fixation location as covariates in the second, fixation-duration LMM. This linked LMM affords a distinction between direct and indirect effects (mediated through fixation location) of psycholinguistic covariates on fixation durations. Results confirm the robustness of distributed processing in the perceptual span. They also offer a resolution of the paradox of the inverted optimal viewing position (IOVP) effect (i.e., longer fixation durations in the center than at the beginning and end of words) although the opposite (i.e., an OVP effect) is predicted from default assumptions of psycholinguistic processing efficiency: The IOVP effect in fixation durations is due to the residual fixation-location covariate, presumably driven primarily by saccadic error, and the OVP effect (at least the left part of it) is uncovered with the predicted fixation-location covariate, capturing the indirect effects of psycholinguistic covariates. We expect that linked LMMs will be useful for the analysis of other dynamically related multiple outcomes, a conundrum of most psychonomic research.

  7. Linking Formal and Informal Science Education: A Successful Model using Libraries, Volunteers and NASA Resources

    Science.gov (United States)

    Race, M. S.; Lafayette Library; Learning Center Foundation (Lllcf)

    2011-12-01

    In these times of budget cuts, tight school schedules, and limited opportunities for student field trips and teacher professional development, it is especially difficult to expose elementary and middle school students to the latest STEM information-particularly in the space sciences. Using our library as a facilitator and catalyst, we built a volunteer-based, multi-faceted, curriculum-linked program for students and teachers in local middle schools (Grade 8) and showcased new astronomical and planetary science information using mainly NASA resources and volunteer effort. The project began with the idea of bringing free NASA photo exhibits (FETTU) to the Lafayette and Antioch Libraries for public display. Subsequently, the effort expanded by adding layers of activities that brought space and science information to teachers, students and the pubic at 5 libraries and schools in the 2 cities, one of which serves a diverse, underserved community. Overall, the effort (supported by a pilot grant from the Bechtel Foundation) included school and library based teacher workshops with resource materials; travelling space museum visits with hands-on activities (Chabot-to-Go); separate powerpoint presentations for students and adults at the library; and concurrent ancillary space-related themes for young children's programs at the library. This pilot project, based largely on the use of free government resources and online materials, demonstrated that volunteer-based, standards-linked STEM efforts can enhance curriculum at the middle school, with libraries serving a special role. Using this model, we subsequently also obtained a small NASA-Space Grant award to bring star parties and hand-on science activities to three libraries this Fall, linking with numerous Grade 5 teachers and students in two additional underserved areas of our county. It's not necessary to reinvent the wheel, you just collect the pieces and build on what you already have.

  8. On the Performance Analysis of Free-Space Optical Links under Generalized Turbulence and Misalignment Models

    KAUST Repository

    AlQuwaiee, Hessa

    2016-11-01

    One of the potential solutions to the radio frequency (RF) spectrum scarcity problem is optical wireless communications (OWC), which utilizes the unlicensed optical spectrum. Long-range outdoor OWC are usually referred to in the literature as free-space optical (FSO) communications. Unlike RF systems, FSO is immune to interference and multi-path fading. Also, the deployment of FSO systems is flexible and much faster than optical fibers. These attractive features make FSO applicable for broadband wireless transmission such as optical fiber backup, metropolitan area network, and last mile access. Although FSO communication is a promising technology, it is negatively affected by two physical phenomenon, namely, scintillation due to atmospheric turbulence and pointing errors. These two critical issues have prompted intensive research in the last decade. To quantify the effect of these two factors on FSO system performance, we need effective mathematical models. In this work, we propose and study a generalized pointing error model based on the Beckmann distribution. Then, we aim to generalize the FSO channel model to span all turbulence conditions from weak to strong while taking pointing errors into consideration. Since scintillation in FSO is analogous to the fading phenomena in RF, diversity has been proposed too to overcome the effect of irradiance fluctuations. Thus, several combining techniques of not necessarily independent dual-branch free-space optical links were investigated over both weak and strong turbulence channels in the presence of pointing errors. On another front, improving the performance, enhancing the capacity and reducing the delay of the communication link has been the motivation of any newly developed schemes, especially for backhauling. Recently, there has been a growing interest in practical systems to integrate RF and FSO technologies to solve the last mile bottleneck. As such, we also study in this thesis asymmetric an RF-FSO dual-hop relay

  9. Introducing technology learning for energy technologies in a national CGE model through soft links to global and national energy models

    International Nuclear Information System (INIS)

    Martinsen, Thomas

    2011-01-01

    This paper describes a method to model the influence by global policy scenarios, particularly spillover of technology learning, on the energy service demand of the non-energy sectors of the national economy. It is exemplified by Norway. Spillover is obtained from the technology-rich global Energy Technology Perspective model operated by the International Energy Agency. It is provided to a national hybrid model where a national bottom-up Markal model carries forward spillover into a national top-down CGE model at a disaggregated demand category level. Spillover of technology learning from the global energy technology market will reduce national generation costs of energy carriers. This may in turn increase demand in the non-energy sectors of the economy because of the rebound effect. The influence of spillover on the Norwegian economy is most pronounced for the production level of industrial chemicals and for the demand for electricity for residential energy services. The influence is modest, however, because all existing electricity generating capacity is hydroelectric and thus compatible with the low emission policy scenario. In countries where most of the existing generating capacity must be replaced by nascent energy technologies or carbon captured and storage the influence on demand is expected to be more significant. - Highlights: → Spillover of global technology learning may be forwarded into a macroeconomic model. → The national electricity price differs significantly between the different global scenarios. → Soft-linking global and national models facilitate transparency in the technology learning effect chain.

  10. Modeling photosynthesis of discontinuous plant canopies by linking the Geometric Optical Radiative Transfer model with biochemical processes

    Directory of Open Access Journals (Sweden)

    Q. Xin

    2015-06-01

    Full Text Available Modeling vegetation photosynthesis is essential for understanding carbon exchanges between terrestrial ecosystems and the atmosphere. The radiative transfer process within plant canopies is one of the key drivers that regulate canopy photosynthesis. Most vegetation cover consists of discrete plant crowns, of which the physical observation departs from the underlying assumption of a homogenous and uniform medium in classic radiative transfer theory. Here we advance the Geometric Optical Radiative Transfer (GORT model to simulate photosynthesis activities for discontinuous plant canopies. We separate radiation absorption into two components that are absorbed by sunlit and shaded leaves, and derive analytical solutions by integrating over the canopy layer. To model leaf-level and canopy-level photosynthesis, leaf light absorption is then linked to the biochemical process of gas diffusion through leaf stomata. The canopy gap probability derived from GORT differs from classic radiative transfer theory, especially when the leaf area index is high, due to leaf clumping effects. Tree characteristics such as tree density, crown shape, and canopy length affect leaf clumping and regulate radiation interception. Modeled gross primary production (GPP for two deciduous forest stands could explain more than 80% of the variance of flux tower measurements at both near hourly and daily timescales. We demonstrate that ambient CO2 concentrations influence daytime vegetation photosynthesis, which needs to be considered in biogeochemical models. The proposed model is complementary to classic radiative transfer theory and shows promise in modeling the radiative transfer process and photosynthetic activities over discontinuous forest canopies.

  11. Invasive growth of Saccharomyces cerevisiae depends on environmental triggers: a quantitative model.

    Science.gov (United States)

    Zupan, Jure; Raspor, Peter

    2010-04-01

    In this contribution, the influence of various physicochemical factors on Saccharomyces cerevisiae invasive growth is examined quantitatively. Agar-invasion assays are generally applied for in vitro studies on S. cerevisiae invasiveness, the phenomenon observed as a putative virulence trait in this clinically more and more concerning yeast. However, qualitative agar-invasion assays, used until now, strongly limit the feasibility and interpretation of analyses and therefore needed to be improved. Besides, knowledge in this field concerning the physiology of invasive growth, influenced by stress conditions related to the human alimentary tract and food, is poor and should be expanded. For this purpose, a quantitative agar-invasion assay, presented in our previous work, was applied in this contribution to clarify the significance of the stress factors controlling the adhesion and invasion of the yeast in greater detail. Ten virulent and non-virulent S. cerevisiae strains were assayed at various temperatures, pH values, nutrient starvation, modified atmosphere, and different concentrations of NaCl, CaCl2 and preservatives. With the use of specific parameters, like a relative invasion, eight invasive growth models were hypothesized, which enabled intelligible interpretation of the results. A strong preference for invasive growth (meaning high relative invasion) was observed when the strains were grown on nitrogen- and glucose-depleted media. A significant increase in the invasion of the strains was also determined at temperatures typical for human fever (37-39 degrees C). On the other hand, a strong repressive effect on invasion was found in the presence of salts, anoxia and some preservatives. Copyright 2010 John Wiley & Sons, Ltd.

  12. Towards Linking 3D SAR and Lidar Models with a Spatially Explicit Individual Based Forest Model

    Science.gov (United States)

    Osmanoglu, B.; Ranson, J.; Sun, G.; Armstrong, A. H.; Fischer, R.; Huth, A.

    2017-12-01

    In this study, we present a parameterization of the FORMIND individual-based gap model (IBGM)for old growth Atlantic lowland rainforest in La Selva, Costa Rica for the purpose of informing multisensor remote sensing techniques for above ground biomass techniques. The model was successfully parameterized and calibrated for the study site; results show that the simulated forest reproduces the structural complexity of Costa Rican rainforest based on comparisons with CARBONO inventory plot data. Though the simulated stem numbers (378) slightly underestimated the plot data (418), particularly for canopy dominant intermediate shade tolerant trees and shade tolerant understory trees, overall there was a 9.7% difference. Aboveground biomass (kg/ha) showed a 0.1% difference between the simulated forest and inventory plot dataset. The Costa Rica FORMIND simulation was then used to parameterize a spatially explicit (3D) SAR and lidar backscatter models. The simulated forest stands were used to generate a Look Up Table as a tractable means to estimate aboveground forest biomass for these complex forests. Various combinations of lidar and radar variables were evaluated in the LUT inversion. To test the capability of future data for estimation of forest height and biomass, we considered data of 1) L- (or P-) band polarimetric data (backscattering coefficients of HH, HV and VV); 2) L-band dual-pol repeat-pass InSAR data (HH/HV backscattering coefficients and coherences, height of scattering phase center at HH and HV using DEM or surface height from lidar data as reference); 3) P-band polarimetric InSAR data (canopy height from inversion of PolInSAR data or use the coherences and height of scattering phase center at HH, HV and VV); 4) various height indices from waveform lidar data); and 5) surface and canopy top height from photon-counting lidar data. The methods for parameterizing the remote sensing models with the IBGM and developing Look Up Tables will be discussed. Results

  13. Putting the five-factor model into context: evidence linking big five traits to narrative identity.

    Science.gov (United States)

    Raggatt, Peter

    2006-10-01

    The study examined relationships between the Big Five personality traits and thematic content extracted from self-reports of life history data. One hundred and five "mature age" university students (M=30.1 years) completed the NEO PI-R trait measure, and the Personality Web Protocol. The protocol examines constituents of identity by asking participants to describe 24 key "attachments" from their life histories (significant events, people, places, objects, and possessions). Participants sorted these attachments into clusters and provided a self-descriptive label for each cluster (e.g., "adventurous self"). It was predicted that the thematic content of these cluster labels would be systematically related to Big Five trait scores (e.g., that labels referring to strength or positive emotions would be linked to Extraversion). The hypothesized links were obtained for each of the Big Five trait domains except Conscientiousness. Results are discussed with a view to broadening our understanding of the Five-Factor Model in relation to units of personality other than traits.

  14. PDF Estimation and Liquid Water Content Based Attenuation Modeling for Fog in Terrestrial FSO Links

    Directory of Open Access Journals (Sweden)

    S. S. Muhammad

    2010-06-01

    Full Text Available Terrestrial Free-space optical communication (FSO links have yet to achieve a mass market success due to the ever elusive 99.999% availability requirement. The terrestrial FSO links are heavily affected by atmospheric fog. To design systems which can achieve high availability and reliability in the presence of fog, accurate and better models of fog attenuation need to be developed. The current article puts forth appropriate probability density function estimates for received signal strength (hereafter RSS under fog conditions, where variations in the RSS during foggy events have been statistically characterized. Moreover, from the surface observations of fog density, liquid water content (hereafter LWC of fog is estimated. The actual measured optical attenuations are then compared with the optical attenuations estimated from LWC. The results presented suggest that fog density measurements carried out are accurate representation of the fog intensity and the attenuation predictions obtained by the LWC estimate match the actual measured optical attenuations. This suggests that the LWC is a useful parameter besides visibility range to predict optical attenuations in the presence of hydrometeors.

  15. Estimating the Burden of Medically Attended Norovirus Gastroenteritis: Modeling Linked Primary Care and Hospitalization Datasets.

    Science.gov (United States)

    Verstraeten, Thomas; Cattaert, Tom; Harris, John; Lopman, Ben; Tam, Clarence C; Ferreira, Germano

    2017-11-15

    Norovirus is the leading cause of community-acquired and nosocomial acute gastroenteritis. Routine testing for norovirus is seldom undertaken, and diagnosis is mainly based on presenting symptoms. This makes understanding the burden of medically attended norovirus-attributable gastroenteritis (MA-NGE) and targeting care and prevention strategies challenging. We used linked population-based healthcare datasets (Clinical Practice Research Datalink General Practice OnLine Database linked with Hospital Episode Statistics Admitted Patient Care) to model the incidence of MA-NGE associated with primary care consultations or hospitalizations according to age groups in England in the period July 2007-June 2013. Mean annual incidence rates of MA-NGE were 4.9/1000 person-years and 0.7/1000 person-years for episodes involving primary care or hospitalizations, respectively. Incidence rates were highest in children aged gastroenteritis hospitalization rates were second highest in adults aged >65 years (1.7/1000 person-years). In this particular study, the burden of MA-NGE estimated from healthcare datasets was higher than previously estimated in small cohort studies in England. Routinely collected primary care and hospitalization datasets are useful resources to estimate and monitor the burden of MA-NGE in a population over time. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  16. Modeling and characterization of VCSEL-based avionics full-duplex ethernet (AFDX) gigabit links

    Science.gov (United States)

    Ly, Khadijetou S.; Rissons, A.; Gambardella, E.; Bajon, D.; Mollier, J.-C.

    2008-02-01

    Low cost and intrinsic performances of 850 nm Vertical Cavity Surface Emitting Lasers (VCSELs) compared to Light Emitting Diodes make them very attractive for high speed and short distances data communication links through optical fibers. Weight saving and Electromagnetic Interference withstanding requirements have led to the need of a reliable solution to improve existing avionics high speed buses (e.g. AFDX) up to 1Gbps over 100m. To predict and optimize the performance of the link, the physical behavior of the VCSEL must be well understood. First, a theoretical study is performed through the rate equations adapted to VCSEL in large signal modulation. Averaged turn-on delays and oscillation effects are analytically computed and analyzed for different values of the on- and off state currents. This will affect the eye pattern, timing jitter and Bit Error Rate (BER) of the signal that must remain within IEEE 802.3 standard limits. In particular, the off-state current is minimized below the threshold to allow the highest possible Extinction Ratio. At this level, the spontaneous emission is dominating and leads to significant turn-on delay, turn-on jitter and bit pattern effects. Also, the transverse multimode behavior of VCSELs, caused by Spatial Hole Burning leads to some dispersion in the fiber and degradation of BER. VCSEL to Multimode Fiber coupling model is provided for prediction and optimization of modal dispersion. Lastly, turn-on delay measurements are performed on a real mock-up and results are compared with calculations.

  17. Mathematical modeling of cross-linking monomer elution from resin-based dental composites.

    Science.gov (United States)

    Manojlovic, Dragica; Radisic, Marina; Lausevic, Mila; Zivkovic, Slavoljub; Miletic, Vesna

    2013-01-01

    Elution of potentially toxic substances, including monomers, from resin-based dental composites may affect the biocompatibility of these materials in clinical conditions. In addition to the amounts of eluted monomers, mathematical modeling of elution kinetics reveals composite restorations as potential chronic sources of leachable monomers. The aim of this work was to experimentally quantify elution of main cross-linking monomers from four commercial composites and offer a mathematical model of elution kinetics. Composite samples (n = 7 per group) of Filtek Supreme XT (3M ESPE), Tetric EvoCeram (Ivoclar Vivadent), Admira (Voco), and Filtek Z250 (3M ESPE) were prepared in 2-mm thick Teflon moulds and cured with halogen or light-emitting diode light. Monomer elution in ethanol and water was analyzed using high-performance liquid chromatography up to 28 days postimmersion. The mathematical model was expressed as a sum of two exponential regression functions representing the first-order kinetics law. Elution kinetics in all cases followed the same mathematical model though differences in rate constants as well as the extent of monomer elution were material-, LCU-, medium-dependent. The proposed mechanisms of elution indicate fast elution from surface and subsurface layers and up to 100 times slower monomer extraction from the bulk polymer. Copyright © 2012 Wiley Periodicals, Inc.

  18. A Linked Simulation-Optimization (LSO) Model for Conjunctive Irrigation Management using Clonal Selection Algorithm

    Science.gov (United States)

    Islam, Sirajul; Talukdar, Bipul

    2016-09-01

    A Linked Simulation-Optimization (LSO) model based on a Clonal Selection Algorithm (CSA) was formulated for application in conjunctive irrigation management. A series of measures were considered for reducing the computational burden associated with the LSO approach. Certain modifications were incurred to the formulated CSA, so as to decrease the number of function evaluations. In addition, a simple problem specific code for a two dimensional groundwater flow simulation model was developed. The flow model was further simplified by a novel approach of area reduction, in order to save computational time in simulation. The LSO model was applied in the irrigation command of the Pagladiya Dam Project in Assam, India. With a view to evaluate the performance of the CSA, a Genetic Algorithm (GA) was used as a comparison base. The results from the CSA compared well with those from the GA. In fact, the CSA was found to consume less computational time than the GA while converging to the optimal solution, due to the modifications incurred in it.

  19. Variable speed limit strategies analysis with link transmission model on urban expressway

    Science.gov (United States)

    Li, Shubin; Cao, Danni

    2018-02-01

    The variable speed limit (VSL) is a kind of active traffic management method. Most of the strategies are used in the expressway traffic flow control in order to ensure traffic safety. However, the urban expressway system is the main artery, carrying most traffic pressure. It has similar traffic characteristics with the expressways between cities. In this paper, the improved link transmission model (LTM) combined with VSL strategies is proposed, based on the urban expressway network. The model can simulate the movement of the vehicles and the shock wave, and well balance the relationship between the amount of calculation and accuracy. Furthermore, the optimal VSL strategy can be proposed based on the simulation method. It can provide management strategies for managers. Finally, a simple example is given to illustrate the model and method. The selected indexes are the average density, the average speed and the average flow on the traffic network in the simulation. The simulation results show that the proposed model and method are feasible. The VSL strategy can effectively alleviate traffic congestion in some cases, and greatly promote the efficiency of the transportation system.

  20. Variable pore connectivity model linking gas diffusivity and air-phase tortuosity to soil matric potential

    DEFF Research Database (Denmark)

    Chamindu, Deepagoda; Møldrup, Per; Schjønning, Per

    2012-01-01

    - and intraaggregate pore regions of aggregated soils. We further suggest that the new model with parameter values of X* = 1.7 and A = 0 may be used for upper limit Dp/Do predictions in risk assessments of, e.g., fluxes of toxic volatile organics from soil to indoor air at polluted soil sites....... of a variable pore connectivity factor, X, as a function of soil matric potential, expressed as pF (=log |−ψ|), for pF values ranging from 1.0 to 3.5. The new model takes the form of X = X* (F/F*)A with F = 1 + pF−1, where X* is the pore network tortuosity at reference F (F*) and A is a model parameter...... that accounts for water blockage. The X–pF relation can be linked to drained pore size to explain the lower probability of the larger but far fewer air-filled pores at lower pF effectively interconnecting and promoting gas diffusion. The model with X* = 2 and A = 0.5 proved promising for generalizing Dp...

  1. MARKAL-MACRO: A linked model for energy-economy analysis

    International Nuclear Information System (INIS)

    Manne, A.S.; Wene, C.O.

    1992-02-01

    MARKAL-MACRO is an experiment in model linkage for energy and economy analysis. This new tool is intended as an improvement over existing methods for energy strategy assessment. It is designed specifically for estimating the costs and analyzing the technologies proposed for reducing environmental risks such as global climate change or regional air pollution. The greenhouse gas debate illustrates the usefulness of linked energy-economy models. A central issue is the coupling between economic growth, the level of energy demands, and the development of an energy system to supply these demands. The debate is often connected with alternative modeling approaches. The competing philosophies may be labeled ''top-down macroeconomic'' and ''bottom-up engineering'' perspectives. MARKAL is a systems engineering (physical process) analysis built on the concept of a Reference Energy System (RES). MARKAL is solved by means of dynamic linear programming. In most applications, the end use demands are fixed, and an economically efficient solution is obtained by minimizing the present value of energy system's costs throughout the planning horizon. MACRO is a macroeconomic model with an aggregated view of long-term economic growth. The basis input factors of production are capital, labor and individual forms of energy. MACRO is solved by nonlinear optimization

  2. A Behavioral Genetic Model of the Mechanisms Underlying the Link Between Obesity and Symptoms of ADHD.

    Science.gov (United States)

    Patte, Karen A; Davis, Caroline A; Levitan, Robert D; Kaplan, Allan S; Carter-Major, Jacqueline; Kennedy, James L

    2016-01-21

    The ADHD-obesity link has been suggested to result from a shared underlying basis of suboptimal dopamine (DA); however, this theory conflicts evidence that an amplified DA signal increases the risk for overeating and weight gain. A model was tested in which ADHD symptoms, predicted by hypodopaminergic functioning in the prefrontal cortex, in combination with an enhanced appetitive drive, predict hedonic eating and, in turn, higher body mass index (BMI). DRD2 and DRD4 markers were genotyped. The model was tested using structural equation modeling in a nonclinical sample (N = 421 adults). The model was a good fit to the data. Controlling for education, all parameter estimates were significant, except for the DRD4-ADHD symptom pathway. The significant indirect effect indicates that overeating mediated the ADHD symptoms-BMI association. Results support the hypothesis that overeating and elevated DA in the ventral striatum-representative of a greater reward response-contribute to the ADHD symptom-obesity relationship. © The Author(s) 2016.

  3. A review of the evidence linking adult attachment theory and chronic pain: presenting a conceptual model.

    Science.gov (United States)

    Meredith, Pamela; Ownsworth, Tamara; Strong, Jenny

    2008-03-01

    It is now well established that pain is a multidimensional phenomenon, affected by a gamut of psychosocial and biological variables. According to diathesis-stress models of chronic pain, some individuals are more vulnerable to developing disability following acute pain because they possess particular psychosocial vulnerabilities which interact with physical pathology to impact negatively upon outcome. Attachment theory, a theory of social and personality development, has been proposed as a comprehensive developmental model of pain, implicating individual adult attachment pattern in the ontogenesis and maintenance of chronic pain. The present paper reviews and critically appraises studies which link adult attachment theory with chronic pain. Together, these papers offer support for the role of insecure attachment as a diathesis (or vulnerability) for problematic adjustment to pain. The Attachment-Diathesis Model of Chronic Pain developed from this body of literature, combines adult attachment theory with the diathesis-stress approach to chronic pain. The evidence presented in this review, and the associated model, advances our understanding of the developmental origins of chronic pain conditions, with potential application in guiding early pain intervention and prevention efforts, as well as tailoring interventions to suit specific patient needs.

  4. Lower-order effects adjustment in quantitative traits model-based multifactor dimensionality reduction.

    Science.gov (United States)

    Mahachie John, Jestinah M; Cattaert, Tom; Lishout, François Van; Gusareva, Elena S; Steen, Kristel Van

    2012-01-01

    Identifying gene-gene interactions or gene-environment interactions in studies of human complex diseases remains a big challenge in genetic epidemiology. An additional challenge, often forgotten, is to account for important lower-order genetic effects. These may hamper the identification of genuine epistasis. If lower-order genetic effects contribute to the genetic variance of a trait, identified statistical interactions may simply be due to a signal boost of these effects. In this study, we restrict attention to quantitative traits and bi-allelic SNPs as genetic markers. Moreover, our interaction study focuses on 2-way SNP-SNP interactions. Via simulations, we assess the performance of different corrective measures for lower-order genetic effects in Model-Based Multifactor Dimensionality Reduction epistasis detection, using additive and co-dominant coding schemes. Performance is evaluated in terms of power and familywise error rate. Our simulations indicate that empirical power estimates are reduced with correction of lower-order effects, likewise familywise error rates. Easy-to-use automatic SNP selection procedures, SNP selection based on "top" findings, or SNP selection based on p-value criterion for interesting main effects result in reduced power but also almost zero false positive rates. Always accounting for main effects in the SNP-SNP pair under investigation during Model-Based Multifactor Dimensionality Reduction analysis adequately controls false positive epistasis findings. This is particularly true when adopting a co-dominant corrective coding scheme. In conclusion, automatic search procedures to identify lower-order effects to correct for during epistasis screening should be avoided. The same is true for procedures that adjust for lower-order effects prior to Model-Based Multifactor Dimensionality Reduction and involve using residuals as the new trait. We advocate using "on-the-fly" lower-order effects adjusting when screening for SNP-SNP interactions

  5. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  6. Rock physics models for constraining quantitative interpretation of ultrasonic data for biofilm growth and development

    Science.gov (United States)

    Alhadhrami, Fathiya Mohammed

    This study examines the use of rock physics modeling for quantitative interpretation of seismic data in the context of microbial growth and biofilm formation in unconsolidated sediment. The impetus for this research comes from geophysical experiments by Davis et al. (2010) and Kwon and Ajo-Franklin et al. (2012). These studies observed that microbial growth has a small effect on P-wave velocities (VP) but a large effect on seismic amplitudes. Davis et al. (2010) and Kwon and Ajo-Franklin et al. (2012) speculated that the amplitude variations were due to a combination of rock mechanical changes from accumulation of microbial growth related features such as biofilms. A more definite conclusion can be drawn by developing rock physics models that connect rock properties to seismic amplitudes. The primary objective of this work is to provide an explanation for high amplitude attenuation due to biofilm growth. The results suggest that biofilm formation in the Davis et al. (2010) experiment exhibit two growth styles: a loadbearing style where biofilm behaves like an additional mineral grain and a non-loadbearing mode where the biofilm grows into the pore spaces. In the loadbearing mode, the biofilms contribute to the stiffness of the sediments. We refer to this style as "filler." In the non-loadbearing mode, the biofilms contribute only to change in density of sediments without affecting their strength. We refer to this style of microbial growth as "mushroom." Both growth styles appear to be changing permeability more than the moduli or the density. As the result, while the VP velocity remains relatively unchanged, the amplitudes can change significantly depending on biofilm saturation. Interpreting seismic data from biofilm growths in term of rock physics models provide a greater insight into the sediment-fluid interaction. The models in turn can be used to understand microbial enhanced oil recovery and in assisting in solving environmental issues such as creating bio

  7. Inference of quantitative models of bacterial promoters from time-series reporter gene data.

    Science.gov (United States)

    Stefan, Diana; Pinel, Corinne; Pinhal, Stéphane; Cinquemani, Eugenio; Geiselmann, Johannes; de Jong, Hidde

    2015-01-01

    The inference of regulatory interactions and quantitative models of gene regulation from time-series transcriptomics data has been extensively studied and applied to a range of problems in drug discovery, cancer research, and biotechnology. The application of existing methods is commonly based on implicit assumptions on the biological processes under study. First, the measurements of mRNA abundance obtained in transcriptomics experiments are taken to be representative of protein concentrations. Second, the observed changes in gene expression are assumed to be solely due to transcription factors and other specific regulators, while changes in the activity of the gene expression machinery and other global physiological effects are neglected. While convenient in practice, these assumptions are often not valid and bias the reverse engineering process. Here we systematically investigate, using a combination of models and experiments, the importance of this bias and possible corrections. We measure in real time and in vivo the activity of genes involved in the FliA-FlgM module of the E. coli motility network. From these data, we estimate protein concentrations and global physiological effects by means of kinetic models of gene expression. Our results indicate that correcting for the bias of commonly-made assumptions improves the quality of the models inferred from the data. Moreover, we show by simulation that these improvements are expected to be even stronger for systems in which protein concentrations have longer half-lives and the activity of the gene expression machinery varies more strongly across conditions than in the FliA-FlgM module. The approach proposed in this study is broadly applicable when using time-series transcriptome data to learn about the structure and dynamics of regulatory networks. In the case of the FliA-FlgM module, our results demonstrate the importance of global physiological effects and the active regulation of FliA and FlgM half-lives for

  8. Generating quantitative models describing the sequence specificity of biological processes with the stabilized matrix method

    Directory of Open Access Journals (Sweden)

    Sette Alessandro

    2005-05-01

    Full Text Available Abstract Background Many processes in molecular biology involve the recognition of short sequences of nucleic-or amino acids, such as the binding of immunogenic peptides to major histocompatibility complex (MHC molecules. From experimental data, a model of the sequence specificity of these processes can be constructed, such as a sequence motif, a scoring matrix or an artificial neural network. The purpose of these models is two-fold. First, they can provide a summary of experimental results, allowing for a deeper understanding of the mechanisms involved in sequence recognition. Second, such models can be used to predict the experimental outcome for yet untested sequences. In the past we reported the development of a method to generate such models called the Stabilized Matrix Method (SMM. This method has been successfully applied to predicting peptide binding to MHC molecules, peptide transport by the transporter associated with antigen presentation (TAP and proteasomal cleavage of protein sequences. Results Herein we report the implementation of the SMM algorithm as a publicly available software package. Specific features determining the type of problems the method is most appropriate for are discussed. Advantageous features of the package are: (1 the output generated is easy to interpret, (2 input and output are both quantitative, (3 specific computational strategies to handle experimental noise are built in, (4 the algorithm is designed to effectively handle bounded experimental data, (5 experimental data from randomized peptide libraries and conventional peptides can easily be combined, and (6 it is possible to incorporate pair interactions between positions of a sequence. Conclusion Making the SMM method publicly available enables bioinformaticians and experimental biologists to easily access it, to compare its performance to other prediction methods, and to extend it to other applications.

  9. How plants manage food reserves at night: quantitative models and open questions

    Directory of Open Access Journals (Sweden)

    Antonio eScialdone

    2015-03-01

    Full Text Available In order to cope with night-time darkness, plants during the day allocate part of their photosynthate for storage, often as starch. This stored reserve is then degraded at night to sustain metabolism and growth. However, night-time starch degradation must be tightly controlled, as over-rapid turnover results in premature depletion of starch before dawn, leading to starvation. Recent experiments in Arabidopsis have shown that starch degradation proceeds at a constant rate during the night and is set such that starch reserves are exhausted almost precisely at dawn. Intriguingly, this pattern is robust with the degradation rate being adjusted to compensate for unexpected changes in the time of darkness onset. While a fundamental role for the circadian clock is well established, the underlying mechanisms controlling starch degradation remain poorly characterized. Here, we discuss recent quantitative models that have been proposed to explain how plants can compute the appropriate starch degradation rate, a process that requires an effective arithmetic division calculation. We review experimental confirmation of the models, and describe aspects that require further investigation. Overall, the process of night-time starch degradation necessitates a fundamental metabolic role for the circadian clock and, more generally, highlights how cells process information in order to optimally manage their resources.

  10. How plants manage food reserves at night: quantitative models and open questions.

    Science.gov (United States)

    Scialdone, Antonio; Howard, Martin

    2015-01-01

    In order to cope with night-time darkness, plants during the day allocate part of their photosynthate for storage, often as starch. This stored reserve is then degraded at night to sustain metabolism and growth. However, night-time starch degradation must be tightly controlled, as over-rapid turnover results in premature depletion of starch before dawn, leading to starvation. Recent experiments in Arabidopsis have shown that starch degradation proceeds at a constant rate during the night and is set such that starch reserves are exhausted almost precisely at dawn. Intriguingly, this pattern is robust with the degradation rate being adjusted to compensate for unexpected changes in the time of darkness onset. While a fundamental role for the circadian clock is well-established, the underlying mechanisms controlling starch degradation remain poorly characterized. Here, we discuss recent quantitative models that have been proposed to explain how plants can compute the appropriate starch degradation rate, a process that requires an effective arithmetic division calculation. We review experimental confirmation of the models, and describe aspects that require further investigation. Overall, the process of night-time starch degradation necessitates a fundamental metabolic role for the circadian clock and, more generally, highlights how cells process information in order to optimally manage their resources.

  11. Currency risk and prices of oil and petroleum products: a simulation with a quantitative model

    International Nuclear Information System (INIS)

    Aniasi, L.; Ottavi, D.; Rubino, E.; Saracino, A.

    1992-01-01

    This paper analyzes the relationship between the exchange rates of the US Dollar against the four major European currencies and the prices of oil and its main products in those countries. In fact, the sensitivity of the prices to the exchange rate movements is of fundamental importance for the refining and distribution industries of importing countries. The result of the analysis shows that in neither free market conditions, as those present in Great Britain, France and Germany, nor in regulated markets, i.e. the italian one, do the variations of petroleum product prices fully absorb the variation of the exchange rates. In order to assess the above relationship, we first tested the order of co-integration of the time series of exchange rates of EMS currencies with those of international prices of oil and its derivative products; then we used a transfer-function model to reproduce the quantitative relationships between those variables. Using these results, we then reproduced domestic price functions with partial adjustment mechanisms. Finally, we used the above model to run a simulation of the deviation from the steady-state pattern caused by exchange-rate exogenous shocks. 21 refs., 5 figs., 3 tabs

  12. An ex vivo model to quantitatively analyze cell migration in tissue.

    Science.gov (United States)

    O'Leary, Conor J; Weston, Mikail; McDermott, Kieran W

    2018-01-01

    Within the developing central nervous system, the ability of cells to migrate throughout the tissue parenchyma to reach their target destination and undergo terminal differentiation is vital to normal central nervous system (CNS) development. To develop novel therapies to treat the injured CNS, it is essential that the migratory behavior of cell populations is understood. Many studies have examined the ability of individual neurons to migrate through the developing CNS, describing specific modes of migration including locomotion and somal translocation. Few studies have investigated the mass migration of large populations of neural progenitors, particularly in the developing the spinal cord. Here, we describe a method to robustly analyze large numbers of migrating cells using a co-culture assay. The ex vivo tissue model promotes the survival and differentiation of co-cultured progenitor cells. Using this assay, we demonstrate that migrating neuroepithelial progenitor cells display region specific migration patterns within the dorsal and ventral spinal cord at defined developmental time points. The technique described here is a viable ex vivo model to quantitatively analyze cell migration and differentiation. We demonstrate the ability to detect changes in cell migration within distinct tissue region across tissue samples using the technique described here. Developmental Dynamics 247:201-211, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  13. Probabilistic Quantitative Precipitation Forecasting over East China using Bayesian Model Averaging

    Science.gov (United States)

    Yang, Ai; Yuan, Huiling

    2014-05-01

    The Bayesian model averaging (BMA) is a post-processing method that weights the predictive probability density functions (PDFs) of individual ensemble members. This study investigates the BMA method for calibrating quantitative precipitation forecasts (QPFs) from The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) database. The QPFs over East Asia during summer (June-August) 2008-2011 are generated from six operational ensemble prediction systems (EPSs), including ECMWF, UKMO, NCEP, CMC, JMA, CMA, and multi-center ensembles of their combinations. The satellite-based precipitation estimate product TRMM 3B42 V7 is used as the verification dataset. In the BMA post-processing for precipitation forecasts, the PDF matching method is first applied to bias-correct systematic errors in each forecast member, by adjusting PDFs of forecasts to match PDFs of observations. Next, a logistic regression and two-parameter gamma distribution are used to fit the probability of rainfall occurrence and precipitation distribution. Through these two steps, the BMA post-processing bias-corrects ensemble forecasts systematically. The 60-70% cumulative density function (CDF) predictions well estimate moderate precipitation compared to raw ensemble mean, while the 90% upper boundary of BMA CDF predictions can be set as a threshold of extreme precipitation alarm. In general, the BMA method is more capable of multi-center ensemble post-processing, which improves probabilistic QPFs (PQPFs) with better ensemble spread and reliability. KEYWORDS: Bayesian model averaging (BMA); post-processing; ensemble forecast; TIGGE

  14. Overcoming pain thresholds with multilevel models-an example using quantitative sensory testing (QST) data.

    Science.gov (United States)

    Hirschfeld, Gerrit; Blankenburg, Markus R; Süß, Moritz; Zernikow, Boris

    2015-01-01

    The assessment of somatosensory function is a cornerstone of research and clinical practice in neurology. Recent initiatives have developed novel protocols for quantitative sensory testing (QST). Application of these methods led to intriguing findings, such as the presence lower pain-thresholds in healthy children compared to healthy adolescents. In this article, we (re-) introduce the basic concepts of signal detection theory (SDT) as a method to investigate such differences in somatosensory function in detail. SDT describes participants' responses according to two parameters, sensitivity and response-bias. Sensitivity refers to individuals' ability to discriminate between painful and non-painful stimulations. Response-bias refers to individuals' criterion for giving a "painful" response. We describe how multilevel models can be used to estimate these parameters and to overcome central critiques of these methods. To provide an example we apply these methods to data from the mechanical pain sensitivity test of the QST protocol. The results show that adolescents are more sensitive to mechanical pain and contradict the idea that younger children simply use more lenient criteria to report pain. Overall, we hope that the wider use of multilevel modeling to describe somatosensory functioning may advance neurology research and practice.

  15. DEVELOPMENT OF MODEL FOR QUANTITATIVE EVALUATION OF DYNAMICALLY STABLE FORMS OF RIVER CHANNELS

    Directory of Open Access Journals (Sweden)

    O. V. Zenkin

    2017-01-01

    systems. The determination of regularities of development of bed forms and quantitative relations between their parameters are based on modeling the “right” forms of riverbed.The research has resulted in establishing and testing methodology of simulation modeling, which allows one to identify dynamically stable form of riverbed. 

  16. Linking sociological with physiological data: the model of effort-reward imbalance at work.

    Science.gov (United States)

    Siegrist, J; Klein, D; Voigt, K H

    1997-01-01

    While socio-epidemiologic studies documented impressive associations of indicators of chronic psychosocial stress with cardiovascular (c.v.) disease evidence on patho-physiologic processes is still limited. In this regard, the concept of heightened c.v. and hormonal reactivity (RE) to mental stress was proposed and explored. While this concept is a static one we suggest a more dynamic two-stage model of RE where recurrent high responsiveness (stage 1) in the long run results in attenuated, reduced maximal RE due to functional adaptation (stage 2). We present results of an indirect test of this hypothesis in a group of 68 healthy middle-aged men undergoing a modified Stroop Test: in men suffering from high chronic work stress in terms of effort-reward imbalance significantly reduced RE in heart rate, adrenaline and cortisol was found after adjusting for relevant confounders. In conclusion, results underscore the potential of linking sociological with physiological data in stress research.

  17. Gene therapy studies in a canine model of X-linked severe combined immunodeficiency.

    Science.gov (United States)

    Felsburg, Peter J; De Ravin, Suk See; Malech, Harry L; Sorrentino, Brian P; Burtner, Christopher; Kiem, Hans-Peter

    2015-03-01

    Since the occurrence of T cell leukemias in the original human γ-retroviral gene therapy trials for X-linked severe combined immunodeficiency (XSCID), considerable effort has been devoted to developing safer vectors. This review summarizes gene therapy studies performed in a canine model of XSCID to evaluate the efficacy of γ-retroviral, lentiviral, and foamy viral vectors for treating XSCID and a novel method of vector delivery. These studies demonstrate that durable T cell reconstitution and thymopoiesis with no evidence of any serious adverse events and, in contrast to the human XSCID patients, sustained marking in myeloid cells and B cells with reconstitution of normal humoral immune function can be achieved for up to 5 years without any pretreatment conditioning. The presence of sustained levels of gene-marked T cells, B cells, and more importantly myeloid cells for almost 5 years is highly suggestive of transduction of either multipotent hematopoietic stem cells or very primitive committed progenitors.

  18. Energy-Aware Topology Evolution Model with Link and Node Deletion in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xiaojuan Luo

    2012-01-01

    Full Text Available Based on the complex network theory, a new topological evolving model is proposed. In the evolution of the topology of sensor networks, the energy-aware mechanism is taken into account, and the phenomenon of change of the link and node in the network is discussed. Theoretical analysis and numerical simulation are conducted to explore the topology characteristics and network performance with different node energy distribution. We find that node energy distribution has the weak effect on the degree distribution P(k that evolves into the scale-free state, nodes with more energy carry more connections, and degree correlation is nontrivial disassortative. Moreover, the results show that, when nodes energy is more heterogeneous, the network is better clustered and enjoys higher performance in terms of the network efficiency and the average path length for transmitting data.

  19. Assessing the potential of using telecommunication microwave links in urban drainage modelling.

    Science.gov (United States)

    Fencl, M; Rieckermann, J; Schleiss, M; Stránský, D; Bareš, V

    2013-01-01

    The ability to predict the runoff response of an urban catchment to rainfall is crucial for managing drainage systems effectively and controlling discharges from urban areas. In this paper we assess the potential of commercial microwave links (MWL) to capture the spatio-temporal rainfall dynamics and thus improve urban rainfall-runoff modelling. Specifically, we perform numerical experiments with virtual rainfall fields and compare the results of MWL rainfall reconstructions to those of rain gauge (RG) observations. In a case study, we are able to show that MWL networks in urban areas are sufficiently dense to provide good information on spatio-temporal rainfall variability and can thus considerably improve pipe flow prediction, even in small subcatchments. In addition, the better spatial coverage also improves the control of discharges from urban areas. This is especially beneficial for heavy rainfall, which usually has a high spatial variability that cannot be accurately captured by RG point measurements.

  20. Neonatal estradiol stimulation prevents epilepsy in Arx model of X-linked infantile spasms syndrome.

    Science.gov (United States)

    Olivetti, Pedro R; Maheshwari, Atul; Noebels, Jeffrey L

    2014-01-22

    Infantile spasms are a catastrophic form of pediatric epilepsy with inadequate treatment. In patients, mutation of ARX, a transcription factor selectively expressed in neuronal precursors and adult inhibitory interneurons, impairs cell migration and causes a major inherited subtype of the disease X-linked infantile spasms syndrome. Using an animal model, the Arx((GCG)10+7) mouse, we determined that brief estradiol (E2) administration during early postnatal development prevented spasms in infancy and seizures in adult mutants. E2 was ineffective when delivered after puberty or 30 days after birth. Early E2 treatment altered mRNA levels of three downstream targets of Arx (Shox2, Ebf3, and Lgi1) and restored depleted interneuron populations without increasing GABAergic synaptic density. Postnatal E2 treatment may induce lasting transcriptional changes that lead to enduring disease modification and could potentially serve as a therapy for inherited interneuronopathies.

  1. Linking state-and-transition simulation and timber supply models for forest biomass production scenarios

    Science.gov (United States)

    Costanza, Jennifer; Abt, Robert C.; McKerrow, Alexa; Collazo, Jaime

    2015-01-01

    We linked state-and-transition simulation models (STSMs) with an economics-based timber supply model to examine landscape dynamics in North Carolina through 2050 for three scenarios of forest biomass production. Forest biomass could be an important source of renewable energy in the future, but there is currently much uncertainty about how biomass production would impact landscapes. In the southeastern US, if forests become important sources of biomass for bioenergy, we expect increased land-use change and forest management. STSMs are ideal for simulating these landscape changes, but the amounts of change will depend on drivers such as timber prices and demand for forest land, which are best captured with forest economic models. We first developed state-and-transition model pathways in the ST-Sim software platform for 49 vegetation and land-use types that incorporated each expected type of landscape change. Next, for the three biomass production scenarios, the SubRegional Timber Supply Model (SRTS) was used to determine the annual areas of thinning and harvest in five broad forest types, as well as annual areas converted among those forest types, agricultural, and urban lands. The SRTS output was used to define area targets for STSMs in ST-Sim under two scenarios of biomass production and one baseline, business-as-usual scenario. We show that ST-Sim output matched SRTS targets in most cases. Landscape dynamics results indicate that, compared with the baseline scenario, forest biomass production leads to more forest and, specifically, more intensively managed forest on the landscape by 2050. Thus, the STSMs, informed by forest economics models, provide important information about potential landscape effects of bioenergy production.

  2. Coupled dynamics of node and link states in complex networks: a model for language competition

    International Nuclear Information System (INIS)

    Carro, Adrián; Toral, Raúl; Miguel, Maxi San

    2016-01-01

    Inspired by language competition processes, we present a model of coupled evolution of node and link states. In particular, we focus on the interplay between the use of a language and the preference or attitude of the speakers towards it, which we model, respectively, as a property of the interactions between speakers (a link state) and as a property of the speakers themselves (a node state). Furthermore, we restrict our attention to the case of two socially equivalent languages and to socially inspired network topologies based on a mechanism of triadic closure. As opposed to most of the previous literature, where language extinction is an inevitable outcome of the dynamics, we find a broad range of possible asymptotic configurations, which we classify as: frozen extinction states, frozen coexistence states, and dynamically trapped coexistence states. Moreover, metastable coexistence states with very long survival times and displaying a non-trivial dynamics are found to be abundant. Interestingly, a system size scaling analysis shows, on the one hand, that the probability of language extinction vanishes exponentially for increasing system sizes and, on the other hand, that the time scale of survival of the non-trivial dynamical metastable states increases linearly with the size of the system. Thus, non-trivial dynamical coexistence is the only possible outcome for large enough systems. Finally, we show how this coexistence is characterized by one of the languages becoming clearly predominant while the other one becomes increasingly confined to ‘ghetto-like’ structures: small groups of bilingual speakers arranged in triangles, with a strong preference for the minority language, and using it for their intra-group interactions while they switch to the predominant language for communications with the rest of the population. (paper)

  3. Quantitative structure activity relationship model for predicting the depletion percentage of skin allergic chemical substances of glutathione

    International Nuclear Information System (INIS)

    Si Hongzong; Wang Tao; Zhang Kejun; Duan Yunbo; Yuan Shuping; Fu Aiping; Hu Zhide

    2007-01-01

    A quantitative model was developed to predict the depletion percentage of glutathione (DPG) compounds by gene expression programming (GEP). Each kind of compound was represented by several calculated structural descriptors involving constitutional, topological, geometrical, electrostatic and quantum-chemical features of compounds. The GEP method produced a nonlinear and five-descriptor quantitative model with a mean error and a correlation coefficient of 10.52 and 0.94 for the training set, 22.80 and 0.85 for the test set, respectively. It is shown that the GEP predicted results are in good agreement with experimental ones, better than those of the heuristic method

  4. Automated model-based quantitative analysis of phantoms with spherical inserts in FDG PET scans.

    Science.gov (United States)

    Ulrich, Ethan J; Sunderland, John J; Smith, Brian J; Mohiuddin, Imran; Parkhurst, Jessica; Plichta, Kristin A; Buatti, John M; Beichel, Reinhard R

    2018-01-01

    Quality control plays an increasingly important role in quantitative PET imaging and is typically performed using phantoms. The purpose of this work was to develop and validate a fully automated analysis method for two common PET/CT quality assurance phantoms: the NEMA NU-2 IQ and SNMMI/CTN oncology phantom. The algorithm was designed to only utilize the PET scan to enable the analysis of phantoms with thin-walled inserts. We introduce a model-based method for automated analysis of phantoms with spherical inserts. Models are first constructed for each type of phantom to be analyzed. A robust insert detection algorithm uses the model to locate all inserts inside the phantom. First, candidates for inserts are detected using a scale-space detection approach. Second, candidates are given an initial label using a score-based optimization algorithm. Third, a robust model fitting step aligns the phantom model to the initial labeling and fixes incorrect labels. Finally, the detected insert locations are refined and measurements are taken for each insert and several background regions. In addition, an approach for automated selection of NEMA and CTN phantom models is presented. The method was evaluated on a diverse set of 15 NEMA and 20 CTN phantom PET/CT scans. NEMA phantoms were filled with radioactive tracer solution at 9.7:1 activity ratio over background, and CTN phantoms were filled with 4:1 and 2:1 activity ratio over background. For quantitative evaluation, an independent reference standard was generated by two experts using PET/CT scans of the phantoms. In addition, the automated approach was compared against manual analysis, which represents the current clinical standard approach, of the PET phantom scans by four experts. The automated analysis method successfully detected and measured all inserts in all test phantom scans. It is a deterministic algorithm (zero variability), and the insert detection RMS error (i.e., bias) was 0.97, 1.12, and 1.48 mm for phantom

  5. Interdiffusion of the aluminum magnesium system. Quantitative analysis and numerical model; Interdiffusion des Aluminium-Magnesium-Systems. Quantitative Analyse und numerische Modellierung

    Energy Technology Data Exchange (ETDEWEB)

    Seperant, Florian

    2012-03-21

    Aluminum coatings are a promising approach to protect magnesium alloys against corrosion and thereby making them accessible to a variety of technical applications. Thermal treatment enhances the adhesion of the aluminium coating on magnesium by interdiffusion. For a deeper understanding of the diffusion process at the interface, a quantitative description of the Al-Mg system is necessary. On the basis of diffusion experiments with infinite reservoirs of aluminum and magnesium, the interdiffusion coefficients of the intermetallic phases of the Al-Mg-system are calculated with the Sauer-Freise method for the first time. To solve contradictions in the literature concerning the intrinsic diffusion coefficients, the possibility of a bifurcation of the Kirkendall plane is considered. Furthermore, a physico-chemical description of interdiffusion is provided to interpret the observed phase transitions. The developed numerical model is based on a temporally varied discretization of the space coordinate. It exhibits excellent quantitative agreement with the experimentally measured concentration profile. This confirms the validity of the obtained diffusion coefficients. Moreover, the Kirkendall shift in the Al-Mg system is simulated for the first time. Systems with thin aluminum coatings on magnesium also exhibit a good correlation between simulated and experimental concentration profiles. Thus, the diffusion coefficients are also valid for Al-coated systems. Hence, it is possible to derive parameters for a thermal treatment by simulation, resulting in an optimized modification of the magnesium surface for technical applications.

  6. A quantitative model of intracellular growth of Legionella pneumophila in Acanthamoeba castellanii.

    Science.gov (United States)

    Moffat, J F; Tompkins, L S

    1992-01-01

    A model of intracellular growth for Legionella pneumophila in Acanthamoeba castellanii has been developed and provides a quantitative measure of survival and replication after entry. In this model, Acanthamoeba monolayers were incubated with bacteria in tissue culture plates under nutrient-limiting conditions. Gentamicin was used to kill extracellular bacteria following the period of incubation, and the number of intracellular bacteria was determined following lysis of amebae. Intracellular growth of virulent L. pneumophila and other wild-type Legionella species was observed when the assay was performed at 37 degrees C. At room temperature, none of the Legionella strains tested grew intracellularly, while an avirulent L. pneumophila strain was unable to replicate in this assay at either temperature. The effect of nutrient limitation on A. castellanii during the assay prevented multiplication of the amebae and increased the level of infection by Legionella spp. The level of infection of the amebae was directly proportional to the multiplicity of infection with bacteria; at an inoculum of 1.03 x 10(7) bacteria added to wells containing 1.10 x 10(5) amebae (multiplicity of infection of 100), approximately 4.4% of A. castellanii cells became infected. Cytochalasin D reduced the uptake of bacteria by the amebae primarily by causing amebae to lift off the culture dish, reducing the number of target hosts; methylamine also reduced the level of initial infection, yet neither inhibitor was able to prevent intracellular replication of Legionella spp. Consequently, once the bacteria entered the cell, only lowered temperature could restrict replication. This model of intracellular growth provides a one-step growth curve and should be useful to study the molecular basis of the host-parasite interaction. PMID:1729191

  7. Quantitative vertebral morphometry based on parametric modeling of vertebral bodies in 3D.

    Science.gov (United States)

    Stern, D; Njagulj, V; Likar, B; Pernuš, F; Vrtovec, T

    2013-04-01

    Quantitative vertebral morphometry (QVM) was performed by parametric modeling of vertebral bodies in three dimensions (3D). Identification of vertebral fractures in two dimensions is a challenging task due to the projective nature of radiographic images and variability in the vertebral shape. By generating detailed 3D anatomical images, computed tomography (CT) enables accurate measurement of vertebral deformations and fractures. A detailed 3D representation of the vertebral body shape is obtained by automatically aligning a parametric 3D model to vertebral bodies in CT images. The parameters of the 3D model describe clinically meaningful morphometric vertebral body features, and QVM in 3D is performed by comparing the parameters to their statistical values. Thresholds and parameters that best discriminate between normal and fractured vertebral bodies are determined by applying statistical classification analysis. The proposed QVM in 3D was applied to 454 normal and 228 fractured vertebral bodies, yielding classification sensitivity of 92.5% at 7.5% specificity, with corresponding accuracy of 92.5% and precision of 86.1%. The 3D shape parameters that provided the best separation between normal and fractured vertebral bodies were the vertebral body height and the inclination and concavity of both vertebral endplates. The described QVM in 3D is able to efficiently and objectively discriminate between normal and fractured vertebral bodies and identify morphological cases (wedge, (bi)concavity, or crush) and grades (1, 2, or 3) of vertebral body fractures. It may be therefore valuable for diagnosing and predicting vertebral fractures in patients who are at risk of osteoporosis.

  8. A rodent model of traumatic stress induces lasting sleep and quantitative electroencephalographic disturbances.

    Science.gov (United States)

    Nedelcovych, Michael T; Gould, Robert W; Zhan, Xiaoyan; Bubser, Michael; Gong, Xuewen; Grannan, Michael; Thompson, Analisa T; Ivarsson, Magnus; Lindsley, Craig W; Conn, P Jeffrey; Jones, Carrie K

    2015-03-18

    Hyperarousal and sleep disturbances are common, debilitating symptoms of post-traumatic stress disorder (PTSD). PTSD patients also exhibit abnormalities in quantitative electroencephalography (qEEG) power spectra during wake as well as rapid eye movement (REM) and non-REM (NREM) sleep. Selective serotonin reuptake inhibitors (SSRIs), the first-line pharmacological treatment for PTSD, provide modest remediation of the hyperarousal symptoms in PTSD patients, but have little to no effect on the sleep-wake architecture deficits. Development of novel therapeutics for these sleep-wake architecture deficits is limited by a lack of relevant animal models. Thus, the present study investigated whether single prolonged stress (SPS), a rodent model of traumatic stress, induces PTSD-like sleep-wake and qEEG spectral power abnormalities that correlate with changes in central serotonin (5-HT) and neuropeptide Y (NPY) signaling in rats. Rats were implanted with telemetric recording devices to continuously measure EEG before and after SPS treatment. A second cohort of rats was used to measure SPS-induced changes in plasma corticosterone, 5-HT utilization, and NPY expression in brain regions that comprise the neural fear circuitry. SPS caused sustained dysregulation of NREM and REM sleep, accompanied by state-dependent alterations in qEEG power spectra indicative of cortical hyperarousal. These changes corresponded with acute induction of the corticosterone receptor co-chaperone FK506-binding protein 51 and delayed reductions in 5-HT utilization and NPY expression in the amygdala. SPS represents a preclinical model of PTSD-related sleep-wake and qEEG disturbances with underlying alterations in neurotransmitter systems known to modulate both sleep-wake architecture and the neural fear circuitry.

  9. Quantitative microleakage analysis of endodontic temporary filling materials using a glucose penetration model.

    Science.gov (United States)

    Kim, Sin-Young; Ahn, Jin-Soo; Yi, Young-Ah; Lee, Yoon; Hwang, Ji-Yun; Seo, Deog-Gyu

    2015-02-01

    The purpose of this study was to analyze the sealing ability of different temporary endodontic materials over a 6-week period using a glucose penetration model. Standardized holes were formed on 48 dentin discs from human premolars. The thicknesses of the specimens were distributed evenly to 2 mm, 3 mm and 4 mm. Prepared dentin specimens were randomly assigned into six groups (n = 7) and the holes in the dentin specimens were filled with two kinds of temporary filling materials as per the manufacturers' instructions as follows: Caviton (GC Corporation, Tokyo, Japan) 2 mm, 3 mm, 4 mm and IRM (Dentsply International Inc., Milford, DE) 2 mm, 3 mm, 4 mm. The remaining specimens were used as positive and negative controls and all specimens underwent thermocycling (1000; 5-55°C). The sealing ability of all samples was evaluated using the leakage model for glucose. The samples were analyzed by a spectrophotometer in quantitative glucose microleakage test over a period of 6 weeks. As a statistical inference, a mixed effect analysis was applied to analyze serial measurements over time. The Caviton groups showed less glucose penetration in comparison with the IRM groups. The Caviton 4 mm group demonstrated relatively low glucose leakage over the test period. High glucose leakage was detected throughout the test period in all IRM groups. The glucose leakage level increased after 1 week in the Caviton 2 mm group and after 4 weeks in the Caviton 3 mm and 4 mm groups (p penetration model during 6 weeks. Temporary filling of Caviton to at least 3 mm in thickness is necessary and temporary filling periods should not exceed 4 weeks.

  10. Model developments for quantitative estimates of the benefits of the signals on nuclear power plant availability and economics

    International Nuclear Information System (INIS)

    Seong, Poong Hyun

    1993-01-01

    A novel framework for quantitative estimates of the benefits of signals on nuclear power plant availability and economics has been developed in this work. The models developed in this work quantify how the perfect signals affect the human operator's success in restoring the power plant to the desired state when it enters undesirable transients. Also, the models quantify the economic benefits of these perfect signals. The models have been applied to the condensate feedwater system of the nuclear power plant for demonstration. (Author)

  11. A quantitative exposure model simulating human norovirus transmission during preparation of deli sandwiches.

    Science.gov (United States)

    Stals, Ambroos; Jacxsens, Liesbeth; Baert, Leen; Van Coillie, Els; Uyttendaele, Mieke

    2015-03-02

    Human noroviruses (HuNoVs) are a major cause of food borne gastroenteritis worldwide. They are often transmitted via infected and shedding food handlers manipulating foods such as deli sandwiches. The presented study aimed to simulate HuNoV transmission during the preparation of deli sandwiches in a sandwich bar. A quantitative exposure model was developed by combining the GoldSim® and @Risk® software packages. Input data were collected from scientific literature and from a two week observational study performed at two sandwich bars. The model included three food handlers working during a three hour shift on a shared working surface where deli sandwiches are prepared. The model consisted of three components. The first component simulated the preparation of the deli sandwiches and contained the HuNoV reservoirs, locations within the model allowing the accumulation of NoV and the working of intervention measures. The second component covered the contamination sources being (1) the initial HuNoV contaminated lettuce used on the sandwiches and (2) HuNoV originating from a shedding food handler. The third component included four possible intervention measures to reduce HuNoV transmission: hand and surface disinfection during preparation of the sandwiches, hand gloving and hand washing after a restroom visit. A single HuNoV shedding food handler could cause mean levels of 43±18, 81±37 and 18±7 HuNoV particles present on the deli sandwiches, hands and working surfaces, respectively. Introduction of contaminated lettuce as the only source of HuNoV resulted in the presence of 6.4±0.8 and 4.3±0.4 HuNoV on the food and hand reservoirs. The inclusion of hand and surface disinfection and hand gloving as a single intervention measure was not effective in the model as only marginal reductions of HuNoV levels were noticeable in the different reservoirs. High compliance of hand washing after a restroom visit did reduce HuNoV presence substantially on all reservoirs. The

  12. Effect of platform, reference material, and quantification model on enumeration of Enterococcus by quantitative PCR methods

    Science.gov (United States)

    Quantitative polymerase chain reaction (qPCR) is increasingly being used for the quantitative detection of fecal indicator bacteria in beach water. QPCR allows for same-day health warnings, and its application is being considered as an optionn for recreational water quality testi...

  13. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    Science.gov (United States)

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  14. Assessing the toxic effects of ethylene glycol ethers using Quantitative Structure Toxicity Relationship models

    International Nuclear Information System (INIS)

    Ruiz, Patricia; Mumtaz, Moiz; Gombar, Vijay

    2011-01-01

    Experimental determination of toxicity profiles consumes a great deal of time, money, and other resources. Consequently, businesses, societies, and regulators strive for reliable alternatives such as Quantitative Structure Toxicity Relationship (QSTR) models to fill gaps in toxicity profiles of compounds of concern to human health. The use of glycol ethers and their health effects have recently attracted the attention of international organizations such as the World Health Organization (WHO). The board members of Concise International Chemical Assessment Documents (CICAD) recently identified inadequate testing as well as gaps in toxicity profiles of ethylene glycol mono-n-alkyl ethers (EGEs). The CICAD board requested the ATSDR Computational Toxicology and Methods Development Laboratory to conduct QSTR assessments of certain specific toxicity endpoints for these chemicals. In order to evaluate the potential health effects of EGEs, CICAD proposed a critical QSTR analysis of the mutagenicity, carcinogenicity, and developmental effects of EGEs and other selected chemicals. We report here results of the application of QSTRs to assess rodent carcinogenicity, mutagenicity, and developmental toxicity of four EGEs: 2-methoxyethanol, 2-ethoxyethanol, 2-propoxyethanol, and 2-butoxyethanol and their metabolites. Neither mutagenicity nor carcinogenicity is indicated for the parent compounds, but these compounds are predicted to be developmental toxicants. The predicted toxicity effects were subjected to reverse QSTR (rQSTR) analysis to identify structural attributes that may be the main drivers of the developmental toxicity potential of these compounds.

  15. Quantitative assessment of bone defect healing by multidetector CT in a pig model

    International Nuclear Information System (INIS)

    Riegger, Carolin; Kroepil, Patric; Lanzman, Rotem S.; Miese, Falk R.; Antoch, Gerald; Scherer, Axel; Jungbluth, Pascal; Hakimi, Mohssen; Wild, Michael; Hakimi, Ahmad R.

    2012-01-01

    To evaluate multidetector CT volumetry in the assessment of bone defect healing in comparison to histopathological findings in an animal model. In 16 mini-pigs, a circumscribed tibial bone defect was created. Multidetector CT (MDCT) of the tibia was performed on a 64-row scanner 42 days after the operation. The extent of bone healing was estimated quantitatively by MDCT volumetry using a commercially available software programme (syngo Volume, Siemens, Germany).The volume of the entire defect (including all pixels from -100 to 3,000 HU), the nonconsolidated areas (-100 to 500 HU), and areas of osseous consolidation (500 to 3,000 HU) were assessed and the extent of consolidation was calculated. Histomorphometry served as the reference standard. The extent of osseous consolidation in MDCT volumetry ranged from 19 to 92% (mean 65.4 ± 18.5%). There was a significant correlation between histologically visible newly formed bone and the extent of osseous consolidation on MDCT volumetry (r = 0.82, P < 0.0001). A significant negative correlation was detected between osseous consolidation on MDCT and histological areas of persisting defect (r = -0.9, P < 0.0001). MDCT volumetry is a promising tool for noninvasive monitoring of bone healing, showing excellent correlation with histomorphometry. (orig.)

  16. Advances in the molecular modeling and quantitative structure-activity relationship-based design for antihistamines.

    Science.gov (United States)

    Galvez, Jorge; Galvez-Llompart, Maria; Zanni, Riccardo; Garcia-Domenech, Ramon

    2013-03-01

    Nowadays the use of antihistamines (AH) is increasing steadily. These drugs are able to act on a variety of pathological conditions of the organism. A number of computer-aided (in silico) approaches have been developed to discover and develop novel AH drugs. Among these methods stand the ones based on drug-receptor docking, thermodynamics, as well as the quantitative structure-activity relationships (QSAR). This review collates the most recent advances in the use of computer approaches for the search and characterization of novel AH drugs. Within the QSAR methods, particular attention will be paid to those based on molecular topology (MT) because of their demonstrated efficacy in discovering new drugs. Collateral topics will also be dealt with including: docking studies, thermodynamic aspects, molecular modeling and so on. These issues will be treated to the extent that they have interest as complementary to QSAR-MT. Given the importance of the use of AHs, the search for new drugs in this field has become imperative today. In this regard, the use of QSAR methods based on MT, namely QSAR-MT, has proven to be a powerful tool when the goal is discovering new hit or lead structures. It has been shown that antihistaminic activity is complex and different for the four known types of receptors (H1 to H4) and that electronic, steric and physicochemical issues determine drug activity. These factors, along with the purely structural ones, can be deduced from topological and topochemical information.

  17. Quantitative assessment of bone defect healing by multidetector CT in a pig model

    Energy Technology Data Exchange (ETDEWEB)

    Riegger, Carolin; Kroepil, Patric; Lanzman, Rotem S.; Miese, Falk R.; Antoch, Gerald; Scherer, Axel [University Duesseldorf, Medical Faculty, Department of Diagnostic and Interventional Radiology, Duesseldorf (Germany); Jungbluth, Pascal; Hakimi, Mohssen; Wild, Michael [University Duesseldorf, Medical Faculty, Department of Traumatology and Hand Surgery, Duesseldorf (Germany); Hakimi, Ahmad R. [Universtity Duesseldorf, Medical Faculty, Department of Oral Surgery, Duesseldorf (Germany)

    2012-05-15

    To evaluate multidetector CT volumetry in the assessment of bone defect healing in comparison to histopathological findings in an animal model. In 16 mini-pigs, a circumscribed tibial bone defect was created. Multidetector CT (MDCT) of the tibia was performed on a 64-row scanner 42 days after the operation. The extent of bone healing was estimated quantitatively by MDCT volumetry using a commercially available software programme (syngo Volume, Siemens, Germany).The volume of the entire defect (including all pixels from -100 to 3,000 HU), the nonconsolidated areas (-100 to 500 HU), and areas of osseous consolidation (500 to 3,000 HU) were assessed and the extent of consolidation was calculated. Histomorphometry served as the reference standard. The extent of osseous consolidation in MDCT volumetry ranged from 19 to 92% (mean 65.4 {+-} 18.5%). There was a significant correlation between histologically visible newly formed bone and the extent of osseous consolidation on MDCT volumetry (r = 0.82, P < 0.0001). A significant negative correlation was detected between osseous consolidation on MDCT and histological areas of persisting defect (r = -0.9, P < 0.0001). MDCT volumetry is a promising tool for noninvasive monitoring of bone healing, showing excellent correlation with histomorphometry. (orig.)

  18. Quantitative profiling of brain lipid raft proteome in a mouse model of fragile X syndrome.

    Science.gov (United States)

    Kalinowska, Magdalena; Castillo, Catherine; Francesconi, Anna

    2015-01-01

    Fragile X Syndrome, a leading cause of inherited intellectual disability and autism, arises from transcriptional silencing of the FMR1 gene encoding an RNA-binding protein, Fragile X Mental Retardation Protein (FMRP). FMRP can regulate the expression of approximately 4% of brain transcripts through its role in regulation of mRNA transport, stability and translation, thus providing a molecular rationale for its potential pleiotropic effects on neuronal and brain circuitry function. Several intracellular signaling pathways are dysregulated in the absence of FMRP suggesting that cellular deficits may be broad and could result in homeostatic changes. Lipid rafts are specialized regions of the plasma membrane, enriched in cholesterol and glycosphingolipids, involved in regulation of intracellular signaling. Among transcripts targeted by FMRP, a subset encodes proteins involved in lipid biosynthesis and homeostasis, dysregulation of which could affect the integrity and function of lipid rafts. Using a quantitative mass spectrometry-based approach we analyzed the lipid raft proteome of Fmr1 knockout mice, an animal model of Fragile X syndrome, and identified candidate proteins that are differentially represented in Fmr1 knockout mice lipid rafts. Furthermore, network analysis of these candidate proteins reveals connectivity between them and predicts functional connectivity with genes encoding components of myelin sheath, axonal processes and growth cones. Our findings provide insight to aid identification of molecular and cellular dysfunctions arising from Fmr1 silencing and for uncovering shared pathologies between Fragile X syndrome and other autism spectrum disorders.

  19. Evaluation of tongue motor biomechanics during swallowing—From oral feeding models to quantitative sensing methods

    Directory of Open Access Journals (Sweden)

    Takahiro Ono

    2009-09-01

    Full Text Available In today's aging society, dentists are more likely to treat patients with dysphagia and are required to select an optimal treatment option based on a complete understanding of the swallowing function. Although the tongue plays an important role in mastication and swallowing as described in human oral feeding models developed in 1990s, physiological significances of tongue function has been poorly understood due to the difficulty in monitoring and analyzing it. This review summarizes recent approaches used to evaluate tongue function during swallowing quantitatively mainly focusing on modern sensing methods such as manofluorography, sensing probes, pressure sensors installed in the palatal plates and ultrasound imaging of tongue movement. Basic understanding on the kinematics and biomechanics of tongue movement during swallowing in normal subjects was provided by the series of studies. There have been few studies, however, on the pathological change of tongue function in dysphagic patients. Therefore further improvement in measurement devices and technologies and additional multidisciplinary studies are needed to establish therapeutic evidence regarding tongue movement, as well as the best prosthodontic approach for dysphagia rehabilitation.

  20. Quantitative studies of animal colour constancy: using the chicken as model

    Science.gov (United States)

    2016-01-01

    Colour constancy is the capacity of visual systems to keep colour perception constant despite changes in the illumination spectrum. Colour constancy has been tested extensively in humans and has also been described in many animals. In humans, colour constancy is often studied quantitatively, but besides humans, this has only been done for the goldfish and the honeybee. In this study, we quantified colour constancy in the chicken by training the birds in a colour discrimination task and testing them in changed illumination spectra to find the largest illumination change in which they were able to remain colour-constant. We used the receptor noise limited model for animal colour vision to quantify the illumination changes, and found that colour constancy performance depended on the difference between the colours used in the discrimination task, the training procedure and the time the chickens were allowed to adapt to a new illumination before making a choice. We analysed literature data on goldfish and honeybee colour constancy with the same method and found that chickens can compensate for larger illumination changes than both. We suggest that future studies on colour constancy in non-human animals could use a similar approach to allow for comparison between species and populations. PMID:27170714

  1. Synthesis, photodynamic activity, and quantitative structure-activity relationship modelling of a series of BODIPYs.

    Science.gov (United States)

    Caruso, Enrico; Gariboldi, Marzia; Sangion, Alessandro; Gramatica, Paola; Banfi, Stefano

    2017-02-01

    Here we report the synthesis of eleven new BODIPYs (14-24) characterized by the presence of an aromatic ring on the 8 (meso) position and of iodine atoms on the pyrrolic 2,6 positions. These molecules, together with twelve BODIPYs already reported by us (1-12), represent a large panel of BODIPYs showing different atoms or groups as substituent of the aromatic moiety. Two physico-chemical features ( 1 O 2 generation rate and lipophilicity), which can play a fundamental role in the outcome as photosensitizers, have been studied. The in vitro photo-induced cell-killing efficacy of 23 PSs was studied on the SKOV3 cell line treating the cells for 24h in the dark then irradiating for 2h with a green LED device (fluence 25.2J/cm 2 ). The cell-killing efficacy was assessed with the MTT test and compared with that one of meso un-substituted compound (13). In order to understand the possible effect of the substituents, a predictive quantitative structure-activity relationship (QSAR) regression model, based on theoretical holistic molecular descriptors, was developed. The results clearly indicate that the presence of an aromatic ring is fundamental for an excellent photodynamic response, whereas the electronic effects and the position of the substituents on the aromatic ring do not influence the photodynamic efficacy. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Quantitative models of persistence and relapse from the perspective of behavioral momentum theory: Fits and misfits.

    Science.gov (United States)

    Nevin, John A; Craig, Andrew R; Cunningham, Paul J; Podlesnik, Christopher A; Shahan, Timothy A; Sweeney, Mary M

    2017-08-01

    We review quantitative accounts of behavioral momentum theory (BMT), its application to clinical treatment, and its extension to post-intervention relapse of target behavior. We suggest that its extension can account for relapse using reinstatement and renewal models, but that its application to resurgence is flawed both conceptually and in its failure to account for recent data. We propose that the enhanced persistence of target behavior engendered by alternative reinforcers is limited to their concurrent availability within a distinctive stimulus context. However, a failure to find effects of stimulus-correlated reinforcer rates in a Pavlovian-to-Instrumental Transfer (PIT) paradigm challenges even a straightforward Pavlovian account of alternative reinforcer effects. BMT has been valuable in understanding basic research findings and in guiding clinical applications and accounting for their data, but alternatives are needed that can account more effectively for resurgence while encompassing basic data on resistance to change as well as other forms of relapse. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Quantitative three-dimensional modeling of zeotile through discrete electron tomography.

    Science.gov (United States)

    Bals, Sara; Batenburg, K Joost; Liang, Duoduo; Lebedev, Oleg; Van Tendeloo, Gustaaf; Aerts, Alexander; Martens, Johan A; Kirschhock, Christine E A

    2009-04-08

    Discrete electron tomography is a new approach for three-dimensional reconstruction of nanoscale objects. The technique exploits prior knowledge of the object to be reconstructed, which results in an improvement of the quality of the reconstructions. Through the combination of conventional transmission electron microscopy and discrete electron tomography with a model-based approach, quantitative structure determination becomes possible. In the present work, this approach is used to unravel the building scheme of Zeotile-4, a silica material with two levels of structural order. The layer sequence of slab-shaped building units could be identified. Successive layers were found to be related by a rotation of 120 degrees, resulting in a hexagonal space group. The Zeotile-4 material is a demonstration of the concept of successive structuring of silica at two levels. At the first level, the colloid chemical properties of Silicalite-1 precursors are exploited to create building units with a slablike geometry. At the second level, the slablike units are tiled using a triblock copolymer to serve as a mesoscale structuring agent.

  4. Quantitative Validation of the Integrated Medical Model (IMM) for ISS Missions

    Science.gov (United States)

    Young, Millennia; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.

    2016-01-01

    Lifetime Surveillance of Astronaut Health (LSAH) provided observed medical event data on 33 ISS and 111 STS person-missions for use in further improving and validating the Integrated Medical Model (IMM). Using only the crew characteristics from these observed missions, the newest development version, IMM v4.0, will simulate these missions to predict medical events and outcomes. Comparing IMM predictions to the actual observed medical event counts will provide external validation and identify areas of possible improvement. In an effort to improve the power of detecting differences in this validation study, the total over each program ISS and STS will serve as the main quantitative comparison objective, specifically the following parameters: total medical events (TME), probability of loss of crew life (LOCL), and probability of evacuation (EVAC). Scatter plots of observed versus median predicted TMEs (with error bars reflecting the simulation intervals) will graphically display comparisons while linear regression will serve as the statistical test of agreement. Two scatter plots will be analyzed 1) where each point reflects a mission and 2) where each point reflects a condition-specific total number of occurrences. The coefficient of determination (R2) resulting from a linear regression with no intercept bias (intercept fixed at zero) will serve as an overall metric of agreement between IMM and the real world system (RWS). In an effort to identify as many possible discrepancies as possible for further inspection, the -level for all statistical tests comparing IMM predictions to observed data will be set to 0.1. This less stringent criterion, along with the multiple testing being conducted, should detect all perceived differences including many false positive signals resulting from random variation. The results of these analyses will reveal areas of the model requiring adjustment to improve overall IMM output, which will thereby provide better decision support for

  5. A conceptual model linking functional gene expression and reductive dechlorination rates of chlorinated ethenes in clay rich groundwater sediment

    DEFF Research Database (Denmark)

    Bælum, Jacob; Chambon, Julie Claire Claudia; Scheutz, Charlotte

    2013-01-01

    We used current knowledge of cellular processes involved in reductive dechlorination to develop a conceptual model to describe the regulatory system of dechlorination at the cell level; the model links bacterial growth and substrate consumption to the abundance of messenger RNA of functional gene...

  6. An Interpersonal Circumplex Model of Children's Social Goals: Links with Peer-Reported Behavior and Sociometric Status

    Science.gov (United States)

    Ojanen, Tiina; Gronroos, Matti; Salmivalli, Christina

    2005-01-01

    The objective of the present research was to develop an assessment model for children's social goals. The aims were (a) to fit children's social goals to a circumplex model and to examine links between goals and peer-reported social behaviors (aggression, withdrawal, and prosocial behavior) in a sample of 276 participants (134 girls, 11- to…

  7. The role of ecological models in linking ecological risk assessment to ecosystem services in agroecosystems.

    Science.gov (United States)

    Galic, Nika; Schmolke, Amelie; Forbes, Valery; Baveco, Hans; van den Brink, Paul J

    2012-01-15

    Agricultural practices are essential for sustaining the human population, but at the same time they can directly disrupt ecosystem functioning. Ecological risk assessment (ERA) aims to estimate possible adverse effects of human activities on ecosystems and their parts. Current ERA practices, however, incorporate very little ecology and base the risk estimates on the results of standard tests with several standard species. The main obstacles for a more ecologically relevant ERA are the lack of clear protection goals and the inherent complexity of ecosystems that is hard to approach empirically. In this paper, we argue that the ecosystem services framework offers an opportunity to define clear and ecologically relevant protection goals. At the same time, ecological models provide the tools to address ecological complexity to the degree needed to link measurement endpoints and ecosystem services, and to quantify service provision and possible adverse effects from human activities. We focus on the ecosystem services relevant for agroecosystem functioning, including pollination, biocontrol and eutrophication effects and present modeling studies relevant for quantification of each of the services. The challenges of the ecosystem services approach are discussed as well as the limitations of ecological models in the context of ERA. A broad, multi-stakeholder dialog is necessary to aid the definition of protection goals in terms of services delivered by ecosystems and their parts. The need to capture spatio-temporal dynamics and possible interactions among service providers pose challenges for ecological models as a basis for decision making. However, we argue that both fields are advancing quickly and can prove very valuable in achieving more ecologically relevant ERA. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. A biological-based model that links genomic instability, bystander effects, and adaptive response

    International Nuclear Information System (INIS)

    Scott, B.R.

    2004-01-01

    This paper links genomic instability, bystander effects, and adaptive response in mammalian cell communities via a novel biological-based, dose-response model called NEOTRANS 3 . The model is an extension of the NEOTRANS 2 model that addressed stochastic effects (genomic instability, mutations, and neoplastic transformation) associated with brief exposure to low radiation doses. With both models, ionizing radiation produces DNA damage in cells that can be associated with varying degrees of genomic instability. Cells with persistent problematic instability (PPI) are mutants that arise via misrepair of DNA damage. Progeny of PPI cells also have PPI and can undergo spontaneous neoplastic transformation. Unlike NEOTRANS 2 , with NEOTRANS 3 newly induced mutant PPI cells and their neoplastically transformed progeny can be suppressed via our previously introduced protective apoptosis-mediated (PAM) process, which can be activated by low linear energy transfer (LET) radiation. However, with NEOTRANS 3 (which like NEOTRANS 2 involves cross-talk between nongenomically compromised [e.g., nontransformed, nonmutants] and genomically compromised [e.g., mutants, transformants, etc.] cells), it is assumed that PAM is only activated over a relatively narrow, dose-rate-dependent interval (D PAM ,D off ); where D PAM is a small stochastic activation threshold, and D off is the stochastic dose above which PAM does not occur. PAM cooperates with activated normal DNA repair and with activated normal apoptosis in guarding against genomic instability. Normal repair involves both error-free repair and misrepair components. Normal apoptosis and the error-free component of normal repair protect mammals by preventing the occurrence of mutant cells. PAM selectively removes mutant cells arising via the misrepair component of normal repair, selectively removes existing neoplastically transformed cells, and probably selectively removes other genomically compromised cells when it is activated

  9. MODIS volcanic ash retrievals vs FALL3D transport model: a quantitative comparison

    Science.gov (United States)

    Corradini, S.; Merucci, L.; Folch, A.

    2010-12-01

    Satellite retrievals and transport models represents the key tools to monitor the volcanic clouds evolution. Because of the harming effects of fine ash particles on aircrafts, the real-time tracking and forecasting of volcanic clouds is key for aviation safety. Together with the security reasons also the economical consequences of a disruption of airports must be taken into account. The airport closures due to the recent Icelandic Eyjafjöll eruption caused millions of passengers to be stranded not only in Europe, but across the world. IATA (the International Air Transport Association) estimates that the worldwide airline industry has lost a total of about 2.5 billion of Euro during the disruption. Both security and economical issues require reliable and robust ash cloud retrievals and trajectory forecasting. The intercomparison between remote sensing and modeling is required to assure precise and reliable volcanic ash products. In this work we perform a quantitative comparison between Moderate Resolution Imaging Spectroradiometer (MODIS) retrievals of volcanic ash cloud mass and Aerosol Optical Depth (AOD) with the FALL3D ash dispersal model. MODIS, aboard the NASA-Terra and NASA-Aqua polar satellites, is a multispectral instrument with 36 spectral bands operating in the VIS-TIR spectral range and spatial resolution varying between 250 and 1000 m at nadir. The MODIS channels centered around 11 and 12 micron have been used for the ash retrievals through the Brightness Temperature Difference algorithm and MODTRAN simulations. FALL3D is a 3-D time-dependent Eulerian model for the transport and deposition of volcanic particles that outputs, among other variables, cloud column mass and AOD. Three MODIS images collected the October 28, 29 and 30 on Mt. Etna volcano during the 2002 eruption have been considered as test cases. The results show a general good agreement between the retrieved and the modeled volcanic clouds in the first 300 km from the vents. Even if the

  10. Linking Narrative Storylines and Quantitative Models to Combat Deserti ication in the Guadalentín Watershed (Spain)

    NARCIS (Netherlands)

    Kok, K.; Delden, van H.

    2013-01-01

    Desertification in Spain is largely a society-driven problem, which can be effectively managed only through a thorough understanding of the principal ecological, sociocultural, and economic driving forces (UNCCD 1994). This calls for a more active role of decision-makers and other stakeholders. We

  11. A model linking clinical workforce skill mix planning to health and health care dynamics

    Directory of Open Access Journals (Sweden)

    McDonnell Geoff

    2010-04-01

    Full Text Available Abstract Background In an attempt to devise a simpler computable tool to assist workforce planners in determining what might be an appropriate mix of health service skills, our discussion led us to consider the implications of skill mixing and workforce composition beyond the 'stock and flow' approach of much workforce planning activity. Methods Taking a dynamic systems approach, we were able to address the interactions, delays and feedbacks that influence the balance between the major components of health and health care. Results We linked clinical workforce requirements to clinical workforce workload, taking into account the requisite facilities, technologies, other material resources and their funding to support clinical care microsystems; gave recognition to productivity and quality issues; took cognisance of policies, governance and power concerns in the establishment and operation of the health care system; and, going back to the individual, gave due attention to personal behaviour and biology within the socio-political family environment. Conclusion We have produced the broad endogenous systems model of health and health care which will enable human resource planners to operate within real world variables. We are now considering the development of simple, computable national versions of this model.

  12. A model linking clinical workforce skill mix planning to health and health care dynamics.

    Science.gov (United States)

    Masnick, Keith; McDonnell, Geoff

    2010-04-30

    In an attempt to devise a simpler computable tool to assist workforce planners in determining what might be an appropriate mix of health service skills, our discussion led us to consider the implications of skill mixing and workforce composition beyond the 'stock and flow' approach of much workforce planning activity. Taking a dynamic systems approach, we were able to address the interactions, delays and feedbacks that influence the balance between the major components of health and health care. We linked clinical workforce requirements to clinical workforce workload, taking into account the requisite facilities, technologies, other material resources and their funding to support clinical care microsystems; gave recognition to productivity and quality issues; took cognisance of policies, governance and power concerns in the establishment and operation of the health care system; and, going back to the individual, gave due attention to personal behaviour and biology within the socio-political family environment. We have produced the broad endogenous systems model of health and health care which will enable human resource planners to operate within real world variables. We are now considering the development of simple, computable national versions of this model.

  13. ON THE TRANSITIONAL DISK CLASS: LINKING OBSERVATIONS OF T TAURI STARS AND PHYSICAL DISK MODELS

    International Nuclear Information System (INIS)

    Espaillat, C.; Andrews, S.; Qi, C.; Wilner, D.; Ingleby, L.; Calvet, N.; Hernández, J.; Furlan, E.; D'Alessio, P.; Muzerolle, J.

    2012-01-01

    Two decades ago 'transitional disks' (TDs) described spectral energy distributions (SEDs) of T Tauri stars with small near-IR excesses, but significant mid- and far-IR excesses. Many inferred this indicated dust-free holes in disks possibly cleared by planets. Recently, this term has been applied disparately to objects whose Spitzer SEDs diverge from the expectations for a typical full disk (FD). Here, we use irradiated accretion disk models to fit the SEDs of 15 such disks in NGC 2068 and IC 348. One group has a 'dip' in infrared emission while the others' continuum emission decreases steadily at all wavelengths. We find that the former have an inner disk hole or gap at intermediate radii in the disk and we call these objects 'transitional disks' and 'pre-transitional disks' (PTDs), respectively. For the latter group, we can fit these SEDs with FD models and find that millimeter data are necessary to break the degeneracy between dust settling and disk mass. We suggest that the term 'transitional' only be applied to objects that display evidence for a radical change in the disk's radial structure. Using this definition, we find that TDs and PTDs tend to have lower mass accretion rates than FDs and that TDs have lower accretion rates than PTDs. These reduced accretion rates onto the star could be linked to forming planets. Future observations of TDs and PTDs will allow us to better quantify the signatures of planet formation in young disks.

  14. Splice-correcting oligonucleotides restore BTK function in X-linked agammaglobulinemia model.

    Science.gov (United States)

    Bestas, Burcu; Moreno, Pedro M D; Blomberg, K Emelie M; Mohammad, Dara K; Saleh, Amer F; Sutlu, Tolga; Nordin, Joel Z; Guterstam, Peter; Gustafsson, Manuela O; Kharazi, Shabnam; Piątosa, Barbara; Roberts, Thomas C; Behlke, Mark A; Wood, Matthew J A; Gait, Michael J; Lundin, Karin E; El Andaloussi, Samir; Månsson, Robert; Berglöf, Anna; Wengel, Jesper; Smith, C I Edvard

    2014-09-01

    X-linked agammaglobulinemia (XLA) is an inherited immunodeficiency that results from mutations within the gene encoding Bruton's tyrosine kinase (BTK). Many XLA-associated mutations affect splicing of BTK pre-mRNA and severely impair B cell development. Here, we assessed the potential of antisense, splice-correcting oligonucleotides (SCOs) targeting mutated BTK transcripts for treating XLA. Both the SCO structural design and chemical properties were optimized using 2'-O-methyl, locked nucleic acid, or phosphorodiamidate morpholino backbones. In order to have access to an animal model of XLA, we engineered a transgenic mouse that harbors a BAC with an authentic, mutated, splice-defective human BTK gene. BTK transgenic mice were bred onto a Btk knockout background to avoid interference of the orthologous mouse protein. Using this model, we determined that BTK-specific SCOs are able to correct aberrantly spliced BTK in B lymphocytes, including pro-B cells. Correction of BTK mRNA restored expression of functional protein, as shown both by enhanced lymphocyte survival and reestablished BTK activation upon B cell receptor stimulation. Furthermore, SCO treatment corrected splicing and restored BTK expression in primary cells from patients with XLA. Together, our data demonstrate that SCOs can restore BTK function and that BTK-targeting SCOs have potential as personalized medicine in patients with XLA.

  15. A model linking sources of stress to approach and avoidance coping styles of Turkish basketball referees.

    Science.gov (United States)

    Anshel, Mark Howard; Sutarso, Toto; Ekmekci, Ridvan; Saraswati, Intan W

    2014-01-01

    Purpose of this study was to externally validate and test a conceptual transient model involving six paths that linked sources of acute stress to avoidance and approach coping styles among Turkish basketball referees. The sample consisted of 125 Turkish basketball referees ranging in age from 18 to 36 years (mean = 25.58. σ = 3.69). The path analysis tested the relationships simultaneously from stressors, in consecutive order, distractions, subpar performance and verbal abuse, to coping styles, first both avoidance-cognitive and approach-cognitive, and then approach-behaviour. Results indicated that the model achieved a good fit and that all paths tested simultaneously were significant. The distractions stressor was positively related to subpar performance, which, in turn, was positively related to verbal abuse. Verbal abuse was negatively associated with an avoidance-cognitive coping style and positively related to the approach-cognitive coping style. The results also supported a crossover effect of both avoidance-cognitive and approach-cognitive on approach-behaviour. One implication of this study is that coping should be studied in naturally occurring stages, a process-oriented approach. Another implication is that approach and avoidance coping styles, each sub-divided into cognitive and behavioural categories, provide a meaningful framework which provides sports officials a coherent structure for learning and improving ways to cope with acute stress experienced during the contest.

  16. Linked Gauss-Diffusion processes for modeling a finite-size neuronal network.

    Science.gov (United States)

    Carfora, M F; Pirozzi, E

    2017-11-01

    A Leaky Integrate-and-Fire (LIF) model with stochastic current-based linkages is considered to describe the firing activity of neurons interacting in a (2×2)-size feed-forward network. In the subthreshold regime and under the assumption that no more than one spike is exchanged between coupled neurons, the stochastic evolution of the neuronal membrane voltage is subject to random jumps due to interactions in the network. Linked Gauss-Diffusion processes are proposed to describe this dynamics and to provide estimates of the firing probability density of each neuron. To this end, an iterated integral equation-based approach is applied to evaluate numerically the first passage time density of such processes through the firing threshold. Asymptotic approximations of the firing densities of surrounding neurons are used to obtain closed-form expressions for the mean of the involved processes and to simplify the numerical procedure. An extension of the model to an (N×N)-size network is also given. Histograms of firing times obtained by simulations of the LIF dynamics and numerical firings estimates are compared. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. New chondrosarcoma cell lines and mouse models to study the link between chondrogenesis and chemoresistance.

    Science.gov (United States)

    Monderer, David; Luseau, Alexandrine; Bellec, Amélie; David, Emmanuelle; Ponsolle, Stéphanie; Saiagh, Soraya; Bercegeay, Sylvain; Piloquet, Philippe; Denis, Marc G; Lodé, Laurence; Rédini, Françoise; Biger, Marine; Heymann, Dominique; Heymann, Marie-Françoise; Le Bot, Ronan; Gouin, François; Blanchard, Frédéric

    2013-10-01

    Chondrosarcomas are cartilage-forming, poorly vascularized tumors. They represent the second malignant primary bone tumor of adults after osteosarcoma, but in contrast to osteosarcoma they are resistant to chemotherapy and radiotherapy, surgical excision remaining the only therapeutic option. Few cell lines and animal models are available, and the mechanisms behind their chemoresistance remain largely unknown. Our goal was to establish new cell lines and animal cancer models from human chondrosarcoma biopsies to study their chemoresistance. Between 2007 and 2012, 10 chondrosarcoma biopsies were collected and used for cell culture and transplantation into nude mice. Only one transplanted biopsy and one injected cell line has engrafted successfully leading to conventional central high-grade chondrosarcoma similar to the original biopsies. In culture, two new stable cell lines were obtained, one from a dedifferentiated and one from a grade III conventional central chondrosarcoma biopsy. Their genetic characterization revealed triploid karyotypes, mutations in IDH1, IDH2, and TP53, deletion in CDKN2A and/or MDM2 amplification. These cell lines expressed mesenchymal membrane markers (CD44, 73, 90, 105) and were able to produce a hyaline cartilaginous matrix when cultured in chondrogenic three-dimensional (3D) pellets. Using a high-throughput quantitative RT-PCR approach, we observed that cell lines cultured in monolayer had lost expression of several genes implicated in cartilage development (COL2A1, COMP, ACAN) but restored their expression in 3D cultures. Chondrosarcoma cells in monolayer were sensitive to several conventional chemotherapeutic agents but became resistant to low doses of mafosfamide or doxorubicin when cultured in 3D pellets, in parallel with an altered nucleic accumulation of the drug. Our results indicate that the cartilaginous matrix produced by chondrosarcoma cells may impair diffusion of several drugs and thus contribute to chemoresistance

  18. Quantitative Modeling of Microbial Population Responses to Chronic Irradiation Combined with Other Stressors.

    Directory of Open Access Journals (Sweden)

    Igor Shuryak

    Full Text Available Microbial population responses to combined effects of chronic irradiation and other stressors (chemical contaminants, other sub-optimal conditions are important for ecosystem functioning and bioremediation in radionuclide-contaminated areas. Quantitative mathematical modeling can improve our understanding of these phenomena. To identify general patterns of microbial responses to multiple stressors in radioactive environments, we analyzed three data sets on: (1 bacteria isolated from soil contaminated by nuclear waste at the Hanford site (USA; (2 fungi isolated from the Chernobyl nuclear-power plant (Ukraine buildings after the accident; (3 yeast subjected to continuous γ-irradiation in the laboratory, where radiation dose rate and cell removal rate were independently varied. We applied generalized linear mixed-effects models to describe the first two data sets, whereas the third data set was amenable to mechanistic modeling using differential equations. Machine learning and information-theoretic approaches were used to select the best-supported formalism(s among biologically-plausible alternatives. Our analysis suggests the following: (1 Both radionuclides and co-occurring chemical contaminants (e.g. NO2 are important for explaining microbial responses to radioactive contamination. (2 Radionuclides may produce non-monotonic dose responses: stimulation of microbial growth at low concentrations vs. inhibition at higher ones. (3 The extinction-defining critical radiation dose rate is dramatically lowered by additional stressors. (4 Reproduction suppression by radiation can be more important for determining the critical dose rate, than radiation-induced cell mortality. In conclusion, the modeling approaches used here on three diverse data sets provide insight into explaining and predicting multi-stressor effects on microbial communities: (1 the most severe effects (e.g. extinction on microbial populations may occur when unfavorable environmental

  19. A Quantitative Human Spacecraft Design Evaluation Model for Assessing Crew Accommodation and Utilization

    Science.gov (United States)

    Fanchiang, Christine

    Crew performance, including both accommodation and utilization factors, is an integral part of every human spaceflight mission from commercial space tourism, to the demanding journey to Mars and beyond. Spacecraft were historically built by engineers and technologists trying to adapt the vehicle into cutting edge rocketry with the assumption that the astronauts could be trained and will adapt to the design. By and large, that is still the current state of the art. It is recognized, however, that poor human-machine design integration can lead to catastrophic and deadly mishaps. The premise of this work relies on the idea that if an accurate predictive model exists to forecast crew performance issues as a result of spacecraft design and operations, it can help designers and managers make better decisions throughout the design process, and ensure that the crewmembers are well-integrated with the system from the very start. The result should be a high-quality, user-friendly spacecraft that optimizes the utilization of the crew while keeping them alive, healthy, and happy during the course of the mission. Therefore, the goal of this work was to develop an integrative framework to quantitatively evaluate a spacecraft design from the crew performance perspective. The approach presented here is done at a very fundamental level starting with identifying and defining basic terminology, and then builds up important axioms of human spaceflight that lay the foundation for how such a framework can be developed. With the framework established, a methodology for characterizing the outcome using a mathematical model was developed by pulling from existing metrics and data collected on human performance in space. Representative test scenarios were run to show what information could be garnered and how it could be applied as a useful, understandable metric for future spacecraft design. While the model is the primary tangible product from this research, the more interesting outcome of

  20. Parasite to patient: A quantitative risk model for Trichinella spp. in pork and wild boar meat.

    Science.gov (United States)

    Franssen, Frits; Swart, Arno; van der Giessen, Joke; Havelaar, Arie; Takumi, Katsuhisa

    2017-01-16

    Consumption of raw or inadequately cooked pork meat may result in trichinellosis, a human disease due to nematodes of the genus Trichinella. In many countries worldwide, individual control of pig carcasses at meat inspection is mandatory but incurs high costs in relation to absence of positive carcasses from pigs reared under controlled housing. EU regulation 2015/1375 implements an alternative risk-based approach, in view of absence of positive findings in pigs under controlled housing conditions. Moreover, Codex Alimentarius guidelines for the control of Trichinella spp. in meat of suidae have been published (CAC, 2015) and used in conjunction with the OIE terrestrial Animal health code, to provide guidance to governments and industry on risk based control measures to prevent human exposure to Trichinella spp. and to facilitate international pork trade. To further support such a risk-based approach, we model the risk of human trichinellosis due to consumption of meat from infected pigs, raised under non-controlled housing and wild boar, using Quantitative Microbial Risk Assessment (QMRA) methods. Our model quantifies the distribution of Trichinella muscle larve (ML) in swine, test sensitivity at carcass control, partitioning of edible pork parts, Trichinella ML distribution in edible muscle types, heat inactivation by cooking and portion sizes. The resulting exposure estimate is combined with a dose response model for Trichinella species to estimate the incidence of human illness after consumption of infected meat. Paramater estimation is based on experimental and observational datasets. In Poland, which served as example, we estimated an average incidence of 0.90 (95%CI: 0.00-3.68) trichinellosis cases per million persons per year (Mpy) due to consumption of pork from pigs that were reared under non-controlled housing, and 1.97 (95%CI: 0.82-4.00) cases per Mpy due to consumption of wild boar. The total estimated incidence of human trichinellosis attributed to

  1. Quantitative modeling of electron spectroscopy intensities for supported nanoparticles: The hemispherical cap model for non-normal detection

    Science.gov (United States)

    Sharp, James C.; Campbell, Charles T.

    2015-02-01

    Nanoparticles of one element or compound dispersed across the surface of another substrate element or compound form the basis for many materials of great technological importance, such as heterogeneous catalysts, fuel cells and other electrocatalysts, photocatalysts, chemical sensors and biomaterials. They also form during film growth by deposition in many fabrication processes. The average size and number density of such nanoparticles are often very important, and these can be estimated with electron microscopy or scanning tunneling microscopy. However, this is very time consuming and often unavailable with sufficient resolution when the particle size is ~ 1 nm. Because the probe depth of electron spectroscopies like X-Ray Photoelectron Spectroscopy (XPS) or Auger Electron Spectroscopy (AES) is ~ 1 nm, these provide quantitative information on both the total amount of adsorbed material when it is in the form of such small nanoparticles, and the particle thickness. For electron spectroscopy conducted with electron detection normal to the surface, Diebold et al. (1993) derived analytical relationships between the signal intensities for the adsorbate and substrate and the particles' average size and number density, under the assumption that all the particles have hemispherical shape and the same radius. In this paper, we report a simple angle- and particle-size-dependent correction factor that can be applied to these analytical expressions so that they can also be extended to measurements made at other detection angles away from the surface normal. This correction factor is computed using numerical integration and presented for use in future modeling. This correction factor is large (> 2) for angles beyond 60°, so comparing model predictions to measurements at both 0° and ≥ 60° will also provide a new means for testing the model's assumptions (hemispherical shape and fixed size particles). The ability to compare the hemispherical cap model at several angles

  2. Using ISOS consensus test protocols for development of quantitative life test models in ageing of organic solar cells

    DEFF Research Database (Denmark)

    Kettle, J.; Stoichkov, V.; Kumar, D.

    2017-01-01

    As Organic Photovoltaic (OPV) development matures, the demand grows for rapid characterisation of degradation and application of Quantitative Accelerated Life Tests (QALT) models to predict and improve reliability. To date, most accelerated testing on OPVs has been conducted using ISOS consensus ...

  3. Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships (MCBIOS)

    Science.gov (United States)

    Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships Jie Liu1,2, Richard Judson1, Matthew T. Martin1, Huixiao Hong3, Imran Shah1 1National Center for Computational Toxicology (NCCT), US EPA, RTP, NC...

  4. Physically based dynamic run-out modelling for quantitative debris flow risk assessment: a case study in Tresenda, northern Italy

    Czech Academy of Sciences Publication Activity Database

    Quan Luna, B.; Blahůt, Jan; Camera, C.; Van Westen, C.; Apuani, T.; Jetten, V.; Sterlacchini, S.

    2014-01-01

    Roč. 72, č. 3 (2014), s. 645-661 ISSN 1866-6280 Institutional support: RVO:67985891 Keywords : debris flow * FLO-2D * run-out * quantitative hazard and risk assessment * vulnerability * numerical modelling Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.765, year: 2014

  5. A Quantitative Study of Faculty Perceptions and Attitudes on Asynchronous Virtual Teamwork Using the Technology Acceptance Model

    Science.gov (United States)

    Wolusky, G. Anthony

    2016-01-01

    This quantitative study used a web-based questionnaire to assess the attitudes and perceptions of online and hybrid faculty towards student-centered asynchronous virtual teamwork (AVT) using the technology acceptance model (TAM) of Davis (1989). AVT is online student participation in a team approach to problem-solving culminating in a written…

  6. Quantitative acid-base physiology using the Stewart model. Does it improve our understanding of what is really wrong?

    NARCIS (Netherlands)

    Derksen, R.; Scheffer, G.J.; Hoeven, J.G. van der

    2006-01-01

    Traditional theories of acid-base balance are based on the Henderson-Hasselbalch equation to calculate proton concentration. The recent revival of quantitative acid-base physiology using the Stewart model has increased our understanding of complicated acid-base disorders, but has also led to several

  7. Model development for quantitative evaluation of nuclear fuel cycle alternatives and its application

    International Nuclear Information System (INIS)

    Ko, Won Il

    2000-02-01

    This study addresses the quantitative evaluation of the proliferation resistance and the economics which are important factors of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles, and a fuel cycle cost analysis model was suggested to incorporate various uncertainties in the fuel cycle cost calculation. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. In this model, the proliferation resistance was described an a relative size of the barrier that must be overcome in order to acquire nuclear weapons. Therefore, a larger barriers means that the risk of failure is great, expenditure of resources is large and the time scales for implementation is long. The electromotive force was expressed as the political motivation of the potential proliferators, such as an unauthorized party or a national group to acquire nuclear weapons. The electrical current was then defined as a proliferation resistance index. There are two electrical circuit models used in the evaluation of the proliferation resistance: the series and the parallel circuits. In the series circuit model of the proliferation resistance, a potential proliferator has to overcome all resistance barriers to achieve the manufacturing of the nuclear weapons. This phenomenon could be explained by the fact that the IAEA(International Atomic Energy Agency)'s safeguards philosophy relies on the defense-in-depth principle against nuclear proliferation at a specific facility. The parallel circuit model was also used to imitate the risk of proliferation for

  8. Demographic Changes and Real Estate Values. A Quantitative Model for Analyzing the Urban-Rural Linkages

    Directory of Open Access Journals (Sweden)

    Massimiliano Bencardino

    2017-03-01

    Full Text Available Vast metropolitan areas include both urban areas and rural outskirts. Between these areas, there are strong links to the point which they cannot be examined separately. There is a contemporary presence of residential function and working activity in the rural outskirts, as well as in the typical sector of agriculture. Therefore, the production of goods and services for the city requires a combined analysis, due to the large territory which it has to consider. The evolution of the population of such a large territory can be studied in great detail, with reference to the single census area and with the use of Geographic Information Systems (GIS. This means that such a demographic development produces an effect on the values of the urban real estate. This work demonstrates the existing interconnections between urban areas and rural outskirts. Data collection on trends of the population living in the Naples metropolitan area and the house prices associated with this area, and the post spatial processing of such data, allows for the establishment of thematic maps according to which a model capable of interpreting the population development is defined. A study of the statistical correlations shows the consequences that the population dynamics produce for property prices. In addition, the diachronic analysis of the sales prices of residential buildings demonstrates that economic functions, exclusive of certain urban or rural territories, end up distributing and integrating.

  9. Development & validation of a quantitative anti-protective antigen IgG enzyme linked immunosorbent assay for serodiagnosis of cutaneous anthrax

    Directory of Open Access Journals (Sweden)

    N Ghosh

    2015-01-01

    Full Text Available Background & objectives: Anthrax caused by Bacillus anthracis is primarily a disease of herbivorous animals, although several mammals are vulnerable to it. ELISA is the most widely accepted serodiagnostic assay for large scale surveillance of cutaneous anthrax. The aims of this study were to develop and evaluate a quantitative ELISA for determination of IgG antibodies against B. anthracis protective antigen (PA in human cutaneous anthrax cases. Methods: Quantitative ELISA was developed using the recombinant PA for coating and standard reference serum AVR801 for quantification. A total of 116 human test and control serum samples were used in the study. The assay was evaluated for its precision, accuracy and linearity. Results: The minimum detection limit and lower limit of quantification of the assay for anti-PA IgG were 3.2 and 4 µg/ml, respectively. The serum samples collected from the anthrax infected patients were found to have anti-PA IgG concentrations of 5.2 to 166.3 µg/ml. The intra-assay precision per cent CV within an assay and within an operator ranged from 0.99 to 7.4 per cent and 1.7 to 3.9 per cent, respectively. The accuracy of the assay was high with a per cent error of 6.5 - 24.1 per cent. The described assay was found to be linear between the range of 4 to 80 ng/ml (R [2] =0.9982; slope=0.9186; intercept = 0.1108. Interpretation & conclusions: The results suggested that the developed assay could be a useful tool for quantification of anti-PA IgG response in human after anthrax infection or vaccination.

  10. Quantitative Modeling of Membrane Transport and Anisogamy by Small Groups Within a Large-Enrollment Organismal Biology Course

    Directory of Open Access Journals (Sweden)

    Eric S. Haag

    2016-12-01

    Full Text Available Quantitative modeling is not a standard part of undergraduate biology education, yet is routine in the physical sciences. Because of the obvious biophysical aspects, classes in anatomy and physiology offer an opportunity to introduce modeling approaches to the introductory curriculum. Here, we describe two in-class exercises for small groups working within a large-enrollment introductory course in organismal biology. Both build and derive biological insights from quantitative models, implemented using spreadsheets. One exercise models the evolution of anisogamy (i.e., small sperm and large eggs from an initial state of isogamy. Groups of four students work on Excel spreadsheets (from one to four laptops per group. The other exercise uses an online simulator to generate data related to membrane transport of a solute, and a cloud-based spreadsheet to analyze them. We provide tips for implementing these exercises gleaned from two years of experience.

  11. Forging a link between mentoring and collaboration: a new training model for implementation science.

    Science.gov (United States)

    Luke, Douglas A; Baumann, Ana A; Carothers, Bobbi J; Landsverk, John; Proctor, Enola K

    2016-10-13

    publishing peer-reviewed papers. Statistical network models demonstrated that mentoring was strongly and significantly related to subsequent scientific collaboration, which supported a core design principle of the IRI. Future work should establish the link between mentoring and scientific productivity. These results may be of interest to team science, as they suggest the importance of mentoring for future team collaborations, as well as illustrate the utility of network analysis for studying team characteristics and activities.

  12. Collagenase-mediated tissue modeling of corneal ectasia and collagen cross-linking treatments.

    Science.gov (United States)

    Hong, Cheng W; Sinha-Roy, Abhijit; Schoenfield, Lynn; McMahon, James T; Dupps, William J

    2012-04-30

    Corneal collagen cross-linking (CXL) is a method for modifying the natural history of keratoconus and other corneal ectatic diseases. The authors evaluated the use of collagenase for generating an experimental model of ectasia to evaluate the topographic effects of CXL interventions. Nine human corneoscleral specimens unsuitable for transplantation were used. After epithelial debridement, mounting, and pressurization on an artificial anterior chamber, a solution of 10 mg/mL collagenase type II with 15% dextran was applied to five corneas for three hours. Three of these corneas subsequently underwent riboflavin/UV-A CXL. Scheimpflug-based tomography was performed before collagenase exposure, after collagenase exposure, and after CXL to evaluate changes in maximum axial curvature of the anterior surface (K(max)) at three IOP levels. Results were compared to four control eyes exposed to dextran alone. A statistically significant increase in K(max) was seen across all IOP levels in the collagenase group compared to the control group (+6.6 ± 1.1 diopters [D] and +0.3 ± 0.8 D, respectively, at physiological IOP). After CXL, K(max) decreased (-7.6 ± 2.0 D at physiological IOP). Anterior corneal aberrations increased after collagenase exposure and decreased after CXL. Light microscopy showed loss of normal stromal collagen architecture and localized edema after collagenase exposure. A method for generating topographic features of corneal ectasia in human tissue is demonstrated. No significant sensitivity of K(max) to IOP was observed. CXL caused regression of steepening and induced aberrations in this model, consistent with clinical trends. The model may be useful for testing modifications to standard CXL techniques.

  13. Linking hydrologic and bedload transport models to simulate fluvial response to changing precipitation

    Science.gov (United States)

    Wickert, A. D.; Ng, G. H. C.; Tofelde, S.; Savi, S.; Schildgen, T. F.; Alonso, R. N.; Strecker, M. R.

    2015-12-01

    Changes in precipitation can drive river aggradation or incision through their influence on both hillslope processes, which supply sediment to the channel, and sediment transport capacity, which moves sediment downstream. Whether a particular change in precipitation intensity and/or duration will result in aggradation or incision is difficult to predict due to these competing effects. In particular, fluvial response to climate change is sensitive to (1) thresholds and nonlinearities involved in sediment production and sediment transport, (2) how different modes of sediment production affect the grain size of the sediment provided to the channel, and (3) impacts of drainage basin geometry on sediment storage time and locations of rapid sediment production and/or transport. A better mechanistic understanding of the relationship between rainfall and river bed elevation changes will help us to understand modern river channel response to climate change and decipher the causes for fluvial terrace formation. Here we couple a hydrologic model, the Precipitation-Runoff Modeling System (PRMS), with a model of sediment transport through a fluvial network, sedFlow, to predict patterns of bed elevation change. We first perform schematic example simulations on an idealized synthetic landscape with a single river channel to show how simple fluvial systems can respond to changes in rainfall. We then expand these numerical tests to full fluvial networks, in which the segments of the tributary network propagate signals of aggradation and incision, leading to a more complex response that embodies the interference between magnitudes and time-scales of sediment transfer in the tributary links. We showcase the possible complexity of the fluvial response with an example from the Quebrada del Toro of NW Argentina, which is currently experiencing rapid and spatially-variable aggradation and incision, possibly in response to an increase in extreme rainfall events in the east-central Andes.

  14. Linking human diseases to animal models using ontology-based phenotype annotation.