WorldWideScience

Sample records for modeling quantitative links

  1. Quantitative Raman characterization of cross-linked collagen thin films as a model system for diagnosing early osteoarthritis

    Science.gov (United States)

    Wang, Chao; Durney, Krista M.; Fomovsky, Gregory; Ateshian, Gerard A.; Vukelic, Sinisa

    2016-03-01

    The onset of osteoarthritis (OA)in articular cartilage is characterized by degradation of extracellular matrix (ECM). Specifically, breakage of cross-links between collagen fibrils in the articular cartilage leads to loss of structural integrity of the bulk tissue. Since there are no broadly accepted, non-invasive, label-free tools for diagnosing OA at its early stage, Raman spectroscopyis therefore proposed in this work as a novel, non-destructive diagnostic tool. In this study, collagen thin films were employed to act as a simplified model system of the cartilage collagen extracellular matrix. Cross-link formation was controlled via exposure to glutaraldehyde (GA), by varying exposure time and concentration levels, and Raman spectral information was collected to quantitatively characterize the cross-link assignments imparted to the collagen thin films during treatment. A novel, quantitative method was developed to analyze the Raman signal obtained from collagen thin films. Segments of Raman signal were decomposed and modeled as the sum of individual bands, providing an optimization function for subsequent curve fitting against experimental findings. Relative changes in the concentration of the GA-induced pyridinium cross-links were extracted from the model, as a function of the exposure to GA. Spatially resolved characterization enabled construction of spectral maps of the collagen thin films, which provided detailed information about the variation of cross-link formation at various locations on the specimen. Results showed that Raman spectral data correlate with glutaraldehyde treatment and therefore may be used as a proxy by which to measure loss of collagen cross-links in vivo. This study proposes a promising system of identifying onset of OA and may enable early intervention treatments that may serve to slow or prevent osteoarthritis progression.

  2. Exploring alternative models for sex-linked quantitative trait loci in outbred populations: application to an iberian x landrace pig intercross.

    OpenAIRE

    Pérez-Enciso, Miguel; Clop, Alex; Folch, Josep M; Sánchez, Armand; Oliver, Maria A; Ovilo, Cristina; Barragán, C; Varona, Luis; Noguera, José L

    2002-01-01

    We present a very flexible method that allows us to analyze X-linked quantitative trait loci (QTL) in crosses between outbred lines. The dosage compensation phenomenon is modeled explicitly in an identity-by-descent approach. A variety of models can be fitted, ranging from considering alternative fixed alleles within the founder breeds to a model where the only genetic variation is within breeds, as well as mixed models. Different genetic variances within each founder breed can be estimated. ...

  3. Quantitative linking hypotheses for infant eye movements.

    Directory of Open Access Journals (Sweden)

    Daniel Yurovsky

    Full Text Available The study of cognitive development hinges, largely, on the analysis of infant looking. But analyses of eye gaze data require the adoption of linking hypotheses: assumptions about the relationship between observed eye movements and underlying cognitive processes. We develop a general framework for constructing, testing, and comparing these hypotheses, and thus for producing new insights into early cognitive development. We first introduce the general framework--applicable to any infant gaze experiment--and then demonstrate its utility by analyzing data from a set of experiments investigating the role of attentional cues in infant learning. The new analysis uncovers significantly more structure in these data, finding evidence of learning that was not found in standard analyses and showing an unexpected relationship between cue use and learning rate. Finally, we discuss general implications for the construction and testing of quantitative linking hypotheses. MATLAB code for sample linking hypotheses can be found on the first author's website.

  4. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  5. A quantitative description of the microwave properties of melt cast Bi{sub 2}Sr{sub 2}CaCu{sub 2}O{sub 8} in terms of a weak-link model

    Energy Technology Data Exchange (ETDEWEB)

    Godel, G.; Gold, N.; Hasse, J.; Bock, J.; Halbritter, J. [Phys. Inst., Karlsruhe Univ. (Germany)

    1994-10-01

    The granular structure dominates the RF properties of the material. Below T{sub c} the surface resistance at 11.27 GHz of Bi{sub 2}Sr{sub 2}CaCu{sub 2}O{sub 8} drops initially more slowly than BSC theory predicts. Below T{sub c}/2 it shows a linear temperature dependence and a quadratic frequency and field dependence with an RF critical magnetic field of <130 A m{sup -1} at 4.2 K. This behaviour is attributed to the existence of weak superconducting regions between crystallites, which provide a strikingly good description. The weak links with a boundary resistance R{sub bn} have to be regarded as Josephson junctions with reduced superconducting properties and normal conducting leakage currents. We conclude that the weak-link model gives a consistent description of the DC and microwave properties not only in the magnitude of the penetration depth and surface resistance but also in their temperature, field and frequency dependence. Conversely, it is possible to obtain from it quantitative information about weak links in the superconductor Bi{sub 2}Sr{sub 2}CaCu{sub 2}O{sub 8}. (author)

  6. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  7. Quantitative microbiome profiling links gut community variation to microbial load.

    Science.gov (United States)

    Vandeputte, Doris; Kathagen, Gunter; D'hoe, Kevin; Vieira-Silva, Sara; Valles-Colomer, Mireia; Sabino, João; Wang, Jun; Tito, Raul Y; De Commer, Lindsey; Darzi, Youssef; Vermeire, Séverine; Falony, Gwen; Raes, Jeroen

    2017-11-23

    Current sequencing-based analyses of faecal microbiota quantify microbial taxa and metabolic pathways as fractions of the sample sequence library generated by each analysis. Although these relative approaches permit detection of disease-associated microbiome variation, they are limited in their ability to reveal the interplay between microbiota and host health. Comparative analyses of relative microbiome data cannot provide information about the extent or directionality of changes in taxa abundance or metabolic potential. If microbial load varies substantially between samples, relative profiling will hamper attempts to link microbiome features to quantitative data such as physiological parameters or metabolite concentrations. Saliently, relative approaches ignore the possibility that altered overall microbiota abundance itself could be a key identifier of a disease-associated ecosystem configuration. To enable genuine characterization of host-microbiota interactions, microbiome research must exchange ratios for counts. Here we build a workflow for the quantitative microbiome profiling of faecal material, through parallelization of amplicon sequencing and flow cytometric enumeration of microbial cells. We observe up to tenfold differences in the microbial loads of healthy individuals and relate this variation to enterotype differentiation. We show how microbial abundances underpin both microbiota variation between individuals and covariation with host phenotype. Quantitative profiling bypasses compositionality effects in the reconstruction of gut microbiota interaction networks and reveals that the taxonomic trade-off between Bacteroides and Prevotella is an artefact of relative microbiome analyses. Finally, we identify microbial load as a key driver of observed microbiota alterations in a cohort of patients with Crohn's disease, here associated with a low-cell-count Bacteroides enterotype (as defined through relative profiling).

  8. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  9. Mapping protein structural changes by quantitative cross-linking

    Czech Academy of Sciences Publication Activity Database

    Kukačka, Zdeněk; Strohalm, Martin; Kavan, Daniel; Novák, Petr

    2015-01-01

    Roč. 89, NOV 2015 (2015), s. 112-120 ISSN 1046-2023 R&D Projects: GA MŠk(CZ) EE2.3.20.0055; GA MŠk(CZ) EE2.3.30.0003; GA MŠk(CZ) ED1.1.00/02.0109 Grant - others:OPPC(XE) CZ.2.16/3.1.00/24023 Institutional support: RVO:61388971 Keywords : Chemical cross-linking * Proteolysis * Mass spectrometry Subject RIV: CE - Biochemistry Impact factor: 3.503, year: 2015

  10. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Juan D Chavez

    Full Text Available Chemical cross-linking mass spectrometry (XL-MS provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  11. Quantitative structure - mesothelioma potency model ...

    Science.gov (United States)

    Cancer potencies of mineral and synthetic elongated particle (EP) mixtures, including asbestos fibers, are influenced by changes in fiber dose composition, bioavailability, and biodurability in combination with relevant cytotoxic dose-response relationships. A unique and comprehensive rat intra-pleural (IP) dose characterization data set with a wide variety of EP size, shape, crystallographic, chemical, and bio-durability properties facilitated extensive statistical analyses of 50 rat IP exposure test results for evaluation of alternative dose pleural mesothelioma response models. Utilizing logistic regression, maximum likelihood evaluations of thousands of alternative dose metrics based on hundreds of individual EP dimensional variations within each test sample, four major findings emerged: (1) data for simulations of short-term EP dose changes in vivo (mild acid leaching) provide superior predictions of tumor incidence compared to non-acid leached data; (2) sum of the EP surface areas (ÓSA) from these mildly acid-leached samples provides the optimum holistic dose response model; (3) progressive removal of dose associated with very short and/or thin EPs significantly degrades resultant ÓEP or ÓSA dose-based predictive model fits, as judged by Akaike’s Information Criterion (AIC); and (4) alternative, biologically plausible model adjustments provide evidence for reduced potency of EPs with length/width (aspect) ratios 80 µm. Regar

  12. On the Reproducibility of Label-Free Quantitative Cross-Linking/Mass Spectrometry

    Science.gov (United States)

    Müller, Fränze; Fischer, Lutz; Chen, Zhuo Angel; Auchynnikava, Tania; Rappsilber, Juri

    2018-02-01

    Quantitative cross-linking/mass spectrometry (QCLMS) is an emerging approach to study conformational changes of proteins and multi-subunit complexes. Distinguishing protein conformations requires reproducibly identifying and quantifying cross-linked peptides. Here we analyzed the variation between multiple cross-linking reactions using bis[sulfosuccinimidyl] suberate (BS3)-cross-linked human serum albumin (HSA) and evaluated how reproducible cross-linked peptides can be identified and quantified by LC-MS analysis. To make QCLMS accessible to a broader research community, we developed a workflow that integrates the established software tools MaxQuant for spectra preprocessing, Xi for cross-linked peptide identification, and finally Skyline for quantification (MS1 filtering). Out of the 221 unique residue pairs identified in our sample, 124 were subsequently quantified across 10 analyses with coefficient of variation (CV) values of 14% (injection replica) and 32% (reaction replica). Thus our results demonstrate that the reproducibility of QCLMS is in line with the reproducibility of general quantitative proteomics and we establish a robust workflow for MS1-based quantitation of cross-linked peptides.

  13. A Direct, Competitive Enzyme-Linked Immunosorbent Assay (ELISA) as a Quantitative Technique for Small Molecules

    Science.gov (United States)

    Powers, Jennifer L.; Rippe, Karen Duda; Imarhia, Kelly; Swift, Aileen; Scholten, Melanie; Islam, Naina

    2012-01-01

    ELISA (enzyme-linked immunosorbent assay) is a widely used technique with applications in disease diagnosis, detection of contaminated foods, and screening for drugs of abuse or environmental contaminants. However, published protocols with a focus on quantitative detection of small molecules designed for teaching laboratories are limited. A…

  14. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  15. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  16. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  17. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  18. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  19. Qualitative to quantitative: linked trajectory of method triangulation in a study on HIV/AIDS in Goa, India.

    Science.gov (United States)

    Bailey, Ajay; Hutter, Inge

    2008-10-01

    With 3.1 million people estimated to be living with HIV/AIDS in India and 39.5 million people globally, the epidemic has posed academics the challenge of identifying behaviours and their underlying beliefs in the effort to reduce the risk of HIV transmission. The Health Belief Model (HBM) is frequently used to identify risk behaviours and adherence behaviour in the field of HIV/AIDS. Risk behaviour studies that apply HBM have been largely quantitative and use of qualitative methodology is rare. The marriage of qualitative and quantitative methods has never been easy. The challenge is in triangulating the methods. Method triangulation has been largely used to combine insights from the qualitative and quantitative methods but not to link both the methods. In this paper we suggest a linked trajectory of method triangulation (LTMT). The linked trajectory aims to first gather individual level information through in-depth interviews and then to present the information as vignettes in focus group discussions. We thus validate information obtained from in-depth interviews and gather emic concepts that arise from the interaction. We thus capture both the interpretation and the interaction angles of the qualitative method. Further, using the qualitative information gained, a survey is designed. In doing so, the survey questions are grounded and contextualized. We employed this linked trajectory of method triangulation in a study on the risk assessment of HIV/AIDS among migrant and mobile men. Fieldwork was carried out in Goa, India. Data come from two waves of studies, first an explorative qualitative study (2003), second a larger study (2004-2005), including in-depth interviews (25), focus group discussions (21) and a survey (n=1259). By employing the qualitative to quantitative LTMT we can not only contextualize the existing concepts of the HBM, but also validate new concepts and identify new risk groups.

  20. A rigid disulfide-linked nitroxide side chain simplifies the quantitative analysis of PRE data

    Energy Technology Data Exchange (ETDEWEB)

    Fawzi, Nicolas L. [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States); Fleissner, Mark R. [University of California, Jules Stein Eye Institute and Department of Chemistry and Biochemistry (United States); Anthis, Nicholas J. [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States); Kalai, Tamas; Hideg, Kalman [University of Pecs, Institute of Organic and Medicinal Chemistry (Hungary); Hubbell, Wayne L., E-mail: hubbellw@jsei.ucla.edu [University of California, Jules Stein Eye Institute and Department of Chemistry and Biochemistry (United States); Clore, G. Marius, E-mail: mariusc@mail.nih.gov [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States)

    2011-09-15

    The measurement of {sup 1}H transverse paramagnetic relaxation enhancement (PRE) has been used in biomolecular systems to determine long-range distance restraints and to visualize sparsely-populated transient states. The intrinsic flexibility of most nitroxide and metal-chelating paramagnetic spin-labels, however, complicates the quantitative interpretation of PREs due to delocalization of the paramagnetic center. Here, we present a novel, disulfide-linked nitroxide spin label, R1p, as an alternative to these flexible labels for PRE studies. When introduced at solvent-exposed {alpha}-helical positions in two model proteins, calmodulin (CaM) and T4 lysozyme (T4L), EPR measurements show that the R1p side chain exhibits dramatically reduced internal motion compared to the commonly used R1 spin label (generated by reacting cysteine with the spin labeling compound often referred to as MTSL). Further, only a single nitroxide position is necessary to account for the PREs arising from CaM S17R1p, while an ensemble comprising multiple conformations is necessary for those observed for CaM S17R1. Together, these observations suggest that the nitroxide adopts a single, fixed position when R1p is placed at solvent-exposed {alpha}-helical positions, greatly simplifying the interpretation of PRE data by removing the need to account for the intrinsic flexibility of the spin label.

  1. A rigid disulfide-linked nitroxide side chain simplifies the quantitative analysis of PRE data

    International Nuclear Information System (INIS)

    Fawzi, Nicolas L.; Fleissner, Mark R.; Anthis, Nicholas J.; Kálai, Tamás; Hideg, Kálmán; Hubbell, Wayne L.; Clore, G. Marius

    2011-01-01

    The measurement of 1 H transverse paramagnetic relaxation enhancement (PRE) has been used in biomolecular systems to determine long-range distance restraints and to visualize sparsely-populated transient states. The intrinsic flexibility of most nitroxide and metal-chelating paramagnetic spin-labels, however, complicates the quantitative interpretation of PREs due to delocalization of the paramagnetic center. Here, we present a novel, disulfide-linked nitroxide spin label, R1p, as an alternative to these flexible labels for PRE studies. When introduced at solvent-exposed α-helical positions in two model proteins, calmodulin (CaM) and T4 lysozyme (T4L), EPR measurements show that the R1p side chain exhibits dramatically reduced internal motion compared to the commonly used R1 spin label (generated by reacting cysteine with the spin labeling compound often referred to as MTSL). Further, only a single nitroxide position is necessary to account for the PREs arising from CaM S17R1p, while an ensemble comprising multiple conformations is necessary for those observed for CaM S17R1. Together, these observations suggest that the nitroxide adopts a single, fixed position when R1p is placed at solvent-exposed α-helical positions, greatly simplifying the interpretation of PRE data by removing the need to account for the intrinsic flexibility of the spin label.

  2. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    and directions of this research area, this paper provides a content analysis of 134 carefully identified papers on quantitative, formal models that address sustainability aspects in the forward SC. It was found that a preponderance of the publications and models appeared in a limited set of six journals......Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  3. Link-based quantitative methods to identify differentially coexpressed genes and gene Pairs

    Directory of Open Access Journals (Sweden)

    Ye Zhi-Qiang

    2011-08-01

    Full Text Available Abstract Background Differential coexpression analysis (DCEA is increasingly used for investigating the global transcriptional mechanisms underlying phenotypic changes. Current DCEA methods mostly adopt a gene connectivity-based strategy to estimate differential coexpression, which is characterized by comparing the numbers of gene neighbors in different coexpression networks. Although it simplifies the calculation, this strategy mixes up the identities of different coexpression neighbors of a gene, and fails to differentiate significant differential coexpression changes from those trivial ones. Especially, the correlation-reversal is easily missed although it probably indicates remarkable biological significance. Results We developed two link-based quantitative methods, DCp and DCe, to identify differentially coexpressed genes and gene pairs (links. Bearing the uniqueness of exploiting the quantitative coexpression change of each gene pair in the coexpression networks, both methods proved to be superior to currently popular methods in simulation studies. Re-mining of a publicly available type 2 diabetes (T2D expression dataset from the perspective of differential coexpression analysis led to additional discoveries than those from differential expression analysis. Conclusions This work pointed out the critical weakness of current popular DCEA methods, and proposed two link-based DCEA algorithms that will make contribution to the development of DCEA and help extend it to a broader spectrum.

  4. Quantitative Visualization of ChIP-chip Data by Using Linked Views

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Min-Yu; Weber, Gunther; Li, Xiao-Yong; Biggin, Mark; Hamann, Bernd

    2010-11-05

    Most analyses of ChIP-chip in vivo DNA binding have focused on qualitative descriptions of whether genomic regions are bound or not. There is increasing evidence, however, that factors bind in a highly overlapping manner to the same genomic regions and that it is quantitative differences in occupancy on these commonly bound regions that are the critical determinants of the different biological specificity of factors. As a result, it is critical to have a tool to facilitate the quantitative visualization of differences between transcription factors and the genomic regions they bind to understand each factor's unique roles in the network. We have developed a framework which combines several visualizations via brushing-and-linking to allow the user to interactively analyze and explore in vivo DNA binding data of multiple transcription factors. We describe these visualization types and also provide a discussion of biological examples in this paper.

  5. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  6. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  7. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  8. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  9. Linking descriptive geology and quantitative machine learning through an ontology of lithological concepts

    Science.gov (United States)

    Klump, J. F.; Huber, R.; Robertson, J.; Cox, S. J. D.; Woodcock, R.

    2014-12-01

    Despite the recent explosion of quantitative geological data, geology remains a fundamentally qualitative science. Numerical data only constitute a certain part of data collection in the geosciences. In many cases, geological observations are compiled as text into reports and annotations on drill cores, thin sections or drawings of outcrops. The observations are classified into concepts such as lithology, stratigraphy, geological structure, etc. These descriptions are semantically rich and are generally supported by more quantitative observations using geochemical analyses, XRD, hyperspectral scanning, etc, but the goal is geological semantics. In practice it has been difficult to bring the different observations together due to differing perception or granularity of classification in human observation, or the partial observation of only some characteristics using quantitative sensors. In the past years many geological classification schemas have been transferred into ontologies and vocabularies, formalized using RDF and OWL, and published through SPARQL endpoints. Several lithological ontologies were compiled by stratigraphy.net and published through a SPARQL endpoint. This work is complemented by the development of a Python API to integrate this vocabulary into Python-based text mining applications. The applications for the lithological vocabulary and Python API are automated semantic tagging of geochemical data and descriptions of drill cores, machine learning of geochemical compositions that are diagnostic for lithological classifications, and text mining for lithological concepts in reports and geological literature. This combination of applications can be used to identify anomalies in databases, where composition and lithological classification do not match. It can also be used to identify lithological concepts in the literature and infer quantitative values. The resulting semantic tagging opens new possibilities for linking these diverse sources of data.

  10. Linking advanced fracture models to structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chiesa, Matteo

    2001-07-01

    Shell structures with defects occur in many situations. The defects are usually introduced during the welding process necessary for joining different parts of the structure. Higher utilization of structural materials leads to a need for accurate numerical tools for reliable prediction of structural response. The direct discretization of the cracked shell structure with solid finite elements in order to perform an integrity assessment of the structure in question leads to large size problems, and makes such analysis infeasible in structural application. In this study a link between local material models and structural analysis is outlined. An ''ad hoc'' element formulation is used in order to connect complex material models to the finite element framework used for structural analysis. An improved elasto-plastic line spring finite element formulation, used in order to take cracks into account, is linked to shell elements which are further linked to beam elements. In this way one obtain a global model of the shell structure that also accounts for local flexibilities and fractures due to defects. An important advantage with such an approach is a direct fracture mechanics assessment e.g. via computed J-integral or CTOD. A recent development in this approach is the notion of two-parameter fracture assessment. This means that the crack tip stress tri-axiality (constraint) is employed in determining the corresponding fracture toughness, giving a much more realistic capacity of cracked structures. The present thesis is organized in six research articles and an introductory chapter that reviews important background literature related to this work. Paper I and II address the performance of shell and line spring finite elements as a cost effective tool for performing the numerical calculation needed to perform a fracture assessment. In Paper II a failure assessment, based on the testing of a constraint-corrected fracture mechanics specimen under tension, is

  11. Modeling Cancer Metastasis using Global, Quantitative and Integrative Network Biology

    DEFF Research Database (Denmark)

    Schoof, Erwin; Erler, Janine

    understanding of molecular processes which are fundamental to tumorigenesis. In Article 1, we propose a novel framework for how cancer mutations can be studied by taking into account their effect at the protein network level. In Article 2, we demonstrate how global, quantitative data on phosphorylation dynamics...... can be generated using MS, and how this can be modeled using a computational framework for deciphering kinase-substrate dynamics. This framework is described in depth in Article 3, and covers the design of KinomeXplorer, which allows the prediction of kinases responsible for modulating observed...... phosphorylation dynamics in a given biological sample. In Chapter III, we move into Integrative Network Biology, where, by combining two fundamental technologies (MS & NGS), we can obtain more in-depth insights into the links between cellular phenotype and genotype. Article 4 describes the proof...

  12. Quantitative histological models suggest endothermy in plesiosaurs

    Directory of Open Access Journals (Sweden)

    Corinna V. Fleischle

    2018-06-01

    Full Text Available Background Plesiosaurs are marine reptiles that arose in the Late Triassic and survived to the Late Cretaceous. They have a unique and uniform bauplan and are known for their very long neck and hydrofoil-like flippers. Plesiosaurs are among the most successful vertebrate clades in Earth’s history. Based on bone mass decrease and cosmopolitan distribution, both of which affect lifestyle, indications of parental care, and oxygen isotope analyses, evidence for endothermy in plesiosaurs has accumulated. Recent bone histological investigations also provide evidence of fast growth and elevated metabolic rates. However, quantitative estimations of metabolic rates and bone growth rates in plesiosaurs have not been attempted before. Methods Phylogenetic eigenvector maps is a method for estimating trait values from a predictor variable while taking into account phylogenetic relationships. As predictor variable, this study employs vascular density, measured in bone histological sections of fossil eosauropterygians and extant comparative taxa. We quantified vascular density as primary osteon density, thus, the proportion of vascular area (including lamellar infillings of primary osteons to total bone area. Our response variables are bone growth rate (expressed as local bone apposition rate and resting metabolic rate (RMR. Results Our models reveal bone growth rates and RMRs for plesiosaurs that are in the range of birds, suggesting that plesiosaurs were endotherm. Even for basal eosauropterygians we estimate values in the range of mammals or higher. Discussion Our models are influenced by the availability of comparative data, which are lacking for large marine amniotes, potentially skewing our results. However, our statistically robust inference of fast growth and fast metabolism is in accordance with other evidence for plesiosaurian endothermy. Endothermy may explain the success of plesiosaurs consisting in their survival of the end-Triassic extinction

  13. Linking agent-based models and stochastic models of financial markets.

    Science.gov (United States)

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene

    2012-05-29

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.

  14. A statistical model for telecommunication link design

    Science.gov (United States)

    Yuen, J. H.

    1975-01-01

    An evaluation is conducted of the current telecommunication link design technique and a description is presented of an alternative method, called the probability distribution method (PDM), which is free of the disadvantages of the current technique while retaining its advantages. The PDM preserves the simplicity of the design control table (DCT) format. The use of the DCT as a management design control tool is continued. The telecommunication link margin probability density function used presents the probability of achieving any particular value of link performance. It is, therefore, possible to assess the performance risk and other tradeoffs.

  15. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    Energy Technology Data Exchange (ETDEWEB)

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  16. Phytochrome quantitation in crude extracts of Avena by enzyme-linked immunosorbent assay with monoclonal antibodies

    Energy Technology Data Exchange (ETDEWEB)

    Shimazaki, Y; Cordonnier, M M; Pratt, L H

    1983-01-01

    An enzyme-linked immunosorbent assay (ELISA), which uses both rabbit polyclonal and mouse monoclonal antibodies to phytochrome, has been adapted for quantitation of phytochrome in crude plant extracts. The assay has a detection limit of about 100 pg phytochrome and can be completed within 10 h. Quantitation of phytochrome in crude extracts of etiolated oat seedlings by ELISA gave values that agreed well with those obtained by spectrophotometric assay. When etiolated oat seedlings were irradiated continuously for 24 h, the amount of phytochrome detected by ELISA and by spectrophotometric assay decreased by more than 1000-fold and about 100-fold, respectively. This discrepancy indicates that phytochrome in light-treated plants may be antigenically distinct from that found in fully etiolated plants. When these light-grown oat seedlings were kept in darkness for 48 h, phytochrome content detected by ELISA increased by 50-fold in crude extracts of green oat shoots, but only about 12-fold in extracts of herbicide-treated oat shoots. Phytochrome reaccumulation in green oat shoots was initially more rapid in the more mature cells of the primary leaf tip than near the basal part of the shoot. The inhibitory effect of Norflurazon on phytochrome accumulation was much more evident near the leaf tip than the shoot base. A 5-min red irradiation of oat seedlings at the end of a 48-h dark period resulted in a subsequent, massive decrease in phytochrome content in crude extracts from both green and Norflurazon-bleached oat shoots. These observations eliminate the possibility that substantial accumulation of chromophore-free phytochrome was being detected and indicate that Norflurazon has a substantial effect on phytochrome accumulation during a prolonged dark period. 25 references, 9 figures, 3 tables.

  17. Enzyme-linked immunosorbent assay for the quantitative/qualitative analysis of plant secondary metabolites.

    Science.gov (United States)

    Sakamoto, Seiichi; Putalun, Waraporn; Vimolmangkang, Sornkanok; Phoolcharoen, Waranyoo; Shoyama, Yukihiro; Tanaka, Hiroyuki; Morimoto, Satoshi

    2018-01-01

    Immunoassays are antibody-based analytical methods for quantitative/qualitative analysis. Since the principle of immunoassays is based on specific antigen-antibody reaction, the assays have been utilized worldwide for diagnosis, pharmacokinetic studies by drug monitoring, and the quality control of commercially available products. Berson and Yalow were the first to develop an immunoassay, known as radioimmunoassay (RIA), for detecting endogenous plasma insulin [1], a development for which Yalow was awarded the Nobel Prize in Physiology or Medicine in 1977. Even today, after half a century, immunoassays are widely utilized with some modifications from the originally proposed system, e.g., radioisotopes have been replaced with enzymes because of safety concerns regarding the use of radioactivity, which is referred to as enzyme immunoassay/enzyme-linked immunosorbent assay (ELISA). In addition, progress has been made in ELISA with the recent advances in recombinant DNA technology, leading to increase in the range of antibodies, probes, and even systems. This review article describes ELISA and its applications for the detection of plant secondary metabolites.

  18. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  19. TraceLink: A model of amnesia and consolidation.

    NARCIS (Netherlands)

    Meeter, M.; Murre, J.M.J.

    2005-01-01

    A connectionist model is presented, the TraceLink model, that implements an autonomous "off-line" consolidation process. The model consists of three subsystems: (1) a trace system (neocortex), (2) a link system (hippocampus and adjacent regions), and (3) a modulatory system (basal forebrain and

  20. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  1. Quantitative XRD analysis: tools to investigate link between hydrous strain and clay mineral CEC

    International Nuclear Information System (INIS)

    Oueslati, W.; Ammar, M.; Ben Rhaiem, H.; Ben Haj Amara, A.

    2012-01-01

    Document available in extended abstract form only. This work aims at examining, by quantitative XRD analysis, the effect of an applied hydrous strain in the cationic exchange process of a di-octahedral smectite (Na-rich montmorillonite SWy-2). The hydrous constraint was created by a continuous, in situ, hydration-dehydration cycles using a variation of the %RH rate. Respectively, The starting, the intermediate and the final stressed samples was deposed in contact with saturated Me 2+ (i.e. Cd 2+ , Co 2+ , Zn 2+ and Ni 2+ ) chloride solutions respectively in order to examine the effect of the retained materials stress on the CEC of the host materials. An XRD profile modelling approach is adopted to describe all structural changes created by the environmental evolution of the %RH rate. This investigation allowed us to determine several structural parameters related to the nature, abundance, size, position and organization of exchangeable cation and water molecule in the inter-lamellar space along the c* axis. The obtained qualitative results show a considerable change in the hydration behaviour, versus the number of hydration - dehydration cycle, from homogeneous '2W' to heterogeneous '1W-2W' hydration state indicating an interstratified hydration phases and due probably to a new organization of the inter-lamellar space content. Quantitatively, the theoretical Mixed Layer Structure MLS suggest the coexistence of more one 'crystallite' species. Which are saturated by more than one exchangeable cations, indicating a partial saturation of all exchangeable sites. Using optimum structural parameter values, deduced from XRD modelling profile approach, some equations which described the evolution of exchangeable cation amount versus the applied hydrous strain were derived. (authors)

  2. Probe colorimeter for quantitating enzyme-linked immunosorbent assays and other colorimetric assays performed with microplates.

    Science.gov (United States)

    Ackerman, S B; Kelley, E A

    1983-03-01

    The performance of a fiberoptic probe colorimeter (model PC800; Brinkmann Instruments, Inc., Westbury, N.Y.) for quantitating enzymatic or colorimetric assays in 96-well microtiter plates was compared with the performances of a spectrophotometer (model 240; Gilford Instrument Laboratories, Inc., Oberlin, Ohio) and a commercially available enzyme immunoassay reader (model MR590; Dynatech Laboratories, Inc., Alexandria, Va.). Alkaline phosphatase-p-nitrophenyl phosphate in 3 M NaOH was used as the chromophore source. Six types of plates were evaluated for use with the probe colorimeter; they generated reproducibility values (100% coefficient of variation) ranging from 91 to 98% when one individual made 24 independent measurements on the same dilution of chromophore on each plate. Eleven individuals each performed 24 measurements with the colorimeter on either a visually light (absorbance of 0.10 at 420 nm) or a dark (absorbance of 0.80 at 420 nm) dilution of chromophore; reproducibilities averaged 87% for the light dilution and 97% for the dark dilution. When one individual measured the same chromophore sample at least 20 times in the colorimeter, in the spectrophotometer or in the enzyme immunoassay reader, reproducibility for each instrument was greater than 99%. Measurements of a dilution series of chromophore in a fixed volume indicated that the optical responses of each instrument were linear in a range of 0.05 to 1.10 absorbance units.

  3. Cross-link guided molecular modeling with ROSETTA.

    Directory of Open Access Journals (Sweden)

    Abdullah Kahraman

    Full Text Available Chemical cross-links identified by mass spectrometry generate distance restraints that reveal low-resolution structural information on proteins and protein complexes. The technology to reliably generate such data has become mature and robust enough to shift the focus to the question of how these distance restraints can be best integrated into molecular modeling calculations. Here, we introduce three workflows for incorporating distance restraints generated by chemical cross-linking and mass spectrometry into ROSETTA protocols for comparative and de novo modeling and protein-protein docking. We demonstrate that the cross-link validation and visualization software Xwalk facilitates successful cross-link data integration. Besides the protocols we introduce XLdb, a database of chemical cross-links from 14 different publications with 506 intra-protein and 62 inter-protein cross-links, where each cross-link can be mapped on an experimental structure from the Protein Data Bank. Finally, we demonstrate on a protein-protein docking reference data set the impact of virtual cross-links on protein docking calculations and show that an inter-protein cross-link can reduce on average the RMSD of a docking prediction by 5.0 Å. The methods and results presented here provide guidelines for the effective integration of chemical cross-link data in molecular modeling calculations and should advance the structural analysis of particularly large and transient protein complexes via hybrid structural biology methods.

  4. Linking quantitative microbial risk assessment and epidemiological data: informing safe drinking water trials in developing countries.

    Science.gov (United States)

    Enger, Kyle S; Nelson, Kara L; Clasen, Thomas; Rose, Joan B; Eisenberg, Joseph N S

    2012-05-01

    Intervention trials are used extensively to assess household water treatment (HWT) device efficacy against diarrheal disease in developing countries. Using these data for policy, however, requires addressing issues of generalizability (relevance of one trial in other contexts) and systematic bias associated with design and conduct of a study. To illustrate how quantitative microbial risk assessment (QMRA) can address water safety and health issues, we analyzed a published randomized controlled trial (RCT) of the LifeStraw Family Filter in the Congo. The model accounted for bias due to (1) incomplete compliance with filtration, (2) unexpected antimicrobial activity by the placebo device, and (3) incomplete recall of diarrheal disease. Effectiveness was measured using the longitudinal prevalence ratio (LPR) of reported diarrhea. The Congo RCT observed an LPR of 0.84 (95% CI: 0.61, 1.14). Our model predicted LPRs, assuming a perfect placebo, ranging from 0.50 (2.5-97.5 percentile: 0.33, 0.77) to 0.86 (2.5-97.5 percentile: 0.68, 1.09) for high (but not perfect) and low (but not zero) compliance, respectively. The calibration step provided estimates of the concentrations of three pathogen types (modeled as diarrheagenic E. coli, Giardia, and rotavirus) in drinking water, consistent with the longitudinal prevalence of reported diarrhea measured in the trial, and constrained by epidemiological data from the trial. Use of a QMRA model demonstrated the importance of compliance in HWT efficacy, the need for pathogen data from source waters, the effect of quantifying biases associated with epidemiological data, and the usefulness of generalizing the effectiveness of HWT trials to other contexts. © 2012 American Chemical Society

  5. Quantum Link Models and Quantum Simulation of Gauge Theories

    International Nuclear Information System (INIS)

    Wiese, U.J.

    2015-01-01

    This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)

  6. Rational Models for Inflation-Linked Derivatives

    DEFF Research Database (Denmark)

    Dam, Henrik; Macrina, Andrea; Skovmand, David

    2018-01-01

    in a multiplicative manner that allows for closed-form pricing of vanilla inflation products suchlike zero-coupon swaps, caps and floors, year-on-year swaps, caps and floors, and the exotic limited price index swap. The model retains the attractive features of a nominal multi-curve interest rate model such as closed...

  7. Quantitative trait loci linked to PRNP gene controlling health and production traits in INRA 401 sheep

    Directory of Open Access Journals (Sweden)

    Brunel Jean-Claude

    2007-07-01

    Full Text Available Abstract In this study, the potential association of PrP genotypes with health and productive traits was investigated. Data were recorded on animals of the INRA 401 breed from the Bourges-La Sapinière INRA experimental farm. The population consisted of 30 rams and 852 ewes, which produced 1310 lambs. The animals were categorized into three PrP genotype classes: ARR homozygous, ARR heterozygous, and animals without any ARR allele. Two analyses differing in the approach considered were carried out. Firstly, the potential association of the PrP genotype with disease (Salmonella resistance and production (wool and carcass traits was studied. The data used included 1042, 1043 and 1013 genotyped animals for the Salmonella resistance, wool and carcass traits, respectively. The different traits were analyzed using an animal model, where the PrP genotype effect was included as a fixed effect. Association analyses do not indicate any evidence of an effect of PrP genotypes on traits studied in this breed. Secondly, a quantitative trait loci (QTL detection approach using the PRNP gene as a marker was applied on ovine chromosome 13. Interval mapping was used. Evidence for one QTL affecting mean fiber diameter was found at 25 cM from the PRNP gene. However, a linkage between PRNP and this QTL does not imply unfavorable linkage disequilibrium for PRNP selection purposes.

  8. A VGI data integration framework based on linked data model

    Science.gov (United States)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  9. Human eyeball model reconstruction and quantitative analysis.

    Science.gov (United States)

    Xing, Qi; Wei, Qi

    2014-01-01

    Determining shape of the eyeball is important to diagnose eyeball disease like myopia. In this paper, we present an automatic approach to precisely reconstruct three dimensional geometric shape of eyeball from MR Images. The model development pipeline involved image segmentation, registration, B-Spline surface fitting and subdivision surface fitting, neither of which required manual interaction. From the high resolution resultant models, geometric characteristics of the eyeball can be accurately quantified and analyzed. In addition to the eight metrics commonly used by existing studies, we proposed two novel metrics, Gaussian Curvature Analysis and Sphere Distance Deviation, to quantify the cornea shape and the whole eyeball surface respectively. The experiment results showed that the reconstructed eyeball models accurately represent the complex morphology of the eye. The ten metrics parameterize the eyeball among different subjects, which can potentially be used for eye disease diagnosis.

  10. Modeling with Young Students--Quantitative and Qualitative.

    Science.gov (United States)

    Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis

    1999-01-01

    A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…

  11. A general method for targeted quantitative cross-linking mass spectrometry

    Science.gov (United States)

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NM...

  12. Quantitative occupational risk model: Single hazard

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Aneziris, O.N.; Bellamy, L.J.; Ale, B.J.M.; Oh, J.

    2017-01-01

    A model for the quantification of occupational risk of a worker exposed to a single hazard is presented. The model connects the working conditions and worker behaviour to the probability of an accident resulting into one of three types of consequence: recoverable injury, permanent injury and death. Working conditions and safety barriers in place to reduce the likelihood of an accident are included. Logical connections are modelled through an influence diagram. Quantification of the model is based on two sources of information: a) number of accidents observed over a period of time and b) assessment of exposure data of activities and working conditions over the same period of time and the same working population. Effectiveness of risk reducing measures affecting the working conditions, worker behaviour and/or safety barriers can be quantified through the effect of these measures on occupational risk. - Highlights: • Quantification of occupational risk from a single hazard. • Influence diagram connects working conditions, worker behaviour and safety barriers. • Necessary data include the number of accidents and the total exposure of worker • Effectiveness of risk reducing measures is quantified through the impact on the risk • An example illustrates the methodology.

  13. Shuttle/TDRSS modelling and link simulation study

    Science.gov (United States)

    Braun, W. R.; Mckenzie, T. M.; Biederman, L.; Lindsey, W. C.

    1979-01-01

    A Shuttle/TDRSS S-band and Ku-band link simulation package called LinCsim was developed for the evaluation of link performance for specific Shuttle signal designs. The link models were described in detail and the transmitter distortion parameters or user constraints were carefully defined. The overall link degradation (excluding hardware degradations) relative to an ideal BPSK channel were given for various sets of user constraint values. The performance sensitivity to each individual user constraint was then illustrated. The effect of excessive Spacelab clock jitter on the return link BER performance was also investigated as was the problem of subcarrier recovery for the K-band Shuttle return link signal.

  14. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, James A.

    1988-01-01

    A theoretical model for evaluating human spatial habitability (HuSH) in the proposed U.S. Space Station is developed. Optimizing the fitness of the space station environment for human occupancy will help reduce environmental stress due to long-term isolation and confinement in its small habitable volume. The development of tools that operationalize the behavioral bases of spatial volume for visual kinesthetic, and social logic considerations is suggested. This report further calls for systematic scientific investigations of how much real and how much perceived volume people need in order to function normally and with minimal stress in space-based settings. The theoretical model presented in this report can be applied to any size or shape interior, at any scale of consideration, for the Space Station as a whole to an individual enclosure or work station. Using as a point of departure the Isovist model developed by Dr. Michael Benedikt of the U. of Texas, the report suggests that spatial habitability can become as amenable to careful assessment as engineering and life support concerns.

  15. Protein analysis by 31p NMR spectroscopy in ionic liquid: quantitative determination of enzymatically created cross-links.

    Science.gov (United States)

    Monogioudi, Evanthia; Permi, Perttu; Filpponen, Ilari; Lienemann, Michael; Li, Bin; Argyropoulos, Dimitris; Buchert, Johanna; Mattinen, Maija-Liisa

    2011-02-23

    Cross-linking of β-casein by Trichoderma reesei tyrosinase (TrTyr) and Streptoverticillium mobaraense transglutaminase (Tgase) was analyzed by (31)P nuclear magnetic resonance (NMR) spectroscopy in ionic liquid (IL). According to (31)P NMR, 91% of the tyrosine side chains were cross-linked by TrTyr at high dosages. When Tgase was used, no changes were observed because a different cross-linking mechanism was operational. However, this verified the success of the phosphitylation of phenolics within the protein matrix in the IL. Atomic force microscopy (AFM) in solid state showed that disk-shaped nanoparticles were formed in the reactions with average diameters of 80 and 20 nm for TrTyr and Tgase, respectively. These data further advance the current understanding of the action of tyrosinases on proteins on molecular and chemical bond levels. Quantitative (31)P NMR in IL was shown to be a simple and efficient method for the study of protein modification.

  16. Extended model of restricted beam for FSO links

    Science.gov (United States)

    Poliak, Juraj; Wilfert, Otakar

    2012-10-01

    Modern wireless optical communication systems in many aspects overcome wire or radio communications. Their advantages are license-free operation and broad bandwidth that they offer. The medium in free-space optical (FSO) links is the atmosphere. Operation of outdoor FSO links struggles with many atmospheric phenomena that deteriorate phase and amplitude of the transmitted optical beam. This beam originates in the transmitter and is affected by its individual parts, especially by the lens socket and the transmitter aperture, where attenuation and diffraction effects take place. Both of these phenomena unfavourable influence the beam and cause degradation of link availability, or its total malfunction. Therefore, both of these phenomena should be modelled and simulated, so that one can judge the link function prior to the realization of the system. Not only the link availability and reliability are concerned, but also economic aspects. In addition, the transmitted beam is not, generally speaking, circularly symmetrical, what makes the link simulation more difficult. In a comprehensive model, it is necessary to take into account the ellipticity of the beam that is restricted by circularly symmetrical aperture where then the attenuation and diffraction occur. General model is too computationally extensive; therefore simplification of the calculations by means of analytical and numerical approaches will be discussed. Presented model is not only simulated using computer, but also experimentally proven. One can then deduce the ability of the model to describe the reality and to estimate how far can one go with approximations, i.e. limitations of the model are discussed.

  17. Link-topic model for biomedical abbreviation disambiguation.

    Science.gov (United States)

    Kim, Seonho; Yoon, Juntae

    2015-02-01

    The ambiguity of biomedical abbreviations is one of the challenges in biomedical text mining systems. In particular, the handling of term variants and abbreviations without nearby definitions is a critical issue. In this study, we adopt the concepts of topic of document and word link to disambiguate biomedical abbreviations. We newly suggest the link topic model inspired by the latent Dirichlet allocation model, in which each document is perceived as a random mixture of topics, where each topic is characterized by a distribution over words. Thus, the most probable expansions with respect to abbreviations of a given abstract are determined by word-topic, document-topic, and word-link distributions estimated from a document collection through the link topic model. The model allows two distinct modes of word generation to incorporate semantic dependencies among words, particularly long form words of abbreviations and their sentential co-occurring words; a word can be generated either dependently on the long form of the abbreviation or independently. The semantic dependency between two words is defined as a link and a new random parameter for the link is assigned to each word as well as a topic parameter. Because the link status indicates whether the word constitutes a link with a given specific long form, it has the effect of determining whether a word forms a unigram or a skipping/consecutive bigram with respect to the long form. Furthermore, we place a constraint on the model so that a word has the same topic as a specific long form if it is generated in reference to the long form. Consequently, documents are generated from the two hidden parameters, i.e. topic and link, and the most probable expansion of a specific abbreviation is estimated from the parameters. Our model relaxes the bag-of-words assumption of the standard topic model in which the word order is neglected, and it captures a richer structure of text than does the standard topic model by considering

  18. Linking spatial and dynamic models for traffic maneuvers

    DEFF Research Database (Denmark)

    Olderog, Ernst-Rüdiger; Ravn, Anders Peter; Wisniewski, Rafal

    2015-01-01

    For traffic maneuvers of multiple vehicles on highways we build an abstract spatial and a concrete dynamic model. In the spatial model we show the safety (collision freedom) of lane-change maneuvers. By linking the spatial and dynamic model via suitable refinements of the spatial atoms to distance...

  19. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  20. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  1. Hidden Markov Model for quantitative prediction of snowfall

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  2. Methodologies for quantitative systems pharmacology (QSP) models : Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, Hp; Agoram, B.; Davies, M.R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, Ph.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  3. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  4. A novel multilayer model for missing link prediction and future link forecasting in dynamic complex networks

    Science.gov (United States)

    Yasami, Yasser; Safaei, Farshad

    2018-02-01

    The traditional complex network theory is particularly focused on network models in which all network constituents are dealt with equivalently, while fail to consider the supplementary information related to the dynamic properties of the network interactions. This is a main constraint leading to incorrect descriptions of some real-world phenomena or incomplete capturing the details of certain real-life problems. To cope with the problem, this paper addresses the multilayer aspects of dynamic complex networks by analyzing the properties of intrinsically multilayered co-authorship networks, DBLP and Astro Physics, and presenting a novel multilayer model of dynamic complex networks. The model examines the layers evolution (layers birth/death process and lifetime) throughout the network evolution. Particularly, this paper models the evolution of each node's membership in different layers by an Infinite Factorial Hidden Markov Model considering feature cascade, and thereby formulates the link generation process for intra-layer and inter-layer links. Although adjacency matrixes are useful to describe the traditional single-layer networks, such a representation is not sufficient to describe and analyze the multilayer dynamic networks. This paper also extends a generalized mathematical infrastructure to address the problems issued by multilayer complex networks. The model inference is performed using some Markov Chain Monte Carlo sampling strategies, given synthetic and real complex networks data. Experimental results indicate a tremendous improvement in the performance of the proposed multilayer model in terms of sensitivity, specificity, positive and negative predictive values, positive and negative likelihood ratios, F1-score, Matthews correlation coefficient, and accuracy for two important applications of missing link prediction and future link forecasting. The experimental results also indicate the strong predictivepower of the proposed model for the application of

  5. Generalized PSF modeling for optimized quantitation in PET imaging.

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF

  6. Modeling of Atmospheric Turbulence Effect on Terrestrial FSO Link

    Directory of Open Access Journals (Sweden)

    A. Prokes

    2009-04-01

    Full Text Available Atmospheric turbulence results in many effects causing fluctuation in the received optical power. Terrestrial laser beam communication is affected above all by scintillations. The paper deals with modeling the influence of scintillation on link performance, using the modified Rytov theory. The probability of correct signal detection in direct detection system in dependence on many parameters such as link distance, power link margin, refractive-index structure parameter, etc. is discussed and different approaches to the evaluation of scintillation effect are compared. The simulations are performed for a horizontal-path propagation of the Gaussian-beam wave.

  7. Port-based modeling of a flexible link

    NARCIS (Netherlands)

    Macchelli, A.; Macchelli, A.; Hirohika, A.; Lynch, K.; Melchiorri, C.; Park, F.C.; Stramigioli, Stefano; Parker, L.E.

    In this paper, a simple way to model flexible robotic links is presented. This is different from classical approaches and from the Euler–Bernoulli or Timoshenko theory, in that the proposed model is able to describe large deflections in 3-D space and does not rely on any finite-dimensional

  8. Evaluation of recent quantitative magnetospheric magnetic field models

    International Nuclear Information System (INIS)

    Walker, R.J.

    1976-01-01

    Recent quantitative magnetospheric field models contain many features not found in earlier models. Magnetopause models which include the effects of the dipole tilt were presented. More realistic models of the tail field include tail currents which close on the magnetopause, cross-tail currents of finite thickness, and cross-tail current models which model the position of the neutral sheet as a function of tilt. Finally, models have attempted to calculate the field of currents distributed in the inner magnetosphere. As the purpose of a magnetospheric model is to provide a mathematical description of the field that reasonably reproduces the observed magnetospheric field, several recent models were compared with the observed ΔB(B/sub observed/--B/sub main field/) contours. Models containing only contributions from magnetopause and tail current systems are able to reproduce the observed quiet time field only in an extremely qualitative way. The best quantitative agreement between models and observations occurs when currents distributed in the inner magnetosphere are added to the magnetopause and tail current systems. However, the distributed current models are valid only for zero tilt. Even the models which reproduce the average observed field reasonably well may not give physically reasonable field gradients. Three of the models evaluated contain regions in the near tail in which the field gradient reverses direction. One region in which all the models fall short is that around the polar cusp, though most can be used to calculate the position of the last closed field line reasonably well

  9. Performance Theories for Sentence Coding: Some Quantitative Models

    Science.gov (United States)

    Aaronson, Doris; And Others

    1977-01-01

    This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)

  10. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  11. Analysis of sensory ratings data with cumulative link models

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen; Brockhoff, Per B.

    2013-01-01

    Examples of categorical rating scales include discrete preference, liking and hedonic rating scales. Data obtained on these scales are often analyzed with normal linear regression methods or with omnibus Pearson chi2 tests. In this paper we propose to use cumulative link models that allow for reg...

  12. Assessment of the link between quantitative biexponential diffusion-weighted imaging and contrast-enhanced MRI in the liver

    NARCIS (Netherlands)

    Dijkstra, Hildebrand; Oudkerk, Matthijs; Kappert, Peter; Sijens, Paul E.

    Purpose: To investigate if intravoxel incoherent motion (IVIM) modeled diffusion-weighted imaging (DWI) can be linked to contrast-enhanced (CE-)MRI in liver parenchyma and liver lesions. Methods: Twenty-five patients underwent IVIM-DWI followed by multiphase CE-MRI using Gd-EOB-DTPA (n = 20) or

  13. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    Science.gov (United States)

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  14. Quantitation of the receptor for urokinase plasminogen activator by enzyme-linked immunosorbent assay

    DEFF Research Database (Denmark)

    Rønne, E; Behrendt, N; Ploug, M

    1994-01-01

    variant of uPAR, suPAR, has been constructed by recombinant technique and the protein content of a purified suPAR standard preparation was determined by amino acid composition analysis. The sensitivity of the assay (0.6 ng uPAR/ml) is strong enough to measure uPAR in extracts of cultured cells and cancer......Binding of the urokinase plasminogen activator (uPA) to a specific cell surface receptor (uPAR) plays a crucial role in proteolysis during tissue remodelling and cancer invasion. An immunosorbent assay for the quantitation of uPAR has now been developed. This assay is based on two monoclonal...... antibodies recognizing the non-ligand binding part of this receptor, and it detects both free and occupied uPAR, in contrast to ligand-binding assays used previously. In a variant of the assay, the occupied fraction of uPAR is selectively detected with a uPA antibody. To be used as a standard, a soluble...

  15. Molecular Model for HNBR with Tunable Cross-Link Density.

    Science.gov (United States)

    Molinari, N; Khawaja, M; Sutton, A P; Mostofi, A A

    2016-12-15

    We introduce a chemically inspired, all-atom model of hydrogenated nitrile butadiene rubber (HNBR) and assess its performance by computing the mass density and glass-transition temperature as a function of cross-link density in the structure. Our HNBR structures are created by a procedure that mimics the real process used to produce HNBR, that is, saturation of the carbon-carbon double bonds in NBR, either by hydrogenation or by cross-linking. The atomic interactions are described by the all-atom "Optimized Potentials for Liquid Simulations" (OPLS-AA). In this paper, first, we assess the use of OPLS-AA in our models, especially using NBR bulk properties, and second, we evaluate the validity of the proposed model for HNBR by investigating mass density and glass transition as a function of the tunable cross-link density. Experimental densities are reproduced within 3% for both elastomers, and qualitatively correct trends in the glass-transition temperature as a function of monomer composition and cross-link density are obtained.

  16. Quantitative metal magnetic memory reliability modeling for welded joints

    Science.gov (United States)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  17. A Review on Quantitative Models for Sustainable Food Logistics Management

    Directory of Open Access Journals (Sweden)

    M. Soysal

    2012-12-01

    Full Text Available The last two decades food logistics systems have seen the transition from a focus on traditional supply chain management to food supply chain management, and successively, to sustainable food supply chain management. The main aim of this study is to identify key logistical aims in these three phases and analyse currently available quantitative models to point out modelling challenges in sustainable food logistics management (SFLM. A literature review on quantitative studies is conducted and also qualitative studies are consulted to understand the key logistical aims more clearly and to identify relevant system scope issues. Results show that research on SFLM has been progressively developing according to the needs of the food industry. However, the intrinsic characteristics of food products and processes have not yet been handled properly in the identified studies. The majority of the works reviewed have not contemplated on sustainability problems, apart from a few recent studies. Therefore, the study concludes that new and advanced quantitative models are needed that take specific SFLM requirements from practice into consideration to support business decisions and capture food supply chain dynamics.

  18. Genomic value prediction for quantitative traits under the epistatic model

    Directory of Open Access Journals (Sweden)

    Xu Shizhong

    2011-01-01

    Full Text Available Abstract Background Most quantitative traits are controlled by multiple quantitative trait loci (QTL. The contribution of each locus may be negligible but the collective contribution of all loci is usually significant. Genome selection that uses markers of the entire genome to predict the genomic values of individual plants or animals can be more efficient than selection on phenotypic values and pedigree information alone for genetic improvement. When a quantitative trait is contributed by epistatic effects, using all markers (main effects and marker pairs (epistatic effects to predict the genomic values of plants can achieve the maximum efficiency for genetic improvement. Results In this study, we created 126 recombinant inbred lines of soybean and genotyped 80 makers across the genome. We applied the genome selection technique to predict the genomic value of somatic embryo number (a quantitative trait for each line. Cross validation analysis showed that the squared correlation coefficient between the observed and predicted embryo numbers was 0.33 when only main (additive effects were used for prediction. When the interaction (epistatic effects were also included in the model, the squared correlation coefficient reached 0.78. Conclusions This study provided an excellent example for the application of genome selection to plant breeding.

  19. Linking Adverse Outcome Pathways to Dynamic Energy Budgets: A Conceptual Model

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Cheryl [Michigan State University, East Lansing; Nisbet, Roger [University of California Santa Barbara; Antczak, Philipp [University of Liverpool, UK; Reyero, Natalia [Army Corps of Engineers, Vicksburg; Gergs, Andre [Gaiac; Lika, Dina [University of Crete; Mathews, Teresa J. [ORNL; Muller, Eric [University of California, Santa Barbara; Nacci, Dianne [U.S. Environmental Protection Agency (EPA); Peace, Angela L. [ORNL; Remien, Chris [University of Idaho; Schulz, Irv [Pacific Northwest National Laboratory (PNNL); Watanabe, Karen [Arizona State University

    2018-02-01

    Ecological risk assessment quantifies the likelihood of undesirable impacts of stressors, primarily at high levels of biological organization. Data used to inform ecological risk assessments come primarily from tests on individual organisms or from suborganismal studies, indicating a disconnect between primary data and protection goals. We know how to relate individual responses to population dynamics using individual-based models, and there are emerging ideas on how to make connections to ecosystem services. However, there is no established methodology to connect effects seen at higher levels of biological organization with suborganismal dynamics, despite progress made in identifying Adverse Outcome Pathways (AOPs) that link molecular initiating events to ecologically relevant key events. This chapter is a product of a working group at the National Center for Mathematical and Biological Synthesis (NIMBioS) that assessed the feasibility of using dynamic energy budget (DEB) models of individual organisms as a “pivot” connecting suborganismal processes to higher level ecological processes. AOP models quantify explicit molecular, cellular or organ-level processes, but do not offer a route to linking sub-organismal damage to adverse effects on individual growth, reproduction, and survival, which can be propagated to the population level through individual-based models. DEB models describe these processes, but use abstract variables with undetermined connections to suborganismal biology. We propose linking DEB and quantitative AOP models by interpreting AOP key events as measures of damage-inducing processes in a DEB model. Here, we present a conceptual model for linking AOPs to DEB models and review existing modeling tools available for both AOP and DEB.

  20. Wires in the soup: quantitative models of cell signaling

    Science.gov (United States)

    Cheong, Raymond; Levchenko, Andre

    2014-01-01

    Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655

  1. Quantitative analysis of a wind energy conversion model

    International Nuclear Information System (INIS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-01-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s −1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is c p = 0.15. The v 3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively. (paper)

  2. Quantitative Systems Pharmacology: A Case for Disease Models.

    Science.gov (United States)

    Musante, C J; Ramanujan, S; Schmidt, B J; Ghobrial, O G; Lu, J; Heatherington, A C

    2017-01-01

    Quantitative systems pharmacology (QSP) has emerged as an innovative approach in model-informed drug discovery and development, supporting program decisions from exploratory research through late-stage clinical trials. In this commentary, we discuss the unique value of disease-scale "platform" QSP models that are amenable to reuse and repurposing to support diverse clinical decisions in ways distinct from other pharmacometrics strategies. © 2016 The Authors Clinical Pharmacology & Therapeutics published by Wiley Periodicals, Inc. on behalf of The American Society for Clinical Pharmacology and Therapeutics.

  3. Quantitative model of New Zealand's energy supply industry

    Energy Technology Data Exchange (ETDEWEB)

    Smith, B. R. [Victoria Univ., Wellington, (New Zealand); Lucas, P. D. [Ministry of Energy Resources (New Zealand)

    1977-10-15

    A mathematical model is presented to assist in an analysis of energy policy options available. The model is based on an engineering orientated description of New Zealand's energy supply and distribution system. The system is cast as a linear program, in which energy demand is satisfied at least cost. The capacities and operating modes of process plant (such as power stations, oil refinery units, and LP-gas extraction plants) are determined by the model, as well as the optimal mix of fuels supplied to the final consumers. Policy analysis with the model enables a wide ranging assessment of the alternatives and uncertainties within a consistent quantitative framework. It is intended that the model be used as a tool to investigate the relative effects of various policy options, rather than to present a definitive plan for satisfying the nation's energy requirements.

  4. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach.

    Science.gov (United States)

    van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie

    2017-09-01

    This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.

  5. A cell-based model system links chromothripsis with hyperploidy

    DEFF Research Database (Denmark)

    Mardin, Balca R; Drainas, Alexandros P; Waszak, Sebastian M

    2015-01-01

    A remarkable observation emerging from recent cancer genome analyses is the identification of chromothripsis as a one-off genomic catastrophe, resulting in massive somatic DNA structural rearrangements (SRs). Largely due to lack of suitable model systems, the mechanistic basis of chromothripsis h...... in hyperploid cells. Analysis of primary medulloblastoma cancer genomes verified the link between hyperploidy and chromothripsis in vivo. CAST provides the foundation for mechanistic dissection of complex DNA rearrangement processes....

  6. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  7. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  8. Modeling online social networks based on preferential linking

    International Nuclear Information System (INIS)

    Hu Hai-Bo; Chen Jun; Guo Jin-Li

    2012-01-01

    We study the phenomena of preferential linking in a large-scale evolving online social network and find that the linear preference holds for preferential creation, preferential acceptance, and preferential attachment. Based on the linear preference, we propose an analyzable model, which illustrates the mechanism of network growth and reproduces the process of network evolution. Our simulations demonstrate that the degree distribution of the network produced by the model is in good agreement with that of the real network. This work provides a possible bridge between the micro-mechanisms of network growth and the macrostructures of online social networks

  9. APPLICATION OF RIGID LINKS IN STRUCTURAL DESIGN MODELS

    Directory of Open Access Journals (Sweden)

    Sergey Yu. Fialko

    2017-09-01

    Full Text Available A special finite element modelling rigid links is proposed for the linear static and buckling analysis. Unlike the classical approach based on the theorems of rigid body kinematics, the proposed approach preserves the similarity between the adjacency graph for a sparse matrix and the adjacency graph for nodes of the finite element model, which allows applying sparse direct solvers more effectively. Besides, the proposed approach allows significantly reducing the number of nonzero entries in the factored stiffness matrix in comparison with the classical one, which greatly reduces the duration of the solution. For buckling problems of structures containing rigid bodies, this approach gives correct results. Several examples demonstrate its efficiency.

  10. Quantitative aspects and dynamic modelling of glucosinolate metabolism

    DEFF Research Database (Denmark)

    Vik, Daniel

    . This enables comparison of transcript and protein levels across mutants and upon induction. I find that unchallenged plants show good correspondence between protein and transcript, but that treatment with methyljasmonate results in significant differences (chapter 1). Functional genomics are used to study......). The construction a dynamic quantitative model of GLS hydrolysis is described. Simulations reveal potential effects on auxin signalling that could reflect defensive strategies (chapter 4). The results presented grant insights into, not only the dynamics of GLS biosynthesis and hydrolysis, but also the relationship...

  11. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  12. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  13. Quantifying Zika: Advancing the Epidemiology of Zika With Quantitative Models.

    Science.gov (United States)

    Keegan, Lindsay T; Lessler, Justin; Johansson, Michael A

    2017-12-16

    When Zika virus (ZIKV) emerged in the Americas, little was known about its biology, pathogenesis, and transmission potential, and the scope of the epidemic was largely hidden, owing to generally mild infections and no established surveillance systems. Surges in congenital defects and Guillain-Barré syndrome alerted the world to the danger of ZIKV. In the context of limited data, quantitative models were critical in reducing uncertainties and guiding the global ZIKV response. Here, we review some of the models used to assess the risk of ZIKV-associated severe outcomes, the potential speed and size of ZIKV epidemics, and the geographic distribution of ZIKV risk. These models provide important insights and highlight significant unresolved questions related to ZIKV and other emerging pathogens. Published by Oxford University Press for the Infectious Diseases Society of America 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  14. Groundwater Pollution Source Identification using Linked ANN-Optimization Model

    Science.gov (United States)

    Ayaz, Md; Srivastava, Rajesh; Jain, Ashu

    2014-05-01

    Groundwater is the principal source of drinking water in several parts of the world. Contamination of groundwater has become a serious health and environmental problem today. Human activities including industrial and agricultural activities are generally responsible for this contamination. Identification of groundwater pollution source is a major step in groundwater pollution remediation. Complete knowledge of pollution source in terms of its source characteristics is essential to adopt an effective remediation strategy. Groundwater pollution source is said to be identified completely when the source characteristics - location, strength and release period - are known. Identification of unknown groundwater pollution source is an ill-posed inverse problem. It becomes more difficult for real field conditions, when the lag time between the first reading at observation well and the time at which the source becomes active is not known. We developed a linked ANN-Optimization model for complete identification of an unknown groundwater pollution source. The model comprises two parts- an optimization model and an ANN model. Decision variables of linked ANN-Optimization model contain source location and release period of pollution source. An objective function is formulated using the spatial and temporal data of observed and simulated concentrations, and then minimized to identify the pollution source parameters. In the formulation of the objective function, we require the lag time which is not known. An ANN model with one hidden layer is trained using Levenberg-Marquardt algorithm to find the lag time. Different combinations of source locations and release periods are used as inputs and lag time is obtained as the output. Performance of the proposed model is evaluated for two and three dimensional case with error-free and erroneous data. Erroneous data was generated by adding uniformly distributed random error (error level 0-10%) to the analytically computed concentration

  15. Tip-Enhanced Raman Voltammetry: Coverage Dependence and Quantitative Modeling.

    Science.gov (United States)

    Mattei, Michael; Kang, Gyeongwon; Goubert, Guillaume; Chulhai, Dhabih V; Schatz, George C; Jensen, Lasse; Van Duyne, Richard P

    2017-01-11

    Electrochemical atomic force microscopy tip-enhanced Raman spectroscopy (EC-AFM-TERS) was employed for the first time to observe nanoscale spatial variations in the formal potential, E 0' , of a surface-bound redox couple. TERS cyclic voltammograms (TERS CVs) of single Nile Blue (NB) molecules were acquired at different locations spaced 5-10 nm apart on an indium tin oxide (ITO) electrode. Analysis of TERS CVs at different coverages was used to verify the observation of single-molecule electrochemistry. The resulting TERS CVs were fit to the Laviron model for surface-bound electroactive species to quantitatively extract the formal potential E 0' at each spatial location. Histograms of single-molecule E 0' at each coverage indicate that the electrochemical behavior of the cationic oxidized species is less sensitive to local environment than the neutral reduced species. This information is not accessible using purely electrochemical methods or ensemble spectroelectrochemical measurements. We anticipate that quantitative modeling and measurement of site-specific electrochemistry with EC-AFM-TERS will have a profound impact on our understanding of the role of nanoscale electrode heterogeneity in applications such as electrocatalysis, biological electron transfer, and energy production and storage.

  16. Modelling and Intelligent Control of an Elastic Link Robot Manipulator

    Directory of Open Access Journals (Sweden)

    Malik Loudini

    2013-01-01

    Full Text Available In this paper, precise control of the end-point position of a planar single-link elastic manipulator robot is discussed. The Timoshenko beam theory (TBT has been used to characterize the structural link elasticity including important damping mechanisms. A suitable nonlinear model is derived based on the Lagrangian assumed modes method. Elastic link manipulators are classified as systems possessing highly complex dynamics. In addition, the environment in which they operate may have a lot of disturbances. These give rise to special problems that may be solved using intelligent control techniques. The application of two advanced control strategies based on fuzzy set theory is investigated. The first closed-loop control scheme to be applied is the standard Proportional-Derivative (PD type fuzzy logic controller (FLC, also known as PD-type Mamdani's FLC (MPDFLC. Then, a genetic algorithm (GA is used to optimize the MPDFLC parameters with innovative tuning procedures. Both the MPDFLC and the GA optimized FLC (GAOFLC are implemented and tested to achieve a precise control of the manipulator end-point. The performances of the adopted closed-loop intelligent control strategies are examined via simulation experiments.

  17. Curing critical links in oscillator networks as power flow models

    International Nuclear Information System (INIS)

    Rohden, Martin; Meyer-Ortmanns, Hildegard; Witthaut, Dirk; Timme, Marc

    2017-01-01

    Modern societies crucially depend on the robust supply with electric energy so that blackouts of power grids can have far reaching consequences. Typically, large scale blackouts take place after a cascade of failures: the failure of a single infrastructure component, such as a critical transmission line, results in several subsequent failures that spread across large parts of the network. Improving the robustness of a network to prevent such secondary failures is thus key for assuring a reliable power supply. In this article we analyze the nonlocal rerouting of power flows after transmission line failures for a simplified AC power grid model and compare different strategies to improve network robustness. We identify critical links in the grid and compute alternative pathways to quantify the grid’s redundant capacity and to find bottlenecks along the pathways. Different strategies are developed and tested to increase transmission capacities to restore stability with respect to transmission line failures. We show that local and nonlocal strategies typically perform alike: one can equally well cure critical links by providing backup capacities locally or by extending the capacities of bottleneck links at remote locations. (paper)

  18. Application of non-quantitative modelling in the analysis of a network warfare environment

    CSIR Research Space (South Africa)

    Veerasamy, N

    2008-07-01

    Full Text Available based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define...

  19. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction

    Directory of Open Access Journals (Sweden)

    Cobbs Gary

    2012-08-01

    Full Text Available Abstract Background Numerous models for use in interpreting quantitative PCR (qPCR data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Results Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the

  20. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction.

    Science.gov (United States)

    Cobbs, Gary

    2012-08-16

    Numerous models for use in interpreting quantitative PCR (qPCR) data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the literature. They also give better estimates of

  1. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  2. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  3. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  4. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    2000-07-01

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  5. Downscaling SSPs in the GBM Delta - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, Andrew; Barbour, Emily; Salehin, Mashfiqus; Munsur Rahman, Md.; Hutton, Craig; Lazar, Attila

    2016-04-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  6. Downscaling SSPs in Bangladesh - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.

    2015-12-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  7. The Pseudomonas community in metal-contaminated sediments as revealed by quantitative PCR: a link with metal bioavailability.

    Science.gov (United States)

    Roosa, Stéphanie; Wauven, Corinne Vander; Billon, Gabriel; Matthijs, Sandra; Wattiez, Ruddy; Gillan, David C

    2014-10-01

    Pseudomonas bacteria are ubiquitous Gram-negative and aerobic microorganisms that are known to harbor metal resistance mechanisms such as efflux pumps and intracellular redox enzymes. Specific Pseudomonas bacteria have been quantified in some metal-contaminated environments, but the entire Pseudomonas population has been poorly investigated under these conditions, and the link with metal bioavailability was not previously examined. In the present study, quantitative PCR and cell cultivation were used to monitor and characterize the Pseudomonas population at 4 different sediment sites contaminated with various levels of metals. At the same time, total metals and metal bioavailability (as estimated using an HCl 1 m extraction) were measured. It was found that the total level of Pseudomonas, as determined by qPCR using two different genes (oprI and the 16S rRNA gene), was positively and significantly correlated with total and HCl-extractable Cu, Co, Ni, Pb and Zn, with high correlation coefficients (>0.8). Metal-contaminated sediments featured isolates of the Pseudomonas putida, Pseudomonas fluorescens, Pseudomonas lutea and Pseudomonas aeruginosa groups, with other bacterial genera such as Mycobacterium, Klebsiella and Methylobacterium. It is concluded that Pseudomonas bacteria do proliferate in metal-contaminated sediments, but are still part of a complex community. Copyright © 2014 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.

  8. Quantitative and Functional Requirements for Bioluminescent Cancer Models.

    Science.gov (United States)

    Feys, Lynn; Descamps, Benedicte; Vanhove, Christian; Vermeulen, Stefan; Vandesompele, J O; Vanderheyden, Katrien; Messens, Kathy; Bracke, Marc; De Wever, Olivier

    2016-01-01

    Bioluminescent cancer models are widely used but detailed quantification of the luciferase signal and functional comparison with a non-transfected control cell line are generally lacking. In the present study, we provide quantitative and functional tests for luciferase-transfected cells. We quantified the luciferase expression in BLM and HCT8/E11 transfected cancer cells, and examined the effect of long-term luciferin exposure. The present study also investigated functional differences between parental and transfected cancer cells. Our results showed that quantification of different single-cell-derived populations are superior with droplet digital polymerase chain reaction. Quantification of luciferase protein level and luciferase bioluminescent activity is only useful when there is a significant difference in copy number. Continuous exposure of cell cultures to luciferin leads to inhibitory effects on mitochondrial activity, cell growth and bioluminescence. These inhibitory effects correlate with luciferase copy number. Cell culture and mouse xenograft assays showed no significant functional differences between luciferase-transfected and parental cells. Luciferase-transfected cells should be validated by quantitative and functional assays before starting large-scale experiments. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  9. Towards Quantitative Spatial Models of Seabed Sediment Composition.

    Directory of Open Access Journals (Sweden)

    David Stephens

    Full Text Available There is a need for fit-for-purpose maps for accurately depicting the types of seabed substrate and habitat and the properties of the seabed for the benefits of research, resource management, conservation and spatial planning. The aim of this study is to determine whether it is possible to predict substrate composition across a large area of seabed using legacy grain-size data and environmental predictors. The study area includes the North Sea up to approximately 58.44°N and the United Kingdom's parts of the English Channel and the Celtic Seas. The analysis combines outputs from hydrodynamic models as well as optical remote sensing data from satellite platforms and bathymetric variables, which are mainly derived from acoustic remote sensing. We build a statistical regression model to make quantitative predictions of sediment composition (fractions of mud, sand and gravel using the random forest algorithm. The compositional data is analysed on the additive log-ratio scale. An independent test set indicates that approximately 66% and 71% of the variability of the two log-ratio variables are explained by the predictive models. A EUNIS substrate model, derived from the predicted sediment composition, achieved an overall accuracy of 83% and a kappa coefficient of 0.60. We demonstrate that it is feasible to spatially predict the seabed sediment composition across a large area of continental shelf in a repeatable and validated way. We also highlight the potential for further improvements to the method.

  10. A quantitative phase field model for hydride precipitation in zirconium alloys: Part I. Development of quantitative free energy functional

    International Nuclear Information System (INIS)

    Shi, San-Qiang; Xiao, Zhihua

    2015-01-01

    A temperature dependent, quantitative free energy functional was developed for the modeling of hydride precipitation in zirconium alloys within a phase field scheme. The model takes into account crystallographic variants of hydrides, interfacial energy between hydride and matrix, interfacial energy between hydrides, elastoplastic hydride precipitation and interaction with externally applied stress. The model is fully quantitative in real time and real length scale, and simulation results were compared with limited experimental data available in the literature with a reasonable agreement. The work calls for experimental and/or theoretical investigations of some of the key material properties that are not yet available in the literature

  11. The link between physics and chemistry in track modelling

    International Nuclear Information System (INIS)

    Green, N.J.B.; Bolton, C.E.; Spencer-Smith, R.D.

    1999-01-01

    The physical structure of a radiation track provides the initial conditions for the modelling of radiation chemistry. These initial conditions are not perfectly understood, because there are important gaps between what is provided by a typical track structure model and what is required to start the chemical model. This paper addresses the links between the physics and chemistry of tracks, with the intention of identifying those problems that need to be solved in order to obtain an accurate picture of the initial conditions for the purposes of modelling chemistry. These problems include the reasons for the increased yield of ionisation relative to homolytic bond breaking in comparison with the gas phase. A second area of great importance is the physical behaviour of low-energy electrons in condensed matter (including thermolisation and solvation). Many of these processes are not well understood, but they can have profound effects on the transient chemistry in the track. Several phenomena are discussed, including the short distance between adjacent energy loss events, the molecular nature of the underlying medium, dissociative attachment resonances and the ability of low-energy electrons to excite optically forbidden molecular states. Each of these phenomena has the potential to modify the transient chemistry substantially and must therefore be properly characterised before the physical model of the track can be considered to be complete. (orig.)

  12. Integrative modelling reveals mechanisms linking productivity and plant species richness.

    Science.gov (United States)

    Grace, James B; Anderson, T Michael; Seabloom, Eric W; Borer, Elizabeth T; Adler, Peter B; Harpole, W Stanley; Hautier, Yann; Hillebrand, Helmut; Lind, Eric M; Pärtel, Meelis; Bakker, Jonathan D; Buckley, Yvonne M; Crawley, Michael J; Damschen, Ellen I; Davies, Kendi F; Fay, Philip A; Firn, Jennifer; Gruner, Daniel S; Hector, Andy; Knops, Johannes M H; MacDougall, Andrew S; Melbourne, Brett A; Morgan, John W; Orrock, John L; Prober, Suzanne M; Smith, Melinda D

    2016-01-21

    How ecosystem productivity and species richness are interrelated is one of the most debated subjects in the history of ecology. Decades of intensive study have yet to discern the actual mechanisms behind observed global patterns. Here, by integrating the predictions from multiple theories into a single model and using data from 1,126 grassland plots spanning five continents, we detect the clear signals of numerous underlying mechanisms linking productivity and richness. We find that an integrative model has substantially higher explanatory power than traditional bivariate analyses. In addition, the specific results unveil several surprising findings that conflict with classical models. These include the isolation of a strong and consistent enhancement of productivity by richness, an effect in striking contrast with superficial data patterns. Also revealed is a consistent importance of competition across the full range of productivity values, in direct conflict with some (but not all) proposed models. The promotion of local richness by macroecological gradients in climatic favourability, generally seen as a competing hypothesis, is also found to be important in our analysis. The results demonstrate that an integrative modelling approach leads to a major advance in our ability to discern the underlying processes operating in ecological systems.

  13. Quantitative genetic models of sexual selection by male choice.

    Science.gov (United States)

    Nakahashi, Wataru

    2008-09-01

    There are many examples of male mate choice for female traits that tend to be associated with high fertility. I develop quantitative genetic models of a female trait and a male preference to show when such a male preference can evolve. I find that a disagreement between the fertility maximum and the viability maximum of the female trait is necessary for directional male preference (preference for extreme female trait values) to evolve. Moreover, when there is a shortage of available male partners or variance in male nongenetic quality, strong male preference can evolve. Furthermore, I also show that males evolve to exhibit a stronger preference for females that are more feminine (less resemblance to males) than the average female when there is a sexual dimorphism caused by fertility selection which acts only on females.

  14. Quantitative Modelling of Trace Elements in Hard Coal.

    Science.gov (United States)

    Smoliński, Adam; Howaniec, Natalia

    2016-01-01

    The significance of coal in the world economy remains unquestionable for decades. It is also expected to be the dominant fossil fuel in the foreseeable future. The increased awareness of sustainable development reflected in the relevant regulations implies, however, the need for the development and implementation of clean coal technologies on the one hand, and adequate analytical tools on the other. The paper presents the application of the quantitative Partial Least Squares method in modeling the concentrations of trace elements (As, Ba, Cd, Co, Cr, Cu, Mn, Ni, Pb, Rb, Sr, V and Zn) in hard coal based on the physical and chemical parameters of coal, and coal ash components. The study was focused on trace elements potentially hazardous to the environment when emitted from coal processing systems. The studied data included 24 parameters determined for 132 coal samples provided by 17 coal mines of the Upper Silesian Coal Basin, Poland. Since the data set contained outliers, the construction of robust Partial Least Squares models for contaminated data set and the correct identification of outlying objects based on the robust scales were required. These enabled the development of the correct Partial Least Squares models, characterized by good fit and prediction abilities. The root mean square error was below 10% for all except for one the final Partial Least Squares models constructed, and the prediction error (root mean square error of cross-validation) exceeded 10% only for three models constructed. The study is of both cognitive and applicative importance. It presents the unique application of the chemometric methods of data exploration in modeling the content of trace elements in coal. In this way it contributes to the development of useful tools of coal quality assessment.

  15. The Linked Dual Representation model of vocal perception and production

    Directory of Open Access Journals (Sweden)

    Sean eHutchins

    2013-11-01

    Full Text Available The voice is one of the most important media for communication, yet there is a wide range of abilities in both the perception and production of the voice. In this article, we review this range of abilities, focusing on pitch accuracy as a particularly informative case, and look at the factors underlying these abilities. Several classes of models have been posited describing the relationship between vocal perception and production, and we review the evidence for and against each class of model. We look at how the voice is different from other musical instruments and review evidence about both the association and the dissociation between vocal perception and production abilities. Finally, we introduce the Linked Dual Representation model, a new approach which can account for the broad patterns in prior findings, including trends in the data which might seem to be countervailing. We discuss how this model interacts with higher-order cognition and examine its predictions about several aspects of vocal perception and production.

  16. An Object-oriented Knowledge Link Model for General Knowledge Management

    OpenAIRE

    Xiao-hong, CHEN; Bang-chuan, LAI

    2005-01-01

    The knowledge link is the basic on knowledge share and the indispensable part in knowledge standardization management. In this paper, a object-oriented knowledge link model is proposed for general knowledge management by using objectoriented representation based on knowledge levels system. In the model, knowledge link is divided into general knowledge link and integrated knowledge with corresponding link properties and methods. What’s more, its BNF syntax is described and designed.

  17. Melanoma screening: Informing public health policy with quantitative modelling.

    Directory of Open Access Journals (Sweden)

    Stephen Gilmore

    Full Text Available Australia and New Zealand share the highest incidence rates of melanoma worldwide. Despite the substantial increase in public and physician awareness of melanoma in Australia over the last 30 years-as a result of the introduction of publicly funded mass media campaigns that began in the early 1980s -mortality has steadily increased during this period. This increased mortality has led investigators to question the relative merits of primary versus secondary prevention; that is, sensible sun exposure practices versus early detection. Increased melanoma vigilance on the part of the public and among physicians has resulted in large increases in public health expenditure, primarily from screening costs and increased rates of office surgery. Has this attempt at secondary prevention been effective? Unfortunately epidemiologic studies addressing the causal relationship between the level of secondary prevention and mortality are prohibitively difficult to implement-it is currently unknown whether increased melanoma surveillance reduces mortality, and if so, whether such an approach is cost-effective. Here I address the issue of secondary prevention of melanoma with respect to incidence and mortality (and cost per life saved by developing a Markov model of melanoma epidemiology based on Australian incidence and mortality data. The advantages of developing a methodology that can determine constraint-based surveillance outcomes are twofold: first, it can address the issue of effectiveness; and second, it can quantify the trade-off between cost and utilisation of medical resources on one hand, and reduced morbidity and lives saved on the other. With respect to melanoma, implementing the model facilitates the quantitative determination of the relative effectiveness and trade-offs associated with different levels of secondary and tertiary prevention, both retrospectively and prospectively. For example, I show that the surveillance enhancement that began in

  18. A Quantitative Trait Locus (LSq-1) on Mouse Chromosome 7 Is Linked to the Absence of Tissue Loss After Surgical Hindlimb Ischemia

    Science.gov (United States)

    Dokun, Ayotunde O.; Keum, Sehoon; Hazarika, Surovi; Li, Yongjun; Lamonte, Gregory M.; Wheeler, Ferrin; Marchuk, Douglas A.; Annex, Brian H.

    2010-01-01

    Background Peripheral arterial disease (PAD) caused by occlusive atherosclerosis of the lower extremity has 2 major clinical manifestations. Critical limb ischemia is characterized by rest pain and/or tissue loss and has a ≥40% risk of death and major amputation. Intermittent claudication causes pain on walking, has no tissue loss, and has amputation plus mortality rates of 2% to 4% per year. Progression from claudication to limb ischemia is infrequent. Risk factors in most PAD patients overlap. Thus, we hypothesized that genetic variations may be linked to presence or absence of tissue loss in PAD. Methods and Results Hindlimb ischemia (murine model of PAD) was induced in C57BL/6, BALB/c, C57BL/6×BALB/c (F1), F1×BALB/c (N2), A/J, and C57BL/6J-Chr7A/J/NaJ chromosome substitution strains. Mice were monitored for perfusion recovery and tissue necrosis. Genome-wide scanning with polymorphic markers across the 19 murine autosomes was performed on the N2 mice. Greater tissue loss and poorer perfusion recovery occurred in BALB/c than in the C57BL/6 strain. Analysis of 105 N2 progeny identified a single quantitative trait locus on chromosome 7 that exhibited significant linkage to both tissue necrosis and extent of perfusion recovery. Using the appropriate chromosome substitution strain, we demonstrate that C57BL/6-derived chromosome 7 is required for tissue preservation. Conclusions We have identified a quantitative trait locus on murine chromosome 7 (LSq-1) that is associated with the absence of tissue loss in a preclinical model of PAD and may be useful in identifying gene(s) that influence PAD in humans. PMID:18285563

  19. Linking an ecosystem model and a landscape model to study forest species response to climate warming

    Science.gov (United States)

    Hong S. He; David J. Mladenoff; Thomas R. Crow

    1999-01-01

    No single model can address forest change from single tree to regional scales. We discuss a framework linking an ecosystem process model {LINKAGES) with a spatial landscape model (LANDIS) to examine forest species responses to climate warming for a large, heterogeneous landscape in northern Wisconsin, USA. Individual species response at the ecosystem scale was...

  20. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    Directory of Open Access Journals (Sweden)

    Mondry Adrian

    2004-08-01

    Full Text Available Abstract Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods

  1. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting pa...

  2. Link between laboratory/field observations and models

    International Nuclear Information System (INIS)

    Cole, C.R.; Foley, M.G.

    1985-10-01

    The various linkages in system performance assessments that integrate disposal program elements must be understood. The linkage between model development and field/laboratory observations is described as the iterative program of site and system characterization for development of an observational-confirmatory data base to develop, improve, and support conceptual models for site and system behavior. The program consists of data gathering and experiments to demonstrate understanding at various spatial and time scales and degrees of complexity. Understanding and accounting for the decreasing characterization certainty that arises with increasing space and time scales is an important aspect of the link between models and observations. The performance allocation process for setting performance goals and confidence levels coupled with a performance assessment approach that provides these performance and confidence estimates will resolve when sufficient characterization has been achieved. At each iteration performance allocation goals are reviewed and revised as necessary. The updated data base and appropriate performance assessment tools and approaches are utilized to identify and design additional tests and data needs necessary to meet current performance allocation goals. 9 refs

  3. The link between laboratory/field observations and models

    International Nuclear Information System (INIS)

    Cole, C.R.; Foley, M.G.

    1986-01-01

    The various linkages in system performance assessments that integrate disposal program elements must be understood. The linkage between model development and field/laboratory observations is described as the iterative program of site and system characterization for development of an observational-confirmatory data base. This data base is designed to develop, improve, and support conceptual models for site and system behavior. The program consists of data gathering and experiments to demonstrate understanding at various spatial and time scales and degrees of complexity. Understanding and accounting for the decreasing characterization certainty that arises with increasing space and time scales is an important aspect of the link between models and observations. The performance allocation process for setting performance goals and confidence levels, coupled with a performance assessment approach that provides these performance and confidence estimates, will determine when sufficient characterization has been achieved. At each iteration, performance allocation goals are reviewed and revised as necessary. The updated data base and appropriate performance assessment tools and approaches are utilized to identify and design additional tests and data needs necessary to meet current performance allocation goals

  4. Linking Geomechanical Models with Observations of Microseismicity during CCS Operations

    Science.gov (United States)

    Verdon, J.; Kendall, J.; White, D.

    2012-12-01

    During CO2 injection for the purposes of carbon capture and storage (CCS), injection-induced fracturing of the overburden represents a key risk to storage integrity. Fractures in a caprock provide a pathway along which buoyant CO2 can rise and escape the storage zone. Therefore the ability to link field-scale geomechanical models with field geophysical observations is of paramount importance to guarantee secure CO2 storage. Accurate location of microseismic events identifies where brittle failure has occurred on fracture planes. This is a manifestation of the deformation induced by CO2 injection. As the pore pressure is increased during injection, effective stress is decreased, leading to inflation of the reservoir and deformation of surrounding rocks, which creates microseismicity. The deformation induced by injection can be simulated using finite-element mechanical models. Such a model can be used to predict when and where microseismicity is expected to occur. However, typical elements in a field scale mechanical models have decameter scales, while the rupture size for microseismic events are typically of the order of 1 square meter. This means that mapping modeled stress changes to predictions of microseismic activity can be challenging. Where larger scale faults have been identified, they can be included explicitly in the geomechanical model. Where movement is simulated along these discrete features, it can be assumed that microseismicity will occur. However, microseismic events typically occur on fracture networks that are too small to be simulated explicitly in a field-scale model. Therefore, the likelihood of microseismicity occurring must be estimated within a finite element that does not contain explicitly modeled discontinuities. This can be done in a number of ways, including the utilization of measures such as closeness on the stress state to predetermined failure criteria, either for planes with a defined orientation (the Mohr-Coulomb criteria) for

  5. Implementation of a vibrationally linked chemical reaction model for DSMC

    Science.gov (United States)

    Carlson, A. B.; Bird, Graeme A.

    1994-01-01

    A new procedure closely linking dissociation and exchange reactions in air to the vibrational levels of the diatomic molecules has been implemented in both one- and two-dimensional versions of Direct Simulation Monte Carlo (DSMC) programs. The previous modeling of chemical reactions with DSMC was based on the continuum reaction rates for the various possible reactions. The new method is more closely related to the actual physics of dissociation and is more appropriate to the particle nature of DSMC. Two cases are presented: the relaxation to equilibrium of undissociated air initially at 10,000 K, and the axisymmetric calculation of shuttle forebody heating during reentry at 92.35 km and 7500 m/s. Although reaction rates are not used in determining the dissociations or exchange reactions, the new method produces rates which agree astonishingly well with the published rates derived from experiment. The results for gas properties and surface properties also agree well with the results produced by earlier DSMC models, equilibrium air calculations, and experiment.

  6. Quantitative Outline-based Shape Analysis and Classification of Planetary Craterforms using Supervised Learning Models

    Science.gov (United States)

    Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric

    2017-10-01

    The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.

  7. Quantitative Model for Supply Chain Visibility: Process Capability Perspective

    Directory of Open Access Journals (Sweden)

    Youngsu Lee

    2016-01-01

    Full Text Available Currently, the intensity of enterprise competition has increased as a result of a greater diversity of customer needs as well as the persistence of a long-term recession. The results of competition are becoming severe enough to determine the survival of company. To survive global competition, each firm must focus on achieving innovation excellence and operational excellence as core competency for sustainable competitive advantage. Supply chain management is now regarded as one of the most effective innovation initiatives to achieve operational excellence, and its importance has become ever more apparent. However, few companies effectively manage their supply chains, and the greatest difficulty is in achieving supply chain visibility. Many companies still suffer from a lack of visibility, and in spite of extensive research and the availability of modern technologies, the concepts and quantification methods to increase supply chain visibility are still ambiguous. Based on the extant researches in supply chain visibility, this study proposes an extended visibility concept focusing on a process capability perspective and suggests a more quantitative model using Z score in Six Sigma methodology to evaluate and improve the level of supply chain visibility.

  8. Testing Process Predictions of Models of Risky Choice: A Quantitative Model Comparison Approach

    Directory of Open Access Journals (Sweden)

    Thorsten ePachur

    2013-09-01

    Full Text Available This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or nonlinear functions thereof and the separate evaluation of risky options (expectation models. Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models. We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter, Gigerenzer, & Hertwig, 2006, and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up and direction of search (i.e., gamble-wise vs. reason-wise. In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly; acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988 called similarity. In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies.

  9. Testing process predictions of models of risky choice: a quantitative model comparison approach

    Science.gov (United States)

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  10. Non-constant link tension coefficient in the tumbling-snake model subjected to simple shear

    Science.gov (United States)

    Stephanou, Pavlos S.; Kröger, Martin

    2017-11-01

    The authors of the present study have recently presented evidence that the tumbling-snake model for polymeric systems has the necessary capacity to predict the appearance of pronounced undershoots in the time-dependent shear viscosity as well as an absence of equally pronounced undershoots in the transient two normal stress coefficients. The undershoots were found to appear due to the tumbling behavior of the director u when a rotational Brownian diffusion term is considered within the equation of motion of polymer segments, and a theoretical basis concerning the use of a link tension coefficient given through the nematic order parameter had been provided. The current work elaborates on the quantitative predictions of the tumbling-snake model to demonstrate its capacity to predict undershoots in the time-dependent shear viscosity. These predictions are shown to compare favorably with experimental rheological data for both polymer melts and solutions, help us to clarify the microscopic origin of the observed phenomena, and demonstrate in detail why a constant link tension coefficient has to be abandoned.

  11. Mesoscale models for stacking faults, deformation twins and martensitic transformations: Linking atomistics to continuum

    Science.gov (United States)

    Kibey, Sandeep A.

    We present a hierarchical approach that spans multiple length scales to describe defect formation---in particular, formation of stacking faults (SFs) and deformation twins---in fcc crystals. We link the energy pathways (calculated here via ab initio density functional theory, DFT) associated with formation of stacking faults and twins to corresponding heterogeneous defect nucleation models (described through mesoscale dislocation mechanics). Through the generalized Peieirls-Nabarro model, we first correlate the width of intrinsic SFs in fcc alloy systems to their nucleation pathways called generalized stacking fault energies (GSFE). We then establish a qualitative dependence of twinning tendency in fee metals and alloys---specifically, in pure Cu and dilute Cu-xAl (x= 5.0 and 8.3 at.%)---on their twin-energy pathways called the generalized planar fault energies (GPFE). We also link the twinning behavior of Cu-Al alloys to their electronic structure by determining the effect of solute Al on the valence charge density redistribution at the SF through ab initio DFT. Further, while several efforts have been undertaken to incorporate twinning for predicting stress-strain response of fcc materials, a fundamental law for critical twinning stress has not yet emerged. We resolve this long-standing issue by linking quantitatively the twin-energy pathways (GPFE) obtained via ab initio DFT to heterogeneous, dislocation-based twin nucleation models. We establish an analytical expression that quantitatively predicts the critical twinning stress in fcc metals in agreement with experiments without requiring any empiricism at any length scale. Our theory connects twinning stress to twin-energy pathways and predicts a monotonic relation between stress and unstable twin stacking fault energy revealing the physics of twinning. We further demonstrate that the theory holds for fcc alloys as well. Our theory inherently accounts for directional nature of twinning which available

  12. Branching enzyme assay: selective quantitation of the alpha 1,6-linked glucosyl residues involved in the branching points.

    Science.gov (United States)

    Krisman, C R; Tolmasky, D S; Raffo, S

    1985-06-01

    Methods previously described for glycogen or amylopectin branching enzymatic activity are insufficiently sensitive and not quantitative. A new, more sensitive, specific, and quantitative one was developed. It is based upon the quantitation of the glucose residues joined by alpha 1,6 bonds introduced by varying amounts of branching enzyme. The procedure involved the synthesis of a polysaccharide from Glc-1-P and phosphorylase in the presence of the sample to be tested. The branched polysaccharide was then purified and the glucoses involved in the branching points were quantitated after degradation with phosphorylase and debranching enzymes. This method appeared to be useful, not only in enzymatic activity determinations but also in the study of the structure of alpha-D-glucans when combined with those of total polysaccharide quantitation, such as iodine and phenol-sulfuric acid.

  13. The link between perceived human resource management practices, engagement and employee behaviour : A moderated mediation model

    NARCIS (Netherlands)

    Alfes, K.; Shantz, A.D.; Truss, C.; Soane, E.C.

    2013-01-01

    This study contributes to our understanding of the mediating and moderating processes through which human resource management (HRM) practices are linked with behavioural outcomes. We developed and tested a moderated mediation model linking perceived HRM practices to organisational citizenship

  14. Toward linking demographic and economic models for impact assessment

    International Nuclear Information System (INIS)

    Williams, C.A.; Meenan, C.D.

    1991-01-01

    One of the objectives of the Yucca Mountain Project, in Southern Nevada, is to evaluate the effects of the development of a high-level nuclear waste repository. As described in the Section 175 Report to the Congress of the US, the temporal scope of this repository project encompasses approximately 70 years and includes four phases: Site characterization and licensing, construction, operation, and closure and decommissioning. If retrieval of the waste were to be required, the temporal scope of the repository project could be extended to approximately 100 years. The study of the potential socioeconomic effects of this project is the foundation for this paper. This paper focuses on the economic and demographic aspects and a possible method to interface the two. First, the authors briefly discuss general socioeconomic modeling theory from a county level view point, as well as methods for the apportionment of county level data to sub-county areas. Next, the authors describe the unique economic and demographic conditions which exist in Nevada at both the state and county levels. Finally, the authors evaluate a possible procedure for analyzing repository effects at a sub-county level; this involves discussion of an interface linking the economic and demographic aspects, which is based on the reconciliation of supply and demand for labor. The authors conclude that the basis for further model development may rely on the interaction of supply and demand to produce change in wage rates. These changes in expected wages should be a justification for allocating economic migrants (who may respond to Yucca Mountain Project development) into various communities

  15. Linking the M&Rfi Weather Generator with Agrometeorological Models

    Science.gov (United States)

    Dubrovsky, Martin; Trnka, Miroslav

    2015-04-01

    Realistic meteorological inputs (representing the present and/or future climates) for the agrometeorological model simulations are often produced by stochastic weather generators (WGs). This contribution presents some methodological issues and results obtained in our recent experiments. We also address selected questions raised in the synopsis of this session. The input meteorological time series for our experiments are produced by the parametric single site weather generator (WG) Marfi, which is calibrated from the available observational data (or interpolated from surrounding stations). To produce meteorological series representing the future climate, the WG parameters are modified by climate change scenarios, which are prepared by the pattern scaling method: the standardised scenarios derived from Global or Regional Climate Models are multiplied by the change in global mean temperature (ΔTG) determined by the simple climate model MAGICC. The presentation will address following questions: (i) The dependence of the quality of the synthetic weather series and impact results on the WG settings. An emphasis will be put on an effect of conditioning the daily WG on monthly WG (presently being one of our hot topics), which aims at improvement of the reproduction of the low-frequency weather variability. Comparison of results obtained with various WG settings is made in terms of climatic and agroclimatic indices (including extreme temperature and precipitation characteristics and drought indices). (ii) Our methodology accounts for the uncertainties coming from various sources. We will show how the climate change impact results are affected by 1. uncertainty in climate modelling, 2. uncertainty in ΔTG, and 3. uncertainty related to the complexity of the climate change scenario (focusing on an effect of inclusion of changes in variability into the climate change scenarios). Acknowledgements: This study was funded by project "Building up a multidisciplinary scientific

  16. Comprehensive Regional Modeling for Long-Range Planning: Linking Integrated Urban Models and Geographic Information Systems

    OpenAIRE

    Johnston, Robert; de la Barra, Thomas

    2000-01-01

    This study demonstrates the sequential linking of two types of models to permit the comprehensive evaluation of regional transportation and land use policies. First, we operate an integrated urban model (TRANUS), which represents both land and travel markets with zones and networks. The travel and land use projections from TRANUS are outlined, to demonstrate the general reasonableness of the results, as this is the first application of a market-based urban model in the US. Second, the land us...

  17. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  18. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.

    Science.gov (United States)

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-11-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.

  19. A generalized quantitative antibody homeostasis model: maintenance of global antibody equilibrium by effector functions.

    Science.gov (United States)

    Prechl, József

    2017-11-01

    The homeostasis of antibodies can be characterized as a balanced production, target-binding and receptor-mediated elimination regulated by an interaction network, which controls B-cell development and selection. Recently, we proposed a quantitative model to describe how the concentration and affinity of interacting partners generates a network. Here we argue that this physical, quantitative approach can be extended for the interpretation of effector functions of antibodies. We define global antibody equilibrium as the zone of molar equivalence of free antibody, free antigen and immune complex concentrations and of dissociation constant of apparent affinity: [Ab]=[Ag]=[AbAg]= K D . This zone corresponds to the biologically relevant K D range of reversible interactions. We show that thermodynamic and kinetic properties of antibody-antigen interactions correlate with immunological functions. The formation of stable, long-lived immune complexes correspond to a decrease of entropy and is a prerequisite for the generation of higher-order complexes. As the energy of formation of complexes increases, we observe a gradual shift from silent clearance to inflammatory reactions. These rules can also be applied to complement activation-related immune effector processes, linking the physicochemical principles of innate and adaptive humoral responses. Affinity of the receptors mediating effector functions shows a wide range of affinities, allowing the continuous sampling of antibody-bound antigen over the complete range of concentrations. The generation of multivalent, multicomponent complexes triggers effector functions by crosslinking these receptors on effector cells with increasing enzymatic degradation potential. Thus, antibody homeostasis is a thermodynamic system with complex network properties, nested into the host organism by proper immunoregulatory and effector pathways. Maintenance of global antibody equilibrium is achieved by innate qualitative signals modulating a

  20. Interpretation of protein quantitation using the Bradford assay: comparison with two calculation models.

    Science.gov (United States)

    Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung

    2013-03-01

    The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... enhancing safety margins and adding greater precision to quantitative accident prediction [45]. One deterministic methodology is the stringency level (SL) approach, which is recognized as a valuable decision tool in the selection of standardized materials specifications to prevent potential failures [3].

  2. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... that require novel methods and applications. One of the important issues in this context is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese smart grid implementation as a case study and address the verification problem for performance and energy...... consumption.We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  3. A neural network model of semantic memory linking feature-based object representation and words.

    Science.gov (United States)

    Cuppini, C; Magosso, E; Ursino, M

    2009-06-01

    Recent theories in cognitive neuroscience suggest that semantic memory is a distributed process, which involves many cortical areas and is based on a multimodal representation of objects. The aim of this work is to extend a previous model of object representation to realize a semantic memory, in which sensory-motor representations of objects are linked with words. The model assumes that each object is described as a collection of features, coded in different cortical areas via a topological organization. Features in different objects are segmented via gamma-band synchronization of neural oscillators. The feature areas are further connected with a lexical area, devoted to the representation of words. Synapses among the feature areas, and among the lexical area and the feature areas are trained via a time-dependent Hebbian rule, during a period in which individual objects are presented together with the corresponding words. Simulation results demonstrate that, during the retrieval phase, the network can deal with the simultaneous presence of objects (from sensory-motor inputs) and words (from acoustic inputs), can correctly associate objects with words and segment objects even in the presence of incomplete information. Moreover, the network can realize some semantic links among words representing objects with shared features. These results support the idea that semantic memory can be described as an integrated process, whose content is retrieved by the co-activation of different multimodal regions. In perspective, extended versions of this model may be used to test conceptual theories, and to provide a quantitative assessment of existing data (for instance concerning patients with neural deficits).

  4. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: broadening the matrix approach.

    NARCIS (Netherlands)

    Grootel, L. van; Wesel, F. van; O'Mara-Eves, A.; Thomas, J.; Hox, J.; Boeije, H.

    2017-01-01

    Background: This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the

  5. A novel animal model linking adiposity to altered circadian rhythms

    Science.gov (United States)

    Researchers have provided evidence for a link between obesity and altered circadian rhythms (e.g., shift work, disrupted sleep), but the mechanism for this association is still unknown. Adipocytes possess an intrinsic circadian clock, and circadian rhythms in adipocytokines and adipose tissue metab...

  6. Quantitation of pulmonary surfactant protein SP-B in the absence or presence of phospholipids by enzyme-linked immunosorbent assay

    DEFF Research Database (Denmark)

    Oviedo, J M; Valiño, F; Plasencia, I

    2001-01-01

    We have developed an enzyme-linked immunosorbent assay (ELISA) that uses polyclonal or monoclonal anti-surfactant protein SP-B antibodies to quantitate purified SP-B in chloroform/methanol and in chloroform/methanol extracts of whole pulmonary surfactant at nanogram levels. This method has been...... used to explore the effect of the presence of different phospholipids on the immunoreactivity of SP-B. Both polyclonal and monoclonal antibodies produced reproducible ELISA calibration curves for methanolic SP-B solutions with protein concentrations in the range of 20-1000 ng/mL. At these protein...

  7. A Linked Model for Simulating Stand Development and Growth Processes of Loblolly Pine

    Science.gov (United States)

    V. Clark Baldwin; Phillip M. Dougherty; Harold E. Burkhart

    1998-01-01

    Linking models of different scales (e.g., process, tree-stand-ecosystem) is essential for furthering our understanding of stand, climatic, and edaphic effects on tree growth and forest productivity. Moreover, linking existing models that differ in scale and levels of resolution quickly identifies knowledge gaps in information required to scale from one level to another...

  8. Decentralising Curriculum Reform: The Link Teacher Model of In-Service Training.

    Science.gov (United States)

    Wildy, Helen; And Others

    1996-01-01

    Presents a (Western Australian) case study of the link teacher model, a decentralized, "train the trainer" approach to inservice education. Discusses the model's perceived effectiveness, the link teachers' role, central authority support, and new experimentation opportunities. Combining centralized syllabus change with decentralized…

  9. A Review of Quantitative Situation Assessment Models for Nuclear Power Plant Operators

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Situation assessment is the process of developing situation awareness and situation awareness is defined as 'the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.' Situation awareness is an important element influencing human actions because human decision making is based on the result of situation assessment or situation awareness. There are many models for situation awareness and those models can be categorized into qualitative or quantitative. As the effects of some input factors on situation awareness can be investigated through the quantitative models, the quantitative models are more useful for the design of operator interfaces, automation strategies, training program, and so on, than the qualitative models. This study presents the review of two quantitative models of situation assessment (SA) for nuclear power plant operators

  10. Quantitative N-linked Glycoproteomics of Myocardial Ischemia and Reperfusion Injury Reveals Early Remodeling in the Extracellular Environment

    DEFF Research Database (Denmark)

    Parker, Benjamin L; Palmisano, Giuseppe; Edwards, Alistair V G

    2011-01-01

    , while dimethyl labeling confirmed 46 of these and revealed an additional 62 significant changes. These were mainly from predicted extracellular matrix and basement membrane proteins that are implicated in cardiac remodeling. Analysis of N-glycans released from myocardial proteins suggest...... that the observed changes were not due to significant alterations in N-glycan structures. Altered proteins included the collagen-laminin-integrin complexes and collagen assembly enzymes, cadherins, mast cell proteases, proliferation-associated secreted protein acidic and rich in cysteine, and microfibril......Extracellular and cell surface proteins are generally modified with N-linked glycans and glycopeptide enrichment is an attractive tool to analyze these proteins. The role of N-linked glycoproteins in cardiovascular disease, particularly ischemia and reperfusion injury, is poorly understood...

  11. Improvement of the ID model for quantitative network data

    DEFF Research Database (Denmark)

    Sørensen, Peter Borgen; Damgaard, Christian Frølund; Dupont, Yoko Luise

    2015-01-01

    Many interactions are often poorly registered or even unobserved in empirical quantitative networks. Hence, the output of the statistical analyses may fail to differentiate between patterns that are statistical artefacts and those which are real characteristics of ecological networks. Such artefa......Many interactions are often poorly registered or even unobserved in empirical quantitative networks. Hence, the output of the statistical analyses may fail to differentiate between patterns that are statistical artefacts and those which are real characteristics of ecological networks......)1. This presentation will illustrate the application of the ID method based on a data set which consists of counts of visits by 152 pollinator species to 16 plant species. The method is based on two definitions of the underlying probabilities for each combination of pollinator and plant species: (1), pi...... reproduce the high number of zero valued cells in the data set and mimic the sampling distribution. 1 Sørensen et al, Journal of Pollination Ecology, 6(18), 2011, pp129-139...

  12. A general mixture model for mapping quantitative trait loci by using molecular markers

    NARCIS (Netherlands)

    Jansen, R.C.

    1992-01-01

    In a segregating population a quantitative trait may be considered to follow a mixture of (normal) distributions, the mixing proportions being based on Mendelian segregation rules. A general and flexible mixture model is proposed for mapping quantitative trait loci (QTLs) by using molecular markers.

  13. Quantitative proteomics links metabolic pathways to specific developmental stages of the plant-pathogenic oomycete Phytophthora capsici.

    Science.gov (United States)

    Pang, Zhili; Srivastava, Vaibhav; Liu, Xili; Bulone, Vincent

    2017-04-01

    The oomycete Phytophthora capsici is a plant pathogen responsible for important losses to vegetable production worldwide. Its asexual reproduction plays an important role in the rapid propagation and spread of the disease in the field. A global proteomics study was conducted to compare two key asexual life stages of P. capsici, i.e. the mycelium and cysts, to identify stage-specific biochemical processes. A total of 1200 proteins was identified using qualitative and quantitative proteomics. The transcript abundance of some of the enriched proteins was also analysed by quantitative real-time polymerase chain reaction. Seventy-three proteins exhibited different levels of abundance between the mycelium and cysts. The proteins enriched in the mycelium are mainly associated with glycolysis, the tricarboxylic acid (or citric acid) cycle and the pentose phosphate pathway, providing the energy required for the biosynthesis of cellular building blocks and hyphal growth. In contrast, the proteins that are predominant in cysts are essentially involved in fatty acid degradation, suggesting that the early infection stage of the pathogen relies primarily on fatty acid degradation for energy production. The data provide a better understanding of P. capsici biology and suggest potential metabolic targets at the two different developmental stages for disease control. © 2016 BSPP AND JOHN WILEY & SONS LTD.

  14. A two-locus model of spatially varying stabilizing or directional selection on a quantitative trait.

    Science.gov (United States)

    Geroldinger, Ludwig; Bürger, Reinhard

    2014-06-01

    The consequences of spatially varying, stabilizing or directional selection on a quantitative trait in a subdivided population are studied. A deterministic two-locus two-deme model is employed to explore the effects of migration, the degree of divergent selection, and the genetic architecture, i.e., the recombination rate and ratio of locus effects, on the maintenance of genetic variation. The possible equilibrium configurations are determined as functions of the migration rate. They depend crucially on the strength of divergent selection and the genetic architecture. The maximum migration rates are investigated below which a stable fully polymorphic equilibrium or a stable single-locus polymorphism can exist. Under stabilizing selection, but with different optima in the demes, strong recombination may facilitate the maintenance of polymorphism. However usually, and in particular with directional selection in opposite direction, the critical migration rates are maximized by a concentrated genetic architecture, i.e., by a major locus and a tightly linked minor one. Thus, complementing previous work on the evolution of genetic architectures in subdivided populations subject to diversifying selection, it is shown that concentrated architectures may aid the maintenance of polymorphism. Conditions are obtained when this is the case. Finally, the dependence of the phenotypic variance, linkage disequilibrium, and various measures of local adaptation and differentiation on the parameters is elaborated. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. A quantitative risk-based model for reasoning over critical system properties

    Science.gov (United States)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  16. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  17. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  18. A quantitative evaluation of multiple biokinetic models using an assembled water phantom: A feasibility study.

    Directory of Open Access Journals (Sweden)

    Da-Ming Yeh

    Full Text Available This study examined the feasibility of quantitatively evaluating multiple biokinetic models and established the validity of the different compartment models using an assembled water phantom. Most commercialized phantoms are made to survey the imaging system since this is essential to increase the diagnostic accuracy for quality assurance. In contrast, few customized phantoms are specifically made to represent multi-compartment biokinetic models. This is because the complicated calculations as defined to solve the biokinetic models and the time-consuming verifications of the obtained solutions are impeded greatly the progress over the past decade. Nevertheless, in this work, five biokinetic models were separately defined by five groups of simultaneous differential equations to obtain the time-dependent radioactive concentration changes inside the water phantom. The water phantom was assembled by seven acrylic boxes in four different sizes, and the boxes were linked to varying combinations of hoses to signify the multiple biokinetic models from the biomedical perspective. The boxes that were connected by hoses were then regarded as a closed water loop with only one infusion and drain. 129.1±24.2 MBq of Tc-99m labeled methylene diphosphonate (MDP solution was thoroughly infused into the water boxes before gamma scanning; then the water was replaced with de-ionized water to simulate the biological removal rate among the boxes. The water was driven by an automatic infusion pump at 6.7 c.c./min, while the biological half-life of the four different-sized boxes (64, 144, 252, and 612 c.c. was 4.8, 10.7, 18.8, and 45.5 min, respectively. The five models of derived time-dependent concentrations for the boxes were estimated either by a self-developed program run in MATLAB or by scanning via a gamma camera facility. Either agreement or disagreement between the practical scanning and the theoretical prediction in five models was thoroughly discussed. The

  19. Linear regression models for quantitative assessment of left ...

    African Journals Online (AJOL)

    Changes in left ventricular structures and function have been reported in cardiomyopathies. No prediction models have been established in this environment. This study established regression models for prediction of left ventricular structures in normal subjects. A sample of normal subjects was drawn from a large urban ...

  20. Towards the quantitative evaluation of visual attention models.

    Science.gov (United States)

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Analysis of Water Conflicts across Natural and Societal Boundaries: Integration of Quantitative Modeling and Qualitative Reasoning

    Science.gov (United States)

    Gao, Y.; Balaram, P.; Islam, S.

    2009-12-01

    Water issues and problems have bewildered humankind for a long time yet a systematic approach for understanding such issues remain elusive. This is partly because many water-related problems are framed from a contested terrain in which many actors (individuals, communities, businesses, NGOs, states, and countries) compete to protect their own and often conflicting interests. We argue that origin of many water problems may be understood as a dynamic consequence of competition, interconnections, and feedback among variables in the Natural and Societal Systems (NSSs). Within the natural system, we recognize that triple constraints on water- water quantity (Q), water quality (P), and ecosystem (E)- and their interdependencies and feedback may lead to conflicts. Such inherent and multifaceted constraints of the natural water system are exacerbated often at the societal boundaries. Within the societal system, interdependencies and feedback among values and norms (V), economy (C), and governance (G) interact in various ways to create intractable contextual differences. The observation that natural and societal systems are linked is not novel. Our argument here, however, is that rigid disciplinary boundaries between these two domains will not produce solutions to the water problems we are facing today. The knowledge needed to address water problems need to go beyond scientific assessment in which societal variables (C, G, and V) are treated as exogenous or largely ignored, and policy research that does not consider the impact of natural variables (E, P, and Q) and that coupling among them. Consequently, traditional quantitative methods alone are not appropriate to address the dynamics of water conflicts, because we cannot quantify the societal variables and the exact mathematical relationships among the variables are not fully known. On the other hand, conventional qualitative study in societal domain has mainly been in the form of individual case studies and therefore

  2. Effect of Linked Rules on Business Process Model Understanding

    DEFF Research Database (Denmark)

    Wang, Wei; Indulska, Marta; Sadiq, Shazia

    2017-01-01

    Business process models are widely used in organizations by information systems analysts to represent complex business requirements and by business users to understand business operations and constraints. This understanding is extracted from graphical process models as well as business rules. Prior...

  3. A Tiered Model for Linking Students to the Community

    Science.gov (United States)

    Meyer, Laura Landry; Gerard, Jean M.; Sturm, Michael R.; Wooldridge, Deborah G.

    2016-01-01

    A tiered practice model (introductory, pre-internship, and internship) embedded in the curriculum facilitates community engagement and creates relevance for students as they pursue a professional identity in Human Development and Family Studies. The tiered model integrates high-impact teaching practices (HIP) and student engagement pedagogies…

  4. Modeling X-linked ancestral origins in multiparental populations

    NARCIS (Netherlands)

    Zheng, Chaozhi

    2015-01-01

    The models for the mosaic structure of an individual's genome from multiparental populations have been developed primarily for autosomes, whereas X chromosomes receive very little attention. In this paper, we extend our previous approach to model ancestral origin processes along two X chromosomes

  5. Hidden Markov Model for quantitative prediction of snowfall and ...

    Indian Academy of Sciences (India)

    J. Earth Syst. Sci. (2017) 126: 33 ... ogy, climate change, glaciology and crop models in agriculture. Different ... In areas where local topography strongly influences precipitation .... (vii) cloud amount, (viii) cloud type and (ix) sun shine hours.

  6. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter

    2014-03-01

    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  7. Strengthening the weak link: Built Environment modelling for loss analysis

    Science.gov (United States)

    Millinship, I.

    2012-04-01

    Methods to analyse insured losses from a range of natural perils, including pricing by primary insurers and catastrophe modelling by reinsurers, typically lack sufficient exposure information. Understanding the hazard intensity in terms of spatial severity and frequency is only the first step towards quantifying the risk of a catastrophic event. For any given event we need to know: Are any structures affected? What type of buildings are they? How much damaged occurred? How much will the repairs cost? To achieve this, detailed exposure information is required to assess the likely damage and to effectively calculate the resultant loss. Modelling exposures in the Built Environment therefore plays as important a role in understanding re/insurance risk as characterising the physical hazard. Across both primary insurance books and aggregated reinsurance portfolios, the location of a property (a risk) and its monetary value is typically known. Exactly what that risk is in terms of detailed property descriptors including structure type and rebuild cost - and therefore its vulnerability to loss - is often omitted. This data deficiency is a primary source of variations between modelled losses and the actual claims value. Built Environment models are therefore required at a high resolution to describe building attributes that relate vulnerability to property damage. However, national-scale household-level datasets are often not computationally practical in catastrophe models and data must be aggregated. In order to provide more accurate risk analysis, we have developed and applied a methodology for Built Environment modelling for incorporation into a range of re/insurance applications, including operational models for different international regions and different perils and covering residential, commercial and industry exposures. Illustrated examples are presented, including exposure modelling suitable for aggregated reinsurance analysis for the UK and bespoke high resolution

  8. Modeling the video distribution link in the Next Generation Optical Access Networks

    DEFF Research Database (Denmark)

    Amaya, F.; Cárdenas, A.; Tafur Monroy, Idelfonso

    2011-01-01

    In this work we present a model for the design and optimization of the video distribution link in the next generation optical access network. We analyze the video distribution performance in a SCM-WDM link, including the noise, the distortion and the fiber optic nonlinearities. Additionally, we...... consider in the model the effect of distributed Raman amplification, used to extent the capacity and the reach of the optical link. In the model, we use the nonlinear Schrödinger equation with the purpose to obtain capacity limitations and design constrains of the next generation optical access networks....

  9. Modeling the video distribution link in the Next Generation Optical Access Networks

    International Nuclear Information System (INIS)

    Amaya, F; Cardenas, A; Tafur, I

    2011-01-01

    In this work we present a model for the design and optimization of the video distribution link in the next generation optical access network. We analyze the video distribution performance in a SCM-WDM link, including the noise, the distortion and the fiber optic nonlinearities. Additionally, we consider in the model the effect of distributed Raman amplification, used to extent the capacity and the reach of the optical link. In the model, we use the nonlinear Schroedinger equation with the purpose to obtain capacity limitations and design constrains of the next generation optical access networks.

  10. Comparison of Methods for Modeling a Hydraulic Loader Crane With Flexible Translational Links

    DEFF Research Database (Denmark)

    Pedersen, Henrik Clemmensen; Andersen, Torben O.; Nielsen, Brian K.

    2015-01-01

    not hold for translational links. Hence, special care has to be taken when including flexible translational links. In the current paper, different methods for modeling a hydraulic loader crane with a telescopic arm are investigated and compared using both the finite segment (FS) and AMs method...

  11. Modeling and control of lateral vibration of an axially translating flexible link

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Heon Seop; Rhim, Sung Soo [Kyung Hee University, Yongin (Korea, Republic of)

    2015-01-15

    Manipulators used for the transportation of large panel-shape payloads often adopt long and slender links (or forks) with translational joins to carry the payloads. As the size of the payload increases, the length of the links also increases to hold the payload securely. The increased length of the link inevitably amplifies the effect of the flexure in the link. Intuitively, the translational motion of the link in its longitudinal direction should have no effect on the lateral vibration of the link because of the orthogonality between the direction of the translational motion and the lateral vibration. If, however, the link was flexible and translated horizontally (perpendicular to the gravitational field) the asymmetric deflection of the link caused by gravity would break the orthogonality between the two directions, and the longitudinal motion of the link would excite lateral motion in the link. In this paper, the lateral oscillatory motion of the flexible link in a large-scale solar cell panel handling robot is investigated where the links carry the panel in its longitudinal direction. The Newtonian approach in conjunction with the assumed modes method is used for derivation of the equation of motion for the flexible forks where non-zero control force is applied at the base of the link. The analysis illustrates the effect of longitudinal motion on the lateral vibration and dynamic stiffening effect (variation of the natural frequency) of the link due to the translational velocity. Lateral vibration behavior is simulated using the derived equations of the motion. A robust vibration control scheme, the input shaping filter technique, is implemented on the model and the effectiveness of the scheme is verified numerically.

  12. Modeling and control of lateral vibration of an axially translating flexible link

    International Nuclear Information System (INIS)

    Shin, Heon Seop; Rhim, Sung Soo

    2015-01-01

    Manipulators used for the transportation of large panel-shape payloads often adopt long and slender links (or forks) with translational joins to carry the payloads. As the size of the payload increases, the length of the links also increases to hold the payload securely. The increased length of the link inevitably amplifies the effect of the flexure in the link. Intuitively, the translational motion of the link in its longitudinal direction should have no effect on the lateral vibration of the link because of the orthogonality between the direction of the translational motion and the lateral vibration. If, however, the link was flexible and translated horizontally (perpendicular to the gravitational field) the asymmetric deflection of the link caused by gravity would break the orthogonality between the two directions, and the longitudinal motion of the link would excite lateral motion in the link. In this paper, the lateral oscillatory motion of the flexible link in a large-scale solar cell panel handling robot is investigated where the links carry the panel in its longitudinal direction. The Newtonian approach in conjunction with the assumed modes method is used for derivation of the equation of motion for the flexible forks where non-zero control force is applied at the base of the link. The analysis illustrates the effect of longitudinal motion on the lateral vibration and dynamic stiffening effect (variation of the natural frequency) of the link due to the translational velocity. Lateral vibration behavior is simulated using the derived equations of the motion. A robust vibration control scheme, the input shaping filter technique, is implemented on the model and the effectiveness of the scheme is verified numerically.

  13. Linking Time and Space Scales in Distributed Hydrological Modelling - a case study for the VIC model

    Science.gov (United States)

    Melsen, Lieke; Teuling, Adriaan; Torfs, Paul; Zappa, Massimiliano; Mizukami, Naoki; Clark, Martyn; Uijlenhoet, Remko

    2015-04-01

    One of the famous paradoxes of the Greek philosopher Zeno of Elea (~450 BC) is the one with the arrow: If one shoots an arrow, and cuts its motion into such small time steps that at every step the arrow is standing still, the arrow is motionless, because a concatenation of non-moving parts does not create motion. Nowadays, this reasoning can be refuted easily, because we know that motion is a change in space over time, which thus by definition depends on both time and space. If one disregards time by cutting it into infinite small steps, motion is also excluded. This example shows that time and space are linked and therefore hard to evaluate separately. As hydrologists we want to understand and predict the motion of water, which means we have to look both in space and in time. In hydrological models we can account for space by using spatially explicit models. With increasing computational power and increased data availability from e.g. satellites, it has become easier to apply models at a higher spatial resolution. Increasing the resolution of hydrological models is also labelled as one of the 'Grand Challenges' in hydrology by Wood et al. (2011) and Bierkens et al. (2014), who call for global modelling at hyperresolution (~1 km and smaller). A literature survey on 242 peer-viewed articles in which the Variable Infiltration Capacity (VIC) model was used, showed that the spatial resolution at which the model is applied has decreased over the past 17 years: From 0.5 to 2 degrees when the model was just developed, to 1/8 and even 1/32 degree nowadays. On the other hand the literature survey showed that the time step at which the model is calibrated and/or validated remained the same over the last 17 years; mainly daily or monthly. Klemeš (1983) stresses the fact that space and time scales are connected, and therefore downscaling the spatial scale would also imply downscaling of the temporal scale. Is it worth the effort of downscaling your model from 1 degree to 1

  14. A robust quantitative near infrared modeling approach for blend monitoring.

    Science.gov (United States)

    Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A

    2018-01-30

    This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.

  15. Modeling and control of a hydraulically actuated flexible-prismatic link robot

    International Nuclear Information System (INIS)

    Love, L.; Kress, R.; Jansen, J.

    1996-12-01

    Most of the research related to flexible link manipulators to date has focused on single link, fixed length, single plane of vibration test beds. In addition, actuation has been predominantly based upon electromagnetic motors. Ironically, these elements are rarely found in the existing industrial long reach systems. This manuscript describes a new hydraulically actuated, long reach manipulator with a flexible prismatic link at Oak Ridge National Laboratory (ORNL). Focus is directed towards both modeling and control of hydraulic actuators as well as flexible links that have variable natural frequencies

  16. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Science.gov (United States)

    2012-07-17

    ... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...] Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A... Influenza Disease Models to Quantitatively Evaluate the Benefits and Risks of Vaccines: A Technical Workshop...

  17. Identifying Quantitative Trait Loci (QTLs) and Developing Diagnostic Markers Linked to Orange Rust Resistance in Sugarcane (Saccharum spp.).

    Science.gov (United States)

    Yang, Xiping; Islam, Md S; Sood, Sushma; Maya, Stephanie; Hanson, Erik A; Comstock, Jack; Wang, Jianping

    2018-01-01

    Sugarcane ( Saccharum spp.) is an important economic crop, contributing up to 80% of table sugar used in the world and has become a promising feedstock for biofuel production. Sugarcane production has been threatened by many diseases, and fungicide applications for disease control have been opted out for sustainable agriculture. Orange rust is one of the major diseases impacting sugarcane production worldwide. Identifying quantitative trait loci (QTLs) and developing diagnostic markers are valuable for breeding programs to expedite release of superior sugarcane cultivars for disease control. In this study, an F 1 segregating population derived from a cross between two hybrid sugarcane clones, CP95-1039 and CP88-1762, was evaluated for orange rust resistance in replicated trails. Three QTLs controlling orange rust resistance in sugarcane (qORR109, qORR4 and qORR102) were identified for the first time ever, which can explain 58, 12 and 8% of the phenotypic variation, separately. We also characterized 1,574 sugarcane putative resistance ( R ) genes. These sugarcane putative R genes and simple sequence repeats in the QTL intervals were further used to develop diagnostic markers for marker-assisted selection of orange rust resistance. A PCR-based Resistance gene-derived maker, G1 was developed, which showed significant association with orange rust resistance. The putative QTLs and marker developed in this study can be effectively utilized in sugarcane breeding programs to facilitate the selection process, thus contributing to the sustainable agriculture for orange rust disease control.

  18. Exploiting linkage disequilibrium in statistical modelling in quantitative genomics

    DEFF Research Database (Denmark)

    Wang, Lei

    Alleles at two loci are said to be in linkage disequilibrium (LD) when they are correlated or statistically dependent. Genomic prediction and gene mapping rely on the existence of LD between gentic markers and causul variants of complex traits. In the first part of the thesis, a novel method...... to quantify and visualize local variation in LD along chromosomes in describet, and applied to characterize LD patters at the local and genome-wide scale in three Danish pig breeds. In the second part, different ways of taking LD into account in genomic prediction models are studied. One approach is to use...... the recently proposed antedependence models, which treat neighbouring marker effects as correlated; another approach involves use of haplotype block information derived using the program Beagle. The overall conclusion is that taking LD information into account in genomic prediction models potentially improves...

  19. The place of quantitative energy models in a prospective approach

    International Nuclear Information System (INIS)

    Taverdet-Popiolek, N.

    2009-01-01

    Futurology above all depends on having the right mind set. Gaston Berger summarizes the prospective approach in 5 five main thrusts: prepare for the distant future, be open-minded (have a systems and multidisciplinary approach), carry out in-depth analyzes (draw out actors which are really determinant or the future, as well as established shed trends), take risks (imagine risky but flexible projects) and finally think about humanity, futurology being a technique at the service of man to help him build a desirable future. On the other hand, forecasting is based on quantified models so as to deduce 'conclusions' about the future. In the field of energy, models are used to draw up scenarios which allow, for instance, measuring medium or long term effects of energy policies on greenhouse gas emissions or global welfare. Scenarios are shaped by the model's inputs (parameters, sets of assumptions) and outputs. Resorting to a model or projecting by scenario is useful in a prospective approach as it ensures coherence for most of the variables that have been identified through systems analysis and that the mind on its own has difficulty to grasp. Interpretation of each scenario must be carried out in the light o the underlying framework of assumptions (the backdrop), developed during the prospective stage. When the horizon is far away (very long-term), the worlds imagined by the futurologist contain breaks (technological, behavioural and organizational) which are hard to integrate into the models. It is here that the main limit for the use of models in futurology is located. (author)

  20. Improved Mental Acuity Forecasting with an Individualized Quantitative Sleep Model

    Directory of Open Access Journals (Sweden)

    Brent D. Winslow

    2017-04-01

    Full Text Available Sleep impairment significantly alters human brain structure and cognitive function, but available evidence suggests that adults in developed nations are sleeping less. A growing body of research has sought to use sleep to forecast cognitive performance by modeling the relationship between the two, but has generally focused on vigilance rather than other cognitive constructs affected by sleep, such as reaction time, executive function, and working memory. Previous modeling efforts have also utilized subjective, self-reported sleep durations and were restricted to laboratory environments. In the current effort, we addressed these limitations by employing wearable systems and mobile applications to gather objective sleep information, assess multi-construct cognitive performance, and model/predict changes to mental acuity. Thirty participants were recruited for participation in the study, which lasted 1 week. Using the Fitbit Charge HR and a mobile version of the automated neuropsychological assessment metric called CogGauge, we gathered a series of features and utilized the unified model of performance to predict mental acuity based on sleep records. Our results suggest that individuals poorly rate their sleep duration, supporting the need for objective sleep metrics to model circadian changes to mental acuity. Participant compliance in using the wearable throughout the week and responding to the CogGauge assessments was 80%. Specific biases were identified in temporal metrics across mobile devices and operating systems and were excluded from the mental acuity metric development. Individualized prediction of mental acuity consistently outperformed group modeling. This effort indicates the feasibility of creating an individualized, mobile assessment and prediction of mental acuity, compatible with the majority of current mobile devices.

  1. First principles pharmacokinetic modeling: A quantitative study on Cyclosporin

    DEFF Research Database (Denmark)

    Mošat', Andrej; Lueshen, Eric; Heitzig, Martina

    2013-01-01

    renal and hepatic clearances, elimination half-life, and mass transfer coefficients, to establish drug biodistribution dynamics in all organs and tissues. This multi-scale model satisfies first principles and conservation of mass, species and momentum.Prediction of organ drug bioaccumulation...... as a function of cardiac output, physiology, pathology or administration route may be possible with the proposed PBPK framework. Successful application of our model-based drug development method may lead to more efficient preclinical trials, accelerated knowledge gain from animal experiments, and shortened time-to-market...

  2. On the usability of quantitative modelling in operations strategy decission making

    NARCIS (Netherlands)

    Akkermans, H.A.; Bertrand, J.W.M.

    1997-01-01

    Quantitative modelling seems admirably suited to help managers in their strategic decision making on operations management issues, but in practice models are rarely used for this purpose. Investigates the reasons why, based on a detailed cross-case analysis of six cases of modelling-supported

  3. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    Science.gov (United States)

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  5. Evaluating quantitative and qualitative models: An application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  6. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  7. Linking Essential Tremor to the Cerebellum-Animal Model Evidence.

    Science.gov (United States)

    Handforth, Adrian

    2016-06-01

    In this review, we hope to stimulate interest in animal models as opportunities to understand tremor mechanisms within the cerebellar system. We begin by considering the harmaline model of essential tremor (ET), which has ET-like anatomy and pharmacology. Harmaline induces the inferior olive (IO) to burst fire rhythmically, recruiting rhythmic activity in Purkinje cells (PCs) and deep cerebellar nuclei (DCN). This model has fostered the IO hypothesis of ET, which postulates that factors that promote excess IO, and hence PC complex spike synchrony, also promote tremor. In contrast, the PC hypothesis postulates that partial PC cell loss underlies tremor of ET. We describe models in which chronic partial PC loss is associated with tremor, such as the Weaver mouse, and others with PC loss that do not show tremor, such as the Purkinje cell degeneration mouse. We postulate that partial PC loss with tremor is associated with terminal axonal sprouting. We then discuss tremor that occurs with large lesions of the cerebellum in primates. This tremor has variable frequency and is an ataxic tremor not related to ET. Another tremor type that is not likely related to ET is tremor in mice with mutations that cause prolonged synaptic GABA action. This tremor is probably due to mistiming within cerebellar circuitry. In the final section, we catalog tremor models involving neurotransmitter and ion channel perturbations. Some appear to be related to the IO hypothesis of ET, while in others tremor may be ataxic or due to mistiming. In summary, we offer a tentative framework for classifying animal action tremor, such that various models may be considered potentially relevant to ET, subscribing to IO or PC hypotheses, or not likely relevant, as with mistiming or ataxic tremor. Considerable further research is needed to elucidate the mechanisms of tremor in animal models.

  8. Quantitative experimental modelling of fragmentation during explosive volcanism

    Science.gov (United States)

    Thordén Haug, Ø.; Galland, O.; Gisler, G.

    2012-04-01

    Phreatomagmatic eruptions results from the violent interaction between magma and an external source of water, such as ground water or a lake. This interaction causes fragmentation of the magma and/or the host rock, resulting in coarse-grained (lapilli) to very fine-grained (ash) material. The products of phreatomagmatic explosions are classically described by their fragment size distribution, which commonly follows power laws of exponent D. Such descriptive approach, however, considers the final products only and do not provide information on the dynamics of fragmentation. The aim of this contribution is thus to address the following fundamental questions. What are the physics that govern fragmentation processes? How fragmentation occurs through time? What are the mechanisms that produce power law fragment size distributions? And what are the scaling laws that control the exponent D? To address these questions, we performed a quantitative experimental study. The setup consists of a Hele-Shaw cell filled with a layer of cohesive silica flour, at the base of which a pulse of pressurized air is injected, leading to fragmentation of the layer of flour. The fragmentation process is monitored through time using a high-speed camera. By varying systematically the air pressure (P) and the thickness of the flour layer (h) we observed two morphologies of fragmentation: "lift off" where the silica flour above the injection inlet is ejected upwards, and "channeling" where the air pierces through the layer along sub-vertical conduit. By building a phase diagram, we show that the morphology is controlled by P/dgh, where d is the density of the flour and g is the gravitational acceleration. To quantify the fragmentation process, we developed a Matlab image analysis program, which calculates the number and sizes of the fragments, and so the fragment size distribution, during the experiments. The fragment size distributions are in general described by power law distributions of

  9. Quantitative variation in obesity-related traits and insulin precursors linked to the OB gene region on human chromosome 7

    Energy Technology Data Exchange (ETDEWEB)

    Duggirala, R.; Stern, M.P.; Reinhart, L.J. [Univ. of Texas Health Science Center, San Antonio, TX (United States)] [and others

    1996-09-01

    Despite the evidence that human obesity has strong genetic determinants, efforts at identifying specific genes that influence human obesity have largely been unsuccessful. Using the sibship data obtained from 32 low-income Mexican American pedigrees ascertained on a type II diabetic proband and a multipoint variance-components method, we tested for linkage between various obesity-related traits plus associated metabolic traits and 15 markers on human chromosome 7. We found evidence for linkage between markers in the OB gene region and various traits, as follows: D7S514 and extremity skinfolds (LOD = 3.1), human carboxypeptidase A1 (HCPA1) and 32,33-split proinsulin level (LOD = 4.2), and HCPA1 and proinsulin level (LOD = 3.2). A putative susceptibility locus linked to the marker D7S514 explained 56% of the total phenotypic variation in extremity skinfolds. Variation at the HCPA1 locus explained 64% of phenotypic variation in proinsulin level and {approximately}73% of phenotypic variation in split proinsulin concentration, respectively. Weaker evidence for linkage to several other obesity-related traits (e.g., waist circumference, body-mass index, fat mass by bioimpedance, etc.) was observed for a genetic location, which is {approximately}15 cM telomeric to OB. In conclusion, our study reveals that the OB region plays a significant role in determining the phenotypic variation of both insulin precursors and obesity-related traits, at least in Mexican Americans. 66 refs., 3 figs., 4 tabs.

  10. Quantitative analysis of intramolecular exciplex and electron transfer in a double-linked zinc porphyrin-fullerene dyad.

    Science.gov (United States)

    Al-Subi, Ali Hanoon; Niemi, Marja; Tkachenko, Nikolai V; Lemmetyinen, Helge

    2012-10-04

    Photoinduced charge transfer in a double-linked zinc porphyrin-fullerene dyad is studied. When the dyad is excited at the absorption band of the charge-transfer complex (780 nm), an intramolecular exciplex is formed, followed by the complete charge separated (CCS) state. By analyzing the results obtained from time-resolved transient absorption and emission decay measurements in a range of solvents with different polarities, we derived a dependence between the observable lifetimes and internal parameters controlling the reaction rate constants based on the semiquantum Marcus electron-transfer theory. The critical value of the solvent polarity was found to be ε(r) ≈ 6.5: in solvents with higher dielectric constants, the energy of the CCS state is lower than that of the exciplex and the relaxation takes place via the CCS state predominantly, whereas in solvents with lower polarities the energy of the CCS state is higher and the exciplex relaxes directly to the ground state. In solvents with moderate polarities the exciplex and the CCS state are in equilibrium and cannot be separated spectroscopically. The degree of the charge shift in the exciplex relative to that in the CCS state was estimated to be 0.55 ± 0.02. The electronic coupling matrix elements for the charge recombination process and for the direct relaxation of the exciplex to the ground state were found to be 0.012 ± 0.001 and 0.245 ± 0.022 eV, respectively.

  11. Quantitative Comparison Between Crowd Models for Evacuation Planning and Evaluation

    NARCIS (Netherlands)

    Viswanathan, V.; Lee, C.E.; Lees, M.H.; Cheong, S.A.; Sloot, P.M.A.

    2014-01-01

    Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we

  12. Quantitative Research: A Dispute Resolution Model for FTC Advertising Regulation.

    Science.gov (United States)

    Richards, Jef I.; Preston, Ivan L.

    Noting the lack of a dispute mechanism for determining whether an advertising practice is truly deceptive without generating the costs and negative publicity produced by traditional Federal Trade Commission (FTC) procedures, this paper proposes a model based upon early termination of the issues through jointly commissioned behavioral research. The…

  13. Characterizing Cognitive Aging in Humans with Links to Animal Models

    Directory of Open Access Journals (Sweden)

    Gene E Alexander

    2012-09-01

    Full Text Available With the population of older adults expected to grow rapidly over the next two decades, it has become increasingly important to advance research efforts to elucidate the mechanisms associated with cognitive aging, with the ultimate goal of developing effective interventions and prevention therapies. Although there has been a vast research literature on the use of cognitive tests to evaluate the effects of aging and age-related neurodegenerative disease, the need for a set of standardized measures to characterize the cognitive profiles specific to healthy aging has been widely recognized. Here we present a review of selected methods and approaches that have been applied in human research studies to evaluate the effects of aging on cognition, including executive function, memory, processing speed, language, and visuospatial function. The effects of healthy aging on each of these cognitive domains are discussed with examples from cognitive/experimental and clinical/neuropsychological approaches. Further, we consider those measures that have clear conceptual and methodological links to tasks currently in use for non-human animal studies of aging, as well as those that have the potential for translation to animal aging research. Having a complementary set of measures to assess the cognitive profiles of healthy aging across species provides a unique opportunity to enhance research efforts for cross-sectional, longitudinal, and intervention studies of cognitive aging. Taking a cross-species, translational approach will help to advance cognitive aging research, leading to a greater understanding of associated neurobiological mechanisms with the potential for developing effective interventions and prevention therapies for age-related cognitive decline.

  14. New ways of supporting decision making: Linking qualitative storylines with quantitative modelling

    NARCIS (Netherlands)

    Delden, van H.; Hagen - Zanker, A.H.; Geertman, S; Stillwell, J

    2009-01-01

    To explore how people will live and work in Europe, what the landscape will look like and what the environmental consequences will be in some 35 years from now, the PRELUDE project (EEA 2007) of the European Environment Agency developed five different land-use scenarios for Europe. The project was

  15. Linking plate reconstructions with deforming lithosphere to geodynamic models

    Science.gov (United States)

    Müller, R. D.; Gurnis, M.; Flament, N.; Seton, M.; Spasojevic, S.; Williams, S.; Zahirovic, S.

    2011-12-01

    While global computational models are rapidly advancing in terms of their capabilities, there is an increasing need for assimilating observations into these models and/or ground-truthing model outputs. The open-source and platform independent GPlates software fills this gap. It was originally conceived as a tool to interactively visualize and manipulate classical rigid plate reconstructions and represent them as time-dependent topological networks of editable plate boundaries. The user can export time-dependent plate velocity meshes that can be used either to define initial surface boundary conditions for geodynamic models or alternatively impose plate motions throughout a geodynamic model run. However, tectonic plates are not rigid, and neglecting plate deformation, especially that of the edges of overriding plates, can result in significant misplacing of plate boundaries through time. A new, substantially re-engineered version of GPlates is now being developed that allows an embedding of deforming plates into topological plate boundary networks. We use geophysical and geological data to define the limit between rigid and deforming areas, and the deformation history of non-rigid blocks. The velocity field predicted by these reconstructions can then be used as a time-dependent surface boundary condition in regional or global 3-D geodynamic models, or alternatively as an initial boundary condition for a particular plate configuration at a given time. For time-dependent models with imposed plate motions (e.g. using CitcomS) we incorporate the continental lithosphere by embedding compositionally distinct crust and continental lithosphere within the thermal lithosphere. We define three isostatic columns of different thickness and buoyancy based on the tectonothermal age of the continents: Archean, Proterozoic and Phanerozoic. In the fourth isostatic column, the oceans, the thickness of the thermal lithosphere is assimilated using a half-space cooling model. We also

  16. Linking Aerosol Optical Properties Between Laboratory, Field, and Model Studies

    Science.gov (United States)

    Murphy, S. M.; Pokhrel, R. P.; Foster, K. A.; Brown, H.; Liu, X.

    2017-12-01

    The optical properties of aerosol emissions from biomass burning have a significant impact on the Earth's radiative balance. Based on measurements made during the Fourth Fire Lab in Missoula Experiment, our group published a series of parameterizations that related optical properties (single scattering albedo and absorption due to brown carbon at multiple wavelengths) to the elemental to total carbon ratio of aerosols emitted from biomass burning. In this presentation, the ability of these parameterizations to simulate the optical properties of ambient aerosol is assessed using observations collected in 2017 from our mobile laboratory chasing wildfires in the Western United States. The ambient data includes measurements of multi-wavelength absorption, scattering, and extinction, size distribution, chemical composition, and volatility. In addition to testing the laboratory parameterizations, this combination of measurements allows us to assess the ability of core-shell Mie Theory to replicate observations and to assess the impact of brown carbon and mixing state on optical properties. Finally, both laboratory and ambient data are compared to the optical properties generated by a prominent climate model (Community Earth System Model (CESM) coupled with the Community Atmosphere Model (CAM 5)). The discrepancies between lab observations, ambient observations and model output will be discussed.

  17. First attempts of linking modelling, Postharvest behaviour and Melon Genetics

    NARCIS (Netherlands)

    Tijskens, L.M.M.; Santos, Don N.; Obando-Ulloa, J.M.; Moreno, E.; Schouten, R.E.

    2008-01-01

    The onset of climacteric is associated with the end of melon fruit shelf-life. The aim of this research was to develop practical and applicable models of fruit ripening changes (hardness, moisture loss) also able to discriminate between climacteric and non-climacteric behaviour. The decrease in

  18. Quantitative properties of clustering within modern microscopic nuclear models

    International Nuclear Information System (INIS)

    Volya, A.; Tchuvil’sky, Yu. M.

    2016-01-01

    A method for studying cluster spectroscopic properties of nuclear fragmentation, such as spectroscopic amplitudes, cluster form factors, and spectroscopic factors, is developed on the basis of modern precision nuclear models that take into account the mixing of large-scale shell-model configurations. Alpha-cluster channels are considered as an example. A mathematical proof of the need for taking into account the channel-wave-function renormalization generated by exchange terms of the antisymmetrization operator (Fliessbach effect) is given. Examples where this effect is confirmed by a high quality of the description of experimental data are presented. By and large, the method in question extends substantially the possibilities for studying clustering phenomena in nuclei and for improving the quality of their description.

  19. Quantitative Risk Modeling of Fire on the International Space Station

    Science.gov (United States)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  20. Demonstration of Linked UAV Observations and Atmospheric Model Predictions in Chem/Bio Attack Response

    National Research Council Canada - National Science Library

    Davidson, Kenneth

    2003-01-01

    ... meteorological data, and the means for linking the UAV data to real-time dispersion prediction. The primary modeling effort focused on an adaptation of the 'Wind On Constant Streamline Surfaces...

  1. Linking the Pilot Structural Model and Pilot Workload

    Science.gov (United States)

    Bachelder, Edward; Hess, Ronald; Aponso, Bimal; Godfroy-Cooper, Martine

    2018-01-01

    Behavioral models are developed that closely reproduced pulsive control response of two pilots using markedly different control techniques while conducting a tracking task. An intriguing find was that the pilots appeared to: 1) produce a continuous, internally-generated stick signal that they integrated in time; 2) integrate the actual stick position; and 3) compare the two integrations to either issue or cease a pulse command. This suggests that the pilots utilized kinesthetic feedback in order to sense and integrate stick position, supporting the hypothesis that pilots can access and employ the proprioceptive inner feedback loop proposed by Hess's pilot Structural Model. A Pilot Cost Index was developed, whose elements include estimated workload, performance, and the degree to which the pilot employs kinesthetic feedback. Preliminary results suggest that a pilot's operating point (parameter values) may be based on control style and index minimization.

  2. An integrative model linking feedback environment and organizational citizenship behavior.

    Science.gov (United States)

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed.

  3. Links between fluid mechanics and quantum mechanics: a model for information in economics?

    Science.gov (United States)

    Haven, Emmanuel

    2016-05-28

    This paper tallies the links between fluid mechanics and quantum mechanics, and attempts to show whether those links can aid in beginning to build a formal template which is usable in economics models where time is (a)symmetric and memory is absent or present. An objective of this paper is to contemplate whether those formalisms can allow us to model information in economics in a novel way. © 2016 The Author(s).

  4. Interleukin-1 may link helplessness-hopelessness with cancer progression: A proposed model

    OpenAIRE

    Argaman, M; Gidron, Y; Ariad, S

    2005-01-01

    A model of the relations between psychological factors and cancer progression should include brain and systemic components and their link with critical cellular stages in cancer progression. We present a psychoneuroimmunological (PNI) model that links helplessness-hopelessness (HH) with cancer progression via interleukin-1β (IL-1β). IL-1β was elevated in the brain following exposure to inescapable shock, and HH was minimized by antagonizing cerebral IL-1β. Elevated cerebral IL-1β increased ca...

  5. Calibration of Linked Hydrodynamic and Water Quality Model for Santa Margarita Lagoon

    Science.gov (United States)

    2016-07-01

    was used to drive the transport and water quality kinetics for the simulation of 2007–2009. The sand berm, which controlled the opening/closure of...TECHNICAL REPORT 3015 July 2016 Calibration of Linked Hydrodynamic and Water Quality Model for Santa Margarita Lagoon Final Report Pei...Linked Hydrodynamic and Water Quality Model for Santa Margarita Lagoon Final Report Pei-Fang Wang Chuck Katz Ripan Barua SSC Pacific James

  6. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  7. Quantitative modeling of selective lysosomal targeting for drug design

    DEFF Research Database (Denmark)

    Trapp, Stefan; Rosania, G.; Horobin, R.W.

    2008-01-01

    log K ow. These findings were validated with experimental results and by a comparison to the properties of antimalarial drugs in clinical use. For ten active compounds, nine were predicted to accumulate to a greater extent in lysosomes than in other organelles, six of these were in the optimum range...... predicted by the model and three were close. Five of the antimalarial drugs were lipophilic weak dibasic compounds. The predicted optimum properties for a selective accumulation of weak bivalent bases in lysosomes are consistent with experimental values and are more accurate than any prior calculation...

  8. Qualitative to quantitative : linked trajectory of method triangulation in a study on HIV/AIDS in Goa, India

    NARCIS (Netherlands)

    Bailey, Ajay; Hutter, Inge

    2008-01-01

    With 3.1 million people estimated to be living with HIV/AIDS in India and 39.5 million people globally, the epidemic has posed academics the challenge of identifying behaviours and their underlying beliefs in the effort to reduce the risk of HIV transmission. The Health Belief Model (HBM) is

  9. Modeling Prairie Pothole Lakes: Linking Satellite Observation and Calibration (Invited)

    Science.gov (United States)

    Schwartz, F. W.; Liu, G.; Zhang, B.; Yu, Z.

    2009-12-01

    This paper examines the response of a complex lake wetland system to variations in climate. The focus is on the lakes and wetlands of the Missouri Coteau, which is part of the larger Prairie Pothole Region of the Central Plains of North America. Information on lake size was enumerated from satellite images, and yielded power law relationships for different hydrological conditions. More traditional lake-stage data were made available to us from the USGS Cottonwood Lake Study Site in North Dakota. A Probabilistic Hydrologic Model (PHM) was developed to simulate lake complexes comprised of tens-of-thousands or more individual closed-basin lakes and wetlands. What is new about this model is a calibration scheme that utilizes remotely-sensed data on lake area as well as stage data for individual lakes. Some ¼ million individual data points are used within a Genetic Algorithm to calibrate the model by comparing the simulated results with observed lake area-frequency power law relationships derived from Landsat images and water depths from seven individual lakes and wetlands. The simulated lake behaviors show good agreement with the observations under average, dry, and wet climatic conditions. The calibrated model is used to examine the impact of climate variability on a large lake complex in ND, in particular, the “Dust Bowl Drought” 1930s. This most famous drought of the 20th Century devastated the agricultural economy of the Great Plains with health and social impacts lingering for years afterwards. Interestingly, the drought of 1930s is unremarkable in relation to others of greater intensity and frequency before AD 1200 in the Great Plains. Major droughts and deluges have the ability to create marked variability of the power law function (e.g. up to one and a half orders of magnitude variability from the extreme Dust Bowl Drought to the extreme 1993-2001 deluge). This new probabilistic modeling approach provides a novel tool to examine the response of the

  10. Linking seasonal climate forecasts with crop models in Iberian Peninsula

    Science.gov (United States)

    Capa, Mirian; Ines, Amor; Baethgen, Walter; Rodriguez-Fonseca, Belen; Han, Eunjin; Ruiz-Ramos, Margarita

    2015-04-01

    Translating seasonal climate forecasts into agricultural production forecasts could help to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse conditions. In this study, we use seasonal rainfall forecasts and crop models to improve predictability of wheat yield in the Iberian Peninsula (IP). Additionally, we estimate economic margins and production risks associated with extreme scenarios of seasonal rainfall forecast. This study evaluates two methods for disaggregating seasonal climate forecasts into daily weather data: 1) a stochastic weather generator (CondWG), and 2) a forecast tercile resampler (FResampler). Both methods were used to generate 100 (with FResampler) and 110 (with CondWG) weather series/sequences for three scenarios of seasonal rainfall forecasts. Simulated wheat yield is computed with the crop model CERES-wheat (Ritchie and Otter, 1985), which is included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at two locations in northeastern Spain where the crop model was calibrated and validated with independent field data. Once simulated yields were obtained, an assessment of farmer's gross margin for different seasonal climate forecasts was accomplished to estimate production risks under different climate scenarios. This methodology allows farmers to assess the benefits and risks of a seasonal weather forecast in IP prior to the crop growing season. The results of this study may have important implications on both, public (agricultural planning) and private (decision support to farmers, insurance companies) sectors. Acknowledgements Research by M. Capa-Morocho has been partly supported by a PICATA predoctoral fellowship of the Moncloa Campus of International Excellence (UCM-UPM) and MULCLIVAR project (CGL2012-38923-C02-02) References Hoogenboom, G. et al., 2010. The Decision

  11. Linking effort and fishing mortality in a mixed fisheries model

    DEFF Research Database (Denmark)

    Thøgersen, Thomas Talund; Hoff, Ayoe; Frost, Hans Staby

    2012-01-01

    in fish stocks has led to overcapacity in many fisheries, leading to incentives for overfishing. Recent research has shown that the allocation of effort among fleets can play an important role in mitigating overfishing when the targeting covers a range of species (multi-species—i.e., so-called mixed...... fisheries), while simultaneously optimising the overall economic performance of the fleets. The so-called FcubEcon model, in particular, has elucidated both the biologically and economically optimal method for allocating catches—and thus effort—between fishing fleets, while ensuring that the quotas...

  12. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  13. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  14. A quantitative and dynamic model for plant stem cell regulation.

    Directory of Open Access Journals (Sweden)

    Florian Geier

    Full Text Available Plants maintain pools of totipotent stem cells throughout their entire life. These stem cells are embedded within specialized tissues called meristems, which form the growing points of the organism. The shoot apical meristem of the reference plant Arabidopsis thaliana is subdivided into several distinct domains, which execute diverse biological functions, such as tissue organization, cell-proliferation and differentiation. The number of cells required for growth and organ formation changes over the course of a plants life, while the structure of the meristem remains remarkably constant. Thus, regulatory systems must be in place, which allow for an adaptation of cell proliferation within the shoot apical meristem, while maintaining the organization at the tissue level. To advance our understanding of this dynamic tissue behavior, we measured domain sizes as well as cell division rates of the shoot apical meristem under various environmental conditions, which cause adaptations in meristem size. Based on our results we developed a mathematical model to explain the observed changes by a cell pool size dependent regulation of cell proliferation and differentiation, which is able to correctly predict CLV3 and WUS over-expression phenotypes. While the model shows stem cell homeostasis under constant growth conditions, it predicts a variation in stem cell number under changing conditions. Consistent with our experimental data this behavior is correlated with variations in cell proliferation. Therefore, we investigate different signaling mechanisms, which could stabilize stem cell number despite variations in cell proliferation. Our results shed light onto the dynamic constraints of stem cell pool maintenance in the shoot apical meristem of Arabidopsis in different environmental conditions and developmental states.

  15. Linking density functional and mode coupling models for supercooled liquids

    Energy Technology Data Exchange (ETDEWEB)

    Premkumar, Leishangthem; Bidhoodi, Neeta; Das, Shankar P. [School of Physical Sciences, Jawaharlal Nehru University, New Delhi 110067 (India)

    2016-03-28

    We compare predictions from two familiar models of the metastable supercooled liquid, respectively, constructed with thermodynamic and dynamic approaches. In the so called density functional theory the free energy F[ρ] of the liquid is a functional of the inhomogeneous density ρ(r). The metastable state is identified as a local minimum of F[ρ]. The sharp density profile characterizing ρ(r) is identified as a single particle oscillator, whose frequency is obtained from the parameters of the optimum density function. On the other hand, a dynamic approach to supercooled liquids is taken in the mode coupling theory (MCT) which predict a sharp ergodicity-non-ergodicity transition at a critical density. The single particle dynamics in the non-ergodic state, treated approximately, represents a propagating mode whose characteristic frequency is computed from the corresponding memory function of the MCT. The mass localization parameters in the above two models (treated in their simplest forms) are obtained, respectively, in terms of the corresponding natural frequencies depicted and are shown to have comparable magnitudes.

  16. Linking density functional and mode coupling models for supercooled liquids.

    Science.gov (United States)

    Premkumar, Leishangthem; Bidhoodi, Neeta; Das, Shankar P

    2016-03-28

    We compare predictions from two familiar models of the metastable supercooled liquid, respectively, constructed with thermodynamic and dynamic approaches. In the so called density functional theory the free energy F[ρ] of the liquid is a functional of the inhomogeneous density ρ(r). The metastable state is identified as a local minimum of F[ρ]. The sharp density profile characterizing ρ(r) is identified as a single particle oscillator, whose frequency is obtained from the parameters of the optimum density function. On the other hand, a dynamic approach to supercooled liquids is taken in the mode coupling theory (MCT) which predict a sharp ergodicity-non-ergodicity transition at a critical density. The single particle dynamics in the non-ergodic state, treated approximately, represents a propagating mode whose characteristic frequency is computed from the corresponding memory function of the MCT. The mass localization parameters in the above two models (treated in their simplest forms) are obtained, respectively, in terms of the corresponding natural frequencies depicted and are shown to have comparable magnitudes.

  17. A Dual-Process Model of the Alcohol-Behavior Link for Social Drinking

    Science.gov (United States)

    Moss, Antony C.; Albery, Ian P.

    2009-01-01

    A dual-process model of the alcohol-behavior link is presented, synthesizing 2 of the major social-cognitive approaches: expectancy and myopia theories. Substantial evidence has accrued to support both of these models, and recent neurocognitive models of the effects of alcohol on thought and behavior have provided evidence to support both as well.…

  18. Virtual Models Linked with Physical Components in Construction

    DEFF Research Database (Denmark)

    Sørensen, Kristian Birch

    The use of virtual models supports a fundamental change in the working practice of the construction industry. It changes the primary information carrier (drawings) from simple manually created depictions of the building under construction to visually realistic digital representations that also...... engineering and business development in an iterative and user needs centred system development process. The analysis of future business perspectives presents an extensive number of new working processes that can assist in solving major challenges in the construction industry. Three of the most promising...... practices and development of new ontologies. Based on the experiences gained in this PhD project, some of the important future challenges are also to show the benefits of using modern information and communication technology to practitioners in the construction industry and to communicate this knowledge...

  19. Modelling soil nitrogen: The MAGIC model with nitrogen retention linked to carbon turnover using decomposer dynamics

    International Nuclear Information System (INIS)

    Oulehle, F.; Cosby, B.J.; Wright, R.F.; Hruška, J.; Kopáček, J.; Krám, P.; Evans, C.D.; Moldan, F.

    2012-01-01

    We present a new formulation of the acidification model MAGIC that uses decomposer dynamics to link nitrogen (N) cycling to carbon (C) turnover in soils. The new model is evaluated by application to 15–30 years of water chemistry data at three coniferous-forested sites in the Czech Republic where deposition of sulphur (S) and N have decreased by >80% and 40%, respectively. Sulphate concentrations in waters have declined commensurately with S deposition, but nitrate concentrations have shown much larger decreases relative to N deposition. This behaviour is inconsistent with most conceptual models of N saturation, and with earlier versions of MAGIC which assume N retention to be a first-order function of N deposition and/or controlled by the soil C/N ratio. In comparison with earlier versions, the new formulation more correctly simulates observed short-term changes in nitrate leaching, as well as long-term retention of N in soils. The model suggests that, despite recent deposition reductions and recovery, progressive N saturation will lead to increased future nitrate leaching, ecosystem eutrophication and re-acidification. - Highlights: ► New version of the biogeochemical model MAGIC developed to simulate C/N dynamics. ► New formulation of N retention based directly on the decomposer processes. ► The new formulation simulates observed changes in nitrate leaching and in soil C/N. ► The model suggests progressive N saturation at sites examined. ► The model performance meets a growing need for realistic process-based simulations. - Process-based modelling of nitrogen dynamics and acidification in forest ecosystems.

  20. Mapping quantitative trait loci in a selectively genotyped outbred population using a mixture model approach

    NARCIS (Netherlands)

    Johnson, David L.; Jansen, Ritsert C.; Arendonk, Johan A.M. van

    1999-01-01

    A mixture model approach is employed for the mapping of quantitative trait loci (QTL) for the situation where individuals, in an outbred population, are selectively genotyped. Maximum likelihood estimation of model parameters is obtained from an Expectation-Maximization (EM) algorithm facilitated by

  1. Quantitative modelling of HDPE spurt experiments using wall slip and generalised Newtonian flow

    NARCIS (Netherlands)

    Doelder, den C.F.J.; Koopmans, R.J.; Molenaar, J.

    1998-01-01

    A quantitative model to describe capillary rheometer experiments is presented. The model can generate ‘two-branched' discontinuous flow curves and the associated pressure oscillations. Polymer compressibility in the barrel, incompressible axisymmetric generalised Newtonian flow in the die, and a

  2. Impact of implementation choices on quantitative predictions of cell-based computational models

    Science.gov (United States)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  3. Development of quantitative structure activity relationship (QSAR) model for disinfection byproduct (DBP) research: A review of methods and resources

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Baiyang, E-mail: poplar_chen@hotmail.com [Harbin Institute of Technology Shenzhen Graduate School, Shenzhen Key Laboratory of Water Resource Utilization and Environmental Pollution Control, Shenzhen 518055 (China); Zhang, Tian [Harbin Institute of Technology Shenzhen Graduate School, Shenzhen Key Laboratory of Water Resource Utilization and Environmental Pollution Control, Shenzhen 518055 (China); Bond, Tom [Department of Civil and Environmental Engineering, Imperial College, London SW7 2AZ (United Kingdom); Gan, Yiqun [Harbin Institute of Technology Shenzhen Graduate School, Shenzhen Key Laboratory of Water Resource Utilization and Environmental Pollution Control, Shenzhen 518055 (China)

    2015-12-15

    Quantitative structure–activity relationship (QSAR) models are tools for linking chemical activities with molecular structures and compositions. Due to the concern about the proliferating number of disinfection byproducts (DBPs) in water and the associated financial and technical burden, researchers have recently begun to develop QSAR models to investigate the toxicity, formation, property, and removal of DBPs. However, there are no standard procedures or best practices regarding how to develop QSAR models, which potentially limit their wide acceptance. In order to facilitate more frequent use of QSAR models in future DBP research, this article reviews the processes required for QSAR model development, summarizes recent trends in QSAR-DBP studies, and shares some important resources for QSAR development (e.g., free databases and QSAR programs). The paper follows the four steps of QSAR model development, i.e., data collection, descriptor filtration, algorithm selection, and model validation; and finishes by highlighting several research needs. Because QSAR models may have an important role in progressing our understanding of DBP issues, it is hoped that this paper will encourage their future use for this application.

  4. Development of quantitative structure activity relationship (QSAR) model for disinfection byproduct (DBP) research: A review of methods and resources

    International Nuclear Information System (INIS)

    Chen, Baiyang; Zhang, Tian; Bond, Tom; Gan, Yiqun

    2015-01-01

    Quantitative structure–activity relationship (QSAR) models are tools for linking chemical activities with molecular structures and compositions. Due to the concern about the proliferating number of disinfection byproducts (DBPs) in water and the associated financial and technical burden, researchers have recently begun to develop QSAR models to investigate the toxicity, formation, property, and removal of DBPs. However, there are no standard procedures or best practices regarding how to develop QSAR models, which potentially limit their wide acceptance. In order to facilitate more frequent use of QSAR models in future DBP research, this article reviews the processes required for QSAR model development, summarizes recent trends in QSAR-DBP studies, and shares some important resources for QSAR development (e.g., free databases and QSAR programs). The paper follows the four steps of QSAR model development, i.e., data collection, descriptor filtration, algorithm selection, and model validation; and finishes by highlighting several research needs. Because QSAR models may have an important role in progressing our understanding of DBP issues, it is hoped that this paper will encourage their future use for this application.

  5. Quantitative Model for Economic Analyses of Information Security Investment in an Enterprise Information System

    Directory of Open Access Journals (Sweden)

    Bojanc Rok

    2012-11-01

    Full Text Available The paper presents a mathematical model for the optimal security-technology investment evaluation and decision-making processes based on the quantitative analysis of security risks and digital asset assessments in an enterprise. The model makes use of the quantitative analysis of different security measures that counteract individual risks by identifying the information system processes in an enterprise and the potential threats. The model comprises the target security levels for all identified business processes and the probability of a security accident together with the possible loss the enterprise may suffer. The selection of security technology is based on the efficiency of selected security measures. Economic metrics are applied for the efficiency assessment and comparative analysis of different protection technologies. Unlike the existing models for evaluation of the security investment, the proposed model allows direct comparison and quantitative assessment of different security measures. The model allows deep analyses and computations providing quantitative assessments of different options for investments, which translate into recommendations facilitating the selection of the best solution and the decision-making thereof. The model was tested using empirical examples with data from real business environment.

  6. Quantitative modeling of gene networks of biological systems using fuzzy Petri nets and fuzzy sets

    Directory of Open Access Journals (Sweden)

    Raed I. Hamed

    2018-01-01

    Full Text Available Quantitative demonstrating of organic frameworks has turned into an essential computational methodology in the configuration of novel and investigation of existing natural frameworks. Be that as it may, active information that portrays the framework's elements should be known keeping in mind the end goal to get pertinent results with the routine displaying strategies. This information is frequently robust or even difficult to get. Here, we exhibit a model of quantitative fuzzy rational demonstrating approach that can adapt to obscure motor information and hence deliver applicable results despite the fact that dynamic information is fragmented or just dubiously characterized. Besides, the methodology can be utilized as a part of the blend with the current cutting edge quantitative demonstrating strategies just in specific parts of the framework, i.e., where the data are absent. The contextual analysis of the methodology suggested in this paper is performed on the model of nine-quality genes. We propose a kind of FPN model in light of fuzzy sets to manage the quantitative modeling of biological systems. The tests of our model appear that the model is practical and entirely powerful for information impersonation and thinking of fuzzy expert frameworks.

  7. Pricing of premiums for equity-linked life insurance based on joint mortality models

    Science.gov (United States)

    Riaman; Parmikanti, K.; Irianingsih, I.; Supian, S.

    2018-03-01

    Life insurance equity - linked is a financial product that not only offers protection, but also investment. The calculation of equity-linked life insurance premiums generally uses mortality tables. Because of advances in medical technology and reduced birth rates, it appears that the use of mortality tables is less relevant in the calculation of premiums. To overcome this problem, we use a combination mortality model which in this study is determined based on Indonesian Mortality table 2011 to determine the chances of death and survival. In this research, we use the Combined Mortality Model of the Weibull, Inverse-Weibull, and Gompertz Mortality Model. After determining the Combined Mortality Model, simulators calculate the value of the claim to be given and the premium price numerically. By calculating equity-linked life insurance premiums well, it is expected that no party will be disadvantaged due to the inaccuracy of the calculation result

  8. A Framework for Linking Population Model Development with Ecological Risk Assessment Objectives.

    Science.gov (United States)

    The value of models that link organism‐level impacts to the responses of a population in ecological risk assessments (ERAs) has been demonstrated extensively over the past few decades. There is little debate about the utility of these models to translate multiple organism&#...

  9. A Simple Forecasting Model Linking Macroeconomic Policy to Industrial Employment Demand.

    Science.gov (United States)

    Malley, James R.; Hady, Thomas F.

    A study detailed further a model linking monetary and fiscal policy to industrial employment in metropolitan and nonmetropolitan areas of four United States regions. The model was used to simulate the impacts on area and regional employment of three events in the economy: changing real gross national product (GNP) via monetary policy, holding the…

  10. Linking linear programming and spatial simulation models to predict landscape effects of forest management alternatives

    Science.gov (United States)

    Eric J. Gustafson; L. Jay Roberts; Larry A. Leefers

    2006-01-01

    Forest management planners require analytical tools to assess the effects of alternative strategies on the sometimes disparate benefits from forests such as timber production and wildlife habitat. We assessed the spatial patterns of alternative management strategies by linking two models that were developed for different purposes. We used a linear programming model (...

  11. Cumulative t-link threshold models for the genetic analysis of calving ease scores

    Directory of Open Access Journals (Sweden)

    Tempelman Robert J

    2003-09-01

    Full Text Available Abstract In this study, a hierarchical threshold mixed model based on a cumulative t-link specification for the analysis of ordinal data or more, specifically, calving ease scores, was developed. The validation of this model and the Markov chain Monte Carlo (MCMC algorithm was carried out on simulated data from normally and t4 (i.e. a t-distribution with four degrees of freedom distributed populations using the deviance information criterion (DIC and a pseudo Bayes factor (PBF measure to validate recently proposed model choice criteria. The simulation study indicated that although inference on the degrees of freedom parameter is possible, MCMC mixing was problematic. Nevertheless, the DIC and PBF were validated to be satisfactory measures of model fit to data. A sire and maternal grandsire cumulative t-link model was applied to a calving ease dataset from 8847 Italian Piemontese first parity dams. The cumulative t-link model was shown to lead to posterior means of direct and maternal heritabilities (0.40 ± 0.06, 0.11 ± 0.04 and a direct maternal genetic correlation (-0.58 ± 0.15 that were not different from the corresponding posterior means of the heritabilities (0.42 ± 0.07, 0.14 ± 0.04 and the genetic correlation (-0.55 ± 0.14 inferred under the conventional cumulative probit link threshold model. Furthermore, the correlation (> 0.99 between posterior means of sire progeny merit from the two models suggested no meaningful rerankings. Nevertheless, the cumulative t-link model was decisively chosen as the better fitting model for this calving ease data using DIC and PBF.

  12. Potential Investigation of Linking PROSAIL with the Ross-Li BRDF Model for Vegetation Characterization

    Directory of Open Access Journals (Sweden)

    Xiaoning Zhang

    2018-03-01

    Full Text Available Methods that link different models for investigating the retrieval of canopy biophysical/structural variables have been substantially adopted in the remote sensing community. To retrieve global biophysical parameters from multiangle data, the kernel-driven bidirectional reflectance distribution function (BRDF model has been widely applied to satellite multiangle observations to model (interpolate/extrapolate the bidirectional reflectance factor (BRF in an arbitrary direction of viewing and solar geometries. Such modeled BRFs, as an essential information source, are then input into an inversion procedure that is devised through a large number of simulation analyses from some widely used physical models that can generalize such an inversion relationship between the BRFs (or their simple algebraic composite and the biophysical/structural parameter. Therefore, evaluation of such a link between physical models and kernel-driven models contributes to the development of such inversion procedures to accurately retrieve vegetation properties, particularly based on the operational global BRDF parameters derived from satellite multiangle observations (e.g., MODIS. In this study, the main objective is to investigate the potential for linking a popular physical model (PROSAIL with the widely used kernel-driven Ross-Li models. To do this, the BRFs and albedo are generated by the physical PROSAIL in a forward model, and then the simulated BRFs are input into the kernel-driven BRDF model for retrieval of the BRFs and albedo in the same viewing and solar geometries. To further strengthen such an investigation, a variety of field-measured multiangle reflectances have also been used to investigate the potential for linking these two models. For simulated BRFs generated by the PROSAIL model at 659 and 865 nm, the two models are generally comparable to each other, and the resultant root mean square errors (RMSEs are 0.0092 and 0.0355, respectively, although some

  13. A quantitative model of the cardiac ventricular cell incorporating the transverse-axial tubular system

    Czech Academy of Sciences Publication Activity Database

    Pásek, Michal; Christé, G.; Šimurda, J.

    2003-01-01

    Roč. 22, č. 3 (2003), s. 355-368 ISSN 0231-5882 R&D Projects: GA ČR GP204/02/D129 Institutional research plan: CEZ:AV0Z2076919 Keywords : cardiac cell * tubular system * quantitative modelling Subject RIV: BO - Biophysics Impact factor: 0.794, year: 2003

  14. Systematic Analysis of Quantitative Logic Model Ensembles Predicts Drug Combination Effects on Cell Signaling Networks

    Science.gov (United States)

    2016-08-27

    bovine serum albumin (BSA) diluted to the amount corresponding to that in the media of the stimulated cells. Phospho-JNK comprises two isoforms whose...information accompanies this paper on the CPT: Pharmacometrics & Systems Pharmacology website (http://www.wileyonlinelibrary.com/psp4) Systematic Analysis of Quantitative Logic Model Morris et al. 553 www.wileyonlinelibrary/psp4

  15. Quantitative detection and biological propagation of scrapie seeding activity in vitro facilitate use of prions as model pathogens for disinfection.

    Directory of Open Access Journals (Sweden)

    Sandra Pritzkow

    Full Text Available Prions are pathogens with an unusually high tolerance to inactivation and constitute a complex challenge to the re-processing of surgical instruments. On the other hand, however, they provide an informative paradigm which has been exploited successfully for the development of novel broad-range disinfectants simultaneously active also against bacteria, viruses and fungi. Here we report on the development of a methodological platform that further facilitates the use of scrapie prions as model pathogens for disinfection. We used specifically adapted serial protein misfolding cyclic amplification (PMCA for the quantitative detection, on steel wires providing model carriers for decontamination, of 263K scrapie seeding activity converting normal protease-sensitive into abnormal protease-resistant prion protein. Reference steel wires carrying defined amounts of scrapie infectivity were used for assay calibration, while scrapie-contaminated test steel wires were subjected to fifteen different procedures for disinfection that yielded scrapie titre reductions of ≤10(1- to ≥10(5.5-fold. As confirmed by titration in hamsters the residual scrapie infectivity on test wires could be reliably deduced for all examined disinfection procedures, from our quantitative seeding activity assay. Furthermore, we found that scrapie seeding activity present in 263K hamster brain homogenate or multiplied by PMCA of scrapie-contaminated steel wires both triggered accumulation of protease-resistant prion protein and was further propagated in a novel cell assay for 263K scrapie prions, i.e., cerebral glial cell cultures from hamsters. The findings from our PMCA- and glial cell culture assays revealed scrapie seeding activity as a biochemically and biologically replicative principle in vitro, with the former being quantitatively linked to prion infectivity detected on steel wires in vivo. When combined, our in vitro assays provide an alternative to titrations of biological

  16. Quantitative modeling assesses the contribution of bond strengthening, rebinding and force sharing to the avidity of biomolecule interactions.

    Directory of Open Access Journals (Sweden)

    Valentina Lo Schiavo

    Full Text Available Cell adhesion is mediated by numerous membrane receptors. It is desirable to derive the outcome of a cell-surface encounter from the molecular properties of interacting receptors and ligands. However, conventional parameters such as affinity or kinetic constants are often insufficient to account for receptor efficiency. Avidity is a qualitative concept frequently used to describe biomolecule interactions: this includes incompletely defined properties such as the capacity to form multivalent attachments. The aim of this study is to produce a working description of monovalent attachments formed by a model system, then to measure and interpret the behavior of divalent attachments under force. We investigated attachments between antibody-coated microspheres and surfaces coated with sparse monomeric or dimeric ligands. When bonds were subjected to a pulling force, they exhibited both a force-dependent dissociation consistent with Bell's empirical formula and a force- and time-dependent strengthening well described by a single parameter. Divalent attachments were stronger and less dependent on forces than monovalent ones. The proportion of divalent attachments resisting a force of 30 piconewtons for at least 5 s was 3.7 fold higher than that of monovalent attachments. Quantitative modeling showed that this required rebinding, i.e. additional bond formation between surfaces linked by divalent receptors forming only one bond. Further, experimental data were compatible with but did not require stress sharing between bonds within divalent attachments. Thus many ligand-receptor interactions do not behave as single-step reactions in the millisecond to second timescale. Rather, they exhibit progressive stabilization. This explains the high efficiency of multimerized or clustered receptors even when bonds are only subjected to moderate forces. Our approach provides a quantitative way of relating binding avidity to measurable parameters including bond

  17. Quantitative Modeling Assesses the Contribution of Bond Strengthening, Rebinding and Force Sharing to the Avidity of Biomolecule Interactions

    Science.gov (United States)

    Lo Schiavo, Valentina; Robert, Philippe; Limozin, Laurent; Bongrand, Pierre

    2012-01-01

    Cell adhesion is mediated by numerous membrane receptors. It is desirable to derive the outcome of a cell-surface encounter from the molecular properties of interacting receptors and ligands. However, conventional parameters such as affinity or kinetic constants are often insufficient to account for receptor efficiency. Avidity is a qualitative concept frequently used to describe biomolecule interactions: this includes incompletely defined properties such as the capacity to form multivalent attachments. The aim of this study is to produce a working description of monovalent attachments formed by a model system, then to measure and interpret the behavior of divalent attachments under force. We investigated attachments between antibody-coated microspheres and surfaces coated with sparse monomeric or dimeric ligands. When bonds were subjected to a pulling force, they exhibited both a force-dependent dissociation consistent with Bell’s empirical formula and a force- and time-dependent strengthening well described by a single parameter. Divalent attachments were stronger and less dependent on forces than monovalent ones. The proportion of divalent attachments resisting a force of 30 piconewtons for at least 5 s was 3.7 fold higher than that of monovalent attachments. Quantitative modeling showed that this required rebinding, i.e. additional bond formation between surfaces linked by divalent receptors forming only one bond. Further, experimental data were compatible with but did not require stress sharing between bonds within divalent attachments. Thus many ligand-receptor interactions do not behave as single-step reactions in the millisecond to second timescale. Rather, they exhibit progressive stabilization. This explains the high efficiency of multimerized or clustered receptors even when bonds are only subjected to moderate forces. Our approach provides a quantitative way of relating binding avidity to measurable parameters including bond maturation, rebinding and

  18. A bibliography of terrain modeling (geomorphometry), the quantitative representation of topography: supplement 4.0

    Science.gov (United States)

    Pike, Richard J.

    2002-01-01

    Terrain modeling, the practice of ground-surface quantification, is an amalgam of Earth science, mathematics, engineering, and computer science. The discipline is known variously as geomorphometry (or simply morphometry), terrain analysis, and quantitative geomorphology. It continues to grow through myriad applications to hydrology, geohazards mapping, tectonics, sea-floor and planetary exploration, and other fields. Dating nominally to the co-founders of academic geography, Alexander von Humboldt (1808, 1817) and Carl Ritter (1826, 1828), the field was revolutionized late in the 20th Century by the computer manipulation of spatial arrays of terrain heights, or digital elevation models (DEMs), which can quantify and portray ground-surface form over large areas (Maune, 2001). Morphometric procedures are implemented routinely by commercial geographic information systems (GIS) as well as specialized software (Harvey and Eash, 1996; Köthe and others, 1996; ESRI, 1997; Drzewiecki et al., 1999; Dikau and Saurer, 1999; Djokic and Maidment, 2000; Wilson and Gallant, 2000; Breuer, 2001; Guth, 2001; Eastman, 2002). The new Earth Surface edition of the Journal of Geophysical Research, specializing in surficial processes, is the latest of many publication venues for terrain modeling. This is the fourth update of a bibliography and introduction to terrain modeling (Pike, 1993, 1995, 1996, 1999) designed to collect the diverse, scattered literature on surface measurement as a resource for the research community. The use of DEMs in science and technology continues to accelerate and diversify (Pike, 2000a). New work appears so frequently that a sampling must suffice to represent the vast literature. This report adds 1636 entries to the 4374 in the four earlier publications1. Forty-eight additional entries correct dead Internet links and other errors found in the prior listings. Chronicling the history of terrain modeling, many entries in this report predate the 1999 supplement

  19. Toward a Dexter-based model for open hypermedia: Unifying embedded references and link objects

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Trigg, Randall Hagner

    1996-01-01

    Nominated for the Doug Engelbart best paper award. This paper discusses experiences and lessons learned from the design of an open hypermedia system, one that integrates applications and data not ''owned'' by the hypermedia. The Dexter Hypertext Reference Model was used as the basis for the design....... Though our experiences were generally positive, we found the model constraining in certain ways and underdeveloped in others. For instance, Dexter argues against dangling links, but we found several situations where permitting and supporting dangling links was advisable. In Dexter, the data objects...

  20. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT

    International Nuclear Information System (INIS)

    Bindschadler, Michael; Alessio, Adam M; Modgil, Dimple; La Riviere, Patrick J; Branch, Kelley R

    2014-01-01

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g) −1 , cardiac output = 3, 5, 8 L min −1 ). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This

  1. Quantitative analysis of the synthesis and secretion of type VII collagen in cultured human dermal fibroblasts with a sensitive sandwich enzyme-linked immunoassay.

    Science.gov (United States)

    Amano, Satoshi; Ogura, Yuki; Akutsu, Nobuko; Nishiyama, Toshio

    2007-02-01

    Type VII collagen is the major component of anchoring fibrils in the epidermal basement membrane. Its expression has been analyzed by immunostaining or Northern blotting, but rarely at the protein level. In this study, we have quantitatively examined the effects of ascorbic acid and various cytokines/growth factors on the protein synthesis and secretion of type VII collagen by human dermal fibroblasts in culture, using a developed, highly sensitive sandwich enzyme-linked immunoassay with two kinds of specific monoclonal antibodies against the non-collagenous domain-1. Ascorbic acid and its derivative induced a twofold increase in type VII collagen synthesis, and markedly increased the secretion of type VII collagen into the medium when compared with the control culture. This effect was not influenced by the presence of transforming growth factor-beta1 (TGF-beta1). The synthesis of type VII collagen was elevated by TGF-beta1, platelet-derived growth factor, tumor necrosis factor-alpha, and interleukin-1beta, but not by TGF-alpha. Thus, our data indicate that the synthesis and secretion of type VII collagen in human dermal fibroblasts are regulated by ascorbate and the enhancement of type VII collagen gene expression by cytokines/growth factors is accompanied with elevated production of type VII collagen at the protein level.

  2. A simple, specific high-throughput enzyme-linked immunosorbent assay (ELISA) for quantitative determination of melatonin in cell culture medium.

    Science.gov (United States)

    Li, Ye; Cassone, Vincent M

    2015-09-01

    A simple, specific, high-throughput enzyme-linked immunosorbent assay (ELISA) for quantitative determination of melatonin was developed for directly measuring melatonin in cell culture medium with 10% FBS. This assay adopts a commercial monoclonal melatonin antibody and melatonin-HRP conjugate, so it can be applied in multiple labs rapidly with low cost compared with commercial RIA and ELISA kits. In addition, the procedure is much simpler with only four steps: 1) sample/conjugate incubation, 2) plate washing, 3) TMB color reaction and 4) reading of results. The standards of the assay cover a wide working range from 100 pg/mL to 10 ng/mL. The sensitivity was 68 pg/mL in cell culture medium with 10% FBS and 26 pg/mL in PBS with as little as 25 μL sample volume. The recovery of melatonin from cell culture medium was 101.0%. The principal cross-reacting compound was 5-methoxytryptophol (0.1%). The variation coefficients of the assay, within and between runs, ranged between 6.68% and 15.76% in cell culture medium. The mean linearity of a series diluted cell culture medium sample was 105% (CV=5%), ranging between 98% and 111%, y=5.5263x+0.0646, R(2)=0.99. The assay enables small research and teaching labs to reliably measure this important neurohormone. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Instantaneous thermal modeling of the DC-link capacitor in PhotoVoltaic systems

    DEFF Research Database (Denmark)

    Yang, Yongheng; Ma, Ke; Wang, Huai

    2015-01-01

    , instantaneous thermal modeling approaches considering mission profiles for the DC-link capacitor in single-phase PV systems are explored in this paper. These thermal modelling approaches are based on: a) fast Fourier transform, b) look-up tables, and c) ripple current reconstruction. Moreover, the thermal...... grid-connected PV system have been adopted to demonstrate a look-up table based modelling approach, where real-field daily ambient conditions are considered....... modelling approaches for the DC-link capacitors take into account the instantaneous thermal characteristics, which are more challenging to the capacitor reliability during operation. Such instantaneous thermal modeling approaches enable a translation of instantaneous capacitor power losses to capacitor...

  4. Bayesian inference in an item response theory model with a generalized student t link function

    Science.gov (United States)

    Azevedo, Caio L. N.; Migon, Helio S.

    2012-10-01

    In this paper we introduce a new item response theory (IRT) model with a generalized Student t-link function with unknown degrees of freedom (df), named generalized t-link (GtL) IRT model. In this model we consider only the difficulty parameter in the item response function. GtL is an alternative to the two parameter logit and probit models, since the degrees of freedom (df) play a similar role to the discrimination parameter. However, the behavior of the curves of the GtL is different from those of the two parameter models and the usual Student t link, since in GtL the curve obtained from different df's can cross the probit curves in more than one latent trait level. The GtL model has similar proprieties to the generalized linear mixed models, such as the existence of sufficient statistics and easy parameter interpretation. Also, many techniques of parameter estimation, model fit assessment and residual analysis developed for that models can be used for the GtL model. We develop fully Bayesian estimation and model fit assessment tools through a Metropolis-Hastings step within Gibbs sampling algorithm. We consider a prior sensitivity choice concerning the degrees of freedom. The simulation study indicates that the algorithm recovers all parameters properly. In addition, some Bayesian model fit assessment tools are considered. Finally, a real data set is analyzed using our approach and other usual models. The results indicate that our model fits the data better than the two parameter models.

  5. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....

  6. Quantitative analysis of CT brain images: a statistical model incorporating partial volume and beam hardening effects

    International Nuclear Information System (INIS)

    McLoughlin, R.F.; Ryan, M.V.; Heuston, P.M.; McCoy, C.T.; Masterson, J.B.

    1992-01-01

    The purpose of this study was to construct and evaluate a statistical model for the quantitative analysis of computed tomographic brain images. Data were derived from standard sections in 34 normal studies. A model representing the intercranial pure tissue and partial volume areas, with allowance for beam hardening, was developed. The average percentage error in estimation of areas, derived from phantom tests using the model, was 28.47%. We conclude that our model is not sufficiently accurate to be of clinical use, even though allowance was made for partial volume and beam hardening effects. (author)

  7. Summary goodness-of-fit statistics for binary generalized linear models with noncanonical link functions.

    Science.gov (United States)

    Canary, Jana D; Blizzard, Leigh; Barry, Ronald P; Hosmer, David W; Quinn, Stephen J

    2016-05-01

    Generalized linear models (GLM) with a canonical logit link function are the primary modeling technique used to relate a binary outcome to predictor variables. However, noncanonical links can offer more flexibility, producing convenient analytical quantities (e.g., probit GLMs in toxicology) and desired measures of effect (e.g., relative risk from log GLMs). Many summary goodness-of-fit (GOF) statistics exist for logistic GLM. Their properties make the development of GOF statistics relatively straightforward, but it can be more difficult under noncanonical links. Although GOF tests for logistic GLM with continuous covariates (GLMCC) have been applied to GLMCCs with log links, we know of no GOF tests in the literature specifically developed for GLMCCs that can be applied regardless of link function chosen. We generalize the Tsiatis GOF statistic originally developed for logistic GLMCCs, (TG), so that it can be applied under any link function. Further, we show that the algebraically related Hosmer-Lemeshow (HL) and Pigeon-Heyse (J(2) ) statistics can be applied directly. In a simulation study, TG, HL, and J(2) were used to evaluate the fit of probit, log-log, complementary log-log, and log models, all calculated with a common grouping method. The TG statistic consistently maintained Type I error rates, while those of HL and J(2) were often lower than expected if terms with little influence were included. Generally, the statistics had similar power to detect an incorrect model. An exception occurred when a log GLMCC was incorrectly fit to data generated from a logistic GLMCC. In this case, TG had more power than HL or J(2) . © 2015 John Wiley & Sons Ltd/London School of Economics.

  8. Discussions on the non-equilibrium effects in the quantitative phase field model of binary alloys

    International Nuclear Information System (INIS)

    Zhi-Jun, Wang; Jin-Cheng, Wang; Gen-Cang, Yang

    2010-01-01

    All the quantitative phase field models try to get rid of the artificial factors of solutal drag, interface diffusion and interface stretch in the diffuse interface. These artificial non-equilibrium effects due to the introducing of diffuse interface are analysed based on the thermodynamic status across the diffuse interface in the quantitative phase field model of binary alloys. Results indicate that the non-equilibrium effects are related to the negative driving force in the local region of solid side across the diffuse interface. The negative driving force results from the fact that the phase field model is derived from equilibrium condition but used to simulate the non-equilibrium solidification process. The interface thickness dependence of the non-equilibrium effects and its restriction on the large scale simulation are also discussed. (cross-disciplinary physics and related areas of science and technology)

  9. Development of quantitative atomic modeling for tungsten transport study Using LHD plasma with tungsten pellet injection

    International Nuclear Information System (INIS)

    Murakami, I.; Sakaue, H.A.; Suzuki, C.; Kato, D.; Goto, M.; Tamura, N.; Sudo, S.; Morita, S.

    2014-10-01

    Quantitative tungsten study with reliable atomic modeling is important for successful achievement of ITER and fusion reactors. We have developed tungsten atomic modeling for understanding the tungsten behavior in fusion plasmas. The modeling is applied to the analysis of tungsten spectra observed from currentless plasmas of the Large Helical Device (LHD) with tungsten pellet injection. We found that extreme ultraviolet (EUV) lines of W 24+ to W 33+ ions are very sensitive to electron temperature (Te) and useful to examine the tungsten behavior in edge plasmas. Based on the first quantitative analysis of measured spatial profile of W 44+ ion, the tungsten concentration is determined to be n(W 44+ )/n e = 1.4x10 -4 and the total radiation loss is estimated as ∼4 MW, of which the value is roughly half the total NBI power. (author)

  10. D1.3 -- Short Report on the First Draft Multi-link Channel Model

    DEFF Research Database (Denmark)

    Pedersen, Troels; Raulefs, Ronald; Steinboeck, Gerhard

    This deliverable is a preliminary report on the activities towards multi-link channel models. It summarizes the activities and achievements of investigations of WP1 Task 1.2 in the first year of the project. In this deliverable work focuses on the characterization of the crosscorrelation of multi...

  11. Modelling man-made ground to link the above- and below- ground urban domains

    NARCIS (Netherlands)

    Schokker, J.

    2017-01-01

    This report describes the results of STSM TU1206-36204. During a visit to GEUS (DK) between 23 and 27 January 2017, Jeroen Schokker (TNO-GSN, NL) has focussed on the modelling of man-made ground as a linking pin between the above- and below-ground urban domains. Key results include: • Man-made

  12. Translational PKPD modeling in schizophrenia: linking receptor occupancy of antipsychotics to efficacy and safety

    NARCIS (Netherlands)

    Pilla Reddy, Venkatesh; Kozielska, Magdalena; Johnson, Martin; Vermeulen, An; Liu, Jing; de Greef, Rik; Groothuis, Genoveva; Danhof, Meindert; Proost, Johannes

    2012-01-01

    Objectives: To link the brain dopamine D2 receptor occupancy (D2RO) of antipsychotic drugs with clinical endpoints of efficacy and safety to assess the therapeutic window of D2RO. Methods: Pharmacokinetic-Pharmacodynamic (PK-PD) models were developed to predict the D2 receptor occupancy of

  13. Ontologies to Support RFID-Based Link between Virtual Models and Construction Components

    DEFF Research Database (Denmark)

    Sørensen, Kristian Birch; Christiansson, Per; Svidt, Kjeld

    2010-01-01

    the virtual models and the physical components in the construction process can improve the information handling and sharing in construction and building operation management. Such a link can be created by means of Radio Frequency Identification (RFID) technology. Ontologies play an important role...

  14. Evaluation of mobile ad hoc network reliability using propagation-based link reliability model

    International Nuclear Information System (INIS)

    Padmavathy, N.; Chaturvedi, Sanjay K.

    2013-01-01

    A wireless mobile ad hoc network (MANET) is a collection of solely independent nodes (that can move randomly around the area of deployment) making the topology highly dynamic; nodes communicate with each other by forming a single hop/multi-hop network and maintain connectivity in decentralized manner. MANET is modelled using geometric random graphs rather than random graphs because the link existence in MANET is a function of the geometric distance between the nodes and the transmission range of the nodes. Among many factors that contribute to the MANET reliability, the reliability of these networks also depends on the robustness of the link between the mobile nodes of the network. Recently, the reliability of such networks has been evaluated for imperfect nodes (transceivers) with binary model of communication links based on the transmission range of the mobile nodes and the distance between them. However, in reality, the probability of successful communication decreases as the signal strength deteriorates due to noise, fading or interference effects even up to the nodes' transmission range. Hence, in this paper, using a propagation-based link reliability model rather than a binary-model with nodes following a known failure distribution to evaluate the network reliability (2TR m , ATR m and AoTR m ) of MANET through Monte Carlo Simulation is proposed. The method is illustrated with an application and some imperative results are also presented

  15. The Chain-Link Fence Model: A Framework for Creating Security Procedures

    Science.gov (United States)

    Houghton, Robert F.

    2013-01-01

    A long standing problem in information technology security is how to help reduce the security footprint. Many specific proposals exist to address specific problems in information technology security. Most information technology solutions need to be repeatable throughout the course of an information systems lifecycle. The Chain-Link Fence Model is…

  16. Exploring Alternative Characteristic Curve Approaches to Linking Parameter Estimates from the Generalized Partial Credit Model.

    Science.gov (United States)

    Roberts, James S.; Bao, Han; Huang, Chun-Wei; Gagne, Phill

    Characteristic curve approaches for linking parameters from the generalized partial credit model were examined for cases in which common (anchor) items are calibrated separately in two groups. Three of these approaches are simple extensions of the test characteristic curve (TCC), item characteristic curve (ICC), and operating characteristic curve…

  17. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  18. Towards the Development of Global Nano-Quantitative Structure–Property Relationship Models: Zeta Potentials of Metal Oxide Nanoparticles

    Directory of Open Access Journals (Sweden)

    Andrey A. Toropov

    2018-04-01

    Full Text Available Zeta potential indirectly reflects a charge of the surface of nanoparticles in solutions and could be used to represent the stability of the colloidal solution. As processes of synthesis, testing and evaluation of new nanomaterials are expensive and time-consuming, so it would be helpful to estimate an approximate range of properties for untested nanomaterials using computational modeling. We collected the largest dataset of zeta potential measurements of bare metal oxide nanoparticles in water (87 data points. The dataset was used to develop quantitative structure–property relationship (QSPR models. Essential features of nanoparticles were represented using a modified simplified molecular input line entry system (SMILES. SMILES strings reflected the size-dependent behavior of zeta potentials, as the considered quasi-SMILES modification included information about both chemical composition and the size of the nanoparticles. Three mathematical models were generated using the Monte Carlo method, and their statistical quality was evaluated (R2 for the training set varied from 0.71 to 0.87; for the validation set, from 0.67 to 0.82; root mean square errors for both training and validation sets ranged from 11.3 to 17.2 mV. The developed models were analyzed and linked to aggregation effects in aqueous solutions.

  19. Vision-based stress estimation model for steel frame structures with rigid links

    Science.gov (United States)

    Park, Hyo Seon; Park, Jun Su; Oh, Byung Kwan

    2017-07-01

    This paper presents a stress estimation model for the safety evaluation of steel frame structures with rigid links using a vision-based monitoring system. In this model, the deformed shape of a structure under external loads is estimated via displacements measured by a motion capture system (MCS), which is a non-contact displacement measurement device. During the estimation of the deformed shape, the effective lengths of the rigid link ranges in the frame structure are identified. The radius of the curvature of the structural member to be monitored is calculated using the estimated deformed shape and is employed to estimate stress. Using MCS in the presented model, the safety of a structure can be assessed gauge-freely. In addition, because the stress is directly extracted from the radius of the curvature obtained from the measured deformed shape, information on the loadings and boundary conditions of the structure are not required. Furthermore, the model, which includes the identification of the effective lengths of the rigid links, can consider the influences of the stiffness of the connection and support on the deformation in the stress estimation. To verify the applicability of the presented model, static loading tests for a steel frame specimen were conducted. By comparing the stress estimated by the model with the measured stress, the validity of the model was confirmed.

  20. Quantitative chemogenomics: machine-learning models of protein-ligand interaction.

    Science.gov (United States)

    Andersson, Claes R; Gustafsson, Mats G; Strömbergsson, Helena

    2011-01-01

    Chemogenomics is an emerging interdisciplinary field that lies in the interface of biology, chemistry, and informatics. Most of the currently used drugs are small molecules that interact with proteins. Understanding protein-ligand interaction is therefore central to drug discovery and design. In the subfield of chemogenomics known as proteochemometrics, protein-ligand-interaction models are induced from data matrices that consist of both protein and ligand information along with some experimentally measured variable. The two general aims of this quantitative multi-structure-property-relationship modeling (QMSPR) approach are to exploit sparse/incomplete information sources and to obtain more general models covering larger parts of the protein-ligand space, than traditional approaches that focuses mainly on specific targets or ligands. The data matrices, usually obtained from multiple sparse/incomplete sources, typically contain series of proteins and ligands together with quantitative information about their interactions. A useful model should ideally be easy to interpret and generalize well to new unseen protein-ligand combinations. Resolving this requires sophisticated machine-learning methods for model induction, combined with adequate validation. This review is intended to provide a guide to methods and data sources suitable for this kind of protein-ligand-interaction modeling. An overview of the modeling process is presented including data collection, protein and ligand descriptor computation, data preprocessing, machine-learning-model induction and validation. Concerns and issues specific for each step in this kind of data-driven modeling will be discussed. © 2011 Bentham Science Publishers

  1. Quantitative Analysis of Intra Urban Growth Modeling using socio economic agents by combining cellular automata model with agent based model

    Science.gov (United States)

    Singh, V. K.; Jha, A. K.; Gupta, K.; Srivastav, S. K.

    2017-12-01

    Recent studies indicate that there is a significant improvement in the urban land use dynamics through modeling at finer spatial resolutions. Geo-computational models such as cellular automata and agent based model have given evident proof regarding the quantification of the urban growth pattern with urban boundary. In recent studies, socio- economic factors such as demography, education rate, household density, parcel price of the current year, distance to road, school, hospital, commercial centers and police station are considered to the major factors influencing the Land Use Land Cover (LULC) pattern of the city. These factors have unidirectional approach to land use pattern which makes it difficult to analyze the spatial aspects of model results both quantitatively and qualitatively. In this study, cellular automata model is combined with generic model known as Agent Based Model to evaluate the impact of socio economic factors on land use pattern. For this purpose, Dehradun an Indian city is selected as a case study. Socio economic factors were collected from field survey, Census of India, Directorate of economic census, Uttarakhand, India. A 3X3 simulating window is used to consider the impact on LULC. Cellular automata model results are examined for the identification of hot spot areas within the urban area and agent based model will be using logistic based regression approach where it will identify the correlation between each factor on LULC and classify the available area into low density, medium density, high density residential or commercial area. In the modeling phase, transition rule, neighborhood effect, cell change factors are used to improve the representation of built-up classes. Significant improvement is observed in the built-up classes from 84 % to 89 %. However after incorporating agent based model with cellular automata model the accuracy improved from 89 % to 94 % in 3 classes of urban i.e. low density, medium density and commercial classes

  2. A quantitative analysis of instabilities in the linear chiral sigma model

    International Nuclear Information System (INIS)

    Nemes, M.C.; Nielsen, M.; Oliveira, M.M. de; Providencia, J. da

    1990-08-01

    We present a method to construct a complete set of stationary states corresponding to small amplitude motion which naturally includes the continuum solution. The energy wheighted sum rule (EWSR) is shown to provide for a quantitative criterium on the importance of instabilities which is known to occur in nonasymptotically free theories. Out results for the linear σ model showed be valid for a large class of models. A unified description of baryon and meson properties in terms of the linear σ model is also given. (author)

  3. Polymorphic ethyl alcohol as a model system for the quantitative study of glassy behaviour

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H E; Schober, H; Gonzalez, M A [Institut Max von Laue - Paul Langevin (ILL), 38 - Grenoble (France); Bermejo, F J; Fayos, R; Dawidowski, J [Consejo Superior de Investigaciones Cientificas, Madrid (Spain); Ramos, M A; Vieira, S [Universidad Autonoma de Madrid (Spain)

    1997-04-01

    The nearly universal transport and dynamical properties of amorphous materials or glasses are investigated. Reasonably successful phenomenological models have been developed to account for these properties as well as the behaviour near the glass-transition, but quantitative microscopic models have had limited success. One hindrance to these investigations has been the lack of a material which exhibits glass-like properties in more than one phase at a given temperature. This report presents results of neutron-scattering experiments for one such material ordinary ethyl alcohol, which promises to be a model system for future investigations of glassy behaviour. (author). 8 refs.

  4. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  5. Parts of the Whole: Strategies for the Spread of Quantitative Literacy: What Models Can Tell Us

    Directory of Open Access Journals (Sweden)

    Dorothy Wallace

    2014-07-01

    Full Text Available Two conceptual frameworks, one from graph theory and one from dynamical systems, have been offered as explanations for complex phenomena in biology and also as possible models for the spread of ideas. The two models are based on different assumptions and thus predict quite different outcomes for the fate of either biological species or ideas. We argue that, depending on the culture in which they exist, one can identify which model is more likely to reflect the survival of two competing ideas. Based on this argument we suggest how two strategies for embedding and normalizing quantitative literacy in a given institution are likely to succeed or fail.

  6. A prosthesis-specific multi-link segment model of lower-limb amputee sprinting.

    Science.gov (United States)

    Rigney, Stacey M; Simmons, Anne; Kark, Lauren

    2016-10-03

    Lower-limb amputees commonly utilize non-articulating energy storage and return (ESAR) prostheses for high impact activities such as sprinting. Despite these prostheses lacking an articulating ankle joint, amputee gait analysis conventionally features a two-link segment model of the prosthetic foot. This paper investigated the effects of the selected link segment model׳s marker-set and geometry on a unilateral amputee sprinter׳s calculated lower-limb kinematics, kinetics and energetics. A total of five lower-limb models of the Ottobock ® 1E90 Sprinter were developed, including two conventional shank-foot models that each used a different version of the Plug-in-Gait (PiG) marker-set to test the effect of prosthesis ankle marker location. Two Hybrid prosthesis-specific models were then developed, also using the PiG marker-sets, with the anatomical shank and foot replaced by prosthesis-specific geometry separated into two segments. Finally, a Multi-link segment (MLS) model was developed, consisting of six segments for the prosthesis as defined by a custom marker-set. All full-body musculoskeletal models were tested using four trials of experimental marker trajectories within OpenSim 3.2 (Stanford, California, USA) to find the affected and unaffected hip, knee and ankle kinematics, kinetics and energetics. The geometry of the selected lower-limb prosthesis model was found to significantly affect all variables on the affected leg (p prosthesis-specific spatial, inertial and elastic properties from full-body models significantly affects the calculated amputee gait characteristics, and we therefore recommend the implementation of a MLS model. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Input Harmonic Analysis on the Slim DC-Link Drive Using Harmonic State Space Model

    DEFF Research Database (Denmark)

    Yang, Feng; Kwon, Jun Bum; Wang, Xiongfei

    2017-01-01

    The harmonic performance of the slim dc-link adjustable speed drives has shown good performance in some studies but poor in some others. The contradiction indicates that a feasible theoretical analysis is still lacking to characterize the harmonic distortion for the slim dc-link drive. Considerin...... results of the slim dc-link drive, loaded up to 2.0 kW, are presented to validate the theoretical analysis....... variation according to the switching instant, the harmonics at the steady-state condition, as well as the coupling between the multiple harmonic impedances. By using this model, the impaction on the harmonics performance by the film capacitor and the grid inductance is derived. Simulation and experimental...

  8. A quantitative and dynamic model of the Arabidopsis flowering time gene regulatory network.

    Directory of Open Access Journals (Sweden)

    Felipe Leal Valentim

    Full Text Available Various environmental signals integrate into a network of floral regulatory genes leading to the final decision on when to flower. Although a wealth of qualitative knowledge is available on how flowering time genes regulate each other, only a few studies incorporated this knowledge into predictive models. Such models are invaluable as they enable to investigate how various types of inputs are combined to give a quantitative readout. To investigate the effect of gene expression disturbances on flowering time, we developed a dynamic model for the regulation of flowering time in Arabidopsis thaliana. Model parameters were estimated based on expression time-courses for relevant genes, and a consistent set of flowering times for plants of various genetic backgrounds. Validation was performed by predicting changes in expression level in mutant backgrounds and comparing these predictions with independent expression data, and by comparison of predicted and experimental flowering times for several double mutants. Remarkably, the model predicts that a disturbance in a particular gene has not necessarily the largest impact on directly connected genes. For example, the model predicts that SUPPRESSOR OF OVEREXPRESSION OF CONSTANS (SOC1 mutation has a larger impact on APETALA1 (AP1, which is not directly regulated by SOC1, compared to its effect on LEAFY (LFY which is under direct control of SOC1. This was confirmed by expression data. Another model prediction involves the importance of cooperativity in the regulation of APETALA1 (AP1 by LFY, a prediction supported by experimental evidence. Concluding, our model for flowering time gene regulation enables to address how different quantitative inputs are combined into one quantitative output, flowering time.

  9. Tannin structural elucidation and quantitative ³¹P NMR analysis. 1. Model compounds.

    Science.gov (United States)

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    Tannins and flavonoids are secondary metabolites of plants that display a wide array of biological activities. This peculiarity is related to the inhibition of extracellular enzymes that occurs through the complexation of peptides by tannins. Not only the nature of these interactions, but more fundamentally also the structure of these heterogeneous polyphenolic molecules are not completely clear. This first paper describes the development of a new analytical method for the structural characterization of tannins on the basis of tannin model compounds employing an in situ labeling of all labile H groups (aliphatic OH, phenolic OH, and carboxylic acids) with a phosphorus reagent. The ³¹P NMR analysis of ³¹P-labeled samples allowed the unprecedented quantitative and qualitative structural characterization of hydrolyzable tannins, proanthocyanidins, and catechin tannin model compounds, forming the foundations for the quantitative structural elucidation of a variety of actual tannin samples described in part 2 of this series.

  10. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China*

    Science.gov (United States)

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-01-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors. PMID:19353749

  11. Refining the statistical model for quantitative immunostaining of surface-functionalized nanoparticles by AFM.

    Science.gov (United States)

    MacCuspie, Robert I; Gorka, Danielle E

    2013-10-01

    Recently, an atomic force microscopy (AFM)-based approach for quantifying the number of biological molecules conjugated to a nanoparticle surface at low number densities was reported. The number of target molecules conjugated to the analyte nanoparticle can be determined with single nanoparticle fidelity using antibody-mediated self-assembly to decorate the analyte nanoparticles with probe nanoparticles (i.e., quantitative immunostaining). This work refines the statistical models used to quantitatively interpret the observations when AFM is used to image the resulting structures. The refinements add terms to the previous statistical models to account for the physical sizes of the analyte nanoparticles, conjugated molecules, antibodies, and probe nanoparticles. Thus, a more physically realistic statistical computation can be implemented for a given sample of known qualitative composition, using the software scripts provided. Example AFM data sets, using horseradish peroxidase conjugated to gold nanoparticles, are presented to illustrate how to implement this method successfully.

  12. Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model

    Science.gov (United States)

    Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi

    2017-09-01

    Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.

  13. Analytical model and figures of merit for filtered Microwave Photonic Links.

    Science.gov (United States)

    Gasulla, Ivana; Capmany, José

    2011-09-26

    The concept of filtered Microwave Photonic Links is proposed in order to provide the most general and versatile description of complex analog photonic systems. We develop a field propagation model where a global optical filter, characterized by its optical transfer function, embraces all the intermediate optical components in a linear link. We assume a non-monochromatic light source characterized by an arbitrary spectral distribution which has a finite linewidth spectrum and consider both intensity modulation and phase modulation with balanced and single detection. Expressions leading to the computation of the main figures of merit concerning the link gain, noise and intermodulation distortion are provided which, to our knowledge, are not available in the literature. The usefulness of this derivation resides in the capability to directly provide performance criteria results for complex links just by substituting in the overall closed-form formulas the numerical or measured optical transfer function characterizing the link. This theory is presented thus as a potential tool for a wide range of relevant microwave photonic application cases which is extendable to multiport radio over fiber systems. © 2011 Optical Society of America

  14. Moving contact lines: linking molecular dynamics and continuum-scale modelling.

    Science.gov (United States)

    Smith, Edward R; Theodorakis, Panagiotis E; Craster, Richard V; Matar, Omar K

    2018-05-04

    Despite decades of research, the modelling of moving contact lines has remained a formidable challenge in fluid dynamics whose resolution will impact numerous industrial, biological, and daily-life applications. On the one hand, molecular dynamics (MD) simulation has the ability to provide unique insight into the microscopic details that determine the dynamic behavior of the contact line, which is not possible with either continuum-scale simulations or experiments. On the other hand, continuum-based models provide the link to the macroscopic description of the system. In this Feature Article, we explore the complex range of physical factors, including the presence of surfactants, which govern the contact line motion through MD simulations. We also discuss links between continuum- and molecular-scale modelling, and highlight the opportunities for future developments in this area.

  15. A proposed model of psychodynamic psychotherapy linked to Erik Erikson's eight stages of psychosocial development.

    Science.gov (United States)

    Knight, Zelda Gillian

    2017-09-01

    Just as Freud used stages of psychosexual development to ground his model of psychoanalysis, it is possible to do the same with Erik Erikson's stages of development with regards to a model of psychodynamic psychotherapy. This paper proposes an eight-stage model of psychodynamic psychotherapy linked to Erik Erikson's eight stages of psychosocial development. Various suggestions are offered. One such suggestion is that as each of Erikson's developmental stages is triggered by a crisis, in therapy it is triggered by the client's search. The resolution of the search often leads to the development of another search, which implies that the therapy process comprises a series of searches. This idea of a series of searches and resolutions leads to the understanding that identity is developmental and therapy is a space in which a new sense of identity may emerge. The notion of hope is linked to Erikson's stage of Basic Trust and the proposed model of therapy views hope and trust as essential for the therapy process. Two clinical vignettes are offered to illustrate these ideas. Psychotherapy can be approached as an eight-stage process and linked to Erikson's eight stages model of development. Psychotherapy may be viewed as a series of searches and thus as a developmental stage resolution process, which leads to the understanding that identity is ongoing throughout the life span. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Integration of CFD codes and advanced combustion models for quantitative burnout determination

    Energy Technology Data Exchange (ETDEWEB)

    Javier Pallares; Inmaculada Arauzo; Alan Williams [University of Zaragoza, Zaragoza (Spain). Centre of Research for Energy Resources and Consumption (CIRCE)

    2007-10-15

    CFD codes and advanced kinetics combustion models are extensively used to predict coal burnout in large utility boilers. Modelling approaches based on CFD codes can accurately solve the fluid dynamics equations involved in the problem but this is usually achieved by including simple combustion models. On the other hand, advanced kinetics combustion models can give a detailed description of the coal combustion behaviour by using a simplified description of the flow field, this usually being obtained from a zone-method approach. Both approximations describe correctly general trends on coal burnout, but fail to predict quantitative values. In this paper a new methodology which takes advantage of both approximations is described. In the first instance CFD solutions were obtained of the combustion conditions in the furnace in the Lamarmora power plant (ASM Brescia, Italy) for a number of different conditions and for three coals. Then, these furnace conditions were used as inputs for a more detailed chemical combustion model to predict coal burnout. In this, devolatilization was modelled using a commercial macromolecular network pyrolysis model (FG-DVC). For char oxidation an intrinsic reactivity approach including thermal annealing, ash inhibition and maceral effects, was used. Results from the simulations were compared against plant experimental values, showing a reasonable agreement in trends and quantitative values. 28 refs., 4 figs., 4 tabs.

  17. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    Science.gov (United States)

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (pEEG patterns such as generalized periodic discharges (pEEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Variable selection in near infrared spectroscopy for quantitative models of homologous analogs of cephalosporins

    Directory of Open Access Journals (Sweden)

    Yan-Chun Feng

    2014-07-01

    Full Text Available Two universal spectral ranges (4550–4100 cm-1 and 6190–5510 cm-1 for construction of quantitative models of homologous analogs of cephalosporins were proposed by evaluating the performance of five spectral ranges and their combinations, using three data sets of cephalosporins for injection, i.e., cefuroxime sodium, ceftriaxone sodium and cefoperazone sodium. Subsequently, the proposed ranges were validated by using eight calibration sets of other homologous analogs of cephalosporins for injection, namely cefmenoxime hydrochloride, ceftezole sodium, cefmetazole, cefoxitin sodium, cefotaxime sodium, cefradine, cephazolin sodium and ceftizoxime sodium. All the constructed quantitative models for the eight kinds of cephalosporins using these universal ranges could fulfill the requirements for quick quantification. After that, competitive adaptive reweighted sampling (CARS algorithm and infrared (IR–near infrared (NIR two-dimensional (2D correlation spectral analysis were used to determine the scientific basis of these two spectral ranges as the universal regions for the construction of quantitative models of cephalosporins. The CARS algorithm demonstrated that the ranges of 4550–4100 cm-1 and 6190–5510 cm-1 included some key wavenumbers which could be attributed to content changes of cephalosporins. The IR–NIR 2D spectral analysis showed that certain wavenumbers in these two regions have strong correlations to the structures of those cephalosporins that were easy to degrade.

  19. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    Science.gov (United States)

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  20. Qualitative and quantitative guidelines for the comparison of environmental model predictions

    International Nuclear Information System (INIS)

    Scott, M.

    1995-03-01

    The question of how to assess or compare predictions from a number of models is one of concern in the validation of models, in understanding the effects of different models and model parameterizations on model output, and ultimately in assessing model reliability. Comparison of model predictions with observed data is the basic tool of model validation while comparison of predictions amongst different models provides one measure of model credibility. The guidance provided here is intended to provide qualitative and quantitative approaches (including graphical and statistical techniques) to such comparisons for use within the BIOMOVS II project. It is hoped that others may find it useful. It contains little technical information on the actual methods but several references are provided for the interested reader. The guidelines are illustrated on data from the VAMP CB scenario. Unfortunately, these data do not permit all of the possible approaches to be demonstrated since predicted uncertainties were not provided. The questions considered are concerned with a) intercomparison of model predictions and b) comparison of model predictions with the observed data. A series of examples illustrating some of the different types of data structure and some possible analyses have been constructed. A bibliography of references on model validation is provided. It is important to note that the results of the various techniques discussed here, whether qualitative or quantitative, should not be considered in isolation. Overall model performance must also include an evaluation of model structure and formulation, i.e. conceptual model uncertainties, and results for performance measures must be interpreted in this context. Consider a number of models which are used to provide predictions of a number of quantities at a number of time points. In the case of the VAMP CB scenario, the results include predictions of total deposition of Cs-137 and time dependent concentrations in various

  1. A linked hydrodynamic and water quality model for the Salton Sea

    Science.gov (United States)

    Chung, E.G.; Schladow, S.G.; Perez-Losada, J.; Robertson, Dale M.

    2008-01-01

    A linked hydrodynamic and water quality model was developed and applied to the Salton Sea. The hydrodynamic component is based on the one-dimensional numerical model, DLM. The water quality model is based on a new conceptual model for nutrient cycling in the Sea, and simulates temperature, total suspended sediment concentration, nutrient concentrations, including PO4-3, NO3-1 and NH4+1, DO concentration and chlorophyll a concentration as functions of depth and time. Existing water temperature data from 1997 were used to verify that the model could accurately represent the onset and breakup of thermal stratification. 1999 is the only year with a near-complete dataset for water quality variables for the Salton Sea. The linked hydrodynamic and water quality model was run for 1999, and by adjustment of rate coefficients and other water quality parameters, a good match with the data was obtained. In this article, the model is fully described and the model results for reductions in external phosphorus load on chlorophyll a distribution are presented. ?? 2008 Springer Science+Business Media B.V.

  2. A quantitative dynamic systems model of health-related quality of life among older adults

    Science.gov (United States)

    Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela

    2015-01-01

    Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722

  3. Fixing the cracks in the crystal ball: A maturity model for quantitative risk assessment

    International Nuclear Information System (INIS)

    Rae, Andrew; Alexander, Rob; McDermid, John

    2014-01-01

    Quantitative risk assessment (QRA) is widely practiced in system safety, but there is insufficient evidence that QRA in general is fit for purpose. Defenders of QRA draw a distinction between poor or misused QRA and correct, appropriately used QRA, but this distinction is only useful if we have robust ways to identify the flaws in an individual QRA. In this paper we present a comprehensive maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature and in a collection of risk assessment peer reviews. We provide initial validation of the completeness and realism of the model. Our risk assessment maturity model provides a way to prioritise both process development within an organisation and empirical research within the QRA community. - Highlights: • Quantitative risk assessment (QRA) is widely practiced, but there is insufficient evidence that it is fit for purpose. • A given QRA may be good, or it may not – we need systematic ways to distinguish this. • We have created a maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature. • We have provided initial validation of the completeness and realism of the model. • The maturity model can also be used to prioritise QRA research discipline-wide

  4. Towards Controlling the Glycoform: A Model Framework Linking Extracellular Metabolites to Antibody Glycosylation

    Directory of Open Access Journals (Sweden)

    Philip M. Jedrzejewski

    2014-03-01

    Full Text Available Glycoproteins represent the largest group of the growing number of biologically-derived medicines. The associated glycan structures and their distribution are known to have a large impact on pharmacokinetics. A modelling framework was developed to provide a link from the extracellular environment and its effect on intracellular metabolites to the distribution of glycans on the constant region of an antibody product. The main focus of this work is the mechanistic in silico reconstruction of the nucleotide sugar donor (NSD metabolic network by means of 34 species mass balances and the saturation kinetics rates of the 60 metabolic reactions involved. NSDs are the co-substrates of the glycosylation process in the Golgi apparatus and their simulated dynamic intracellular concentration profiles were linked to an existing model describing the distribution of N-linked glycan structures of the antibody constant region. The modelling framework also describes the growth dynamics of the cell population by means of modified Monod kinetics. Simulation results match well to experimental data from a murine hybridoma cell line. The result is a modelling platform which is able to describe the product glycoform based on extracellular conditions. It represents a first step towards the in silico prediction of the glycoform of a biotherapeutic and provides a platform for the optimisation of bioprocess conditions with respect to product quality.

  5. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    Science.gov (United States)

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (pquantitative DCE-MRI, the quantitative DCE-MRI parameter K trans is a more sensitive and accurate index for detecting early reduced perfusion in osteoporotic bone.

  6. AUTOMATED ANALYSIS OF QUANTITATIVE IMAGE DATA USING ISOMORPHIC FUNCTIONAL MIXED MODELS, WITH APPLICATION TO PROTEOMICS DATA.

    Science.gov (United States)

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard

    2011-01-01

    Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method

  7. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  8. Comparison of conventional, model-based quantitative planar, and quantitative SPECT image processing methods for organ activity estimation using In-111 agents

    International Nuclear Information System (INIS)

    He, Bin; Frey, Eric C

    2006-01-01

    Accurate quantification of organ radionuclide uptake is important for patient-specific dosimetry. The quantitative accuracy from conventional conjugate view methods is limited by overlap of projections from different organs and background activity, and attenuation and scatter. In this work, we propose and validate a quantitative planar (QPlanar) processing method based on maximum likelihood (ML) estimation of organ activities using 3D organ VOIs and a projector that models the image degrading effects. Both a physical phantom experiment and Monte Carlo simulation (MCS) studies were used to evaluate the new method. In these studies, the accuracies and precisions of organ activity estimates for the QPlanar method were compared with those from conventional planar (CPlanar) processing methods with various corrections for scatter, attenuation and organ overlap, and a quantitative SPECT (QSPECT) processing method. Experimental planar and SPECT projections and registered CT data from an RSD Torso phantom were obtained using a GE Millenium VH/Hawkeye system. The MCS data were obtained from the 3D NCAT phantom with organ activity distributions that modelled the uptake of 111 In ibritumomab tiuxetan. The simulations were performed using parameters appropriate for the same system used in the RSD torso phantom experiment. The organ activity estimates obtained from the CPlanar, QPlanar and QSPECT methods from both experiments were compared. From the results of the MCS experiment, even with ideal organ overlap correction and background subtraction, CPlanar methods provided limited quantitative accuracy. The QPlanar method with accurate modelling of the physical factors increased the quantitative accuracy at the cost of requiring estimates of the organ VOIs in 3D. The accuracy of QPlanar approached that of QSPECT, but required much less acquisition and computation time. Similar results were obtained from the physical phantom experiment. We conclude that the QPlanar method, based

  9. Interleukin-1 may link helplessness-hopelessness with cancer progression: a proposed model.

    Science.gov (United States)

    Argaman, Miriam; Gidron, Yori; Ariad, Shmuel

    2005-01-01

    A model of the relations between psychological factors and cancer progression should include brain and systemic components and their link with critical cellular stages in cancer progression. We present a psychoneuroimmunological (PNI) model that links helplessness-hopelessness (HH) with cancer progression via interleukin-1beta (IL-1beta). IL-1beta was elevated in the brain following exposure to inescapable shock, and HH was minimized by antagonizing cerebral IL-1beta. Elevated cerebral IL-1beta increased cancer metastasis in animals. Inescapable shock was associated with systemic elevations of IL-1beta and peripheral IL-1beta was associated with escape from apoptosis, angiogenesis, and metastasis. Involvement of the sympathetic nervous system and the hypothalamic-pituitary-adrenal axis are discussed. Future studies need to identify the role of additional factors in this PNI pathway.

  10. Simulink models for performance analysis of high speed DQPSK modulated optical link

    International Nuclear Information System (INIS)

    Sharan, Lucky; Rupanshi,; Chaubey, V. K.

    2016-01-01

    This paper attempts to present the design approach for development of simulation models to study and analyze the transmission of 10 Gbps DQPSK signal over a single channel Peer to Peer link using Matlab Simulink. The simulation model considers the different optical components used in link design with their behavior represented initially by theoretical interpretation, including the transmitter topology, Mach Zehnder Modulator(MZM) module and, the propagation model for optical fibers etc. thus allowing scope for direct realization in experimental configurations. It provides the flexibility to incorporate the various photonic components as either user-defined or fixed and, can also be enhanced or removed from the model as per the design requirements. We describe the detailed operation and need of every component model and its representation in Simulink blocksets. Moreover the developed model can be extended in future to support Dense Wavelength Division Multiplexing (DWDM) system, thereby allowing high speed transmission with N × 40 Gbps systems. The various compensation techniques and their influence on system performance can be easily investigated by using such models.

  11. Simulink models for performance analysis of high speed DQPSK modulated optical link

    Energy Technology Data Exchange (ETDEWEB)

    Sharan, Lucky, E-mail: luckysharan@pilani.bits-pilani.ac.in; Rupanshi,, E-mail: f2011222@pilani.bits-pilani.ac.in; Chaubey, V. K., E-mail: vkc@pilani.bits-pilani.ac.in [EEE Department, BITS-Pilani, Rajasthan, 333031 (India)

    2016-03-09

    This paper attempts to present the design approach for development of simulation models to study and analyze the transmission of 10 Gbps DQPSK signal over a single channel Peer to Peer link using Matlab Simulink. The simulation model considers the different optical components used in link design with their behavior represented initially by theoretical interpretation, including the transmitter topology, Mach Zehnder Modulator(MZM) module and, the propagation model for optical fibers etc. thus allowing scope for direct realization in experimental configurations. It provides the flexibility to incorporate the various photonic components as either user-defined or fixed and, can also be enhanced or removed from the model as per the design requirements. We describe the detailed operation and need of every component model and its representation in Simulink blocksets. Moreover the developed model can be extended in future to support Dense Wavelength Division Multiplexing (DWDM) system, thereby allowing high speed transmission with N × 40 Gbps systems. The various compensation techniques and their influence on system performance can be easily investigated by using such models.

  12. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    Science.gov (United States)

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  13. Accuracy test for link prediction in terms of similarity index: The case of WS and BA models

    Science.gov (United States)

    Ahn, Min-Woo; Jung, Woo-Sung

    2015-07-01

    Link prediction is a technique that uses the topological information in a given network to infer the missing links in it. Since past research on link prediction has primarily focused on enhancing performance for given empirical systems, negligible attention has been devoted to link prediction with regard to network models. In this paper, we thus apply link prediction to two network models: The Watts-Strogatz (WS) model and Barabási-Albert (BA) model. We attempt to gain a better understanding of the relation between accuracy and each network parameter (mean degree, the number of nodes and the rewiring probability in the WS model) through network models. Six similarity indices are used, with precision and area under the ROC curve (AUC) value as the accuracy metrics. We observe a positive correlation between mean degree and accuracy, and size independence of the AUC value.

  14. Quantitative Modeling of Acid Wormholing in Carbonates- What Are the Gaps to Bridge

    KAUST Repository

    Qiu, Xiangdong; Zhao, Weishu; Chang, Frank; Dyer, Steve

    2013-01-01

    Conventional wormhole propagation models largely ignore the impact of reaction products. When implemented in a job design, the significant errors can result in treatment fluid schedule, rate, and volume. A more accurate method to simulate carbonate matrix acid treatments would accomodate the effect of reaction products on reaction kinetics. It is the purpose of this work to properly account for these effects. This is an important step in achieving quantitative predictability of wormhole penetration during an acidzing treatment. This paper describes the laboratory procedures taken to obtain the reaction-product impacted kinetics at downhole conditions using a rotating disk apparatus, and how this new set of kinetics data was implemented in a 3D wormholing model to predict wormhole morphology and penetration velocity. The model explains some of the differences in wormhole morphology observed in limestone core flow experiments where injection pressure impacts the mass transfer of hydrogen ions to the rock surface. The model uses a CT scan rendered porosity field to capture the finer details of the rock fabric and then simulates the fluid flow through the rock coupled with reactions. Such a validated model can serve as a base to scale up to near wellbore reservoir and 3D radial flow geometry allowing a more quantitative acid treatment design.

  15. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    Science.gov (United States)

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. © 2013 Elsevier B.V. All rights reserved.

  16. Quantitative structure-activity relationship (QSAR) for insecticides: development of predictive in vivo insecticide activity models.

    Science.gov (United States)

    Naik, P K; Singh, T; Singh, H

    2009-07-01

    Quantitative structure-activity relationship (QSAR) analyses were performed independently on data sets belonging to two groups of insecticides, namely the organophosphates and carbamates. Several types of descriptors including topological, spatial, thermodynamic, information content, lead likeness and E-state indices were used to derive quantitative relationships between insecticide activities and structural properties of chemicals. A systematic search approach based on missing value, zero value, simple correlation and multi-collinearity tests as well as the use of a genetic algorithm allowed the optimal selection of the descriptors used to generate the models. The QSAR models developed for both organophosphate and carbamate groups revealed good predictability with r(2) values of 0.949 and 0.838 as well as [image omitted] values of 0.890 and 0.765, respectively. In addition, a linear correlation was observed between the predicted and experimental LD(50) values for the test set data with r(2) of 0.871 and 0.788 for both the organophosphate and carbamate groups, indicating that the prediction accuracy of the QSAR models was acceptable. The models were also tested successfully from external validation criteria. QSAR models developed in this study should help further design of novel potent insecticides.

  17. A conceptual knowledge-link model for supporting dental implant process

    OpenAIRE

    Szejka , Anderson Luis; Canciglieri , Osiris ,; Rudek , Marcelo; Panetto , Hervé

    2014-01-01

    International audience; Computer aided techniques widely used as diagnostic and surgical procedures tools are scarcely applied in implantology, which continues using visualization of CT images to define the parameters for dental implant process leaving to the dentist discretion the implant determination, since only the images analysis is non-deterministic. Thus, this research proposes the development of a knowledge-link model integrated to a reasoner system to support dental implant process t...

  18. A semi-quantitative model for risk appreciation and risk weighing

    DEFF Research Database (Denmark)

    Bos, Peter M.J.; Boon, Polly E.; van der Voet, Hilko

    2009-01-01

    Risk managers need detailed information on (1) the type of effect, (2) the size (severity) of the expected effect(s) and (3) the fraction of the population at risk to decide on well-balanced risk reduction measures. A previously developed integrated probabilistic risk assessment (IPRA) model...... provides quantitative information on these three parameters. A semi-quantitative tool is presented that combines information on these parameters into easy-readable charts that will facilitate risk evaluations of exposure situations and decisions on risk reduction measures. This tool is based on a concept...... detailed information on the estimated health impact in a given exposure situation. These graphs will facilitate the discussions on appropriate risk reduction measures to be taken....

  19. Function of dynamic models in systems biology: linking structure to behaviour.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens

    2013-10-08

    Dynamic models in Systems Biology are used in computational simulation experiments for addressing biological questions. The complexity of the modelled biological systems and the growing number and size of the models calls for computer support for modelling and simulation in Systems Biology. This computer support has to be based on formal representations of relevant knowledge fragments. In this paper we describe different functional aspects of dynamic models. This description is conceptually embedded in our "meaning facets" framework which systematises the interpretation of dynamic models in structural, functional and behavioural facets. Here we focus on how function links the structure and the behaviour of a model. Models play a specific role (teleological function) in the scientific process of finding explanations for dynamic phenomena. In order to fulfil this role a model has to be used in simulation experiments (pragmatical function). A simulation experiment always refers to a specific situation and a state of the model and the modelled system (conditional function). We claim that the function of dynamic models refers to both the simulation experiment executed by software (intrinsic function) and the biological experiment which produces the phenomena under investigation (extrinsic function). We use the presented conceptual framework for the function of dynamic models to review formal accounts for functional aspects of models in Systems Biology, such as checklists, ontologies, and formal languages. Furthermore, we identify missing formal accounts for some of the functional aspects. In order to fill one of these gaps we propose an ontology for the teleological function of models. We have thoroughly analysed the role and use of models in Systems Biology. The resulting conceptual framework for the function of models is an important first step towards a comprehensive formal representation of the functional knowledge involved in the modelling and simulation process

  20. New Ghost-node method for linking different models with varied grid refinement

    International Nuclear Information System (INIS)

    Mehl, Steffen W.; Hill, Mary Catherine; James, Scott Carlton; Leake, Stanley A.; Zyvoloski, George A.; Dickinson, Jesse E.; Eddebbarh, Al A.

    2006-01-01

    A flexible, robust method for linking grids of locally refined models constructed with different numerical methods is needed to address a variety of hydrologic problems. This work outlines and tests a new ghost-node model-linking method for a refined 'child' model that is contained within a larger and coarser 'parent' model that is based on the iterative method of Mehl and Hill (2002, 2004). The method is applicable to steady-state solutions for ground-water flow. Tests are presented for a homogeneous two-dimensional system that has either matching grids (parent cells border an integer number of child cells; Figure 2a) or non-matching grids (parent cells border a non-integer number of child cells; Figure 2b). The coupled grids are simulated using the finite-difference and finite-element models MODFLOW and FEHM, respectively. The simulations require no alteration of the MODFLOW or FEHM models and are executed using a batch file on Windows operating systems. Results indicate that when the grids are matched spatially so that nodes and child cell boundaries are aligned, the new coupling technique has error nearly equal to that when coupling two MODFLOW models (Mehl and Hill, 2002). When the grids are non-matching, model accuracy is slightly increased over matching-grid cases. Overall, results indicate that the ghost-node technique is a viable means to accurately couple distinct models because the overall error is less than if only the regional model was used to simulate flow in the child model's domain

  1. New ghost-node method for linking different models with varied grid refinement

    Science.gov (United States)

    James, S.C.; Dickinson, J.E.; Mehl, S.W.; Hill, M.C.; Leake, S.A.; Zyvoloski, G.A.; Eddebbarh, A.-A.

    2006-01-01

    A flexible, robust method for linking grids of locally refined ground-water flow models constructed with different numerical methods is needed to address a variety of hydrologic problems. This work outlines and tests a new ghost-node model-linking method for a refined "child" model that is contained within a larger and coarser "parent" model that is based on the iterative method of Steffen W. Mehl and Mary C. Hill (2002, Advances in Water Res., 25, p. 497-511; 2004, Advances in Water Res., 27, p. 899-912). The method is applicable to steady-state solutions for ground-water flow. Tests are presented for a homogeneous two-dimensional system that has matching grids (parent cells border an integer number of child cells) or nonmatching grids. The coupled grids are simulated by using the finite-difference and finite-element models MODFLOW and FEHM, respectively. The simulations require no alteration of the MODFLOW or FEHM models and are executed using a batch file on Windows operating systems. Results indicate that when the grids are matched spatially so that nodes and child-cell boundaries are aligned, the new coupling technique has error nearly equal to that when coupling two MODFLOW models. When the grids are nonmatching, model accuracy is slightly increased compared to that for matching-grid cases. Overall, results indicate that the ghost-node technique is a viable means to couple distinct models because the overall head and flow errors relative to the analytical solution are less than if only the regional coarse-grid model was used to simulate flow in the child model's domain.

  2. Influence of gender constancy and social power on sex-linked modeling.

    Science.gov (United States)

    Bussey, K; Bandura, A

    1984-12-01

    Competing predictions derived from cognitive-developmental theory and social learning theory concerning sex-linked modeling were tested. In cognitive-developmental theory, gender constancy is considered a necessary prerequisite for the emulation of same-sex models, whereas according to social learning theory, sex-role development is promoted through a vast system of social influences with modeling serving as a major conveyor of sex role information. In accord with social learning theory, even children at a lower level of gender conception emulated same-sex models in preference to opposite-sex ones. Level of gender constancy was associated with higher emulation of both male and female models rather than operating as a selective determinant of modeling. This finding corroborates modeling as a basic mechanism in the sex-typing process. In a second experiment we explored the limits of same-sex modeling by pitting social power against the force of collective modeling of different patterns of behavior by male and female models. Social power over activities and rewarding resources produced cross-sex modeling in boys, but not in girls. This unexpected pattern of cross-sex modeling is explained by the differential sex-typing pressures that exist for boys and girls and socialization experiences that heighten the attractiveness of social power for boys.

  3. Efficient Generation and Selection of Virtual Populations in Quantitative Systems Pharmacology Models.

    Science.gov (United States)

    Allen, R J; Rieger, T R; Musante, C J

    2016-03-01

    Quantitative systems pharmacology models mechanistically describe a biological system and the effect of drug treatment on system behavior. Because these models rarely are identifiable from the available data, the uncertainty in physiological parameters may be sampled to create alternative parameterizations of the model, sometimes termed "virtual patients." In order to reproduce the statistics of a clinical population, virtual patients are often weighted to form a virtual population that reflects the baseline characteristics of the clinical cohort. Here we introduce a novel technique to efficiently generate virtual patients and, from this ensemble, demonstrate how to select a virtual population that matches the observed data without the need for weighting. This approach improves confidence in model predictions by mitigating the risk that spurious virtual patients become overrepresented in virtual populations.

  4. Quantitative assessment of manual and robotic microcannulation for eye surgery using new eye model.

    Science.gov (United States)

    Tanaka, Shinichi; Harada, Kanako; Ida, Yoshiki; Tomita, Kyohei; Kato, Ippei; Arai, Fumihito; Ueta, Takashi; Noda, Yasuo; Sugita, Naohiko; Mitsuishi, Mamoru

    2015-06-01

    Microcannulation, a surgical procedure for the eye that requires drug injection into a 60-90 µm retinal vein, is difficult to perform manually. Robotic assistance has been proposed; however, its effectiveness in comparison to manual operation has not been quantified. An eye model has been developed to quantify the performance of manual and robotic microcannulation. The eye model, which is implemented with a force sensor and microchannels, also simulates the mechanical constraints of the instrument's movement. Ten subjects performed microcannulation using the model, with and without robotic assistance. The results showed that the robotic assistance was useful for motion stability when the drug was injected, whereas its positioning accuracy offered no advantage. An eye model was used to quantitatively assess the robotic microcannulation performance in comparison to manual operation. This approach could be valid for a better evaluation of surgical robotic assistance. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Quantitative Decision Making Model for Carbon Reduction in Road Construction Projects Using Green Technologies

    Directory of Open Access Journals (Sweden)

    Woosik Jang

    2015-08-01

    Full Text Available Numerous countries have established policies for reducing greenhouse gas emissions and have suggested goals pertaining to these reductions. To reach the target reduction amounts, studies on the reduction of carbon emissions have been conducted with regard to all stages and processes in construction projects. According to a study on carbon emissions, the carbon emissions generated during the construction stage of road projects account for approximately 76 to 86% of the total carbon emissions, far exceeding the other stages, such as maintenance or demolition. Therefore, this study aims to develop a quantitative decision making model that supports the application of green technologies (GTs to reduce carbon emissions during the construction stage of road construction projects. First, the authors selected environmental soundness, economic feasibility and constructability as the key assessment indices for evaluating 20 GTs. Second, a fuzzy set/qualitative comparative analysis (FS/QCA was used to establish an objective decision-making model for the assessment of both the quantitative and qualitative characteristics of the key indices. To support the developed model, an expert survey was performed to assess the applicability of each GT from a practical perspective, which was verified with a case study using two additional GTs. The proposed model is expected to support practitioners in the application of suitable GTs to road projects and reduce carbon emissions, resulting in better decision making during road construction projects.

  6. [A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].

    Science.gov (United States)

    Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang

    2015-05-01

    To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.

  7. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  8. Quantitative stress measurement of elastic deformation using mechanoluminescent sensor: An intensity ratio model

    Science.gov (United States)

    Cai, Tao; Guo, Songtao; Li, Yongzeng; Peng, Di; Zhao, Xiaofeng; Liu, Yingzheng

    2018-04-01

    The mechanoluminescent (ML) sensor is a newly developed non-invasive technique for stress/strain measurement. However, its application has been mostly restricted to qualitative measurement due to the lack of a well-defined relationship between ML intensity and stress. To achieve accurate stress measurement, an intensity ratio model was proposed in this study to establish a quantitative relationship between the stress condition and its ML intensity in elastic deformation. To verify the proposed model, experiments were carried out on a ML measurement system using resin samples mixed with the sensor material SrAl2O4:Eu2+, Dy3+. The ML intensity ratio was found to be dependent on the applied stress and strain rate, and the relationship acquired from the experimental results agreed well with the proposed model. The current study provided a physical explanation for the relationship between ML intensity and its stress condition. The proposed model was applicable in various SrAl2O4:Eu2+, Dy3+-based ML measurement in elastic deformation, and could provide a useful reference for quantitative stress measurement using the ML sensor in general.

  9. Conformational analysis of a covalently cross-linked Watson-Crick base pair model.

    Science.gov (United States)

    Jensen, Erik A; Allen, Benjamin D; Kishi, Yoshito; O'Leary, Daniel J

    2008-11-15

    Low-temperature NMR experiments and molecular modeling have been used to characterize the conformational behavior of a covalently cross-linked DNA base pair model. The data suggest that Watson-Crick or reverse Watson-Crick hydrogen bonding geometries have similar energies and can interconvert at low temperatures. This low-temperature process involves rotation about the crosslink CH(2)C(5') (psi) carbon-carbon bond, which is energetically preferred over the alternate CH(2)N(3) (phi) carbon-nitrogen bond rotation.

  10. Conformational Analysis of a Covalently Cross-Linked Watson-Crick Base Pair Model

    OpenAIRE

    Jensen, Erik A.; Allen, Benjamin D.; Kishi, Yoshito; O'Leary, Daniel J.

    2008-01-01

    Low temperature NMR experiments and molecular modeling have been used to characterize the conformational behavior of a covalently cross-linked DNA base pair model. The data suggest that Watson-Crick or reverse Watson-Crick hydrogen bonding geometries have similar energies and can interconvert at low temperatures. This low-temperature process involves rotation about the crosslink CH2–C(5′) (ψ) carbon-carbon bond, which is energetically preferred over the alternate CH2–N(3) (ϕ) carbon-nitrogen ...

  11. Nonparametric modeling of longitudinal covariance structure in functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yap, John Stephen; Fan, Jianqing; Wu, Rongling

    2009-12-01

    Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.

  12. Quantitative laser diagnostic and modeling study of C2 and CH chemistry in combustion.

    Science.gov (United States)

    Köhler, Markus; Brockhinke, Andreas; Braun-Unkhoff, Marina; Kohse-Höinghaus, Katharina

    2010-04-15

    Quantitative concentration measurements of CH and C(2) have been performed in laminar, premixed, flat flames of propene and cyclopentene with varying stoichiometry. A combination of cavity ring-down (CRD) spectroscopy and laser-induced fluorescence (LIF) was used to enable sensitive detection of these species with high spatial resolution. Previously, CH and C(2) chemistry had been studied, predominantly in methane flames, to understand potential correlations of their formation and consumption. For flames of larger hydrocarbon fuels, however, quantitative information on these small intermediates is scarce, especially under fuel-rich conditions. Also, the combustion chemistry of C(2) in particular has not been studied in detail, and although it has often been observed, its role in potential build-up reactions of higher hydrocarbon species is not well understood. The quantitative measurements performed here are the first to detect both species with good spatial resolution and high sensitivity in the same experiment in flames of C(3) and C(5) fuels. The experimental profiles were compared with results of combustion modeling to reveal details of the formation and consumption of these important combustion molecules, and the investigation was devoted to assist the further understanding of the role of C(2) and of its potential chemical interdependences with CH and other small radicals.

  13. Quantitative model for the generic 3D shape of ICMEs at 1 AU

    Science.gov (United States)

    Démoulin, P.; Janvier, M.; Masías-Meza, J. J.; Dasso, S.

    2016-10-01

    Context. Interplanetary imagers provide 2D projected views of the densest plasma parts of interplanetary coronal mass ejections (ICMEs), while in situ measurements provide magnetic field and plasma parameter measurements along the spacecraft trajectory, that is, along a 1D cut. The data therefore only give a partial view of the 3D structures of ICMEs. Aims: By studying a large number of ICMEs, crossed at different distances from their apex, we develop statistical methods to obtain a quantitative generic 3D shape of ICMEs. Methods: In a first approach we theoretically obtained the expected statistical distribution of the shock-normal orientation from assuming simple models of 3D shock shapes, including distorted profiles, and compared their compatibility with observed distributions. In a second approach we used the shock normal and the flux rope axis orientations together with the impact parameter to provide statistical information across the spacecraft trajectory. Results: The study of different 3D shock models shows that the observations are compatible with a shock that is symmetric around the Sun-apex line as well as with an asymmetry up to an aspect ratio of around 3. Moreover, flat or dipped shock surfaces near their apex can only be rare cases. Next, the sheath thickness and the ICME velocity have no global trend along the ICME front. Finally, regrouping all these new results and those of our previous articles, we provide a quantitative ICME generic 3D shape, including the global shape of the shock, the sheath, and the flux rope. Conclusions: The obtained quantitative generic ICME shape will have implications for several aims. For example, it constrains the output of typical ICME numerical simulations. It is also a base for studying the transport of high-energy solar and cosmic particles during an ICME propagation as well as for modeling and forecasting space weather conditions near Earth.

  14. Model experiments on the sensitization of polyethylene cross-linking of oligobutadienes

    International Nuclear Information System (INIS)

    Brede, O.; Beckert, D.; Hoesselbarth, B.; Specht, W.; Tannert, F.; Wunsch, K.

    1988-01-01

    In presence of ≥ 1 % of 1,2-oligobutadiene the efficiency of the radiation-induced cross-linking of polyethylene was found to be increased in comparison to the pure matrix. Model experiments with solutions of the sensitizer in long chain n-alkanes showed that after addition of alkyl radicals onto the oligobutadiene (reaction with the vinyl groups) the sensitizer forms an own network which is grafted by the alkyl groups. In comparison to this grafting reaction proceeding with G of about 5 the vinyl consumption happened with about the threefold of it indicating a short (intra- and intermolecular) vinyl reaction chain. Pulse radiolysis measurements in solutions of the 1,2-oligobutadiene in n-hexadecane and in molten PE blends resulted in the observation of radical transients of the cross-linking reaction. (author)

  15. Tumour-cell killing by X-rays and immunity quantitated in a mouse model system

    International Nuclear Information System (INIS)

    Porteous, D.D.; Porteous, K.M.; Hughes, M.J.

    1979-01-01

    As part of an investigation of the interaction of X-rays and immune cytotoxicity in tumour control, an experimental mouse model system has been used in which quantitative anti-tumour immunity was raised in prospective recipients of tumour-cell suspensions exposed to varying doses of X-rays in vitro before injection. Findings reported here indicate that, whilst X-rays kill a proportion of cells, induced immunity deals with a fixed number dependent upon the immune status of the host, and that X-rays and anti-tumour immunity do not act synergistically in tumour-cell killing. The tumour used was the ascites sarcoma BP8. (author)

  16. Impact Assessment of Abiotic Resources in LCA: Quantitative Comparison of Selected Characterization Models

    DEFF Research Database (Denmark)

    Rørbech, Jakob Thaysen; Vadenbo, Carl; Hellweg, Stefanie

    2014-01-01

    Resources have received significant attention in recent years resulting in development of a wide range of resource depletion indicators within life cycle assessment (LCA). Understanding the differences in assessment principles used to derive these indicators and the effects on the impact assessment...... results is critical for indicator selection and interpretation of the results. Eleven resource depletion methods were evaluated quantitatively with respect to resource coverage, characterization factors (CF), impact contributions from individual resources, and total impact scores. We included 2247...... groups, according to method focus and modeling approach, to aid method selection within LCA....

  17. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  18. A quantitative speciation model for the adsorption of organic pollutants on activated carbon.

    Science.gov (United States)

    Grivé, M; García, D; Domènech, C; Richard, L; Rojo, I; Martínez, X; Rovira, M

    2013-01-01

    Granular activated carbon (GAC) is commonly used as adsorbent in water treatment plants given its high capacity for retaining organic pollutants in aqueous phase. The current knowledge on GAC behaviour is essentially empirical, and no quantitative description of the chemical relationships between GAC surface groups and pollutants has been proposed. In this paper, we describe a quantitative model for the adsorption of atrazine onto GAC surface. The model is based on results of potentiometric titrations and three types of adsorption experiments which have been carried out in order to determine the nature and distribution of the functional groups on the GAC surface, and evaluate the adsorption characteristics of GAC towards atrazine. Potentiometric titrations have indicated the existence of at least two different families of chemical groups on the GAC surface, including phenolic- and benzoic-type surface groups. Adsorption experiments with atrazine have been satisfactorily modelled with the geochemical code PhreeqC, assuming that atrazine is sorbed onto the GAC surface in equilibrium (log Ks = 5.1 ± 0.5). Independent thermodynamic calculations suggest a possible adsorption of atrazine on a benzoic derivative. The present work opens a new approach for improving the adsorption capabilities of GAC towards organic pollutants by modifying its chemical properties.

  19. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  20. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity.

  1. Spatiotemporal microbiota dynamics from quantitative in vitro and in silico models of the gut

    Science.gov (United States)

    Hwa, Terence

    The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth behaviors, which ultimately dictate the gut microbiota composition. Combining measurements of bacterial growth physiology with analysis of published data on human physiology into a quantitative modeling framework, we show how hydrodynamic forces in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla in the gut. Our model quantitatively explains the observed variation of microbiota composition among healthy adults, and predicts colonic water absorption (manifested as stool consistency) and nutrient intake to be two key factors determining this composition. The model further reveals that both factors, which have been identified in recent correlative studies, exert their effects through the same mechanism: changes in colonic pH that differentially affect the growth of different bacteria. Our findings show that a predictive and mechanistic understanding of microbial ecology in the human gut is possible, and offer the hope for the rational design of intervention strategies to actively control the microbiota. This work is supported by the Bill and Melinda Gates Foundation.

  2. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    International Nuclear Information System (INIS)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity

  3. A didactical structural modellinking analysis of teaching and analysis of educational media

    DEFF Research Database (Denmark)

    Graf, Stefan Ting

    1. Gap between general didactics and textbook/media research There seems to be a gap between general didactics (theory of teaching) and research in textbooks or educational media in general at least in the Nordic and German speaking countries. General didactic and their models seem to underestimate...... related questions (e.g. readability) without establishing a link to what is useful for the teacher’s tasks both on the level of preparation, practice and reflection, i.e. without an explicit theory of teaching. 2. Media in general didactics I will discuss the status of media in some current models...... of reflection in general didactics (Hiim/Hippe, Meyer, Klafki) and present a reconstruction of a didactical model of structure (Strukturmodel), whose counterstones are ‘intentional content’, ‘media/expression’ and ‘teaching method/activity’. The inclusion of media/expression in the model resumes a seemingly...

  4. From Rivers to Oceans and Back: Linking Models to Encompass the Full Salmon Life Cycle

    Science.gov (United States)

    Danner, E.; Hendrix, N.; Martin, B.; Lindley, S. T.

    2016-02-01

    Pacific salmon are a promising study subject for investigating the linkages between freshwater and coastal ocean ecosystems. Salmon use a wide range of habitats throughout their life cycle as they move with water from mountain streams, mainstem rivers, estuaries, bays, and coastal oceans, with adult fish swimming back through the same migration route they took as juveniles. Conditions in one habitat can have growth and survival consequences that manifest in the following habitat, so is key that full life cycle models are used to further our understanding salmon population dynamics. Given the wide range of habitats and potential stressors, this approach requires the coordination of a multidisciplinary suite of physical and biological models, including climate, hydrologic, hydraulic, food web, circulation, bioenergetic, and ecosystem models. Here we present current approaches to linking physical and biological models that capture the foundational drivers for salmon in complex and dynamic systems.

  5. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  6. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    Science.gov (United States)

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-06-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented.

  7. A quantitative microbial risk assessment model for Listeria monocytogenes in RTE sandwiches

    DEFF Research Database (Denmark)

    Tirloni, E.; Stella, S.; de Knegt, Leonardo

    2018-01-01

    within each serving. Then, two dose-response models were alternatively applied: the first used a fixed r value for each of the three population groups, while the second considered a variable r value (lognormal distribution), taking into account the variability in strain virulence and different host...... subpopulations susceptibility. The stochastic model predicted zero cases for total population for both the substrates by using the fixed r approach, while 3 cases were expected when a higher variability (in virulence and susceptibility) was considered in the model; the number of cases increased to 45......A Quantitative Microbial Risk Assessment (QMRA) was performed to estimate the expected number of listeriosis cases due to the consumption, on the last day of shelf life, of 20 000 servings of multi-ingredient sandwiches produced by a medium scale food producer in Italy, by different population...

  8. Quantitative model for the blood pressure‐lowering interaction of valsartan and amlodipine

    Science.gov (United States)

    Heo, Young‐A; Holford, Nick; Kim, Yukyung; Son, Mijeong

    2016-01-01

    Aims The objective of this study was to develop a population pharmacokinetic (PK) and pharmacodynamic (PD) model to quantitatively describe the antihypertensive effect of combined therapy with amlodipine and valsartan. Methods PK modelling was used with data collected from 48 healthy volunteers receiving a single dose of combined formulation of 10 mg amlodipine and 160 mg valsartan. Systolic (SBP) and diastolic blood pressure (DBP) were recorded during combined administration. SBP and DBP data for each drug alone were gathered from the literature. PKPD models of each drug and for combined administration were built with NONMEM 7.3. Results A two‐compartment model with zero order absorption best described the PK data of both drugs. Amlodipine and valsartan monotherapy effects on SBP and DBP were best described by an I max model with an effect compartment delay. Combined therapy was described using a proportional interaction term as follows: (D1 + D2) +ALPHA×(D1 × D2). D1 and D2 are the predicted drug effects of amlodipine and valsartan monotherapy respectively. ALPHA is the interaction term for combined therapy. Quantitative estimates of ALPHA were −0.171 (95% CI: −0.218, −0.143) for SBP and −0.0312 (95% CI: −0.07739, −0.00283) for DBP. These infra‐additive interaction terms for both SBP and DBP were consistent with literature results for combined administration of drugs in these classes. Conclusion PKPD models for SBP and DBP successfully described the time course of the antihypertensive effects of amlodipine and valsartan. An infra‐additive interaction between amlodipine and valsartan when used in combined administration was confirmed and quantified. PMID:27504853

  9. Application of a linked stress release model in Corinth Gulf and Central Ionian Islands (Greece)

    Science.gov (United States)

    Mangira, Ourania; Vasiliadis, Georgios; Papadimitriou, Eleftheria

    2017-06-01

    Spatio-temporal stress changes and interactions between adjacent fault segments consist of the most important component in seismic hazard assessment, as they can alter the occurrence probability of strong earthquake onto these segments. The investigation of the interactions between adjacent areas by means of the linked stress release model is attempted for moderate earthquakes ( M ≥ 5.2) in the Corinth Gulf and the Central Ionian Islands (Greece). The study areas were divided in two subareas, based on seismotectonic criteria. The seismicity of each subarea is investigated by means of a stochastic point process and its behavior is determined by the conditional intensity function, which usually gets an exponential form. A conditional intensity function of Weibull form is used for identifying the most appropriate among the models (simple, independent and linked stress release model) for the interpretation of the earthquake generation process. The appropriateness of the models was decided after evaluation via the Akaike information criterion. Despite the fact that the curves of the conditional intensity functions exhibit similar behavior, the use of the exponential-type conditional intensity function seems to fit better the data.

  10. Quantitative Hydraulic Models Of Early Land Plants Provide Insight Into Middle Paleozoic Terrestrial Paleoenvironmental Conditions

    Science.gov (United States)

    Wilson, J. P.; Fischer, W. W.

    2010-12-01

    Fossil plants provide useful proxies of Earth’s climate because plants are closely connected, through physiology and morphology, to the environments in which they lived. Recent advances in quantitative hydraulic models of plant water transport provide new insight into the history of climate by allowing fossils to speak directly to environmental conditions based on preserved internal anatomy. We report results of a quantitative hydraulic model applied to one of the earliest terrestrial plants preserved in three dimensions, the ~396 million-year-old vascular plant Asteroxylon mackei. This model combines equations describing the rate of fluid flow through plant tissues with detailed observations of plant anatomy; this allows quantitative estimates of two critical aspects of plant function. First and foremost, results from these models quantify the supply of water to evaporative surfaces; second, results describe the ability of plant vascular systems to resist tensile damage from extreme environmental events, such as drought or frost. This approach permits quantitative comparisons of functional aspects of Asteroxylon with other extinct and extant plants, informs the quality of plant-based environmental proxies, and provides concrete data that can be input into climate models. Results indicate that despite their small size, water transport cells in Asteroxylon could supply a large volume of water to the plant's leaves--even greater than cells from some later-evolved seed plants. The smallest Asteroxylon tracheids have conductivities exceeding 0.015 m^2 / MPa * s, whereas Paleozoic conifer tracheids do not reach this threshold until they are three times wider. However, this increase in conductivity came at the cost of little to no adaptations for transport safety, placing the plant’s vegetative organs in jeopardy during drought events. Analysis of the thickness-to-span ratio of Asteroxylon’s tracheids suggests that environmental conditions of reduced relative

  11. LinkImputeR: user-guided genotype calling and imputation for non-model organisms.

    Science.gov (United States)

    Money, Daniel; Migicovsky, Zoë; Gardner, Kyle; Myles, Sean

    2017-07-10

    Genomic studies such as genome-wide association and genomic selection require genome-wide genotype data. All existing technologies used to create these data result in missing genotypes, which are often then inferred using genotype imputation software. However, existing imputation methods most often make use only of genotypes that are successfully inferred after having passed a certain read depth threshold. Because of this, any read information for genotypes that did not pass the threshold, and were thus set to missing, is ignored. Most genomic studies also choose read depth thresholds and quality filters without investigating their effects on the size and quality of the resulting genotype data. Moreover, almost all genotype imputation methods require ordered markers and are therefore of limited utility in non-model organisms. Here we introduce LinkImputeR, a software program that exploits the read count information that is normally ignored, and makes use of all available DNA sequence information for the purposes of genotype calling and imputation. It is specifically designed for non-model organisms since it requires neither ordered markers nor a reference panel of genotypes. Using next-generation DNA sequence (NGS) data from apple, cannabis and grape, we quantify the effect of varying read count and missingness thresholds on the quantity and quality of genotypes generated from LinkImputeR. We demonstrate that LinkImputeR can increase the number of genotype calls by more than an order of magnitude, can improve genotyping accuracy by several percent and can thus improve the power of downstream analyses. Moreover, we show that the effects of quality and read depth filters can differ substantially between data sets and should therefore be investigated on a per-study basis. By exploiting DNA sequence data that is normally ignored during genotype calling and imputation, LinkImputeR can significantly improve both the quantity and quality of genotype data generated from

  12. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    Science.gov (United States)

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  13. First experiences with model based iterative reconstructions influence on quantitative plaque volume and intensity measurements in coronary computed tomography angiography

    DEFF Research Database (Denmark)

    Precht, Helle; Kitslaar, Pieter H.; Broersen, Alexander

    2017-01-01

    Purpose: Investigate the influence of adaptive statistical iterative reconstruction (ASIR) and the model- based IR (Veo) reconstruction algorithm in coronary computed tomography angiography (CCTA) im- ages on quantitative measurements in coronary arteries for plaque volumes and intensities. Methods...

  14. Development of quantitative atomic modeling for tungsten transport study using LHD plasma with tungsten pellet injection

    Science.gov (United States)

    Murakami, I.; Sakaue, H. A.; Suzuki, C.; Kato, D.; Goto, M.; Tamura, N.; Sudo, S.; Morita, S.

    2015-09-01

    Quantitative tungsten study with reliable atomic modeling is important for successful achievement of ITER and fusion reactors. We have developed tungsten atomic modeling for understanding the tungsten behavior in fusion plasmas. The modeling is applied to the analysis of tungsten spectra observed from plasmas of the large helical device (LHD) with tungsten pellet injection. We found that extreme ultraviolet (EUV) emission of W24+ to W33+ ions at 1.5-3.5 nm are sensitive to electron temperature and useful to examine the tungsten behavior in edge plasmas. We can reproduce measured EUV spectra at 1.5-3.5 nm by calculated spectra with the tungsten atomic model and obtain charge state distributions of tungsten ions in LHD plasmas at different temperatures around 1 keV. Our model is applied to calculate the unresolved transition array (UTA) seen at 4.5-7 nm tungsten spectra. We analyze the effect of configuration interaction on population kinetics related to the UTA structure in detail and find the importance of two-electron-one-photon transitions between 4p54dn+1- 4p64dn-14f. Radiation power rate of tungsten due to line emissions is also estimated with the model and is consistent with other models within factor 2.

  15. Quantitative computational models of molecular self-assembly in systems biology.

    Science.gov (United States)

    Thomas, Marcus; Schwartz, Russell

    2017-05-23

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.

  16. Linking the Power and Transport Sectors—Part 2: Modelling a Sector Coupling Scenario for Germany

    Directory of Open Access Journals (Sweden)

    Martin Robinius

    2017-07-01

    Full Text Available “Linking the power and transport sectors—Part 1” describes the general principle of “sector coupling” (SC, develops a working definition intended of the concept to be of utility to the international scientific community, contains a literature review that provides an overview of relevant scientific papers on this topic and conducts a rudimentary analysis of the linking of the power and transport sectors on a worldwide, EU and German level. The aim of this follow-on paper is to outline an approach to the modelling of SC. Therefore, a study of Germany as a case study was conducted. This study assumes a high share of renewable energy sources (RES contributing to the grid and significant proportion of fuel cell vehicles (FCVs in the year 2050, along with a dedicated hydrogen pipeline grid to meet hydrogen demand. To construct a model of this nature, the model environment “METIS” (models for energy transformation and integration systems we developed will be described in more detail in this paper. Within this framework, a detailed model of the power and transport sector in Germany will be presented in this paper and the rationale behind its assumptions described. Furthermore, an intensive result analysis for the power surplus, utilization of electrolysis, hydrogen pipeline and economic considerations has been conducted to show the potential outcomes of modelling SC. It is hoped that this will serve as a basis for researchers to apply this framework in future to models and analysis with an international focus.

  17. Quantitative Modeling of Acid Wormholing in Carbonates- What Are the Gaps to Bridge

    KAUST Repository

    Qiu, Xiangdong

    2013-01-01

    Carbonate matrix acidization extends a well\\'s effective drainage radius by dissolving rock and forming conductive channels (wormholes) from the wellbore. Wormholing is a dynamic process that involves balance between the acid injection rate and reaction rate. Generally, injection rate is well defined where injection profiles can be controlled, whereas the reaction rate can be difficult to obtain due to its complex dependency on interstitial velocity, fluid composition, rock surface properties etc. Conventional wormhole propagation models largely ignore the impact of reaction products. When implemented in a job design, the significant errors can result in treatment fluid schedule, rate, and volume. A more accurate method to simulate carbonate matrix acid treatments would accomodate the effect of reaction products on reaction kinetics. It is the purpose of this work to properly account for these effects. This is an important step in achieving quantitative predictability of wormhole penetration during an acidzing treatment. This paper describes the laboratory procedures taken to obtain the reaction-product impacted kinetics at downhole conditions using a rotating disk apparatus, and how this new set of kinetics data was implemented in a 3D wormholing model to predict wormhole morphology and penetration velocity. The model explains some of the differences in wormhole morphology observed in limestone core flow experiments where injection pressure impacts the mass transfer of hydrogen ions to the rock surface. The model uses a CT scan rendered porosity field to capture the finer details of the rock fabric and then simulates the fluid flow through the rock coupled with reactions. Such a validated model can serve as a base to scale up to near wellbore reservoir and 3D radial flow geometry allowing a more quantitative acid treatment design.

  18. Qualitative and quantitative examination of the performance of regional air quality models representing different modeling approaches

    International Nuclear Information System (INIS)

    Bhumralkar, C.M.; Ludwig, F.L.; Shannon, J.D.; McNaughton, D.

    1985-04-01

    The calculations of three different air quality models were compared with the best available observations. The comparisons were made without calibrating the models to improve agreement with the observations. Model performance was poor for short averaging times (less than 24 hours). Some of the poor performance can be traced to errors in the input meteorological fields, but error exist on all levels. It should be noted that these models were not originally designed for treating short-term episodes. For short-term episodes, much of the variance in the data can arise from small spatial scale features that tend to be averaged out over longer periods. These small spatial scale features cannot be resolved with the coarse grids that are used for the meteorological and emissions inputs. Thus, it is not surprising that the models performed for the longer averaging times. The models compared were RTM-II, ENAMAP-2 and ACID. (17 refs., 5 figs., 4 tabs

  19. Biomine: predicting links between biological entities using network models of heterogeneous databases

    Directory of Open Access Journals (Sweden)

    Eronen Lauri

    2012-06-01

    Full Text Available Abstract Background Biological databases contain large amounts of data concerning the functions and associations of genes and proteins. Integration of data from several such databases into a single repository can aid the discovery of previously unknown connections spanning multiple types of relationships and databases. Results Biomine is a system that integrates cross-references from several biological databases into a graph model with multiple types of edges, such as protein interactions, gene-disease associations and gene ontology annotations. Edges are weighted based on their type, reliability, and informativeness. We present Biomine and evaluate its performance in link prediction, where the goal is to predict pairs of nodes that will be connected in the future, based on current data. In particular, we formulate protein interaction prediction and disease gene prioritization tasks as instances of link prediction. The predictions are based on a proximity measure computed on the integrated graph. We consider and experiment with several such measures, and perform a parameter optimization procedure where different edge types are weighted to optimize link prediction accuracy. We also propose a novel method for disease-gene prioritization, defined as finding a subset of candidate genes that cluster together in the graph. We experimentally evaluate Biomine by predicting future annotations in the source databases and prioritizing lists of putative disease genes. Conclusions The experimental results show that Biomine has strong potential for predicting links when a set of selected candidate links is available. The predictions obtained using the entire Biomine dataset are shown to clearly outperform ones obtained using any single source of data alone, when different types of links are suitably weighted. In the gene prioritization task, an established reference set of disease-associated genes is useful, but the results show that under favorable

  20. An open source web interface for linking models to infrastructure system databases

    Science.gov (United States)

    Knox, S.; Mohamed, K.; Harou, J. J.; Rheinheimer, D. E.; Medellin-Azuara, J.; Meier, P.; Tilmant, A.; Rosenberg, D. E.

    2016-12-01

    Models of networked engineered resource systems such as water or energy systems are often built collaboratively with developers from different domains working at different locations. These models can be linked to large scale real world databases, and they are constantly being improved and extended. As the development and application of these models becomes more sophisticated, and the computing power required for simulations and/or optimisations increases, so has the need for online services and tools which enable the efficient development and deployment of these models. Hydra Platform is an open source, web-based data management system, which allows modellers of network-based models to remotely store network topology and associated data in a generalised manner, allowing it to serve multiple disciplines. Hydra Platform uses a web API using JSON to allow external programs (referred to as `Apps') to interact with its stored networks and perform actions such as importing data, running models, or exporting the networks to different formats. Hydra Platform supports multiple users accessing the same network and has a suite of functions for managing users and data. We present ongoing development in Hydra Platform, the Hydra Web User Interface, through which users can collaboratively manage network data and models in a web browser. The web interface allows multiple users to graphically access, edit and share their networks, run apps and view results. Through apps, which are located on the server, the web interface can give users access to external data sources and models without the need to install or configure any software. This also ensures model results can be reproduced by removing platform or version dependence. Managing data and deploying models via the web interface provides a way for multiple modellers to collaboratively manage data, deploy and monitor model runs and analyse results.

  1. Identification of X-linked quantitative trait loci affecting cold tolerance in Drosophila melanogaster and fine mapping by selective sweep analysis.

    Science.gov (United States)

    Svetec, Nicolas; Werzner, Annegret; Wilches, Ricardo; Pavlidis, Pavlos; Alvarez-Castro, José M; Broman, Karl W; Metzler, Dirk; Stephan, Wolfgang

    2011-02-01

    Drosophila melanogaster is a cosmopolitan species that colonizes a great variety of environments. One trait that shows abundant evidence for naturally segregating genetic variance in different populations of D. melanogaster is cold tolerance. Previous work has found quantitative trait loci (QTL) exclusively on the second and the third chromosomes. To gain insight into the genetic architecture of cold tolerance on the X chromosome and to compare the results with our analyses of selective sweeps, a mapping population was derived from a cross between substitution lines that solely differed in the origin of their X chromosome: one originates from a European inbred line and the other one from an African inbred line. We found a total of six QTL for cold tolerance factors on the X chromosome of D. melanogaster. Although the composite interval mapping revealed slightly different QTL profiles between sexes, a coherent model suggests that most QTL overlapped between sexes, and each explained around 5-14% of the genetic variance (which may be slightly overestimated). The allelic effects were largely additive, but we also detected two significant interactions. Taken together, this provides evidence for multiple QTL that are spread along the entire X chromosome and whose effects range from low to intermediate. One detected transgressive QTL influences cold tolerance in different ways for the two sexes. While females benefit from the European allele increasing their cold tolerance, males tend to do better with the African allele. Finally, using selective sweep mapping, the candidate gene CG16700 for cold tolerance colocalizing with a QTL was identified. © 2010 Blackwell Publishing Ltd.

  2. Reservoir architecture modeling: Nonstationary models for quantitative geological characterization. Final report, April 30, 1998

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, D.; Epili, D.; Kelkar, M.; Redner, R.; Reynolds, A.

    1998-12-01

    The study was comprised of four investigations: facies architecture; seismic modeling and interpretation; Markov random field and Boolean models for geologic modeling of facies distribution; and estimation of geological architecture using the Bayesian/maximum entropy approach. This report discusses results from all four investigations. Investigations were performed using data from the E and F units of the Middle Frio Formation, Stratton Field, one of the major reservoir intervals in the Gulf Coast Basin.

  3. Modeling and Performance Analysis of 10 Gbps Inter-satellite Optical Wireless Communication Link

    Science.gov (United States)

    Singh, Mehtab

    2017-12-01

    Free-space optical (FSO) communication has the advantages of two of the most predominant data transmission technologies - optical fiber communication and wireless communication. Most of the technical aspects of FSO are similar to that of optical fiber communication, with major difference in the information signal propagation medium which is free space in case of FSO rather than silica glass in optical fiber communication. One of the most important applications of FSO is inter-satellite optical wireless communication (IsOWC) links which will be deployed in the future in space. The IsOWC links have many advantages over the previously existing microwave satellite communication technologies such as higher bandwidth, lower power consumption, low cost of implementation, light size, and weight. In this paper, modeling and performance analysis of a 10-Gbps inter-satellite communication link with two satellites separated at a distance of 1,200 km has been done using OPTISYSTEM simulation software. Performance has been analyzed on the basis of quality factor, signal to noise ratio (SNR), and total power of the received signal.

  4. Application of cross-linked and hydrolyzed arabinoxylans in baking of model rye bread.

    Science.gov (United States)

    Buksa, Krzysztof; Nowotna, Anna; Ziobro, Rafał

    2016-02-01

    The role of water extractable arabinoxylan with varying molar mass and structure (cross-linked vs. hydrolyzed) in the structure formation of rye bread was examined using a model bread. Instead of the normal flour, the dough contained starch, arabinoxylan and protein, which were isolated from rye wholemeal. It was observed that the applied mixes of these constituents result in a product closely resembling typical rye bread, even if arabinoxylan was modified (by cross-linking or hydrolysis). The levels of arabinoxylan required for bread preparation depended on its modification and mix composition. At 3% protein, the maximum applicable level of poorly soluble cross-linked arabinoxylan was 3%, as higher amounts of this preparation resulted in an extensively viscous dough and diminished bread volume. On the other hand highly soluble, hydrolyzed arabinoxylan could be used at a higher level (6%) together with larger amounts of rye protein (3% or 6%). Further addition of arabinoxylan leads to excessive water absorption, resulting in a decreased viscosity of the dough during baking and insufficient gas retention. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. An atomistic model for cross-linked HNBR elastomers used in seals

    Science.gov (United States)

    Molinari, Nicola; Sutton, Adrian; Stevens, John; Mostofi, Arash

    2015-03-01

    Hydrogenated nitrile butadiene rubber (HNBR) is one of the most common elastomeric materials used for seals in the oil and gas industry. These seals sometimes suffer ``explosive decompression,'' a costly problem in which gases permeate a seal at the elevated temperatures and pressures pertaining in oil and gas wells, leading to rupture when the seal is brought back to the surface. The experimental evidence that HNBR and its unsaturated parent NBR have markedly different swelling properties suggests that cross-linking may occur during hydrogenation of NBR to produce HNBR. We have developed a code compatible with the LAMMPS molecular dynamics package to generate fully atomistic HNBR configurations by hydrogenating initial NBR structures. This can be done with any desired degree of cross-linking. The code uses a model of atomic interactions based on the OPLS-AA force-field. We present calculations of the dependence of a number of bulk properties on the degree of cross-linking. Using our atomistic representations of HNBR and NBR, we hope to develop a better molecular understanding of the mechanisms that result in explosive decompression.

  6. Introduction to the Special Section: Linking the MMPI-2-RF to Contemporary Models of Psychopathology.

    Science.gov (United States)

    Sellbom, Martin; Arbisi, Paul A

    2017-01-01

    This special section considers 9 independent articles that seek to link the Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008/ 2011 ) to contemporary models of psychopathology. Sellbom ( this issue ) maps the Specific Problems scales onto hierarchical psychopathology structures, whereas Romero, Toorabally, Burchett, Tarescavage, and Glassmire ( this issue ) and Shkalim, Almagor, and Ben-Porath ( this issue ) show evidence of linking the instruments' scales to diagnostic representations of common higher order psychopathology constructs. McCord, Achee, Cannon, Harrop, and Poynter ( this issue ) link the MMPI-2-RF scales to psychophysiological constructs inspired by the National Institute of Mental Health (NIMH) Research Domain Criteria. Sellbom and Smith ( this issue ) find support for MMPI-2-RF scale hypotheses in covering personality psychopathology in general, whereas Klein Haneveld, Kamphuis, Smid, and Forbey ( this issue ) and Kutchen et al. ( this issue ) demonstrate the utility of the MMPI-2-RF in capturing contemporary conceptualizations of the psychopathic personality. Finally, Franz, Harrop, and McCord ( this issue ) and Rogers et al. ( this issue ) mapped the MMPI-2-RF scales onto more specific transdiagnostic constructs reflecting interpersonal functioning and suicide behavior proneness, respectively.

  7. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its stru....... Further fundamental extensions and advances to more sophisticated theory models, such as those related to dynamics and expectations (in the structural relations) are left for future papers......This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its......, it is demonstrated how other controversial hypotheses such as Rational Expectations can be formulated directly as restrictions on the CVAR-parameters. A simple example of a "Neoclassical synthetic" AS-AD model is also formulated. Finally, the partial- general equilibrium distinction is related to the CVAR as well...

  8. Quantitative evaluation and modeling of two-dimensional neovascular network complexity: the surface fractal dimension

    International Nuclear Information System (INIS)

    Grizzi, Fabio; Russo, Carlo; Colombo, Piergiuseppe; Franceschini, Barbara; Frezza, Eldo E; Cobos, Everardo; Chiriva-Internati, Maurizio

    2005-01-01

    Modeling the complex development and growth of tumor angiogenesis using mathematics and biological data is a burgeoning area of cancer research. Architectural complexity is the main feature of every anatomical system, including organs, tissues, cells and sub-cellular entities. The vascular system is a complex network whose geometrical characteristics cannot be properly defined using the principles of Euclidean geometry, which is only capable of interpreting regular and smooth objects that are almost impossible to find in Nature. However, fractal geometry is a more powerful means of quantifying the spatial complexity of real objects. This paper introduces the surface fractal dimension (D s ) as a numerical index of the two-dimensional (2-D) geometrical complexity of tumor vascular networks, and their behavior during computer-simulated changes in vessel density and distribution. We show that D s significantly depends on the number of vessels and their pattern of distribution. This demonstrates that the quantitative evaluation of the 2-D geometrical complexity of tumor vascular systems can be useful not only to measure its complex architecture, but also to model its development and growth. Studying the fractal properties of neovascularity induces reflections upon the real significance of the complex form of branched anatomical structures, in an attempt to define more appropriate methods of describing them quantitatively. This knowledge can be used to predict the aggressiveness of malignant tumors and design compounds that can halt the process of angiogenesis and influence tumor growth

  9. Preclinical Magnetic Resonance Fingerprinting (MRF) at 7 T: Effective Quantitative Imaging for Rodent Disease Models

    Science.gov (United States)

    Gao, Ying; Chen, Yong; Ma, Dan; Jiang, Yun; Herrmann, Kelsey A.; Vincent, Jason A.; Dell, Katherine M.; Drumm, Mitchell L.; Brady-Kalnay, Susann M.; Griswold, Mark A.; Flask, Chris A.; Lu, Lan

    2015-01-01

    High field, preclinical magnetic resonance imaging (MRI) scanners are now commonly used to quantitatively assess disease status and efficacy of novel therapies in a wide variety of rodent models. Unfortunately, conventional MRI methods are highly susceptible to respiratory and cardiac motion artifacts resulting in potentially inaccurate and misleading data. We have developed an initial preclinical, 7.0 T MRI implementation of the highly novel Magnetic Resonance Fingerprinting (MRF) methodology that has been previously described for clinical imaging applications. The MRF technology combines a priori variation in the MRI acquisition parameters with dictionary-based matching of acquired signal evolution profiles to simultaneously generate quantitative maps of T1 and T2 relaxation times and proton density. This preclinical MRF acquisition was constructed from a Fast Imaging with Steady-state Free Precession (FISP) MRI pulse sequence to acquire 600 MRF images with both evolving T1 and T2 weighting in approximately 30 minutes. This initial high field preclinical MRF investigation demonstrated reproducible and differentiated estimates of in vitro phantoms with different relaxation times. In vivo preclinical MRF results in mouse kidneys and brain tumor models demonstrated an inherent resistance to respiratory motion artifacts as well as sensitivity to known pathology. These results suggest that MRF methodology may offer the opportunity for quantification of numerous MRI parameters for a wide variety of preclinical imaging applications. PMID:25639694

  10. A Cytomorphic Chip for Quantitative Modeling of Fundamental Bio-Molecular Circuits.

    Science.gov (United States)

    2015-08-01

    We describe a 0.35 μm BiCMOS silicon chip that quantitatively models fundamental molecular circuits via efficient log-domain cytomorphic transistor equivalents. These circuits include those for biochemical binding with automatic representation of non-modular and loading behavior, e.g., in cascade and fan-out topologies; for representing variable Hill-coefficient operation and cooperative binding; for representing inducer, transcription-factor, and DNA binding; for probabilistic gene transcription with analogic representations of log-linear and saturating operation; for gain, degradation, and dynamics of mRNA and protein variables in transcription and translation; and, for faithfully representing biological noise via tunable stochastic transistor circuits. The use of on-chip DACs and ADCs enables multiple chips to interact via incoming and outgoing molecular digital data packets and thus create scalable biochemical reaction networks. The use of off-chip digital processors and on-chip digital memory enables programmable connectivity and parameter storage. We show that published static and dynamic MATLAB models of synthetic biological circuits including repressilators, feed-forward loops, and feedback oscillators are in excellent quantitative agreement with those from transistor circuits on the chip. Computationally intensive stochastic Gillespie simulations of molecular production are also rapidly reproduced by the chip and can be reliably tuned over the range of signal-to-noise ratios observed in biological cells.

  11. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (IBM PC VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  12. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (MACINTOSH VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  13. Linking genes to ecosystem trace gas fluxes in a large-scale model system

    Science.gov (United States)

    Meredith, L. K.; Cueva, A.; Volkmann, T. H. M.; Sengupta, A.; Troch, P. A.

    2017-12-01

    Soil microorganisms mediate biogeochemical cycles through biosphere-atmosphere gas exchange with significant impact on atmospheric trace gas composition. Improving process-based understanding of these microbial populations and linking their genomic potential to the ecosystem-scale is a challenge, particularly in soil systems, which are heterogeneous in biodiversity, chemistry, and structure. In oligotrophic systems, such as the Landscape Evolution Observatory (LEO) at Biosphere 2, atmospheric trace gas scavenging may supply critical metabolic needs to microbial communities, thereby promoting tight linkages between microbial genomics and trace gas utilization. This large-scale model system of three initially homogenous and highly instrumented hillslopes facilitates high temporal resolution characterization of subsurface trace gas fluxes at hundreds of sampling points, making LEO an ideal location to study microbe-mediated trace gas fluxes from the gene to ecosystem scales. Specifically, we focus on the metabolism of ubiquitous atmospheric reduced trace gases hydrogen (H2), carbon monoxide (CO), and methane (CH4), which may have wide-reaching impacts on microbial community establishment, survival, and function. Additionally, microbial activity on LEO may facilitate weathering of the basalt matrix, which can be studied with trace gas measurements of carbonyl sulfide (COS/OCS) and carbon dioxide (O-isotopes in CO2), and presents an additional opportunity for gene to ecosystem study. This work will present initial measurements of this suite of trace gases to characterize soil microbial metabolic activity, as well as links between spatial and temporal variability of microbe-mediated trace gas fluxes in LEO and their relation to genomic-based characterization of microbial community structure (phylogenetic amplicons) and genetic potential (metagenomics). Results from the LEO model system will help build understanding of the importance of atmospheric inputs to

  14. Random blebbing motion: A simple model linking cell structural properties to migration characteristics

    Science.gov (United States)

    Woolley, Thomas E.; Gaffney, Eamonn A.; Goriely, Alain

    2017-07-01

    If the plasma membrane of a cell is able to delaminate locally from its actin cortex, a cellular bleb can be produced. Blebs are pressure-driven protrusions, which are noteworthy for their ability to produce cellular motion. Starting from a general continuum mechanics description, we restrict ourselves to considering cell and bleb shapes that maintain approximately spherical forms. From this assumption, we obtain a tractable algebraic system for bleb formation. By including cell-substrate adhesions, we can model blebbing cell motility. Further, by considering mechanically isolated blebbing events, which are randomly distributed over the cell, we can derive equations linking the macroscopic migration characteristics to the microscopic structural parameters of the cell. This multiscale modeling framework is then used to provide parameter estimates, which are in agreement with current experimental data. In summary, the construction of the mathematical model provides testable relationships between the bleb size and cell motility.

  15. Random blebbing motion: A simple model linking cell structural properties to migration characteristics.

    Science.gov (United States)

    Woolley, Thomas E; Gaffney, Eamonn A; Goriely, Alain

    2017-07-01

    If the plasma membrane of a cell is able to delaminate locally from its actin cortex, a cellular bleb can be produced. Blebs are pressure-driven protrusions, which are noteworthy for their ability to produce cellular motion. Starting from a general continuum mechanics description, we restrict ourselves to considering cell and bleb shapes that maintain approximately spherical forms. From this assumption, we obtain a tractable algebraic system for bleb formation. By including cell-substrate adhesions, we can model blebbing cell motility. Further, by considering mechanically isolated blebbing events, which are randomly distributed over the cell, we can derive equations linking the macroscopic migration characteristics to the microscopic structural parameters of the cell. This multiscale modeling framework is then used to provide parameter estimates, which are in agreement with current experimental data. In summary, the construction of the mathematical model provides testable relationships between the bleb size and cell motility.

  16. SIMO optical wireless links with nonzero boresight pointing errors over M modeled turbulence channels

    Science.gov (United States)

    Varotsos, G. K.; Nistazakis, H. E.; Petkovic, M. I.; Djordjevic, G. T.; Tombras, G. S.

    2017-11-01

    Over the last years terrestrial free-space optical (FSO) communication systems have demonstrated an increasing scientific and commercial interest in response to the growing demands for ultra high bandwidth, cost-effective and secure wireless data transmissions. However, due the signal propagation through the atmosphere, the performance of such links depends strongly on the atmospheric conditions such as weather phenomena and turbulence effect. Additionally, their operation is affected significantly by the pointing errors effect which is caused by the misalignment of the optical beam between the transmitter and the receiver. In order to address this significant performance degradation, several statistical models have been proposed, while particular attention has been also given to diversity methods. Here, the turbulence-induced fading of the received optical signal irradiance is studied through the M (alaga) distribution, which is an accurate model suitable for weak to strong turbulence conditions and unifies most of the well-known, previously emerged models. Thus, taking into account the atmospheric turbulence conditions along with the pointing errors effect with nonzero boresight and the modulation technique that is used, we derive mathematical expressions for the estimation of the average bit error rate performance for SIMO FSO links. Finally, proper numerical results are given to verify our derived expressions and Monte Carlo simulations are also provided to further validate the accuracy of the analysis proposed and the obtained mathematical expressions.

  17. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    Science.gov (United States)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  18. Preparation and Antioxidant Activity of Ethyl-Linked Anthocyanin-Flavanol Pigments from Model Wine Solutions.

    Science.gov (United States)

    Li, Lingxi; Zhang, Minna; Zhang, Shuting; Cui, Yan; Sun, Baoshan

    2018-05-03

    Anthocyanin-flavanol pigments, formed during red wine fermentation and storage by condensation reactions between anthocyanins and flavanols (monomers, oligomers, and polymers), are one of the major groups of polyphenols in aged red wine. However, knowledge of their biological activities is lacking. This is probably due to the structural diversity and complexity of these molecules, which makes the large-scale separation and isolation of the individual compounds very difficult, thus restricting their further study. In this study, anthocyanins (i.e., malvidin-3-glucoside, cyanidin-3-glucoside, and peonidin-3-glucoside) and (⁻)-epicatechin were first isolated at a preparative scale by high-speed counter-current chromatography. The condensation reaction between each of the isolated anthocyanins and (⁻)-epicatechin, mediated by acetaldehyde, was conducted in model wine solutions to obtain ethyl-linked anthocyanin-flavanol pigments. The effects of pH, molar ratio, and temperature on the reaction rate were investigated, and the reaction conditions of pH 1.7, molar ratio 1:6:10 (anthocyanin/(⁻)-epicatechin/acetaldehyde), and reaction temperature of 35 °C were identified as optimal for conversion of anthocyanins to ethyl-linked anthocyanin-flavanol pigments. Six ethyl-linked anthocyanin-flavanol pigments were isolated in larger quantities and collected under optimal reaction conditions, and their chemical structures were identified by HPLC-QTOF-MS and ECD analyses. Furthermore, DPPH, ABTS, and FRAP assays indicate that ethyl-linked anthocyanin-flavanol pigments show stronger antioxidant activities than their precursor anthocyanins.

  19. Preparation and Antioxidant Activity of Ethyl-Linked Anthocyanin-Flavanol Pigments from Model Wine Solutions

    Directory of Open Access Journals (Sweden)

    Lingxi Li

    2018-05-01

    Full Text Available Anthocyanin-flavanol pigments, formed during red wine fermentation and storage by condensation reactions between anthocyanins and flavanols (monomers, oligomers, and polymers, are one of the major groups of polyphenols in aged red wine. However, knowledge of their biological activities is lacking. This is probably due to the structural diversity and complexity of these molecules, which makes the large-scale separation and isolation of the individual compounds very difficult, thus restricting their further study. In this study, anthocyanins (i.e., malvidin-3-glucoside, cyanidin-3-glucoside, and peonidin-3-glucoside and (–-epicatechin were first isolated at a preparative scale by high-speed counter-current chromatography. The condensation reaction between each of the isolated anthocyanins and (–-epicatechin, mediated by acetaldehyde, was conducted in model wine solutions to obtain ethyl-linked anthocyanin-flavanol pigments. The effects of pH, molar ratio, and temperature on the reaction rate were investigated, and the reaction conditions of pH 1.7, molar ratio 1:6:10 (anthocyanin/(–-epicatechin/acetaldehyde, and reaction temperature of 35 °C were identified as optimal for conversion of anthocyanins to ethyl-linked anthocyanin-flavanol pigments. Six ethyl-linked anthocyanin-flavanol pigments were isolated in larger quantities and collected under optimal reaction conditions, and their chemical structures were identified by HPLC-QTOF-MS and ECD analyses. Furthermore, DPPH, ABTS, and FRAP assays indicate that ethyl-linked anthocyanin-flavanol pigments show stronger antioxidant activities than their precursor anthocyanins.

  20. Modeling channel interference in an orbital angular momentum-multiplexed laser link

    Science.gov (United States)

    Anguita, Jaime A.; Neifeld, Mark A.; Vasic, Bane V.

    2009-08-01

    We study the effects of optical turbulence on the energy crosstalk among constituent orbital angular momentum (OAM) states in a vortex-based multi-channel laser communication link and determine channel interference in terms of turbulence strength and OAM state separation. We characterize the channel interference as a function of C2n and transmit OAM state, and propose probability models to predict the random fluctuations in the received signals for such architecture. Simulations indicate that turbulence-induced channel interference is mutually correlated across receive channels.

  1. A regionally-linked, dynamic material flow modelling tool for rolled, extruded and cast aluminium products

    DEFF Research Database (Denmark)

    Bertram, M.; Ramkumar, S.; Rechberger, H.

    2017-01-01

    A global aluminium flow modelling tool, comprising nine trade linked regions, namely China, Europe, Japan, Middle East, North America, Other Asia, Other Producing Countries, South America and Rest of World, has been developed. The purpose of the Microsoft Excel-based tool is the quantification...... of regional stocks and flows of rolled, extruded and casting alloys across space and over time, giving the industry the ability to evaluate the potential to recycle aluminium scrap most efficiently. The International Aluminium Institute will update the tool annually and publish a visualisation of results...

  2. Quantitative Circulatory Physiology: an integrative mathematical model of human physiology for medical education.

    Science.gov (United States)

    Abram, Sean R; Hodnett, Benjamin L; Summers, Richard L; Coleman, Thomas G; Hester, Robert L

    2007-06-01

    We have developed Quantitative Circulatory Physiology (QCP), a mathematical model of integrative human physiology containing over 4,000 variables of biological interactions. This model provides a teaching environment that mimics clinical problems encountered in the practice of medicine. The model structure is based on documented physiological responses within peer-reviewed literature and serves as a dynamic compendium of physiological knowledge. The model is solved using a desktop, Windows-based program, allowing students to calculate time-dependent solutions and interactively alter over 750 parameters that modify physiological function. The model can be used to understand proposed mechanisms of physiological function and the interactions among physiological variables that may not be otherwise intuitively evident. In addition to open-ended or unstructured simulations, we have developed 30 physiological simulations, including heart failure, anemia, diabetes, and hemorrhage. Additional stimulations include 29 patients in which students are challenged to diagnose the pathophysiology based on their understanding of integrative physiology. In summary, QCP allows students to examine, integrate, and understand a host of physiological factors without causing harm to patients. This model is available as a free download for Windows computers at http://physiology.umc.edu/themodelingworkshop.

  3. Plutonium chemistry: a synthesis of experimental data and a quantitative model for plutonium oxide solubility

    International Nuclear Information System (INIS)

    Haschke, J.M.; Oversby, V.M.

    2002-01-01

    The chemistry of plutonium is important for assessing potential behavior of radioactive waste under conditions of geologic disposal. This paper reviews experimental data on dissolution of plutonium oxide solids, describes a hybrid kinetic-equilibrium model for predicting steady-state Pu concentrations, and compares laboratory results with predicted Pu concentrations and oxidation-state distributions. The model is based on oxidation of PuO 2 by water to produce PuO 2+x , an oxide that can release Pu(V) to solution. Kinetic relationships between formation of PuO 2+x , dissolution of Pu(V), disproportionation of Pu(V) to Pu(IV) and Pu(VI), and reduction of Pu(VI) are given and used in model calculations. Data from tests of pyrochemical salt wastes in brines are discussed and interpreted using the conceptual model. Essential data for quantitative modeling at conditions relevant to nuclear waste repositories are identified and laboratory experiments to determine rate constants for use in the model are discussed

  4. A pulsatile flow model for in vitro quantitative evaluation of prosthetic valve regurgitation

    Directory of Open Access Journals (Sweden)

    S. Giuliatti

    2000-03-01

    Full Text Available A pulsatile pressure-flow model was developed for in vitro quantitative color Doppler flow mapping studies of valvular regurgitation. The flow through the system was generated by a piston which was driven by stepper motors controlled by a computer. The piston was connected to acrylic chambers designed to simulate "ventricular" and "atrial" heart chambers. Inside the "ventricular" chamber, a prosthetic heart valve was placed at the inflow connection with the "atrial" chamber while another prosthetic valve was positioned at the outflow connection with flexible tubes, elastic balloons and a reservoir arranged to mimic the peripheral circulation. The flow model was filled with a 0.25% corn starch/water suspension to improve Doppler imaging. A continuous flow pump transferred the liquid from the peripheral reservoir to another one connected to the "atrial" chamber. The dimensions of the flow model were designed to permit adequate imaging by Doppler echocardiography. Acoustic windows allowed placement of transducers distal and perpendicular to the valves, so that the ultrasound beam could be positioned parallel to the valvular flow. Strain-gauge and electromagnetic transducers were used for measurements of pressure and flow in different segments of the system. The flow model was also designed to fit different sizes and types of prosthetic valves. This pulsatile flow model was able to generate pressure and flow in the physiological human range, with independent adjustment of pulse duration and rate as well as of stroke volume. This model mimics flow profiles observed in patients with regurgitant prosthetic valves.

  5. Quantitative models for predicting adsorption of oxytetracycline, ciprofloxacin and sulfamerazine to swine manures with contrasting properties.

    Science.gov (United States)

    Cheng, Dengmiao; Feng, Yao; Liu, Yuanwang; Li, Jinpeng; Xue, Jianming; Li, Zhaojun

    2018-09-01

    Understanding antibiotic adsorption in livestock manures is crucial to assess the fate and risk of antibiotics in the environment. In this study, three quantitative models developed with swine manure-water distribution coefficients (LgK d ) for oxytetracycline (OTC), ciprofloxacin (CIP) and sulfamerazine (SM1) in swine manures. Physicochemical parameters (n=12) of the swine manure were used as independent variables using partial least-squares (PLSs) analysis. The cumulative cross-validated regression coefficients (Q 2 cum ) values, standard deviations (SDs) and external validation coefficient (Q 2 ext ) ranged from 0.761 to 0.868, 0.027 to 0.064, and 0.743 to 0.827 for the three models; as such, internal and external predictability of the models were strong. The pH, soluble organic carbon (SOC) and nitrogen (SON), and Ca were important explanatory variables for the OTC-Model, pH, SOC, and SON for the CIP-model, and pH, total organic nitrogen (TON), and SOC for the SM1-model. The high VIPs (variable importance in the projections) of pH (1.178-1.396), SOC (0.968-1.034), and SON (0.822 and 0.865) established these physicochemical parameters as likely being dominant (associatively) in affecting transport of antibiotics in swine manures. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Modelling and Quantitative Analysis of LTRACK–A Novel Mobility Management Algorithm

    Directory of Open Access Journals (Sweden)

    Benedek Kovács

    2006-01-01

    Full Text Available This paper discusses the improvements and parameter optimization issues of LTRACK, a recently proposed mobility management algorithm. Mathematical modelling of the algorithm and the behavior of the Mobile Node (MN are used to optimize the parameters of LTRACK. A numerical method is given to determine the optimal values of the parameters. Markov chains are used to model both the base algorithm and the so-called loop removal effect. An extended qualitative and quantitative analysis is carried out to compare LTRACK to existing handover mechanisms such as MIP, Hierarchical Mobile IP (HMIP, Dynamic Hierarchical Mobility Management Strategy (DHMIP, Telecommunication Enhanced Mobile IP (TeleMIP, Cellular IP (CIP and HAWAII. LTRACK is sensitive to network topology and MN behavior so MN movement modelling is also introduced and discussed with different topologies. The techniques presented here can not only be used to model the LTRACK algorithm, but other algorithms too. There are many discussions and calculations to support our mathematical model to prove that it is adequate in many cases. The model is valid on various network levels, scalable vertically in the ISO-OSI layers and also scales well with the number of network elements.

  7. Implementation and Evaluation of Linked Parenting Models in a Large Urban Child Welfare System

    Science.gov (United States)

    Feldman, Sara Wolf; Wulczyn, Fred; Saldana, Lisa; Forgatch, Marion

    2015-01-01

    During the past decade, there have been increased efforts to implement evidence-based practices into child welfare systems to improve outcomes for children in foster care and their families. In this paper, the implementation and evaluation of a policy-driven large system-initiated reform is described. Over 250 caseworkers and supervisors were trained and supported to implement two evidence-based parent focused interventions in five private agencies serving over 2,000 children and families. At the request of child welfare system leaders, a third intervention was developed and implemented to train the social work workforce to use evidence-based principles in everyday interactions with caregivers (including foster, relative, adoptive, and biological parents). In this paper, we describe the policy context and the targeted outcomes of the reform. We discuss the theory of the interventions and the logistics of how they were linked to create consistency and synergy. Training and ongoing consultation strategies used are described as are some of the barriers and opportunities that arose during the implementation. The strategy for creating a path to sustainability is also discussed. The reform effort was evaluated using both qualitative and quantitative methods; the evaluation design, research questions and preliminary results are provided. PMID:26602831

  8. [Quantitative models between canopy hyperspectrum and its component features at apple tree prosperous fruit stage].

    Science.gov (United States)

    Wang, Ling; Zhao, Geng-xing; Zhu, Xi-cun; Lei, Tong; Dong, Fang

    2010-10-01

    Hyperspectral technique has become the basis of quantitative remote sensing. Hyperspectrum of apple tree canopy at prosperous fruit stage consists of the complex information of fruits, leaves, stocks, soil and reflecting films, which was mostly affected by component features of canopy at this stage. First, the hyperspectrum of 18 sample apple trees with reflecting films was compared with that of 44 trees without reflecting films. It could be seen that the impact of reflecting films on reflectance was obvious, so the sample trees with ground reflecting films should be separated to analyze from those without ground films. Secondly, nine indexes of canopy components were built based on classified digital photos of 44 apple trees without ground films. Thirdly, the correlation between the nine indexes and canopy reflectance including some kinds of conversion data was analyzed. The results showed that the correlation between reflectance and the ratio of fruit to leaf was the best, among which the max coefficient reached 0.815, and the correlation between reflectance and the ratio of leaf was a little better than that between reflectance and the density of fruit. Then models of correlation analysis, linear regression, BP neural network and support vector regression were taken to explain the quantitative relationship between the hyperspectral reflectance and the ratio of fruit to leaf with the softwares of DPS and LIBSVM. It was feasible that all of the four models in 611-680 nm characteristic band are feasible to be used to predict, while the model accuracy of BP neural network and support vector regression was better than one-variable linear regression and multi-variable regression, and the accuracy of support vector regression model was the best. This study will be served as a reliable theoretical reference for the yield estimation of apples based on remote sensing data.

  9. Use of a plant level logic model for quantitative assessment of systems interactions

    International Nuclear Information System (INIS)

    Chu, B.B.; Rees, D.C.; Kripps, L.P.; Hunt, R.N.; Bradley, M.

    1985-01-01

    The Electric Power Research Institute (EPRI) has sponsored a research program to investigate methods for identifying systems interactions (SIs) and for the evaluation of their importance. Phase 1 of the EPRI research project focused on the evaluation of methods for identification of SIs. Major results of the Phase 1 activities are the documentation of four different methodologies for identification of potential SIs and development of guidelines for performing an effective plant walkdown in support of an SI analysis. Phase II of the project, currently being performed, is utilizing a plant level logic model of a pressurized water reactor (PWR) to determine the quantitative importance of identified SIs. In Phase II, previously reported events involving interactions between systems were screened and selected on the basis of their relevance to the Baltimore Gas and Electric (BGandE) Calvert Cliffs Nuclear Power Plant design and perceived potential safety significance. Selected events were then incorporated into the BGandE plant level GO logic model. The model is being exercised to calculate the relative importance of these events. Five previously identified event scenarios, extracted from licensee event reports (LERs) are being evaluated during the course of the study. A key feature of the approach being used in Phase II is the use of a logic model in a manner to effectively evaluate the impact of events on the system level and the plant level for the mitigation of transients. Preliminary study results indicate that the developed methodology can be a viable and effective means for determining the quantitative significance of SIs

  10. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  11. A quantitative quasispecies theory-based model of virus escape mutation under immune selection.

    Science.gov (United States)

    Woo, Hyung-June; Reifman, Jaques

    2012-08-07

    Viral infections involve a complex interplay of the immune response and escape mutation of the virus quasispecies inside a single host. Although fundamental aspects of such a balance of mutation and selection pressure have been established by the quasispecies theory decades ago, its implications have largely remained qualitative. Here, we present a quantitative approach to model the virus evolution under cytotoxic T-lymphocyte immune response. The virus quasispecies dynamics are explicitly represented by mutations in the combined sequence space of a set of epitopes within the viral genome. We stochastically simulated the growth of a viral population originating from a single wild-type founder virus and its recognition and clearance by the immune response, as well as the expansion of its genetic diversity. Applied to the immune escape of a simian immunodeficiency virus epitope, model predictions were quantitatively comparable to the experimental data. Within the model parameter space, we found two qualitatively different regimes of infectious disease pathogenesis, each representing alternative fates of the immune response: It can clear the infection in finite time or eventually be overwhelmed by viral growth and escape mutation. The latter regime exhibits the characteristic disease progression pattern of human immunodeficiency virus, while the former is bounded by maximum mutation rates that can be suppressed by the immune response. Our results demonstrate that, by explicitly representing epitope mutations and thus providing a genotype-phenotype map, the quasispecies theory can form the basis of a detailed sequence-specific model of real-world viral pathogens evolving under immune selection.

  12. Linking market interaction intensity of 3D Ising type financial model with market volatility

    Science.gov (United States)

    Fang, Wen; Ke, Jinchuan; Wang, Jun; Feng, Ling

    2016-11-01

    Microscopic interaction models in physics have been used to investigate the complex phenomena of economic systems. The simple interactions involved can lead to complex behaviors and help the understanding of mechanisms in the financial market at a systemic level. This article aims to develop a financial time series model through 3D (three-dimensional) Ising dynamic system which is widely used as an interacting spins model to explain the ferromagnetism in physics. Through Monte Carlo simulations of the financial model and numerical analysis for both the simulation return time series and historical return data of Hushen 300 (HS300) index in Chinese stock market, we show that despite its simplicity, this model displays stylized facts similar to that seen in real financial market. We demonstrate a possible underlying link between volatility fluctuations of real stock market and the change in interaction strengths of market participants in the financial model. In particular, our stochastic interaction strength in our model demonstrates that the real market may be consistently operating near the critical point of the system.

  13. Individual-based modeling of fish: Linking to physical models and water quality.

    Energy Technology Data Exchange (ETDEWEB)

    Rose, K.A.

    1997-08-01

    The individual-based modeling approach for the simulating fish population and community dynamics is gaining popularity. Individual-based modeling has been used in many other fields, such as forest succession and astronomy. The popularity of the individual-based approach is partly a result of the lack of success of the more aggregate modeling approaches traditionally used for simulating fish population and community dynamics. Also, recent recognition that it is often the atypical individual that survives has fostered interest in the individual-based approach. Two general types of individual-based models are distribution and configuration. Distribution models follow the probability distributions of individual characteristics, such as length and age. Configuration models explicitly simulate each individual; the sum over individuals being the population. DeAngelis et al (1992) showed that, when distribution and configuration models were formulated from the same common pool of information, both approaches generated similar predictions. The distribution approach was more compact and general, while the configuration approach was more flexible. Simple biological changes, such as making growth rate dependent on previous days growth rates, were easy to implement in the configuration version but prevented simple analytical solution of the distribution version.

  14. Need for collection of quantitative distribution data for dosimetry and metabolic modeling

    International Nuclear Information System (INIS)

    Lathrop, K.A.

    1976-01-01

    Problems in radiation dose distribution studies in humans are discussed. Data show the effective half-times for 7 Be and 75 Se in the mouse, rat, monkey, dog, and human show no correlation with weight, body surface, or other readily apparent factor that could be used to equate nonhuman and human data. Another problem sometimes encountered in attempting to extrapolate animal data to humans involves equivalent doses of the radiopharmaceutical. A usual human dose for a radiopharmaceutical is 1 ml or 0.017 mg/kg. The same solution injected into a mouse in a convenient volume of 0.1 ml results in a dose of 4 ml/kg or 240 times that received by the human. The effect on whole body retention produced by a dose difference of similar magnitude for selenium in the rat shows the retention is at least twice as great with the smaller amount. With the development of methods for the collection of data throughout the body representing the fractional distribution of radioactivity versus time, not only can more realistic dose estimates be made, but also the tools will be provided for the study of physiological and biochemical interrelationships in the intact subject from which compartmental models may be made which have diagnostic significance. The unique requirement for quantitative biologic data needed for calculation of radiation absorbed doses is the same as the unique scientific contribution that nuclear medicine can make, which is the quantitative in vivo study of physiologic and biochemical processes. The technique involved is not the same as quantitation of a radionuclide image, but is a step beyond

  15. Development of Multidimensional Gap Conductance model using Virtual Link Gap Element

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The gap conductance that determines temperature gradient between pellet and cladding can be quite sensitive to gap thickness. For instance, once the gap size increases up to several micrometers in certain region, difference of pellet surface temperatures increases up to 100 Kelvin. Therefore, iterative thermo-mechanical coupled analysis is required to solve temperature distribution throughout pellet and cladding. Recently, multidimensional fuel performance codes have been being developed in the advanced countries to evaluate thermal behavior of fuel for off normal conditions and DBA(design based accident) conditions using the Finite Element Method (FEM). FRAPCON-FRAPTRAN code system, which is well known as the verified and reliable code, incorporates 1D thermal module and multidimensional mechanical module. In this code, multidimensional gap conductance model is not applied. ALCYONE developed by CEA introduces equivalent heat convection coefficient that represents multidimensional gap conductance as a function of gap thickness. BISON, which is multidimensional fuel performance code developed by INL, owns multidimensional gap conductance model using projected thermal contact. In general, thermal contact algorithm is nonlinear calculation which is expensive approach numerically. The gap conductance model for multi-dimension is difficult issue in terms of convergence and nonlinearity because gap conductance is function of gap thickness which depends on mechanical analysis at each iteration step. In this paper, virtual link gap (VLG) element has been proposed to resolve convergence issue and nonlinear characteristic of multidimensional gap conductance. In terms of calculation accuracy and convergence efficiency, the proposed VLG model was evaluated. LWR fuel performance codes should incorporate thermo-mechanical loop to solve gap conductance problem, iteratively. However, gap conductance in multidimensional model is difficult issue owing to its nonlinearity and

  16. Exploring the roles of cannot-link constraint in community detection via Multi-variance Mixed Gaussian Generative Model

    Science.gov (United States)

    Ge, Meng; Jin, Di; He, Dongxiao; Fu, Huazhu; Wang, Jing; Cao, Xiaochun

    2017-01-01

    Due to the demand for performance improvement and the existence of prior information, semi-supervised community detection with pairwise constraints becomes a hot topic. Most existing methods have been successfully encoding the must-link constraints, but neglect the opposite ones, i.e., the cannot-link constraints, which can force the exclusion between nodes. In this paper, we are interested in understanding the role of cannot-link constraints and effectively encoding pairwise constraints. Towards these goals, we define an integral generative process jointly considering the network topology, must-link and cannot-link constraints. We propose to characterize this process as a Multi-variance Mixed Gaussian Generative (MMGG) Model to address diverse degrees of confidences that exist in network topology and pairwise constraints and formulate it as a weighted nonnegative matrix factorization problem. The experiments on artificial and real-world networks not only illustrate the superiority of our proposed MMGG, but also, most importantly, reveal the roles of pairwise constraints. That is, though the must-link is more important than cannot-link when either of them is available, both must-link and cannot-link are equally important when both of them are available. To the best of our knowledge, this is the first work on discovering and exploring the importance of cannot-link constraints in semi-supervised community detection. PMID:28678864

  17. Non Linear Programming (NLP) formulation for quantitative modeling of protein signal transduction pathways.

    Science.gov (United States)

    Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  18. Non Linear Programming (NLP formulation for quantitative modeling of protein signal transduction pathways.

    Directory of Open Access Journals (Sweden)

    Alexander Mitsos

    Full Text Available Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i excessive CPU time requirements and ii loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  19. Quantitative model of super-Arrhenian behavior in glass forming materials

    Science.gov (United States)

    Caruthers, J. M.; Medvedev, G. A.

    2018-05-01

    The key feature of glass forming liquids is the super-Arrhenian temperature dependence of the mobility, where the mobility can increase by ten orders of magnitude or more as the temperature is decreased if crystallization does not intervene. A fundamental description of the super-Arrhenian behavior has been developed; specifically, the logarithm of the relaxation time is a linear function of 1 /U¯x , where U¯x is the independently determined excess molar internal energy and B is a material constant. This one-parameter mobility model quantitatively describes data for 21 glass forming materials, which are all the materials where there are sufficient experimental data for analysis. The effect of pressure on the loga mobility is also described using the same U¯x(T ,p ) function determined from the difference between the liquid and crystalline internal energies. It is also shown that B is well correlated with the heat of fusion. The prediction of the B /U¯x model is compared to the Adam and Gibbs 1 /T S¯x model, where the B /U¯x model is significantly better in unifying the full complement of mobility data. The implications of the B /U¯x model for the development of a fundamental description of glass are discussed.

  20. Multivariate characterisation and quantitative structure-property relationship modelling of nitroaromatic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Joensson, S. [Man-Technology-Environment Research Centre, Department of Natural Sciences, Orebro University, 701 82 Orebro (Sweden)], E-mail: sofie.jonsson@nat.oru.se; Eriksson, L.A. [Department of Natural Sciences and Orebro Life Science Center, Orebro University, 701 82 Orebro (Sweden); Bavel, B. van [Man-Technology-Environment Research Centre, Department of Natural Sciences, Orebro University, 701 82 Orebro (Sweden)

    2008-07-28

    A multivariate model to characterise nitroaromatics and related compounds based on molecular descriptors was calculated. Descriptors were collected from literature and through empirical, semi-empirical and density functional theory-based calculations. Principal components were used to describe the distribution of the compounds in a multidimensional space. Four components described 76% of the variation in the dataset. PC1 separated the compounds due to molecular weight, PC2 separated the different isomers, PC3 arranged the compounds according to different functional groups such as nitrobenzoic acids, nitrobenzenes, nitrotoluenes and nitroesters and PC4 differentiated the compounds containing chlorine from other compounds. Quantitative structure-property relationship models were calculated using partial least squares (PLS) projection to latent structures to predict gas chromatographic (GC) retention times and the distribution between the water phase and air using solid-phase microextraction (SPME). GC retention time was found to be dependent on the presence of polar amine groups, electronic descriptors including highest occupied molecular orbital, dipole moments and the melting point. The model of GC retention time was good, but the precision was not precise enough for practical use. An important environmental parameter was measured using SPME, the distribution between headspace (air) and the water phase. This parameter was mainly dependent on Henry's law constant, vapour pressure, log P, content of hydroxyl groups and atmospheric OH rate constant. The predictive capacity of the model substantially improved when recalculating a model using these five descriptors only.

  1. Combining quantitative trait loci analysis with physiological models to predict genotype-specific transpiration rates.

    Science.gov (United States)

    Reuning, Gretchen A; Bauerle, William L; Mullen, Jack L; McKay, John K

    2015-04-01

    Transpiration is controlled by evaporative demand and stomatal conductance (gs ), and there can be substantial genetic variation in gs . A key parameter in empirical models of transpiration is minimum stomatal conductance (g0 ), a trait that can be measured and has a large effect on gs and transpiration. In Arabidopsis thaliana, g0 exhibits both environmental and genetic variation, and quantitative trait loci (QTL) have been mapped. We used this information to create a genetically parameterized empirical model to predict transpiration of genotypes. For the parental lines, this worked well. However, in a recombinant inbred population, the predictions proved less accurate. When based only upon their genotype at a single g0 QTL, genotypes were less distinct than our model predicted. Follow-up experiments indicated that both genotype by environment interaction and a polygenic inheritance complicate the application of genetic effects into physiological models. The use of ecophysiological or 'crop' models for predicting transpiration of novel genetic lines will benefit from incorporating further knowledge of the genetic control and degree of independence of core traits/parameters underlying gs variation. © 2014 John Wiley & Sons Ltd.

  2. Linking electricity and water models to assess electricity choices at water-relevant scales

    International Nuclear Information System (INIS)

    Sattler, S; Rogers, J; Macknick, J; Lopez, A; Yates, D; Flores-Lopez, F

    2012-01-01

    Hydrology/water management and electricity generation projections have been modeled separately, but there has been little effort in intentionally and explicitly linking the two sides of the water–energy nexus. This paper describes a platform for assessing power plant cooling water withdrawals and consumption under different electricity pathways at geographic and time scales appropriate for both electricity and hydrology/water management. This platform uses estimates of regional electricity generation by the Regional Energy Deployment System (ReEDS) as input to a hydrologic and water management model—the Water Evaluation and Planning (WEAP) system. In WEAP, this electricity use represents thermoelectric cooling water withdrawals and consumption within the broader, regional water resource context. Here we describe linking the electricity and water models, including translating electricity generation results from ReEDS-relevant geographies to the water-relevant geographies of WEAP. The result of this analysis is water use by the electric sector at the regional watershed level, which is used to examine the water resource implications of these electricity pathways. (letter)

  3. DFT Modeling of Cross-Linked Polyethylene: Role of Gold Atoms and Dispersion Interactions.

    Science.gov (United States)

    Blaško, Martin; Mach, Pavel; Antušek, Andrej; Urban, Miroslav

    2018-02-08

    Using DFT modeling, we analyze the concerted action of gold atoms and dispersion interactions in cross-linked polyethylene. Our model consists of two oligomer chains (PEn) with 7, 11, 15, 19, or 23 carbon atoms in each oligomer cross-linked with one to three Au atoms through C-Au-C bonds. In structures with a single gold atom the C-Au-C bond is located in the central position of the oligomer. Binding energies (BEs) with respect to two oligomer radical fragments and Au are as high as 362-489 kJ/mol depending on the length of the oligomer chain. When the dispersion contribution in PEn-Au-PEn oligomers is omitted, BE is almost independent of the number of carbon atoms, lying between 293 and 296 kJ/mol. The dispersion energy contributions to BEs in PEn-Au-PEn rise nearly linearly with the number of carbon atoms in the PEn chain. The carbon-carbon distance in the C-Au-C moiety is around 4.1 Å, similar to the bond distance between saturated closed shell chains in the polyethylene crystal. BEs of pure saturated closed shell PEn-PEn oligomers are 51-187 kJ/mol. Both Au atoms and dispersion interactions contribute considerably to the creation of nearly parallel chains of oligomers with reasonably high binding energies.

  4. Linking intended visitation to regional economic impact models of bison and elk management

    Science.gov (United States)

    Loomis, J.; Caughlan, L.

    2004-01-01

    This article links intended National Park visitation estimates to regional economic models to calculate the employment impacts of alternative bison and elk management strategies. The survey described alternative National Elk Refuge (NER) management actions and the effects on elk and bison populations at the NER and adjacent Grand Teton National Park (GTNP). Park visitors were then asked if they would change their number of visits with each potential management action. Results indicate there would be a 10% decrease in visitation if bison populations were reduced from 600 to 400 animals and elk populations were reduced in GTNP and the NER. The related decrease in jobs in Teton counties of Wyoming and Idaho is estimated at 5.5%. Adopting a “no active management” option of never feeding elk and bison on the NER yields about one-third the current bison population (200 bison) and about half the elk population. Visitors surveyed about this management option would take about 20% fewer trips, resulting in an 11.3% decrease in employment. Linking intended visitation surveys and regional economic models represents a useful tool for natural resource planners who must present the consequences of potential actions in Environmental Impact Statements and plans to the public and decision makers prior to any action being implemented.

  5. Linking 1D coastal ocean modelling to environmental management: an ensemble approach

    Science.gov (United States)

    Mussap, Giulia; Zavatarelli, Marco; Pinardi, Nadia

    2017-12-01

    The use of a one-dimensional interdisciplinary numerical model of the coastal ocean as a tool contributing to the formulation of ecosystem-based management (EBM) is explored. The focus is on the definition of an experimental design based on ensemble simulations, integrating variability linked to scenarios (characterised by changes in the system forcing) and to the concurrent variation of selected, and poorly constrained, model parameters. The modelling system used was previously specifically designed for the use in "data-rich" areas, so that horizontal dynamics can be resolved by a diagnostic approach and external inputs can be parameterised by nudging schemes properly calibrated. Ensembles determined by changes in the simulated environmental (physical and biogeochemical) dynamics, under joint forcing and parameterisation variations, highlight the uncertainties associated to the application of specific scenarios that are relevant to EBM, providing an assessment of the reliability of the predicted changes. The work has been carried out by implementing the coupled modelling system BFM-POM1D in an area of Gulf of Trieste (northern Adriatic Sea), considered homogeneous from the point of view of hydrological properties, and forcing it by changing climatic (warming) and anthropogenic (reduction of the land-based nutrient input) pressure. Model parameters affected by considerable uncertainties (due to the lack of relevant observations) were varied jointly with the scenarios of change. The resulting large set of ensemble simulations provided a general estimation of the model uncertainties related to the joint variation of pressures and model parameters. The information of the model result variability aimed at conveying efficiently and comprehensibly the information on the uncertainties/reliability of the model results to non-technical EBM planners and stakeholders, in order to have the model-based information effectively contributing to EBM.

  6. Linking models of human behaviour and climate alters projected climate change

    Science.gov (United States)

    Beckage, Brian; Gross, Louis J.; Lacasse, Katherine; Carr, Eric; Metcalf, Sara S.; Winter, Jonathan M.; Howe, Peter D.; Fefferman, Nina; Franck, Travis; Zia, Asim; Kinzig, Ann; Hoffman, Forrest M.

    2018-01-01

    Although not considered in climate models, perceived risk stemming from extreme climate events may induce behavioural changes that alter greenhouse gas emissions. Here, we link the C-ROADS climate model to a social model of behavioural change to examine how interactions between perceived risk and emissions behaviour influence projected climate change. Our coupled climate and social model resulted in a global temperature change ranging from 3.4-6.2 °C by 2100 compared with 4.9 °C for the C-ROADS model alone, and led to behavioural uncertainty that was of a similar magnitude to physical uncertainty (2.8 °C versus 3.5 °C). Model components with the largest influence on temperature were the functional form of response to extreme events, interaction of perceived behavioural control with perceived social norms, and behaviours leading to sustained emissions reductions. Our results suggest that policies emphasizing the appropriate attribution of extreme events to climate change and infrastructural mitigation may reduce climate change the most.

  7. A Linked Hydro-Economic Model to Examine the Effects of Water Policy on Rural Poverty

    Science.gov (United States)

    Maneta, M. P.; Torres, M.; Vosti, S. A.; Wallender, W. W.; Howitt, R.; Rodrigues, L. N.; Bassoi, L. H.; Pfeiffer, L.; Young, J.

    2006-12-01

    The sustainable intensification of small-scale agriculture is a necessary condition for reducing rural poverty in developing countries. Increasing the amount of irrigated cropland and the economic efficiency of irrigation are two key components of most intensification strategies. Improved access to water generally increases farm income but richer farmers use a disproportionate share of the available water, decreasing the chances of poor farmers to meet their water needs. Furthermore, water and poverty have strong spatial components that have so far been neglected in water planning. In that sense, too little is known about the short and long term hydrological effects, especially the externality effects of changes in on-farm water use and its implications to nearby farmers. To address this gap in knowledge, a spatially distributed and transient description of changes in surface and groundwater allocation under different agricultural management scenarios is needed. We propose a hydro-economic model providing a realistic spatio-temporal description of the linkages between the economic and hydrologic subsystems. This hydro-economic model is composed of a basin-level 3D spatially distributed transient hydrologic model (MOD-HMS) and a farm-level, spatially distributed agricultural production model. Both models are explicitly linked through the boundary conditions of the hydrologic model. The linkage will account for the spatial and temporal impact of different crop mixes, irrigation techniques and groundwater pumpage on water availability at farm level to assess the effects of policy action on the hydro-economic components of the system.

  8. Dynamically linking economic models to ecological condition for coastal zone management: Application to sustainable tourism planning.

    Science.gov (United States)

    Dvarskas, Anthony

    2017-03-01

    While the development of the tourism industry can bring economic benefits to an area, it is important to consider the long-run impact of the industry on a given location. Particularly when the tourism industry relies upon a certain ecological state, those weighing different development options need to consider the long-run impacts of increased tourist numbers upon measures of ecological condition. This paper presents one approach for linking a model of recreational visitor behavior with an ecological model that estimates the impact of the increased visitors upon the environment. Two simulations were run for the model using initial parameters available from survey data and water quality data for beach locations in Croatia. Results suggest that the resilience of a given tourist location to the changes brought by increasing tourism numbers is important in determining its long-run sustainability. Further work should investigate additional model components, including the tourism industry, refinement of the relationships assumed by the model, and application of the proposed model in additional areas. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  10. Linking Earth Observations and Models to Societal Information Needs: The Case of Coastal Flooding

    Science.gov (United States)

    Buzzanga, B. A.; Plag, H. P.

    2016-12-01

    Coastal flooding is expected to increase in many areas due to sea level rise (SLR). Many societal applications such as emergency planning and designing public services depend on information on how the flooding spectrum may change as a result of SLR. To identify the societal information needs a conceptual model is needed that identifies the key stakeholders, applications, and information and observation needs. In the context of the development of the Global Earth Observation System of Systems (GEOSS), which is implemented by the Group on Earth Observations (GEO), the Socio-Economic and Environmental Information Needs Knowledge Base (SEE-IN KB) is developed as part of the GEOSS Knowledge Base. A core function of the SEE-IN KB is to facilitate the linkage of societal information needs to observations, models, information and knowledge. To achieve this, the SEE-IN KB collects information on objects such as user types, observational requirements, societal goals, models, and datasets. Comprehensive information concerning the interconnections between instances of these objects is used to capture the connectivity and to establish a conceptual model as a network of networks. The captured connectivity can be used in searches to allow users to discover products and services for their information needs, and providers to search for users and applications benefiting from their products. It also allows to answer "What if?" questions and supports knowledge creation. We have used the SEE-IN KB to develop a conceptual model capturing the stakeholders in coastal flooding and their information needs, and to link these elements to objects. We show how the knowledge base enables the transition of scientific data to useable information by connecting individuals such as city managers to flood maps. Within the knowledge base, these same users can request information that improves their ability to make specific planning decisions. These needs are linked to entities within research

  11. A Quantitative, Time-Dependent Model of Oxygen Isotopes in the Solar Nebula: Step one

    Science.gov (United States)

    Nuth, J. A.; Paquette, J. A.; Farquhar, A.; Johnson, N. M.

    2011-01-01

    The remarkable discovery that oxygen isotopes in primitive meteorites were fractionated along a line of slope I rather than along the typical slope 0,52 terrestrial fractionation line occurred almost 40 years ago, However, a satisfactory, quantitative explanation for this observation has yet to be found, though many different explanations have been proposed, The first of these explanations proposed that the observed line represented the final product produced by mixing molecular cloud dust with a nucleosynthetic component, rich in O-16, possibly resulting from a nearby supernova explosion, Donald Clayton suggested that Galactic Chemical Evolution would gradually change the oxygen isotopic composition of the interstellar grain population by steadily producing O-16 in supernovae, then producing the heavier isotopes as secondary products in lower mass stars, Thiemens and collaborators proposed a chemical mechanism that relied on the availability of additional active rotational and vibrational states in otherwise-symmetric molecules, such as CO2, O3 or SiO2, containing two different oxygen isotopes and a second, photochemical process that suggested that differential photochemical dissociation processes could fractionate oxygen , This second line of research has been pursued by several groups, though none of the current models is quantitative,

  12. Quantitative modeling of the reaction/diffusion kinetics of two-chemistry photopolymers

    Science.gov (United States)

    Kowalski, Benjamin Andrew

    Optically driven diffusion in photopolymers is an appealing material platform for a broad range of applications, in which the recorded refractive index patterns serve either as images (e.g. data storage, display holography) or as optical elements (e.g. custom GRIN components, integrated optical devices). A quantitative understanding of the reaction/diffusion kinetics is difficult to obtain directly, but is nevertheless necessary in order to fully exploit the wide array of design freedoms in these materials. A general strategy for characterizing these kinetics is proposed, in which key processes are decoupled and independently measured. This strategy enables prediction of a material's potential refractive index change, solely on the basis of its chemical components. The degree to which a material does not reach this potential reveals the fraction of monomer that has participated in unwanted reactions, reducing spatial resolution and dynamic range. This approach is demonstrated for a model material similar to commercial media, achieving quantitative predictions of index response over three orders of exposure dose (~1 to ~103 mJ cm-2) and three orders of feature size (0.35 to 500 microns). The resulting insights enable guided, rational design of new material formulations with demonstrated performance improvement.

  13. Accurate Simulation of 802.11 Indoor Links: A "Bursty" Channel Model Based on Real Measurements

    Directory of Open Access Journals (Sweden)

    Agüero Ramón

    2010-01-01

    Full Text Available We propose a novel channel model to be used for simulating indoor wireless propagation environments. An extensive measurement campaign was carried out to assess the performance of different transport protocols over 802.11 links. This enabled us to better adjust our approach, which is based on an autoregressive filter. One of the main advantages of this proposal lies in its ability to reflect the "bursty" behavior which characterizes indoor wireless scenarios, having a great impact on the behavior of upper layer protocols. We compare this channel model, integrated within the Network Simulator (ns-2 platform, with other traditional approaches, showing that it is able to better reflect the real behavior which was empirically assessed.

  14. Linking susceptibility genes and pathogenesis mechanisms using mouse models of systemic lupus erythematosus

    Science.gov (United States)

    Crampton, Steve P.; Morawski, Peter A.; Bolland, Silvia

    2014-01-01

    Systemic lupus erythematosus (SLE) represents a challenging autoimmune disease from a clinical perspective because of its varied forms of presentation. Although broad-spectrum steroids remain the standard treatment for SLE, they have many side effects and only provide temporary relief from the symptoms of the disease. Thus, gaining a deeper understanding of the genetic traits and biological pathways that confer susceptibility to SLE will help in the design of more targeted and effective therapeutics. Both human genome-wide association studies (GWAS) and investigations using a variety of mouse models of SLE have been valuable for the identification of the genes and pathways involved in pathogenesis. In this Review, we link human susceptibility genes for SLE with biological pathways characterized in mouse models of lupus, and discuss how the mechanistic insights gained could advance drug discovery for the disease. PMID:25147296

  15. Linking susceptibility genes and pathogenesis mechanisms using mouse models of systemic lupus erythematosus

    Directory of Open Access Journals (Sweden)

    Steve P. Crampton

    2014-09-01

    Full Text Available Systemic lupus erythematosus (SLE represents a challenging autoimmune disease from a clinical perspective because of its varied forms of presentation. Although broad-spectrum steroids remain the standard treatment for SLE, they have many side effects and only provide temporary relief from the symptoms of the disease. Thus, gaining a deeper understanding of the genetic traits and biological pathways that confer susceptibility to SLE will help in the design of more targeted and effective therapeutics. Both human genome-wide association studies (GWAS and investigations using a variety of mouse models of SLE have been valuable for the identification of the genes and pathways involved in pathogenesis. In this Review, we link human susceptibility genes for SLE with biological pathways characterized in mouse models of lupus, and discuss how the mechanistic insights gained could advance drug discovery for the disease.

  16. Channel modelling for free-space optical inter-HAP links using adaptive ARQ transmission

    Science.gov (United States)

    Parthasarathy, S.; Giggenbach, D.; Kirstädter, A.

    2014-10-01

    Free-space optical (FSO) communication systems have seen significant developments in recent years due to growing need for very high data rates and tap-proof communication. The operation of an FSO link is suited to diverse variety of applications such as satellites, High Altitude Platforms (HAPs), Unmanned Aerial Vehicles (UAVs), aircrafts, ground stations and other areas involving both civil and military situations. FSO communication systems face challenges due to different effects of the atmospheric channel. FSO channel primarily suffers from scintillation effects due to Index of Refraction Turbulence (IRT). In addition, acquisition and pointing becomes more difficult because of the high directivity of the transmitted beam: Miss-pointing of the transmitted beam and tracking errors at the receiver generate additional fading of the optical signal. High Altitude Platforms (HAPs) are quasi-stationary vehicles operating in the stratosphere. The slowly varying but precisely determined time-of-flight of the Inter-HAP channel adds to its characteristics. To propose a suitable ARQ scheme, proper theoretical understanding of the optical atmospheric propagation and modeling of a specific scenario FSO channel is required. In this paper, a bi-directional symmetrical Inter-HAP link has been selected and modeled. The Inter-HAP channel model is then investigated via simulations in terms of optical scintillation induced by IRT and in presence of pointing error. The performance characteristic of the model is then quantified in terms of fading statistics from which the Packet Error Probability (PEP) is calculated. Based on the PEP characteristics, we propose suitable ARQ schemes.

  17. Linking sediment fingerprinting and modeling outputs for a Spanish Pyrenean river catchment.

    Science.gov (United States)

    Palazón, Leticia; Latorre, Borja; Gaspar, Leticia; Blake, Williams H.; Smith, Hugh G.; Navas, Ana

    2015-04-01

    Indirect techniques to study fine sediment redistribution in river catchments could provide unique and diverse information, which, when combined become a powerful tool to address catchment management problems. Such combinations could solve limitations of individual techniques and provide different lines of information to address a particular problem. The Barasona reservoir has suffered from siltation since its construction, with the loss of over one third of its storage volume in around 30 study years (period 1972-1996). Information on sediment production from tributary catchments for the reservoir is required to develop management plans for maintaining reservoir sustainability. Large spatial variability in sediment delivery was found in previous studies in the Barasona catchment and the major sediment sources identified included badlands developed in the middle part of the catchment and the agricultural fields in its lower part. From the diverse range of indirect techniques, fingerprinting sediment sources and computer models could be linked to obtain a more holistic view of the processes related to sediment redistribution in the Barasona river catchment (1509 km2, Central Spanish Pyrenees), which comprises agricultural and forest land uses. In the present study, the results from a fingerprinting procedure and the SWAT model were compared and combined to improve the knowledge of land use sediment source contributions to the reservoir. Samples from the study catchment were used to define soil parameters for the model and for fingerprinting the land use sources. The fingerprinting approach provided information about relative contributions from land use sources to the superficial sediment samples taken from the reservoir infill. The calibration and validation of the model provided valuable information, for example on the timescale of sediment production from the different land uses within the catchment. Linking results from both techniques enabled us to achieve a

  18. A Model for the Detailed Analysis of Radio Links Involving Tree Canopies

    Directory of Open Access Journals (Sweden)

    F. Perez-Fontan

    2016-12-01

    Full Text Available Detailed analysis of tree canopy interaction with incident radiowaves has mainly been limited to remote sensing for the purpose of forest classification among many other applications. This represents a monostatic configuration, unlike the case of communication links, which are bistatic. In general, link analyses have been limited to the application of simple, empirical formulas based on the use of specific attenuation values in dB/m and the traversed vegetated mass as, e.g., the model in Recommendation ITU-R P.833-8 [1]. In remote sensing, two main techniques are used: Multiple Scattering Theory (MST [2][5] and Radiative Transfer Theory (RT, [5] and [6]. We have paid attention in the past to MST [7][10]. It was shown that a full application of MST leads to very long computation times which are unacceptable in the case where we have to analyze a scenario with several trees. Extensive work using MST has been also presented by others in [11][16] showing the interest in this technique. We have proposed a simplified model for scattering from tree canopies based on a hybridization of MST and a modified physical optics (PO approach [16]. We assume that propagation through a canopy is accounted for by using the complex valued propagation constant obtained by MST. Unlike the case when the full MST is applied, the proposed approach offers significant benefits including a direct software implementation and acceptable computation times even for high frequencies and electrically large canopies. The proposed model thus replaces the coherent component in MST, significant in the forward direction, but keeps the incoherent or diffuse scattering component present in all directions. The incoherent component can be calculated within reasonable times. Here, we present tests of the proposed model against MST using an artificial single-tree scenario at 2 GHz and 10 GHz.

  19. How predictive quantitative modelling of tissue organisation can inform liver disease pathogenesis.

    Science.gov (United States)

    Drasdo, Dirk; Hoehme, Stefan; Hengstler, Jan G

    2014-10-01

    From the more than 100 liver diseases described, many of those with high incidence rates manifest themselves by histopathological changes, such as hepatitis, alcoholic liver disease, fatty liver disease, fibrosis, and, in its later stages, cirrhosis, hepatocellular carcinoma, primary biliary cirrhosis and other disorders. Studies of disease pathogeneses are largely based on integrating -omics data pooled from cells at different locations with spatial information from stained liver structures in animal models. Even though this has led to significant insights, the complexity of interactions as well as the involvement of processes at many different time and length scales constrains the possibility to condense disease processes in illustrations, schemes and tables. The combination of modern imaging modalities with image processing and analysis, and mathematical models opens up a promising new approach towards a quantitative understanding of pathologies and of disease processes. This strategy is discussed for two examples, ammonia metabolism after drug-induced acute liver damage, and the recovery of liver mass as well as architecture during the subsequent regeneration process. This interdisciplinary approach permits integration of biological mechanisms and models of processes contributing to disease progression at various scales into mathematical models. These can be used to perform in silico simulations to promote unravelling the relation between architecture and function as below illustrated for liver regeneration, and bridging from the in vitro situation and animal models to humans. In the near future novel mechanisms will usually not be directly elucidated by modelling. However, models will falsify hypotheses and guide towards the most informative experimental design. Copyright © 2014 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  20. A quantitative model for estimating mean annual soil loss in cultivated land using 137Cs measurements

    International Nuclear Information System (INIS)

    Yang Hao; Zhao Qiguo; Du Mingyuan; Minami, Katsuyuki; Hatta, Tamao

    2000-01-01

    The radioisotope 137 Cs has been widely used to determine rates of cultivated soil loss, Many calibration relationships (including both empirical relationships and theoretical models) have been employed to estimate erosion rates from the amount of 137 Cs lost from the cultivated soil profile. However, there are important limitations which restrict the reliability of these models, which consider only the uniform distribution of 137 Cs in the plough layer and the depth. As a result, erosion rates they may be overestimated or underestimated. This article presents a quantitative model for the relation the amount of 137 Cs lost from the cultivate soil profile and the rate of soil erosion. According to a mass balance model, during the construction of this model we considered the following parameters: the remaining fraction of the surface enrichment layer (F R ), the thickness of the surface enrichment layer (H s ), the depth of the plough layer (H p ), input fraction of the total 137 Cs fallout deposition during a given year t (F t ), radioactive decay of 137 Cs (k), and sampling year (t). The simulation results showed that the amounts of erosion rates estimated using this model were very sensitive to changes in the values of the parameters F R , H s , and H p . We also observed that the relationship between the rate of soil loss and 137 Cs depletion is neither linear nor logarithmic, and is very complex. Although the model is an improvement over existing approaches to derive calibration relationships for cultivated soil, it requires empirical information on local soil properties and the behavior of 137 Cs in the soil profile. There is clearly still a need for more precise information on the latter aspect and, in particular, on the retention of 137 Cs fallout in the top few millimeters of the soil profile and on the enrichment and depletion effects associated with soil redistribution (i.e. for determining accurate values of F R and H s ). (author)

  1. Modulation and modeling of monoclonal antibody N-linked glycosylation in mammalian cell perfusion reactors.

    Science.gov (United States)

    Karst, Daniel J; Scibona, Ernesto; Serra, Elisa; Bielser, Jean-Marc; Souquet, Jonathan; Stettler, Matthieu; Broly, Hervé; Soos, Miroslav; Morbidelli, Massimo; Villiger, Thomas K

    2017-09-01

    Mammalian cell perfusion cultures are gaining renewed interest as an alternative to traditional fed-batch processes for the production of therapeutic proteins, such as monoclonal antibodies (mAb). The steady state operation at high viable cell density allows the continuous delivery of antibody product with increased space-time yield and reduced in-process variability of critical product quality attributes (CQA). In particular, the production of a confined mAb N-linked glycosylation pattern has the potential to increase therapeutic efficacy and bioactivity. In this study, we show that accurate control of flow rates, media composition and cell density of a Chinese hamster ovary (CHO) cell perfusion bioreactor allowed the production of a constant glycosylation profile for over 20 days. Steady state was reached after an initial transition phase of 6 days required for the stabilization of extra- and intracellular processes. The possibility to modulate the glycosylation profile was further investigated in a Design of Experiment (DoE), at different viable cell density and media supplement concentrations. This strategy was implemented in a sequential screening approach, where various steady states were achieved sequentially during one culture. It was found that, whereas high ammonia levels reached at high viable cell densities (VCD) values inhibited the processing to complex glycan structures, the supplementation of either galactose, or manganese as well as their synergy significantly increased the proportion of complex forms. The obtained experimental data set was used to compare the reliability of a statistical response surface model (RSM) to a mechanistic model of N-linked glycosylation. The latter outperformed the response surface predictions with respect to its capability and reliability in predicting the system behavior (i.e., glycosylation pattern) outside the experimental space covered by the DoE design used for the model parameter estimation. Therefore, we can

  2. Diffusion-weighted MRI and quantitative biophysical modeling of hippocampal neurite loss in chronic stress.

    Directory of Open Access Journals (Sweden)

    Peter Vestergaard-Poulsen

    Full Text Available Chronic stress has detrimental effects on physiology, learning and memory and is involved in the development of anxiety and depressive disorders. Besides changes in synaptic formation and neurogenesis, chronic stress also induces dendritic remodeling in the hippocampus, amygdala and the prefrontal cortex. Investigations of dendritic remodeling during development and treatment of stress are currently limited by the invasive nature of histological and stereological methods. Here we show that high field diffusion-weighted MRI combined with quantitative biophysical modeling of the hippocampal dendritic loss in 21 day restraint stressed rats highly correlates with former histological findings. Our study strongly indicates that diffusion-weighted MRI is sensitive to regional dendritic loss and thus a promising candidate for non-invasive studies of dendritic plasticity in chronic stress and stress-related disorders.

  3. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  4. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism

    DEFF Research Database (Denmark)

    Birkel, Garrett W.; Ghosh, Amit; Kumar, Vinay S.

    2017-01-01

    analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics, proteomics and metabolomics) are urgently needed.Results: The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes......, it introduces the capability to use C-13 labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale C-13 Metabolic Flux Analysis (2S-C-13 MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable...... insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs.Conclusions: jQMM will facilitate the design...

  5. Linking Formal and Informal Science Education: A Successful Model using Libraries, Volunteers and NASA Resources

    Science.gov (United States)

    Race, M. S.; Lafayette Library; Learning Center Foundation (Lllcf)

    2011-12-01

    In these times of budget cuts, tight school schedules, and limited opportunities for student field trips and teacher professional development, it is especially difficult to expose elementary and middle school students to the latest STEM information-particularly in the space sciences. Using our library as a facilitator and catalyst, we built a volunteer-based, multi-faceted, curriculum-linked program for students and teachers in local middle schools (Grade 8) and showcased new astronomical and planetary science information using mainly NASA resources and volunteer effort. The project began with the idea of bringing free NASA photo exhibits (FETTU) to the Lafayette and Antioch Libraries for public display. Subsequently, the effort expanded by adding layers of activities that brought space and science information to teachers, students and the pubic at 5 libraries and schools in the 2 cities, one of which serves a diverse, underserved community. Overall, the effort (supported by a pilot grant from the Bechtel Foundation) included school and library based teacher workshops with resource materials; travelling space museum visits with hands-on activities (Chabot-to-Go); separate powerpoint presentations for students and adults at the library; and concurrent ancillary space-related themes for young children's programs at the library. This pilot project, based largely on the use of free government resources and online materials, demonstrated that volunteer-based, standards-linked STEM efforts can enhance curriculum at the middle school, with libraries serving a special role. Using this model, we subsequently also obtained a small NASA-Space Grant award to bring star parties and hand-on science activities to three libraries this Fall, linking with numerous Grade 5 teachers and students in two additional underserved areas of our county. It's not necessary to reinvent the wheel, you just collect the pieces and build on what you already have.

  6. Methods for quantitative measurement of tooth wear using the area and volume of virtual model cusps.

    Science.gov (United States)

    Kim, Soo-Hyun; Park, Young-Seok; Kim, Min-Kyoung; Kim, Sulhee; Lee, Seung-Pyo

    2018-04-01

    Clinicians must examine tooth wear to make a proper diagnosis. However, qualitative methods of measuring tooth wear have many disadvantages. Therefore, this study aimed to develop and evaluate quantitative parameters using the cusp area and volume of virtual dental models. The subjects of this study were the same virtual models that were used in our former study. The same age group classification and new tooth wear index (NTWI) scoring system were also reused. A virtual occlusal plane was generated with the highest cusp points and lowered vertically from 0.2 to 0.8 mm to create offset planes. The area and volume of each cusp was then measured and added together. In addition to the former analysis, the differential features of each cusp were analyzed. The scores of the new parameters differentiated the age and NTWI groups better than those analyzed in the former study. The Spearman ρ coefficients between the total area and the area of each cusp also showed higher scores at the levels of 0.6 mm (0.6A) and 0.8A. The mesiolingual cusp (MLC) showed a statistically significant difference ( P <0.01) from the other cusps in the paired t -test. Additionally, the MLC exhibited the highest percentage of change at 0.6A in some age and NTWI groups. Regarding the age groups, the MLC showed the highest score in groups 1 and 2. For the NTWI groups, the MLC was not significantly different in groups 3 and 4. These results support the proposal that the lingual cusp exhibits rapid wear because it serves as a functional cusp. Although this study has limitations due to its cross-sectional nature, it suggests better quantitative parameters and analytical tools for the characteristics of cusp wear.

  7. Quantitative assessment of biological impact using transcriptomic data and mechanistic network models

    International Nuclear Information System (INIS)

    Thomson, Ty M.; Sewer, Alain; Martin, Florian; Belcastro, Vincenzo; Frushour, Brian P.; Gebel, Stephan; Park, Jennifer; Schlage, Walter K.; Talikka, Marja; Vasilyev, Dmitry M.; Westra, Jurjen W.; Hoeng, Julia; Peitsch, Manuel C.

    2013-01-01

    Exposure to biologically active substances such as therapeutic drugs or environmental toxicants can impact biological systems at various levels, affecting individual molecules, signaling pathways, and overall cellular processes. The ability to derive mechanistic insights from the resulting system responses requires the integration of experimental measures with a priori knowledge about the system and the interacting molecules therein. We developed a novel systems biology-based methodology that leverages mechanistic network models and transcriptomic data to quantitatively assess the biological impact of exposures to active substances. Hierarchically organized network models were first constructed to provide a coherent framework for investigating the impact of exposures at the molecular, pathway and process levels. We then validated our methodology using novel and previously published experiments. For both in vitro systems with simple exposure and in vivo systems with complex exposures, our methodology was able to recapitulate known biological responses matching expected or measured phenotypes. In addition, the quantitative results were in agreement with experimental endpoint data for many of the mechanistic effects that were assessed, providing further objective confirmation of the approach. We conclude that our methodology evaluates the biological impact of exposures in an objective, systematic, and quantifiable manner, enabling the computation of a systems-wide and pan-mechanistic biological impact measure for a given active substance or mixture. Our results suggest that various fields of human disease research, from drug development to consumer product testing and environmental impact analysis, could benefit from using this methodology. - Highlights: • The impact of biologically active substances is quantified at multiple levels. • The systems-level impact integrates the perturbations of individual networks. • The networks capture the relationships between

  8. Using Coupled Simulation Models to Link Pastoral Decision Making and Ecosystem Services

    Directory of Open Access Journals (Sweden)

    Randall B. Boone

    2011-06-01

    Full Text Available Historically, pastoral people were able to more freely use the services their semi-arid and arid ecosystems provide, and they adapted to changes in ways that improved their well-being. More recently, their ability to adapt has been constrained due to changes from within and from outside their communities. To compare possible responses by pastoral communities, we modeled ecosystem services and tied those services to decisions that people make at the household level. We created an agent-based household model called DECUMA, joined that model with the ecosystem model SAVANNA, and applied the linked models to southeastern Kajiado District, Kenya. The structure of the new agent-based model and linkages between the models are described, and then we demonstrate the model results using a scenario that shows changes in Maasai well-being in response to drought. We then explore two additional but related scenarios, quantifying household well-being if access to a grazing reserve is lost and if access is lost but those most affected are compensated. In the second scenario, households in group ranches abutting the grazing reserve that lost access had large declines in livestock populations, less food energy from animal sources, increased livestock sales and grain purchases, and increased need for supplemental foods. Households in more distant areas showed no changes or had increases in livestock populations because their herds had fewer animals with which to compete for forage. When households neighboring the grazing reserve were compensated for the lease of the lands they had used, they prospered. We describe some benefits and limitations of the agent-based approach.

  9. Underestimation of boreal soil carbon stocks by mathematical soil carbon models linked to soil nutrient status

    Science.gov (United States)

    Ťupek, Boris; Ortiz, Carina A.; Hashimoto, Shoji; Stendahl, Johan; Dahlgren, Jonas; Karltun, Erik; Lehtonen, Aleksi

    2016-08-01

    Inaccurate estimate of the largest terrestrial carbon pool, soil organic carbon (SOC) stock, is the major source of uncertainty in simulating feedback of climate warming on ecosystem-atmosphere carbon dioxide exchange by process-based ecosystem and soil carbon models. Although the models need to simplify complex environmental processes of soil carbon sequestration, in a large mosaic of environments a missing key driver could lead to a modeling bias in predictions of SOC stock change.We aimed to evaluate SOC stock estimates of process-based models (Yasso07, Q, and CENTURY soil sub-model v4) against a massive Swedish forest soil inventory data set (3230 samples) organized by a recursive partitioning method into distinct soil groups with underlying SOC stock development linked to physicochemical conditions.For two-thirds of measurements all models predicted accurate SOC stock levels regardless of the detail of input data, e.g., whether they ignored or included soil properties. However, in fertile sites with high N deposition, high cation exchange capacity, or moderately increased soil water content, Yasso07 and Q models underestimated SOC stocks. In comparison to Yasso07 and Q, accounting for the site-specific soil characteristics (e. g. clay content and topsoil mineral N) by CENTURY improved SOC stock estimates for sites with high clay content, but not for sites with high N deposition.Our analysis suggested that the soils with poorly predicted SOC stocks, as characterized by the high nutrient status and well-sorted parent material, indeed have had other predominant drivers of SOC stabilization lacking in the models, presumably the mycorrhizal organic uptake and organo-mineral stabilization processes. Our results imply that the role of soil nutrient status as regulator of organic matter mineralization has to be re-evaluated, since correct SOC stocks are decisive for predicting future SOC change and soil CO2 efflux.

  10. Regression models for linking patterns of growth to a later outcome: infant growth and childhood overweight

    Directory of Open Access Journals (Sweden)

    Andrew K. Wills

    2016-04-01

    Full Text Available Abstract Background Regression models are widely used to link serial measures of anthropometric size or changes in size to a later outcome. Different parameterisations of these models enable one to target different questions about the effect of growth, however, their interpretation can be challenging. Our objective was to formulate and classify several sets of parameterisations by their underlying growth pattern contrast, and to discuss their utility using an expository example. Methods We describe and classify five sets of model parameterisations in accordance with their underlying growth pattern contrast (conditional growth; being bigger v being smaller; becoming bigger and staying bigger; growing faster v being bigger; becoming and staying bigger versus being bigger. The contrasts are estimated by including different sets of repeated measures of size and changes in size in a regression model. We illustrate these models in the setting of linking infant growth (measured on 6 occasions: birth, 6 weeks, 3, 6, 12 and 24 months in weight-for-height-for-age z-scores to later childhood overweight at 8y using complete cases from the Norwegian Childhood Growth study (n = 900. Results In our expository example, conditional growth during all periods, becoming bigger in any interval and staying bigger through infancy, and being bigger from birth were all associated with higher odds of later overweight. The highest odds of later overweight occurred for individuals who experienced high conditional growth or became bigger in the 3 to 6 month period and stayed bigger, and those who were bigger from birth to 24 months. Comparisons between periods and between growth patterns require large sample sizes and need to consider how to scale associations to make comparisons fair; with respect to the latter, we show one approach. Conclusion Studies interested in detrimental growth patterns may gain extra insight from reporting several sets of growth pattern

  11. On the Performance Analysis of Free-Space Optical Links under Generalized Turbulence and Misalignment Models

    KAUST Repository

    AlQuwaiee, Hessa

    2016-11-01

    One of the potential solutions to the radio frequency (RF) spectrum scarcity problem is optical wireless communications (OWC), which utilizes the unlicensed optical spectrum. Long-range outdoor OWC are usually referred to in the literature as free-space optical (FSO) communications. Unlike RF systems, FSO is immune to interference and multi-path fading. Also, the deployment of FSO systems is flexible and much faster than optical fibers. These attractive features make FSO applicable for broadband wireless transmission such as optical fiber backup, metropolitan area network, and last mile access. Although FSO communication is a promising technology, it is negatively affected by two physical phenomenon, namely, scintillation due to atmospheric turbulence and pointing errors. These two critical issues have prompted intensive research in the last decade. To quantify the effect of these two factors on FSO system performance, we need effective mathematical models. In this work, we propose and study a generalized pointing error model based on the Beckmann distribution. Then, we aim to generalize the FSO channel model to span all turbulence conditions from weak to strong while taking pointing errors into consideration. Since scintillation in FSO is analogous to the fading phenomena in RF, diversity has been proposed too to overcome the effect of irradiance fluctuations. Thus, several combining techniques of not necessarily independent dual-branch free-space optical links were investigated over both weak and strong turbulence channels in the presence of pointing errors. On another front, improving the performance, enhancing the capacity and reducing the delay of the communication link has been the motivation of any newly developed schemes, especially for backhauling. Recently, there has been a growing interest in practical systems to integrate RF and FSO technologies to solve the last mile bottleneck. As such, we also study in this thesis asymmetric an RF-FSO dual-hop relay

  12. Validation of Quantitative Structure-Activity Relationship (QSAR Model for Photosensitizer Activity Prediction

    Directory of Open Access Journals (Sweden)

    Sharifuddin M. Zain

    2011-11-01

    Full Text Available Photodynamic therapy is a relatively new treatment method for cancer which utilizes a combination of oxygen, a photosensitizer and light to generate reactive singlet oxygen that eradicates tumors via direct cell-killing, vasculature damage and engagement of the immune system. Most of photosensitizers that are in clinical and pre-clinical assessments, or those that are already approved for clinical use, are mainly based on cyclic tetrapyrroles. In an attempt to discover new effective photosensitizers, we report the use of the quantitative structure-activity relationship (QSAR method to develop a model that could correlate the structural features of cyclic tetrapyrrole-based compounds with their photodynamic therapy (PDT activity. In this study, a set of 36 porphyrin derivatives was used in the model development where 24 of these compounds were in the training set and the remaining 12 compounds were in the test set. The development of the QSAR model involved the use of the multiple linear regression analysis (MLRA method. Based on the method, r2 value, r2 (CV value and r2 prediction value of 0.87, 0.71 and 0.70 were obtained. The QSAR model was also employed to predict the experimental compounds in an external test set. This external test set comprises 20 porphyrin-based compounds with experimental IC50 values ranging from 0.39 µM to 7.04 µM. Thus the model showed good correlative and predictive ability, with a predictive correlation coefficient (r2 prediction for external test set of 0.52. The developed QSAR model was used to discover some compounds as new lead photosensitizers from this external test set.

  13. GMM - a general microstructural model for qualitative and quantitative studies of smectite clays

    International Nuclear Information System (INIS)

    Pusch, R.; Karnland, O.; Hoekmark, H.

    1990-12-01

    A few years ago an attempt was made to accommodate a number of basic ideas on the fabric and interparticle forces that are assumed to be valid in montmorillonite clay in an integrated microstructural model and this resulted in an SKB report on 'Outlines of models of water and gas flow through smectite clay buffers'. This model gave reasonable agreement between predicted hydraulic conductivity values and actually recorded ones for room temperature and porewater that is poor in electrolytes. The present report describes an improved model that also accounts for effects generated by salt porewater and heating, and that provides a basis for both quantitative determination of transport capacities in a more general way, and also for analysis and prediction of rheological behaviour in bulk. It has been understood very early by investigators in this scientific field that full understanding of the physical state of porewater is asked for in order to make it possible to develop models for clay particle interaction. In particular, a deep insight in the nature of the interlamellar water and of the hydration mechanisms leading to an equilibrium state between the two types of water, and of forcefields in matured smectite clay, requires very qualified multi-discipline research and attempts have been made by the senior author to initiate and coordinate such work in the last 30 years. Despite this effort it has not been possible to get an unanimous understanding of these things but a number of major features have become more clear through the work that we have been able to carry out in the current SKB research work. Thus, NMR studies and precision measurements of the density of porewater as well as comprehensive electron microscopy and rheological testing in combination with application of stochastical mechanics, have led to the hypothetical microstructural model - the GMM - presented in this report. (au)

  14. Observing and modeling links between soil moisture, microbes and CH4 fluxes from forest soils

    Science.gov (United States)

    Christiansen, Jesper; Levy-Booth, David; Barker, Jason; Prescott, Cindy; Grayston, Sue

    2017-04-01

    Soil moisture is a key driver of methane (CH4) fluxes in forest soils, both of the net uptake of atmospheric CH4 and emission from the soil. Climate and land use change will alter spatial patterns of soil moisture as well as temporal variability impacting the net CH4 exchange. The impact on the resultant net CH4 exchange however is linked to the underlying spatial and temporal distribution of the soil microbial communities involved in CH4 cycling as well as the response of the soil microbial community to environmental changes. Significant progress has been made to target specific CH4 consuming and producing soil organisms, which is invaluable in order to understand the microbial regulation of the CH4 cycle in forest soils. However, it is not clear as to which extent soil moisture shapes the structure, function and abundance of CH4 specific microorganisms and how this is linked to observed net CH4 exchange under contrasting soil moisture regimes. Here we report on the results from a research project aiming to understand how the CH4 net exchange is shaped by the interactive effects soil moisture and the spatial distribution CH4 consuming (methanotrophs) and producing (methanogens). We studied the growing season variations of in situ CH4 fluxes, microbial gene abundances of methanotrophs and methanogens, soil hydrology, and nutrient availability in three typical forest types across a soil moisture gradient in a temperate rainforest on the Canadian Pacific coast. Furthermore, we conducted laboratory experiments to determine whether the net CH4 exchange from hydrologically contrasting forest soils responded differently to changes in soil moisture. Lastly, we modelled the microbial mediation of net CH4 exchange along the soil moisture gradient using structural equation modeling. Our study shows that it is possible to link spatial patterns of in situ net exchange of CH4 to microbial abundance of CH4 consuming and producing organisms. We also show that the microbial

  15. Modeling and characterization of VCSEL-based avionics full-duplex ethernet (AFDX) gigabit links

    Science.gov (United States)

    Ly, Khadijetou S.; Rissons, A.; Gambardella, E.; Bajon, D.; Mollier, J.-C.

    2008-02-01

    Low cost and intrinsic performances of 850 nm Vertical Cavity Surface Emitting Lasers (VCSELs) compared to Light Emitting Diodes make them very attractive for high speed and short distances data communication links through optical fibers. Weight saving and Electromagnetic Interference withstanding requirements have led to the need of a reliable solution to improve existing avionics high speed buses (e.g. AFDX) up to 1Gbps over 100m. To predict and optimize the performance of the link, the physical behavior of the VCSEL must be well understood. First, a theoretical study is performed through the rate equations adapted to VCSEL in large signal modulation. Averaged turn-on delays and oscillation effects are analytically computed and analyzed for different values of the on- and off state currents. This will affect the eye pattern, timing jitter and Bit Error Rate (BER) of the signal that must remain within IEEE 802.3 standard limits. In particular, the off-state current is minimized below the threshold to allow the highest possible Extinction Ratio. At this level, the spontaneous emission is dominating and leads to significant turn-on delay, turn-on jitter and bit pattern effects. Also, the transverse multimode behavior of VCSELs, caused by Spatial Hole Burning leads to some dispersion in the fiber and degradation of BER. VCSEL to Multimode Fiber coupling model is provided for prediction and optimization of modal dispersion. Lastly, turn-on delay measurements are performed on a real mock-up and results are compared with calculations.

  16. Putting the five-factor model into context: evidence linking big five traits to narrative identity.

    Science.gov (United States)

    Raggatt, Peter

    2006-10-01

    The study examined relationships between the Big Five personality traits and thematic content extracted from self-reports of life history data. One hundred and five "mature age" university students (M=30.1 years) completed the NEO PI-R trait measure, and the Personality Web Protocol. The protocol examines constituents of identity by asking participants to describe 24 key "attachments" from their life histories (significant events, people, places, objects, and possessions). Participants sorted these attachments into clusters and provided a self-descriptive label for each cluster (e.g., "adventurous self"). It was predicted that the thematic content of these cluster labels would be systematically related to Big Five trait scores (e.g., that labels referring to strength or positive emotions would be linked to Extraversion). The hypothesized links were obtained for each of the Big Five trait domains except Conscientiousness. Results are discussed with a view to broadening our understanding of the Five-Factor Model in relation to units of personality other than traits.

  17. Introducing technology learning for energy technologies in a national CGE model through soft links to global and national energy models

    International Nuclear Information System (INIS)

    Martinsen, Thomas

    2011-01-01

    This paper describes a method to model the influence by global policy scenarios, particularly spillover of technology learning, on the energy service demand of the non-energy sectors of the national economy. It is exemplified by Norway. Spillover is obtained from the technology-rich global Energy Technology Perspective model operated by the International Energy Agency. It is provided to a national hybrid model where a national bottom-up Markal model carries forward spillover into a national top-down CGE model at a disaggregated demand category level. Spillover of technology learning from the global energy technology market will reduce national generation costs of energy carriers. This may in turn increase demand in the non-energy sectors of the economy because of the rebound effect. The influence of spillover on the Norwegian economy is most pronounced for the production level of industrial chemicals and for the demand for electricity for residential energy services. The influence is modest, however, because all existing electricity generating capacity is hydroelectric and thus compatible with the low emission policy scenario. In countries where most of the existing generating capacity must be replaced by nascent energy technologies or carbon captured and storage the influence on demand is expected to be more significant. - Highlights: → Spillover of global technology learning may be forwarded into a macroeconomic model. → The national electricity price differs significantly between the different global scenarios. → Soft-linking global and national models facilitate transparency in the technology learning effect chain.

  18. Disentangling the Complexity of HGF Signaling by Combining Qualitative and Quantitative Modeling.

    Directory of Open Access Journals (Sweden)

    Lorenza A D'Alessandro

    2015-04-01

    Full Text Available Signaling pathways are characterized by crosstalk, feedback and feedforward mechanisms giving rise to highly complex and cell-context specific signaling networks. Dissecting the underlying relations is crucial to predict the impact of targeted perturbations. However, a major challenge in identifying cell-context specific signaling networks is the enormous number of potentially possible interactions. Here, we report a novel hybrid mathematical modeling strategy to systematically unravel hepatocyte growth factor (HGF stimulated phosphoinositide-3-kinase (PI3K and mitogen activated protein kinase (MAPK signaling, which critically contribute to liver regeneration. By combining time-resolved quantitative experimental data generated in primary mouse hepatocytes with interaction graph and ordinary differential equation modeling, we identify and experimentally validate a network structure that represents the experimental data best and indicates specific crosstalk mechanisms. Whereas the identified network is robust against single perturbations, combinatorial inhibition strategies are predicted that result in strong reduction of Akt and ERK activation. Thus, by capitalizing on the advantages of the two modeling approaches, we reduce the high combinatorial complexity and identify cell-context specific signaling networks.

  19. Observing Clonal Dynamics across Spatiotemporal Axes: A Prelude to Quantitative Fitness Models for Cancer.

    Science.gov (United States)

    McPherson, Andrew W; Chan, Fong Chun; Shah, Sohrab P

    2018-02-01

    The ability to accurately model evolutionary dynamics in cancer would allow for prediction of progression and response to therapy. As a prelude to quantitative understanding of evolutionary dynamics, researchers must gather observations of in vivo tumor evolution. High-throughput genome sequencing now provides the means to profile the mutational content of evolving tumor clones from patient biopsies. Together with the development of models of tumor evolution, reconstructing evolutionary histories of individual tumors generates hypotheses about the dynamics of evolution that produced the observed clones. In this review, we provide a brief overview of the concepts involved in predicting evolutionary histories, and provide a workflow based on bulk and targeted-genome sequencing. We then describe the application of this workflow to time series data obtained for transformed and progressed follicular lymphomas (FL), and contrast the observed evolutionary dynamics between these two subtypes. We next describe results from a spatial sampling study of high-grade serous (HGS) ovarian cancer, propose mechanisms of disease spread based on the observed clonal mixtures, and provide examples of diversification through subclonal acquisition of driver mutations and convergent evolution. Finally, we state implications of the techniques discussed in this review as a necessary but insufficient step on the path to predictive modelling of disease dynamics. Copyright © 2018 Cold Spring Harbor Laboratory Press; all rights reserved.

  20. Current Challenges in the First Principle Quantitative Modelling of the Lower Hybrid Current Drive in Tokamaks

    Science.gov (United States)

    Peysson, Y.; Bonoli, P. T.; Chen, J.; Garofalo, A.; Hillairet, J.; Li, M.; Qian, J.; Shiraiwa, S.; Decker, J.; Ding, B. J.; Ekedahl, A.; Goniche, M.; Zhai, X.

    2017-10-01

    The Lower Hybrid (LH) wave is widely used in existing tokamaks for tailoring current density profile or extending pulse duration to steady-state regimes. Its high efficiency makes it particularly attractive for a fusion reactor, leading to consider it for this purpose in ITER tokamak. Nevertheless, if basics of the LH wave in tokamak plasma are well known, quantitative modeling of experimental observations based on first principles remains a highly challenging exercise, despite considerable numerical efforts achieved so far. In this context, a rigorous methodology must be carried out in the simulations to identify the minimum number of physical mechanisms that must be considered to reproduce experimental shot to shot observations and also scalings (density, power spectrum). Based on recent simulations carried out for EAST, Alcator C-Mod and Tore Supra tokamaks, the state of the art in LH modeling is reviewed. The capability of fast electron bremsstrahlung, internal inductance li and LH driven current at zero loop voltage to constrain all together LH simulations is discussed, as well as the needs of further improvements (diagnostics, codes, LH model), for robust interpretative and predictive simulations.

  1. A review of the evidence linking adult attachment theory and chronic pain: presenting a conceptual model.

    Science.gov (United States)

    Meredith, Pamela; Ownsworth, Tamara; Strong, Jenny

    2008-03-01

    It is now well established that pain is a multidimensional phenomenon, affected by a gamut of psychosocial and biological variables. According to diathesis-stress models of chronic pain, some individuals are more vulnerable to developing disability following acute pain because they possess particular psychosocial vulnerabilities which interact with physical pathology to impact negatively upon outcome. Attachment theory, a theory of social and personality development, has been proposed as a comprehensive developmental model of pain, implicating individual adult attachment pattern in the ontogenesis and maintenance of chronic pain. The present paper reviews and critically appraises studies which link adult attachment theory with chronic pain. Together, these papers offer support for the role of insecure attachment as a diathesis (or vulnerability) for problematic adjustment to pain. The Attachment-Diathesis Model of Chronic Pain developed from this body of literature, combines adult attachment theory with the diathesis-stress approach to chronic pain. The evidence presented in this review, and the associated model, advances our understanding of the developmental origins of chronic pain conditions, with potential application in guiding early pain intervention and prevention efforts, as well as tailoring interventions to suit specific patient needs.

  2. MARKAL-MACRO: A linked model for energy-economy analysis

    International Nuclear Information System (INIS)

    Manne, A.S.; Wene, C.O.

    1992-02-01

    MARKAL-MACRO is an experiment in model linkage for energy and economy analysis. This new tool is intended as an improvement over existing methods for energy strategy assessment. It is designed specifically for estimating the costs and analyzing the technologies proposed for reducing environmental risks such as global climate change or regional air pollution. The greenhouse gas debate illustrates the usefulness of linked energy-economy models. A central issue is the coupling between economic growth, the level of energy demands, and the development of an energy system to supply these demands. The debate is often connected with alternative modeling approaches. The competing philosophies may be labeled ''top-down macroeconomic'' and ''bottom-up engineering'' perspectives. MARKAL is a systems engineering (physical process) analysis built on the concept of a Reference Energy System (RES). MARKAL is solved by means of dynamic linear programming. In most applications, the end use demands are fixed, and an economically efficient solution is obtained by minimizing the present value of energy system's costs throughout the planning horizon. MACRO is a macroeconomic model with an aggregated view of long-term economic growth. The basis input factors of production are capital, labor and individual forms of energy. MACRO is solved by nonlinear optimization

  3. Variable speed limit strategies analysis with link transmission model on urban expressway

    Science.gov (United States)

    Li, Shubin; Cao, Danni

    2018-02-01

    The variable speed limit (VSL) is a kind of active traffic management method. Most of the strategies are used in the expressway traffic flow control in order to ensure traffic safety. However, the urban expressway system is the main artery, carrying most traffic pressure. It has similar traffic characteristics with the expressways between cities. In this paper, the improved link transmission model (LTM) combined with VSL strategies is proposed, based on the urban expressway network. The model can simulate the movement of the vehicles and the shock wave, and well balance the relationship between the amount of calculation and accuracy. Furthermore, the optimal VSL strategy can be proposed based on the simulation method. It can provide management strategies for managers. Finally, a simple example is given to illustrate the model and method. The selected indexes are the average density, the average speed and the average flow on the traffic network in the simulation. The simulation results show that the proposed model and method are feasible. The VSL strategy can effectively alleviate traffic congestion in some cases, and greatly promote the efficiency of the transportation system.

  4. Towards Linking 3D SAR and Lidar Models with a Spatially Explicit Individual Based Forest Model

    Science.gov (United States)

    Osmanoglu, B.; Ranson, J.; Sun, G.; Armstrong, A. H.; Fischer, R.; Huth, A.

    2017-12-01

    In this study, we present a parameterization of the FORMIND individual-based gap model (IBGM)for old growth Atlantic lowland rainforest in La Selva, Costa Rica for the purpose of informing multisensor remote sensing techniques for above ground biomass techniques. The model was successfully parameterized and calibrated for the study site; results show that the simulated forest reproduces the structural complexity of Costa Rican rainforest based on comparisons with CARBONO inventory plot data. Though the simulated stem numbers (378) slightly underestimated the plot data (418), particularly for canopy dominant intermediate shade tolerant trees and shade tolerant understory trees, overall there was a 9.7% difference. Aboveground biomass (kg/ha) showed a 0.1% difference between the simulated forest and inventory plot dataset. The Costa Rica FORMIND simulation was then used to parameterize a spatially explicit (3D) SAR and lidar backscatter models. The simulated forest stands were used to generate a Look Up Table as a tractable means to estimate aboveground forest biomass for these complex forests. Various combinations of lidar and radar variables were evaluated in the LUT inversion. To test the capability of future data for estimation of forest height and biomass, we considered data of 1) L- (or P-) band polarimetric data (backscattering coefficients of HH, HV and VV); 2) L-band dual-pol repeat-pass InSAR data (HH/HV backscattering coefficients and coherences, height of scattering phase center at HH and HV using DEM or surface height from lidar data as reference); 3) P-band polarimetric InSAR data (canopy height from inversion of PolInSAR data or use the coherences and height of scattering phase center at HH, HV and VV); 4) various height indices from waveform lidar data); and 5) surface and canopy top height from photon-counting lidar data. The methods for parameterizing the remote sensing models with the IBGM and developing Look Up Tables will be discussed. Results

  5. Quantitative structure-activity relationship modeling of the toxicity of organothiophosphate pesticides to Daphnia magna and Cyprinus carpio

    NARCIS (Netherlands)

    Zvinavashe, E.; Du, T.; Griff, T.; Berg, van den J.H.J.; Soffers, A.E.M.F.; Vervoort, J.J.M.; Murk, A.J.; Rietjens, I.

    2009-01-01

    Within the REACH regulatory framework in the EU, quantitative structure-activity relationships (QSAR) models are expected to help reduce the number of animals used for experimental testing. The objective of this study was to develop QSAR models to describe the acute toxicity of organothiophosphate

  6. Pathophysiology of white-nose syndrome in bats: a mechanistic model linking wing damage to mortality.

    Science.gov (United States)

    Warnecke, Lisa; Turner, James M; Bollinger, Trent K; Misra, Vikram; Cryan, Paul M; Blehert, David S; Wibbelt, Gudrun; Willis, Craig K R

    2013-08-23

    White-nose syndrome is devastating North American bat populations but we lack basic information on disease mechanisms. Altered blood physiology owing to epidermal invasion by the fungal pathogen Geomyces destructans (Gd) has been hypothesized as a cause of disrupted torpor patterns of affected hibernating bats, leading to mortality. Here, we present data on blood electrolyte concentration, haematology and acid-base balance of hibernating little brown bats, Myotis lucifugus, following experimental inoculation with Gd. Compared with controls, infected bats showed electrolyte depletion (i.e. lower plasma sodium), changes in haematology (i.e. increased haematocrit and decreased glucose) and disrupted acid-base balance (i.e. lower CO2 partial pressure and bicarbonate). These findings indicate hypotonic dehydration, hypovolaemia and metabolic acidosis. We propose a mechanistic model linking tissue damage to altered homeostasis and morbidity/mortality.

  7. Linked cluster expansion in the SU(2) lattice Higgs model at strong gauge coupling

    International Nuclear Information System (INIS)

    Wagner, C.E.M.

    1989-01-01

    A linked cluster expansion is developed for the β=0 limit of the SU(2) Higgs model. This method, when combined with strong gauge coupling expansions, is used to obtain the phase transition surface and the behaviour of scalar and vector masses in the lattice regularized theory. The method, in spite of the low order of truncation of the series applied, gives a reasonable agreement with Monte Carlo data for the phase transition surface and a qualitatively good picture of the behaviour of Higgs, glueball and gauge vector boson masses, in the strong coupling limit. Some limitations of the method are discussed, and an intuitive picture of the different behaviour for small and large bare self-coupling λ is given. (orig.)

  8. Energy-Aware Topology Evolution Model with Link and Node Deletion in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xiaojuan Luo

    2012-01-01

    Full Text Available Based on the complex network theory, a new topological evolving model is proposed. In the evolution of the topology of sensor networks, the energy-aware mechanism is taken into account, and the phenomenon of change of the link and node in the network is discussed. Theoretical analysis and numerical simulation are conducted to explore the topology characteristics and network performance with different node energy distribution. We find that node energy distribution has the weak effect on the degree distribution P(k that evolves into the scale-free state, nodes with more energy carry more connections, and degree correlation is nontrivial disassortative. Moreover, the results show that, when nodes energy is more heterogeneous, the network is better clustered and enjoys higher performance in terms of the network efficiency and the average path length for transmitting data.

  9. Coupled dynamics of node and link states in complex networks: a model for language competition

    International Nuclear Information System (INIS)

    Carro, Adrián; Toral, Raúl; Miguel, Maxi San

    2016-01-01

    Inspired by language competition processes, we present a model of coupled evolution of node and link states. In particular, we focus on the interplay between the use of a language and the preference or attitude of the speakers towards it, which we model, respectively, as a property of the interactions between speakers (a link state) and as a property of the speakers themselves (a node state). Furthermore, we restrict our attention to the case of two socially equivalent languages and to socially inspired network topologies based on a mechanism of triadic closure. As opposed to most of the previous literature, where language extinction is an inevitable outcome of the dynamics, we find a broad range of possible asymptotic configurations, which we classify as: frozen extinction states, frozen coexistence states, and dynamically trapped coexistence states. Moreover, metastable coexistence states with very long survival times and displaying a non-trivial dynamics are found to be abundant. Interestingly, a system size scaling analysis shows, on the one hand, that the probability of language extinction vanishes exponentially for increasing system sizes and, on the other hand, that the time scale of survival of the non-trivial dynamical metastable states increases linearly with the size of the system. Thus, non-trivial dynamical coexistence is the only possible outcome for large enough systems. Finally, we show how this coexistence is characterized by one of the languages becoming clearly predominant while the other one becomes increasingly confined to ‘ghetto-like’ structures: small groups of bilingual speakers arranged in triangles, with a strong preference for the minority language, and using it for their intra-group interactions while they switch to the predominant language for communications with the rest of the population. (paper)

  10. Linking landscape characteristics to local grizzly bear abundance using multiple detection methods in a hierarchical model

    Science.gov (United States)

    Graves, T.A.; Kendall, Katherine C.; Royle, J. Andrew; Stetz, J.B.; Macleod, A.C.

    2011-01-01

    Few studies link habitat to grizzly bear Ursus arctos abundance and these have not accounted for the variation in detection or spatial autocorrelation. We collected and genotyped bear hair in and around Glacier National Park in northwestern Montana during the summer of 2000. We developed a hierarchical Markov chain Monte Carlo model that extends the existing occupancy and count models by accounting for (1) spatially explicit variables that we hypothesized might influence abundance; (2) separate sub-models of detection probability for two distinct sampling methods (hair traps and rub trees) targeting different segments of the population; (3) covariates to explain variation in each sub-model of detection; (4) a conditional autoregressive term to account for spatial autocorrelation; (5) weights to identify most important variables. Road density and per cent mesic habitat best explained variation in female grizzly bear abundance; spatial autocorrelation was not supported. More female bears were predicted in places with lower road density and with more mesic habitat. Detection rates of females increased with rub tree sampling effort. Road density best explained variation in male grizzly bear abundance and spatial autocorrelation was supported. More male bears were predicted in areas of low road density. Detection rates of males increased with rub tree and hair trap sampling effort and decreased over the sampling period. We provide a new method to (1) incorporate multiple detection methods into hierarchical models of abundance; (2) determine whether spatial autocorrelation should be included in final models. Our results suggest that the influence of landscape variables is consistent between habitat selection and abundance in this system.

  11. Linking state-and-transition simulation and timber supply models for forest biomass production scenarios

    Directory of Open Access Journals (Sweden)

    Jennifer K. Costanza

    2015-03-01

    Full Text Available We linked state-and-transition simulation models (STSMs with an economics-based timber supply model to examine landscape dynamics in North Carolina through 2050 for three scenarios of forest biomass production. Forest biomass could be an important source of renewable energy in the future, but there is currently much uncertainty about how biomass production would impact landscapes. In the southeastern US, if forests become important sources of biomass for bioenergy, we expect increased land-use change and forest management. STSMs are ideal for simulating these landscape changes, but the amounts of change will depend on drivers such as timber prices and demand for forest land, which are best captured with forest economic models. We first developed state-and-transition model pathways in the ST-Sim software platform for 49 vegetation and land-use types that incorporated each expected type of landscape change. Next, for the three biomass production scenarios, the SubRegional Timber Supply Model (SRTS was used to determine the annual areas of thinning and harvest in five broad forest types, as well as annual areas converted among those forest types, agricultural, and urban lands. The SRTS output was used to define area targets for STSMs in ST-Sim under two scenarios of biomass production and one baseline, business-as-usual scenario. We show that ST-Sim output matched SRTS targets in most cases. Landscape dynamics results indicate that, compared with the baseline scenario, forest biomass production leads to more forest and, specifically, more intensively managed forest on the landscape by 2050. Thus, the STSMs, informed by forest economics models, provide important information about potential landscape effects of bioenergy production.

  12. Linking state-and-transition simulation and timber supply models for forest biomass production scenarios

    Science.gov (United States)

    Costanza, Jennifer; Abt, Robert C.; McKerrow, Alexa; Collazo, Jaime

    2015-01-01

    We linked state-and-transition simulation models (STSMs) with an economics-based timber supply model to examine landscape dynamics in North Carolina through 2050 for three scenarios of forest biomass production. Forest biomass could be an important source of renewable energy in the future, but there is currently much uncertainty about how biomass production would impact landscapes. In the southeastern US, if forests become important sources of biomass for bioenergy, we expect increased land-use change and forest management. STSMs are ideal for simulating these landscape changes, but the amounts of change will depend on drivers such as timber prices and demand for forest land, which are best captured with forest economic models. We first developed state-and-transition model pathways in the ST-Sim software platform for 49 vegetation and land-use types that incorporated each expected type of landscape change. Next, for the three biomass production scenarios, the SubRegional Timber Supply Model (SRTS) was used to determine the annual areas of thinning and harvest in five broad forest types, as well as annual areas converted among those forest types, agricultural, and urban lands. The SRTS output was used to define area targets for STSMs in ST-Sim under two scenarios of biomass production and one baseline, business-as-usual scenario. We show that ST-Sim output matched SRTS targets in most cases. Landscape dynamics results indicate that, compared with the baseline scenario, forest biomass production leads to more forest and, specifically, more intensively managed forest on the landscape by 2050. Thus, the STSMs, informed by forest economics models, provide important information about potential landscape effects of bioenergy production.

  13. The Arctic Marine Pulses Model: Linking Contiguous Domains in the Pacific Arctic Region

    Science.gov (United States)

    Moore, S. E.; Stabeno, P. J.

    2016-02-01

    The Pacific Arctic marine ecosystem extends from the northern Bering Sea, across the Chukchi and into the East Siberian and Beaufort seas. Food webs in this domain are short, a simplicity that belies the biophysical complexity underlying trophic linkages from primary production to humans. Existing biophysical models, such as pelagic-benthic coupling and advective processes, provide frameworks for connecting certain aspects of the marine food web, but do not offer a full accounting of events that occur seasonally across the Pacific Arctic. In the course of the Synthesis of Arctic Research (SOAR) project, a holistic Arctic Marine Pulses (AMP) model was developed that depicts seasonal biophysical `pulses' across a latitudinal gradient, and linking four previously-described contiguous domains, including the: (i) Pacific-Arctic domain = the focal region; (ii) seasonal ice zone domain; (iii) Pacific marginal domain; and (iv) riverine coastal domain. The AMP model provides a spatial-temporal framework to guide research on dynamic ecosystem processes during this period of rapid biophysical changes in the Pacific Arctic. Some of the processes included in the model, such as pelagic-benthic coupling in the Northern Bering and Chukchi seas, and advection and upwelling along the Beaufort shelf, are already the focus of sampling via the Distributed Biological Observatory (DBO) and other research programs. Other aspects such as biological processes associated with the seasonal ice zone and trophic responses to riverine outflow have received less attention. The AMP model could be enhanced by the application of visualization tools to provide a means to watch a season unfold in space and time. The capability to track sea ice dynamics and water masses and to move nutrients, prey and upper-trophic predators in space and time would provide a strong foundation for the development of predictive human-inclusive ecosystem models for the Pacific Arctic.

  14. Experimental and model based investigation of the links between snow bidirectional reflectance and snow microstructure

    Science.gov (United States)

    Dumont, M.; Flin, F.; Malinka, A.; Brissaud, O.; Hagenmuller, P.; Dufour, A.; Lapalus, P.; Lesaffre, B.; Calonne, N.; Rolland du Roscoat, S.; Ando, E.

    2017-12-01

    Snow optical properties are unique among Earth surface and crucial for a wide range of applications. The bi-directional reflectance, hereafter BRDF, of snow is sensible to snow microstructure. However the complex interplays between different parameters of snow microstructure namely size parameters and shape parameters on reflectance are challenging to disentangle both theoretically and experimentally. An accurate understanding and modelling of snow BRDF is required to correctly process satellite data. BRDF measurements might also provide means of characterizing snow morphology. This study presents one of the very few dataset that combined bi-directional reflectance measurements over 500-2500 nm and X-ray tomography of the snow microstructure for three different snow samples and two snow types. The dataset is used to evaluate the approach from Malinka, 2014 that relates snow optical properties to the chord length distribution in the snow microstructure. For low and medium absorption, the model accurately reproduces the measurements but tends to slightly overestimate the anisotropy of the reflectance. The model indicates that the deviation of the ice chord length distribution from an exponential distribution, that can be understood as a characterization of snow types, does not impact the reflectance for such absorptions. The simulations are also impacted by the uncertainties in the ice refractive index values. At high absorption and high viewing/incident zenith angle, the simulations and the measurements disagree indicating that some of the assumptions made in the model are not met anymore. The study also indicates that crystal habits might play a significant role for the reflectance under such geometries and wavelengths. However quantitative relationship between crystal habits and reflectance alongside with potential optical methodologies to classify snow morphology would require an extended dataset over more snow types. This extended dataset can likely be obtained

  15. Stability in a fiber bundle model: Existence of strong links and the effect of disorder

    Science.gov (United States)

    Roy, Subhadeep

    2018-05-01

    The present paper deals with a fiber bundle model which consists of a fraction α of infinitely strong fibers. The inclusion of such an unbreakable fraction has been proven to affect the failure process in early studies, especially around a critical value αc. The present work has a twofold purpose: (i) a study of failure abruptness, mainly the brittle to quasibrittle transition point with varying α and (ii) variation of αc as we change the strength of disorder introduced in the model. The brittle to quasibrittle transition is confirmed from the failure abruptness. On the other hand, the αc is obtained from the knowledge of failure abruptness as well as the statistics of avalanches. It is observed that the brittle to quasibrittle transition point scales to lower values, suggesting more quasi-brittle-like continuous failure when α is increased. At the same time, the bundle becomes stronger as there are larger numbers of strong links to support the external stress. High α in a highly disordered bundle leads to an ideal situation where the bundle strength, as well as the predictability in failure process is very high. Also, the critical fraction αc, required to make the model deviate from the conventional results, increases with decreasing strength of disorder. The analytical expression for αc shows good agreement with the numerical results. Finally, the findings in the paper are compared with previous results and real-life applications of composite materials.

  16. Establishment of quantitative severity evaluation model for spinal cord injury by metabolomic fingerprinting.

    Directory of Open Access Journals (Sweden)

    Jin Peng

    Full Text Available Spinal cord injury (SCI is a devastating event with a limited hope for recovery and represents an enormous public health issue. It is crucial to understand the disturbances in the metabolic network after SCI to identify injury mechanisms and opportunities for treatment intervention. Through plasma 1H-nuclear magnetic resonance (NMR screening, we identified 15 metabolites that made up an "Eigen-metabolome" capable of distinguishing rats with severe SCI from healthy control rats. Forty enzymes regulated these 15 metabolites in the metabolic network. We also found that 16 metabolites regulated by 130 enzymes in the metabolic network impacted neurobehavioral recovery. Using the Eigen-metabolome, we established a linear discrimination model to cluster rats with severe and mild SCI and control rats into separate groups and identify the interactive relationships between metabolic biomarkers in the global metabolic network. We identified 10 clusters in the global metabolic network and defined them as distinct metabolic disturbance domains of SCI. Metabolic paths such as retinal, glycerophospholipid, arachidonic acid metabolism; NAD-NADPH conversion process, tyrosine metabolism, and cadaverine and putrescine metabolism were included. In summary, we presented a novel interdisciplinary method that integrates metabolomics and global metabolic network analysis to visualize metabolic network disturbances after SCI. Our study demonstrated the systems biological study paradigm that integration of 1H-NMR, metabolomics, and global metabolic network analysis is useful to visualize complex metabolic disturbances after severe SCI. Furthermore, our findings may provide a new quantitative injury severity evaluation model for clinical use.

  17. Currency risk and prices of oil and petroleum products: a simulation with a quantitative model

    International Nuclear Information System (INIS)

    Aniasi, L.; Ottavi, D.; Rubino, E.; Saracino, A.

    1992-01-01

    This paper analyzes the relationship between the exchange rates of the US Dollar against the four major European currencies and the prices of oil and its main products in those countries. In fact, the sensitivity of the prices to the exchange rate movements is of fundamental importance for the refining and distribution industries of importing countries. The result of the analysis shows that in neither free market conditions, as those present in Great Britain, France and Germany, nor in regulated markets, i.e. the italian one, do the variations of petroleum product prices fully absorb the variation of the exchange rates. In order to assess the above relationship, we first tested the order of co-integration of the time series of exchange rates of EMS currencies with those of international prices of oil and its derivative products; then we used a transfer-function model to reproduce the quantitative relationships between those variables. Using these results, we then reproduced domestic price functions with partial adjustment mechanisms. Finally, we used the above model to run a simulation of the deviation from the steady-state pattern caused by exchange-rate exogenous shocks. 21 refs., 5 figs., 3 tabs

  18. Quantitative modelling of amyloidogenic processing and its influence by SORLA in Alzheimer's disease.

    Science.gov (United States)

    Schmidt, Vanessa; Baum, Katharina; Lao, Angelyn; Rateitschak, Katja; Schmitz, Yvonne; Teichmann, Anke; Wiesner, Burkhard; Petersen, Claus Munck; Nykjaer, Anders; Wolf, Jana; Wolkenhauer, Olaf; Willnow, Thomas E

    2012-01-04

    The extent of proteolytic processing of the amyloid precursor protein (APP) into neurotoxic amyloid-β (Aβ) peptides is central to the pathology of Alzheimer's disease (AD). Accordingly, modifiers that increase Aβ production rates are risk factors in the sporadic form of AD. In a novel systems biology approach, we combined quantitative biochemical studies with mathematical modelling to establish a kinetic model of amyloidogenic processing, and to evaluate the influence by SORLA/SORL1, an inhibitor of APP processing and important genetic risk factor. Contrary to previous hypotheses, our studies demonstrate that secretases represent allosteric enzymes that require cooperativity by APP oligomerization for efficient processing. Cooperativity enables swift adaptive changes in secretase activity with even small alterations in APP concentration. We also show that SORLA prevents APP oligomerization both in cultured cells and in the brain in vivo, eliminating the preferred form of the substrate and causing secretases to switch to a less efficient non-allosteric mode of action. These data represent the first mathematical description of the contribution of genetic risk factors to AD substantiating the relevance of subtle changes in SORLA levels for amyloidogenic processing as proposed for patients carrying SORL1 risk alleles.

  19. Automatic and quantitative measurement of collagen gel contraction using model-guided segmentation

    Science.gov (United States)

    Chen, Hsin-Chen; Yang, Tai-Hua; Thoreson, Andrew R.; Zhao, Chunfeng; Amadio, Peter C.; Sun, Yung-Nien; Su, Fong-Chin; An, Kai-Nan

    2013-08-01

    Quantitative measurement of collagen gel contraction plays a critical role in the field of tissue engineering because it provides spatial-temporal assessment (e.g., changes of gel area and diameter during the contraction process) reflecting the cell behavior and tissue material properties. So far the assessment of collagen gels relies on manual segmentation, which is time-consuming and suffers from serious intra- and inter-observer variability. In this study, we propose an automatic method combining various image processing techniques to resolve these problems. The proposed method first detects the maximal feasible contraction range of circular references (e.g., culture dish) and avoids the interference of irrelevant objects in the given image. Then, a three-step color conversion strategy is applied to normalize and enhance the contrast between the gel and background. We subsequently introduce a deformable circular model which utilizes regional intensity contrast and circular shape constraint to locate the gel boundary. An adaptive weighting scheme was employed to coordinate the model behavior, so that the proposed system can overcome variations of gel boundary appearances at different contraction stages. Two measurements of collagen gels (i.e., area and diameter) can readily be obtained based on the segmentation results. Experimental results, including 120 gel images for accuracy validation, showed high agreement between the proposed method and manual segmentation with an average dice similarity coefficient larger than 0.95. The results also demonstrated obvious improvement in gel contours obtained by the proposed method over two popular, generic segmentation methods.

  20. Generating quantitative models describing the sequence specificity of biological processes with the stabilized matrix method

    Directory of Open Access Journals (Sweden)

    Sette Alessandro

    2005-05-01

    Full Text Available Abstract Background Many processes in molecular biology involve the recognition of short sequences of nucleic-or amino acids, such as the binding of immunogenic peptides to major histocompatibility complex (MHC molecules. From experimental data, a model of the sequence specificity of these processes can be constructed, such as a sequence motif, a scoring matrix or an artificial neural network. The purpose of these models is two-fold. First, they can provide a summary of experimental results, allowing for a deeper understanding of the mechanisms involved in sequence recognition. Second, such models can be used to predict the experimental outcome for yet untested sequences. In the past we reported the development of a method to generate such models called the Stabilized Matrix Method (SMM. This method has been successfully applied to predicting peptide binding to MHC molecules, peptide transport by the transporter associated with antigen presentation (TAP and proteasomal cleavage of protein sequences. Results Herein we report the implementation of the SMM algorithm as a publicly available software package. Specific features determining the type of problems the method is most appropriate for are discussed. Advantageous features of the package are: (1 the output generated is easy to interpret, (2 input and output are both quantitative, (3 specific computational strategies to handle experimental noise are built in, (4 the algorithm is designed to effectively handle bounded experimental data, (5 experimental data from randomized peptide libraries and conventional peptides can easily be combined, and (6 it is possible to incorporate pair interactions between positions of a sequence. Conclusion Making the SMM method publicly available enables bioinformaticians and experimental biologists to easily access it, to compare its performance to other prediction methods, and to extend it to other applications.

  1. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  2. Quantitative Phosphoproteomics Reveals Wee1 Kinase as a Therapeutic Target in a Model of Proneural Glioblastoma.

    Science.gov (United States)

    Lescarbeau, Rebecca S; Lei, Liang; Bakken, Katrina K; Sims, Peter A; Sarkaria, Jann N; Canoll, Peter; White, Forest M

    2016-06-01

    Glioblastoma (GBM) is the most common malignant primary brain cancer. With a median survival of about a year, new approaches to treating this disease are necessary. To identify signaling molecules regulating GBM progression in a genetically engineered murine model of proneural GBM, we quantified phosphotyrosine-mediated signaling using mass spectrometry. Oncogenic signals, including phosphorylated ERK MAPK, PI3K, and PDGFR, were found to be increased in the murine tumors relative to brain. Phosphorylation of CDK1 pY15, associated with the G2 arrest checkpoint, was identified as the most differentially phosphorylated site, with a 14-fold increase in phosphorylation in the tumors. To assess the role of this checkpoint as a potential therapeutic target, syngeneic primary cell lines derived from these tumors were treated with MK-1775, an inhibitor of Wee1, the kinase responsible for CDK1 Y15 phosphorylation. MK-1775 treatment led to mitotic catastrophe, as defined by increased DNA damage and cell death by apoptosis. To assess the extensibility of targeting Wee1/CDK1 in GBM, patient-derived xenograft (PDX) cell lines were also treated with MK-1775. Although the response was more heterogeneous, on-target Wee1 inhibition led to decreased CDK1 Y15 phosphorylation and increased DNA damage and apoptosis in each line. These results were also validated in vivo, where single-agent MK-1775 demonstrated an antitumor effect on a flank PDX tumor model, increasing mouse survival by 1.74-fold. This study highlights the ability of unbiased quantitative phosphoproteomics to reveal therapeutic targets in tumor models, and the potential for Wee1 inhibition as a treatment approach in preclinical models of GBM. Mol Cancer Ther; 15(6); 1332-43. ©2016 AACR. ©2016 American Association for Cancer Research.

  3. The Union Health Center: a working model of clinical care linked to preventive occupational health services.

    Science.gov (United States)

    Herbert, R; Plattus, B; Kellogg, L; Luo, J; Marcus, M; Mascolo, A; Landrigan, P J

    1997-03-01

    As health care provision in the United States shifts to primary care settings, it is vital that new models of occupational health services be developed that link clinical care to prevention. The model program described in this paper was developed at the Union Health Center (UHC), a comprehensive health care center supported by the International Ladies Garment Workers Union (now the Union of Needletrades, Industrial and Textile Employees) serving a population of approximately 50,000 primarily minority, female garment workers in New York City. The objective of this paper is to describe a model occupational medicine program in a union-based comprehensive health center linking accessible clinical care with primary and secondary disease prevention efforts. To assess the presence of symptoms suggestive of occupational disease, a health status questionnaire was administered to female workers attending the UHC for routine health maintenance. Based on the results of this survey, an occupational medicine clinic was developed that integrated direct clinical care with worker and employer education and workplace hazard abatement. To assess the success of this new approach, selected cases of sentinel health events were tracked and a chart review was conducted after 3 years of clinic operation. Prior to initiation of the occupational medicine clinic, 64% (648) of the workers surveyed reported symptoms indicative of occupational illnesses. However, only 42 (4%) reported having been told by a physician that they had an occupational illness and only 4 (.4%) reported having field a workers' compensation claim for an occupational disease. In the occupational medicine clinic established at the UHC, a health and safety specialist acts as a case manager, coordinating worker and employer education as well as workplace hazard abatement focused on disease prevention, ensuring that every case of occupational disease is treated as a potential sentinel health event. As examples of the success

  4. DEVELOPMENT OF MODEL FOR QUANTITATIVE EVALUATION OF DYNAMICALLY STABLE FORMS OF RIVER CHANNELS

    Directory of Open Access Journals (Sweden)

    O. V. Zenkin

    2017-01-01

    systems. The determination of regularities of development of bed forms and quantitative relations between their parameters are based on modeling the “right” forms of riverbed.The research has resulted in establishing and testing methodology of simulation modeling, which allows one to identify dynamically stable form of riverbed. 

  5. Quantitative structure activity relationship model for predicting the depletion percentage of skin allergic chemical substances of glutathione

    International Nuclear Information System (INIS)

    Si Hongzong; Wang Tao; Zhang Kejun; Duan Yunbo; Yuan Shuping; Fu Aiping; Hu Zhide

    2007-01-01

    A quantitative model was developed to predict the depletion percentage of glutathione (DPG) compounds by gene expression programming (GEP). Each kind of compound was represented by several calculated structural descriptors involving constitutional, topological, geometrical, electrostatic and quantum-chemical features of compounds. The GEP method produced a nonlinear and five-descriptor quantitative model with a mean error and a correlation coefficient of 10.52 and 0.94 for the training set, 22.80 and 0.85 for the test set, respectively. It is shown that the GEP predicted results are in good agreement with experimental ones, better than those of the heuristic method

  6. Improved quantitative 90 Y bremsstrahlung SPECT/CT reconstruction with Monte Carlo scatter modeling.

    Science.gov (United States)

    Dewaraja, Yuni K; Chun, Se Young; Srinivasa, Ravi N; Kaza, Ravi K; Cuneo, Kyle C; Majdalany, Bill S; Novelli, Paula M; Ljungberg, Michael; Fessler, Jeffrey A

    2017-12-01

    In 90 Y microsphere radioembolization (RE), accurate post-therapy imaging-based dosimetry is important for establishing absorbed dose versus outcome relationships for developing future treatment planning strategies. Additionally, accurately assessing microsphere distributions is important because of concerns for unexpected activity deposition outside the liver. Quantitative 90 Y imaging by either SPECT or PET is challenging. In 90 Y SPECT model based methods are necessary for scatter correction because energy window-based methods are not feasible with the continuous bremsstrahlung energy spectrum. The objective of this work was to implement and evaluate a scatter estimation method for accurate 90 Y bremsstrahlung SPECT/CT imaging. Since a fully Monte Carlo (MC) approach to 90 Y SPECT reconstruction is computationally very demanding, in the present study the scatter estimate generated by a MC simulator was combined with an analytical projector in the 3D OS-EM reconstruction model. A single window (105 to 195-keV) was used for both the acquisition and the projector modeling. A liver/lung torso phantom with intrahepatic lesions and low-uptake extrahepatic objects was imaged to evaluate SPECT/CT reconstruction without and with scatter correction. Clinical application was demonstrated by applying the reconstruction approach to five patients treated with RE to determine lesion and normal liver activity concentrations using a (liver) relative calibration. There was convergence of the scatter estimate after just two updates, greatly reducing computational requirements. In the phantom study, compared with reconstruction without scatter correction, with MC scatter modeling there was substantial improvement in activity recovery in intrahepatic lesions (from > 55% to > 86%), normal liver (from 113% to 104%), and lungs (from 227% to 104%) with only a small degradation in noise (13% vs. 17%). Similarly, with scatter modeling contrast improved substantially both visually and in

  7. Interdiffusion of the aluminum magnesium system. Quantitative analysis and numerical model; Interdiffusion des Aluminium-Magnesium-Systems. Quantitative Analyse und numerische Modellierung

    Energy Technology Data Exchange (ETDEWEB)

    Seperant, Florian

    2012-03-21

    Aluminum coatings are a promising approach to protect magnesium alloys against corrosion and thereby making them accessible to a variety of technical applications. Thermal treatment enhances the adhesion of the aluminium coating on magnesium by interdiffusion. For a deeper understanding of the diffusion process at the interface, a quantitative description of the Al-Mg system is necessary. On the basis of diffusion experiments with infinite reservoirs of aluminum and magnesium, the interdiffusion coefficients of the intermetallic phases of the Al-Mg-system are calculated with the Sauer-Freise method for the first time. To solve contradictions in the literature concerning the intrinsic diffusion coefficients, the possibility of a bifurcation of the Kirkendall plane is considered. Furthermore, a physico-chemical description of interdiffusion is provided to interpret the observed phase transitions. The developed numerical model is based on a temporally varied discretization of the space coordinate. It exhibits excellent quantitative agreement with the experimentally measured concentration profile. This confirms the validity of the obtained diffusion coefficients. Moreover, the Kirkendall shift in the Al-Mg system is simulated for the first time. Systems with thin aluminum coatings on magnesium also exhibit a good correlation between simulated and experimental concentration profiles. Thus, the diffusion coefficients are also valid for Al-coated systems. Hence, it is possible to derive parameters for a thermal treatment by simulation, resulting in an optimized modification of the magnesium surface for technical applications.

  8. A quantitative exposure model simulating human norovirus transmission during preparation of deli sandwiches.

    Science.gov (United States)

    Stals, Ambroos; Jacxsens, Liesbeth; Baert, Leen; Van Coillie, Els; Uyttendaele, Mieke

    2015-03-02

    Human noroviruses (HuNoVs) are a major cause of food borne gastroenteritis worldwide. They are often transmitted via infected and shedding food handlers manipulating foods such as deli sandwiches. The presented study aimed to simulate HuNoV transmission during the preparation of deli sandwiches in a sandwich bar. A quantitative exposure model was developed by combining the GoldSim® and @Risk® software packages. Input data were collected from scientific literature and from a two week observational study performed at two sandwich bars. The model included three food handlers working during a three hour shift on a shared working surface where deli sandwiches are prepared. The model consisted of three components. The first component simulated the preparation of the deli sandwiches and contained the HuNoV reservoirs, locations within the model allowing the accumulation of NoV and the working of intervention measures. The second component covered the contamination sources being (1) the initial HuNoV contaminated lettuce used on the sandwiches and (2) HuNoV originating from a shedding food handler. The third component included four possible intervention measures to reduce HuNoV transmission: hand and surface disinfection during preparation of the sandwiches, hand gloving and hand washing after a restroom visit. A single HuNoV shedding food handler could cause mean levels of 43±18, 81±37 and 18±7 HuNoV particles present on the deli sandwiches, hands and working surfaces, respectively. Introduction of contaminated lettuce as the only source of HuNoV resulted in the presence of 6.4±0.8 and 4.3±0.4 HuNoV on the food and hand reservoirs. The inclusion of hand and surface disinfection and hand gloving as a single intervention measure was not effective in the model as only marginal reductions of HuNoV levels were noticeable in the different reservoirs. High compliance of hand washing after a restroom visit did reduce HuNoV presence substantially on all reservoirs. The

  9. A biological-based model that links genomic instability, bystander effects, and adaptive response

    International Nuclear Information System (INIS)

    Scott, B.R.

    2004-01-01

    This paper links genomic instability, bystander effects, and adaptive response in mammalian cell communities via a novel biological-based, dose-response model called NEOTRANS 3 . The model is an extension of the NEOTRANS 2 model that addressed stochastic effects (genomic instability, mutations, and neoplastic transformation) associated with brief exposure to low radiation doses. With both models, ionizing radiation produces DNA damage in cells that can be associated with varying degrees of genomic instability. Cells with persistent problematic instability (PPI) are mutants that arise via misrepair of DNA damage. Progeny of PPI cells also have PPI and can undergo spontaneous neoplastic transformation. Unlike NEOTRANS 2 , with NEOTRANS 3 newly induced mutant PPI cells and their neoplastically transformed progeny can be suppressed via our previously introduced protective apoptosis-mediated (PAM) process, which can be activated by low linear energy transfer (LET) radiation. However, with NEOTRANS 3 (which like NEOTRANS 2 involves cross-talk between nongenomically compromised [e.g., nontransformed, nonmutants] and genomically compromised [e.g., mutants, transformants, etc.] cells), it is assumed that PAM is only activated over a relatively narrow, dose-rate-dependent interval (D PAM ,D off ); where D PAM is a small stochastic activation threshold, and D off is the stochastic dose above which PAM does not occur. PAM cooperates with activated normal DNA repair and with activated normal apoptosis in guarding against genomic instability. Normal repair involves both error-free repair and misrepair components. Normal apoptosis and the error-free component of normal repair protect mammals by preventing the occurrence of mutant cells. PAM selectively removes mutant cells arising via the misrepair component of normal repair, selectively removes existing neoplastically transformed cells, and probably selectively removes other genomically compromised cells when it is activated

  10. A conceptual model linking functional gene expression and reductive dechlorination rates of chlorinated ethenes in clay rich groundwater sediment

    DEFF Research Database (Denmark)

    Bælum, Jacob; Chambon, Julie Claire Claudia; Scheutz, Charlotte

    2013-01-01

    We used current knowledge of cellular processes involved in reductive dechlorination to develop a conceptual model to describe the regulatory system of dechlorination at the cell level; the model links bacterial growth and substrate consumption to the abundance of messenger RNA of functional gene...

  11. An Interpersonal Circumplex Model of Children's Social Goals: Links with Peer-Reported Behavior and Sociometric Status

    Science.gov (United States)

    Ojanen, Tiina; Gronroos, Matti; Salmivalli, Christina

    2005-01-01

    The objective of the present research was to develop an assessment model for children's social goals. The aims were (a) to fit children's social goals to a circumplex model and to examine links between goals and peer-reported social behaviors (aggression, withdrawal, and prosocial behavior) in a sample of 276 participants (134 girls, 11- to…

  12. Model developments for quantitative estimates of the benefits of the signals on nuclear power plant availability and economics

    International Nuclear Information System (INIS)

    Seong, Poong Hyun

    1993-01-01

    A novel framework for quantitative estimates of the benefits of signals on nuclear power plant availability and economics has been developed in this work. The models developed in this work quantify how the perfect signals affect the human operator's success in restoring the power plant to the desired state when it enters undesirable transients. Also, the models quantify the economic benefits of these perfect signals. The models have been applied to the condensate feedwater system of the nuclear power plant for demonstration. (Author)

  13. Ecological-network models link diversity, structure and function in the plankton food-web

    Science.gov (United States)

    D'Alelio, Domenico; Libralato, Simone; Wyatt, Timothy; Ribera D'Alcalà, Maurizio

    2016-02-01

    A planktonic food-web model including sixty-three functional nodes (representing auto- mixo- and heterotrophs) was developed to integrate most trophic diversity present in the plankton. The model was implemented in two variants - which we named ‘green’ and ‘blue’ - characterized by opposite amounts of phytoplankton biomass and representing, respectively, bloom and non-bloom states of the system. Taxonomically disaggregated food-webs described herein allowed to shed light on how components of the plankton community changed their trophic behavior in the two different conditions, and modified the overall functioning of the plankton food web. The green and blue food-webs showed distinct organizations in terms of trophic roles of the nodes and carbon fluxes between them. Such re-organization stemmed from switches in selective grazing by both metazoan and protozoan consumers. Switches in food-web structure resulted in relatively small differences in the efficiency of material transfer towards higher trophic levels. For instance, from green to blue states, a seven-fold decrease in phytoplankton biomass translated into only a two-fold decrease in potential planktivorous fish biomass. By linking diversity, structure and function in the plankton food-web, we discuss the role of internal mechanisms, relying on species-specific functionalities, in driving the ‘adaptive’ responses of plankton communities to perturbations.

  14. ON THE TRANSITIONAL DISK CLASS: LINKING OBSERVATIONS OF T TAURI STARS AND PHYSICAL DISK MODELS

    International Nuclear Information System (INIS)

    Espaillat, C.; Andrews, S.; Qi, C.; Wilner, D.; Ingleby, L.; Calvet, N.; Hernández, J.; Furlan, E.; D'Alessio, P.; Muzerolle, J.

    2012-01-01

    Two decades ago 'transitional disks' (TDs) described spectral energy distributions (SEDs) of T Tauri stars with small near-IR excesses, but significant mid- and far-IR excesses. Many inferred this indicated dust-free holes in disks possibly cleared by planets. Recently, this term has been applied disparately to objects whose Spitzer SEDs diverge from the expectations for a typical full disk (FD). Here, we use irradiated accretion disk models to fit the SEDs of 15 such disks in NGC 2068 and IC 348. One group has a 'dip' in infrared emission while the others' continuum emission decreases steadily at all wavelengths. We find that the former have an inner disk hole or gap at intermediate radii in the disk and we call these objects 'transitional disks' and 'pre-transitional disks' (PTDs), respectively. For the latter group, we can fit these SEDs with FD models and find that millimeter data are necessary to break the degeneracy between dust settling and disk mass. We suggest that the term 'transitional' only be applied to objects that display evidence for a radical change in the disk's radial structure. Using this definition, we find that TDs and PTDs tend to have lower mass accretion rates than FDs and that TDs have lower accretion rates than PTDs. These reduced accretion rates onto the star could be linked to forming planets. Future observations of TDs and PTDs will allow us to better quantify the signatures of planet formation in young disks.

  15. A model linking clinical workforce skill mix planning to health and health care dynamics

    Directory of Open Access Journals (Sweden)

    McDonnell Geoff

    2010-04-01

    Full Text Available Abstract Background In an attempt to devise a simpler computable tool to assist workforce planners in determining what might be an appropriate mix of health service skills, our discussion led us to consider the implications of skill mixing and workforce composition beyond the 'stock and flow' approach of much workforce planning activity. Methods Taking a dynamic systems approach, we were able to address the interactions, delays and feedbacks that influence the balance between the major components of health and health care. Results We linked clinical workforce requirements to clinical workforce workload, taking into account the requisite facilities, technologies, other material resources and their funding to support clinical care microsystems; gave recognition to productivity and quality issues; took cognisance of policies, governance and power concerns in the establishment and operation of the health care system; and, going back to the individual, gave due attention to personal behaviour and biology within the socio-political family environment. Conclusion We have produced the broad endogenous systems model of health and health care which will enable human resource planners to operate within real world variables. We are now considering the development of simple, computable national versions of this model.

  16. A model linking clinical workforce skill mix planning to health and health care dynamics.

    Science.gov (United States)

    Masnick, Keith; McDonnell, Geoff

    2010-04-30

    In an attempt to devise a simpler computable tool to assist workforce planners in determining what might be an appropriate mix of health service skills, our discussion led us to consider the implications of skill mixing and workforce composition beyond the 'stock and flow' approach of much workforce planning activity. Taking a dynamic systems approach, we were able to address the interactions, delays and feedbacks that influence the balance between the major components of health and health care. We linked clinical workforce requirements to clinical workforce workload, taking into account the requisite facilities, technologies, other material resources and their funding to support clinical care microsystems; gave recognition to productivity and quality issues; took cognisance of policies, governance and power concerns in the establishment and operation of the health care system; and, going back to the individual, gave due attention to personal behaviour and biology within the socio-political family environment. We have produced the broad endogenous systems model of health and health care which will enable human resource planners to operate within real world variables. We are now considering the development of simple, computable national versions of this model.

  17. Linking Fine-Scale Observations and Model Output with Imagery at Multiple Scales

    Science.gov (United States)

    Sadler, J.; Walthall, C. L.

    2014-12-01

    The development and implementation of a system for seasonal worldwide agricultural yield estimates is underway with the international Group on Earth Observations GeoGLAM project. GeoGLAM includes a research component to continually improve and validate its algorithms. There is a history of field measurement campaigns going back decades to draw upon for ways of linking surface measurements and model results with satellite observations. Ground-based, in-situ measurements collected by interdisciplinary teams include yields, model inputs and factors affecting scene radiation. Data that is comparable across space and time with careful attention to calibration is essential for the development and validation of agricultural applications of remote sensing. Data management to ensure stewardship, availability and accessibility of the data are best accomplished when considered an integral part of the research. The expense and logistical challenges of field measurement campaigns can be cost-prohibitive and because of short funding cycles for research, access to consistent, stable study sites can be lost. The use of a dedicated staff for baseline data needed by multiple investigators, and conducting measurement campaigns using existing measurement networks such as the USDA Long Term Agroecosystem Research network can fulfill these needs and ensure long-term access to study sites.

  18. New chondrosarcoma cell lines and mouse models to study the link between chondrogenesis and chemoresistance.

    Science.gov (United States)

    Monderer, David; Luseau, Alexandrine; Bellec, Amélie; David, Emmanuelle; Ponsolle, Stéphanie; Saiagh, Soraya; Bercegeay, Sylvain; Piloquet, Philippe; Denis, Marc G; Lodé, Laurence; Rédini, Françoise; Biger, Marine; Heymann, Dominique; Heymann, Marie-Françoise; Le Bot, Ronan; Gouin, François; Blanchard, Frédéric

    2013-10-01

    Chondrosarcomas are cartilage-forming, poorly vascularized tumors. They represent the second malignant primary bone tumor of adults after osteosarcoma, but in contrast to osteosarcoma they are resistant to chemotherapy and radiotherapy, surgical excision remaining the only therapeutic option. Few cell lines and animal models are available, and the mechanisms behind their chemoresistance remain largely unknown. Our goal was to establish new cell lines and animal cancer models from human chondrosarcoma biopsies to study their chemoresistance. Between 2007 and 2012, 10 chondrosarcoma biopsies were collected and used for cell culture and transplantation into nude mice. Only one transplanted biopsy and one injected cell line has engrafted successfully leading to conventional central high-grade chondrosarcoma similar to the original biopsies. In culture, two new stable cell lines were obtained, one from a dedifferentiated and one from a grade III conventional central chondrosarcoma biopsy. Their genetic characterization revealed triploid karyotypes, mutations in IDH1, IDH2, and TP53, deletion in CDKN2A and/or MDM2 amplification. These cell lines expressed mesenchymal membrane markers (CD44, 73, 90, 105) and were able to produce a hyaline cartilaginous matrix when cultured in chondrogenic three-dimensional (3D) pellets. Using a high-throughput quantitative RT-PCR approach, we observed that cell lines cultured in monolayer had lost expression of several genes implicated in cartilage development (COL2A1, COMP, ACAN) but restored their expression in 3D cultures. Chondrosarcoma cells in monolayer were sensitive to several conventional chemotherapeutic agents but became resistant to low doses of mafosfamide or doxorubicin when cultured in 3D pellets, in parallel with an altered nucleic accumulation of the drug. Our results indicate that the cartilaginous matrix produced by chondrosarcoma cells may impair diffusion of several drugs and thus contribute to chemoresistance

  19. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    Science.gov (United States)

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  20. Environmental determinants of tropical forest and savanna distribution: A quantitative model evaluation and its implication

    Science.gov (United States)

    Zeng, Zhenzhong; Chen, Anping; Piao, Shilong; Rabin, Sam; Shen, Zehao

    2014-07-01

    The distributions of tropical ecosystems are rapidly being altered by climate change and anthropogenic activities. One possible trend—the loss of tropical forests and replacement by savannas—could result in significant shifts in ecosystem services and biodiversity loss. However, the influence and the relative importance of environmental factors in regulating the distribution of tropical forest and savanna biomes are still poorly understood, which makes it difficult to predict future tropical forest and savanna distributions in the context of climate change. Here we use boosted regression trees to quantitatively evaluate the importance of environmental predictors—mainly climatic, edaphic, and fire factors—for the tropical forest-savanna distribution at a mesoscale across the tropics (between 15°N and 35°S). Our results demonstrate that climate alone can explain most of the distribution of tropical forest and savanna at the scale considered; dry season average precipitation is the single most important determinant across tropical Asia-Australia, Africa, and South America. Given the strong tendency of increased seasonality and decreased dry season precipitation predicted by global climate models, we estimate that about 28% of what is now tropical forest would likely be lost to savanna by the late 21st century under the future scenario considered. This study highlights the importance of climate seasonality and interannual variability in predicting the distribution of tropical forest and savanna, supporting the climate as the primary driver in the savanna biogeography.

  1. Model-independent quantitative measurement of nanomechanical oscillator vibrations using electron-microscope linescans

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Huan; Fenton, J. C.; Chiatti, O. [London Centre for Nanotechnology, University College London, 17–19 Gordon Street, London WC1H 0AH (United Kingdom); Warburton, P. A. [London Centre for Nanotechnology, University College London, 17–19 Gordon Street, London WC1H 0AH (United Kingdom); Department of Electronic and Electrical Engineering, University College London, Torrington Place, London WC1E 7JE (United Kingdom)

    2013-07-15

    Nanoscale mechanical resonators are highly sensitive devices and, therefore, for application as highly sensitive mass balances, they are potentially superior to micromachined cantilevers. The absolute measurement of nanoscale displacements of such resonators remains a challenge, however, since the optical signal reflected from a cantilever whose dimensions are sub-wavelength is at best very weak. We describe a technique for quantitative analysis and fitting of scanning-electron microscope (SEM) linescans across a cantilever resonator, involving deconvolution from the vibrating resonator profile using the stationary resonator profile. This enables determination of the absolute amplitude of nanomechanical cantilever oscillations even when the oscillation amplitude is much smaller than the cantilever width. This technique is independent of any model of secondary-electron emission from the resonator and is, therefore, applicable to resonators with arbitrary geometry and material inhomogeneity. We demonstrate the technique using focussed-ion-beam–deposited tungsten cantilevers of radius ∼60–170 nm inside a field-emission SEM, with excitation of the cantilever by a piezoelectric actuator allowing measurement of the full frequency response. Oscillation amplitudes approaching the size of the primary electron-beam can be resolved. We further show that the optimum electron-beam scan speed is determined by a compromise between deflection of the cantilever at low scan speeds and limited spatial resolution at high scan speeds. Our technique will be an important tool for use in precise characterization of nanomechanical resonator devices.

  2. Quantitative analysis of aqueous phase composition of model dentin adhesives experiencing phase separation

    Science.gov (United States)

    Ye, Qiang; Park, Jonggu; Parthasarathy, Ranganathan; Pamatmat, Francis; Misra, Anil; Laurence, Jennifer S.; Marangos, Orestes; Spencer, Paulette

    2013-01-01

    There have been reports of the sensitivity of our current dentin adhesives to excess moisture, for example, water-blisters in adhesives placed on over-wet surfaces, and phase separation with concomitant limited infiltration of the critical dimethacrylate component into the demineralized dentin matrix. To determine quantitatively the hydrophobic/hydrophilic components in the aqueous phase when exposed to over-wet environments, model adhesives were mixed with 16, 33, and 50 wt % water to yield well-separated phases. Based upon high-performance liquid chromatography coupled with photodiode array detection, it was found that the amounts of hydrophobic BisGMA and hydrophobic initiators are less than 0.1 wt % in the aqueous phase. The amount of these compounds decreased with an increase in the initial water content. The major components of the aqueous phase were hydroxyethyl methacrylate (HEMA) and water, and the HEMA content ranged from 18.3 to 14.7 wt %. Different BisGMA homologues and the relative content of these homologues in the aqueous phase have been identified; however, the amount of crosslinkable BisGMA was minimal and, thus, could not help in the formation of a crosslinked polymer network in the aqueous phase. Without the protection afforded by a strong crosslinked network, the poorly photoreactive compounds of this aqueous phase could be leached easily. These results suggest that adhesive formulations should be designed to include hydrophilic multimethacrylate monomers and water compatible initiators. PMID:22331596

  3. Microsegregation in multicomponent alloy analysed by quantitative phase-field model

    International Nuclear Information System (INIS)

    Ohno, M; Takaki, T; Shibuta, Y

    2015-01-01

    Microsegregation behaviour in a ternary alloy system has been analysed by means of quantitative phase-field (Q-PF) simulations with a particular attention directed at an influence of tie-line shift stemming from different liquid diffusivities of the solute elements. The Q-PF model developed for non-isothermal solidification in multicomponent alloys with non-zero solid diffusivities was applied to analysis of microsegregation in a ternary alloy consisting of fast and slow diffusing solute elements. The accuracy of the Q-PF simulation was first verified by performing the convergence test of segregation ratio with respect to the interface thickness. From one-dimensional analysis, it was found that the microsegregation of slow diffusing element is reduced due to the tie-line shift. In two-dimensional simulations, the refinement of microstructure, viz., the decrease of secondary arms spacing occurs at low cooling rates due to the formation of diffusion layer of slow diffusing element. It yields the reductions of degrees of microsegregation for both the fast and slow diffusing elements. Importantly, in a wide range of cooling rates, the degree of microsegregation of the slow diffusing element is always lower than that of the fast diffusing element, which is entirely ascribable to the influence of tie-line shift. (paper)

  4. Assessing the toxic effects of ethylene glycol ethers using Quantitative Structure Toxicity Relationship models

    International Nuclear Information System (INIS)

    Ruiz, Patricia; Mumtaz, Moiz; Gombar, Vijay

    2011-01-01

    Experimental determination of toxicity profiles consumes a great deal of time, money, and other resources. Consequently, businesses, societies, and regulators strive for reliable alternatives such as Quantitative Structure Toxicity Relationship (QSTR) models to fill gaps in toxicity profiles of compounds of concern to human health. The use of glycol ethers and their health effects have recently attracted the attention of international organizations such as the World Health Organization (WHO). The board members of Concise International Chemical Assessment Documents (CICAD) recently identified inadequate testing as well as gaps in toxicity profiles of ethylene glycol mono-n-alkyl ethers (EGEs). The CICAD board requested the ATSDR Computational Toxicology and Methods Development Laboratory to conduct QSTR assessments of certain specific toxicity endpoints for these chemicals. In order to evaluate the potential health effects of EGEs, CICAD proposed a critical QSTR analysis of the mutagenicity, carcinogenicity, and developmental effects of EGEs and other selected chemicals. We report here results of the application of QSTRs to assess rodent carcinogenicity, mutagenicity, and developmental toxicity of four EGEs: 2-methoxyethanol, 2-ethoxyethanol, 2-propoxyethanol, and 2-butoxyethanol and their metabolites. Neither mutagenicity nor carcinogenicity is indicated for the parent compounds, but these compounds are predicted to be developmental toxicants. The predicted toxicity effects were subjected to reverse QSTR (rQSTR) analysis to identify structural attributes that may be the main drivers of the developmental toxicity potential of these compounds.

  5. Quantitative assessment of bone defect healing by multidetector CT in a pig model

    International Nuclear Information System (INIS)

    Riegger, Carolin; Kroepil, Patric; Lanzman, Rotem S.; Miese, Falk R.; Antoch, Gerald; Scherer, Axel; Jungbluth, Pascal; Hakimi, Mohssen; Wild, Michael; Hakimi, Ahmad R.

    2012-01-01

    To evaluate multidetector CT volumetry in the assessment of bone defect healing in comparison to histopathological findings in an animal model. In 16 mini-pigs, a circumscribed tibial bone defect was created. Multidetector CT (MDCT) of the tibia was performed on a 64-row scanner 42 days after the operation. The extent of bone healing was estimated quantitatively by MDCT volumetry using a commercially available software programme (syngo Volume, Siemens, Germany).The volume of the entire defect (including all pixels from -100 to 3,000 HU), the nonconsolidated areas (-100 to 500 HU), and areas of osseous consolidation (500 to 3,000 HU) were assessed and the extent of consolidation was calculated. Histomorphometry served as the reference standard. The extent of osseous consolidation in MDCT volumetry ranged from 19 to 92% (mean 65.4 ± 18.5%). There was a significant correlation between histologically visible newly formed bone and the extent of osseous consolidation on MDCT volumetry (r = 0.82, P < 0.0001). A significant negative correlation was detected between osseous consolidation on MDCT and histological areas of persisting defect (r = -0.9, P < 0.0001). MDCT volumetry is a promising tool for noninvasive monitoring of bone healing, showing excellent correlation with histomorphometry. (orig.)

  6. Quantitative assessment of bone defect healing by multidetector CT in a pig model

    Energy Technology Data Exchange (ETDEWEB)

    Riegger, Carolin; Kroepil, Patric; Lanzman, Rotem S.; Miese, Falk R.; Antoch, Gerald; Scherer, Axel [University Duesseldorf, Medical Faculty, Department of Diagnostic and Interventional Radiology, Duesseldorf (Germany); Jungbluth, Pascal; Hakimi, Mohssen; Wild, Michael [University Duesseldorf, Medical Faculty, Department of Traumatology and Hand Surgery, Duesseldorf (Germany); Hakimi, Ahmad R. [Universtity Duesseldorf, Medical Faculty, Department of Oral Surgery, Duesseldorf (Germany)

    2012-05-15

    To evaluate multidetector CT volumetry in the assessment of bone defect healing in comparison to histopathological findings in an animal model. In 16 mini-pigs, a circumscribed tibial bone defect was created. Multidetector CT (MDCT) of the tibia was performed on a 64-row scanner 42 days after the operation. The extent of bone healing was estimated quantitatively by MDCT volumetry using a commercially available software programme (syngo Volume, Siemens, Germany).The volume of the entire defect (including all pixels from -100 to 3,000 HU), the nonconsolidated areas (-100 to 500 HU), and areas of osseous consolidation (500 to 3,000 HU) were assessed and the extent of consolidation was calculated. Histomorphometry served as the reference standard. The extent of osseous consolidation in MDCT volumetry ranged from 19 to 92% (mean 65.4 {+-} 18.5%). There was a significant correlation between histologically visible newly formed bone and the extent of osseous consolidation on MDCT volumetry (r = 0.82, P < 0.0001). A significant negative correlation was detected between osseous consolidation on MDCT and histological areas of persisting defect (r = -0.9, P < 0.0001). MDCT volumetry is a promising tool for noninvasive monitoring of bone healing, showing excellent correlation with histomorphometry. (orig.)

  7. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention

    Science.gov (United States)

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2009-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan’s current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 × 10−8 (95th percentile: 3.20 × 10−7). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures. PMID:19822125

  8. Analysis of genetic effects of nuclear-cytoplasmic interaction on quantitative traits: genetic model for diploid plants.

    Science.gov (United States)

    Han, Lide; Yang, Jian; Zhu, Jun

    2007-06-01

    A genetic model was proposed for simultaneously analyzing genetic effects of nuclear, cytoplasm, and nuclear-cytoplasmic interaction (NCI) as well as their genotype by environment (GE) interaction for quantitative traits of diploid plants. In the model, the NCI effects were further partitioned into additive and dominance nuclear-cytoplasmic interaction components. Mixed linear model approaches were used for statistical analysis. On the basis of diallel cross designs, Monte Carlo simulations showed that the genetic model was robust for estimating variance components under several situations without specific effects. Random genetic effects were predicted by an adjusted unbiased prediction (AUP) method. Data on four quantitative traits (boll number, lint percentage, fiber length, and micronaire) in Upland cotton (Gossypium hirsutum L.) were analyzed as a worked example to show the effectiveness of the model.

  9. Forging a link between mentoring and collaboration: a new training model for implementation science.

    Science.gov (United States)

    Luke, Douglas A; Baumann, Ana A; Carothers, Bobbi J; Landsverk, John; Proctor, Enola K

    2016-10-13

    publishing peer-reviewed papers. Statistical network models demonstrated that mentoring was strongly and significantly related to subsequent scientific collaboration, which supported a core design principle of the IRI. Future work should establish the link between mentoring and scientific productivity. These results may be of interest to team science, as they suggest the importance of mentoring for future team collaborations, as well as illustrate the utility of network analysis for studying team characteristics and activities.

  10. Linking human diseases to animal models using ontology-based phenotype annotation.

    Directory of Open Access Journals (Sweden)

    Nicole L Washington

    2009-11-01

    Full Text Available Scientists and clinicians who study genetic alterations and disease have traditionally described phenotypes in natural language. The considerable variation in these free-text descriptions has posed a hindrance to the important task of identifying candidate genes and models for human diseases and indicates the need for a computationally tractable method to mine data resources for mutant phenotypes. In this study, we tested the hypothesis that ontological annotation of disease phenotypes will facilitate the discovery of new genotype-phenotype relationships within and across species. To describe phenotypes using ontologies, we used an Entity-Quality (EQ methodology, wherein the affected entity (E and how it is affected (Q are recorded using terms from a variety of ontologies. Using this EQ method, we annotated the phenotypes of 11 gene-linked human diseases described in Online Mendelian Inheritance in Man (OMIM. These human annotations were loaded into our Ontology-Based Database (OBD along with other ontology-based phenotype descriptions of mutants from various model organism databases. Phenotypes recorded with this EQ method can be computationally compared based on the hierarchy of terms in the ontologies and the frequency of annotation. We utilized four similarity metrics to compare phenotypes and developed an ontology of homologous and analogous anatomical structures to compare phenotypes between species. Using these tools, we demonstrate that we can identify, through the similarity of the recorded phenotypes, other alleles of the same gene, other members of a signaling pathway, and orthologous genes and pathway members across species. We conclude that EQ-based annotation of phenotypes, in conjunction with a cross-species ontology, and a variety of similarity metrics can identify biologically meaningful similarities between genes by comparing phenotypes alone. This annotation and search method provides a novel and efficient means to identify

  11. Control model design to limit DC-link voltage during grid fault in a dfig variable speed wind turbine

    Science.gov (United States)

    Nwosu, Cajethan M.; Ogbuka, Cosmas U.; Oti, Stephen E.

    2017-08-01

    This paper presents a control model design capable of inhibiting the phenomenal rise in the DC-link voltage during grid- fault condition in a variable speed wind turbine. Against the use of power circuit protection strategies with inherent limitations in fault ride-through capability, a control circuit algorithm capable of limiting the DC-link voltage rise which in turn bears dynamics that has direct influence on the characteristics of the rotor voltage especially during grid faults is here proposed. The model results so obtained compare favorably with the simulation results as obtained in a MATLAB/SIMULINK environment. The generated model may therefore be used to predict near accurately the nature of DC-link voltage variations during fault given some factors which include speed and speed mode of operation, the value of damping resistor relative to half the product of inner loop current control bandwidth and the filter inductance.

  12. Quantitative Validation of the Integrated Medical Model (IMM) for ISS Missions

    Science.gov (United States)

    Young, Millennia; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.

    2016-01-01

    Lifetime Surveillance of Astronaut Health (LSAH) provided observed medical event data on 33 ISS and 111 STS person-missions for use in further improving and validating the Integrated Medical Model (IMM). Using only the crew characteristics from these observed missions, the newest development version, IMM v4.0, will simulate these missions to predict medical events and outcomes. Comparing IMM predictions to the actual observed medical event counts will provide external validation and identify areas of possible improvement. In an effort to improve the power of detecting differences in this validation study, the total over each program ISS and STS will serve as the main quantitative comparison objective, specifically the following parameters: total medical events (TME), probability of loss of crew life (LOCL), and probability of evacuation (EVAC). Scatter plots of observed versus median predicted TMEs (with error bars reflecting the simulation intervals) will graphically display comparisons while linear regression will serve as the statistical test of agreement. Two scatter plots will be analyzed 1) where each point reflects a mission and 2) where each point reflects a condition-specific total number of occurrences. The coefficient of determination (R2) resulting from a linear regression with no intercept bias (intercept fixed at zero) will serve as an overall metric of agreement between IMM and the real world system (RWS). In an effort to identify as many possible discrepancies as possible for further inspection, the -level for all statistical tests comparing IMM predictions to observed data will be set to 0.1. This less stringent criterion, along with the multiple testing being conducted, should detect all perceived differences including many false positive signals resulting from random variation. The results of these analyses will reveal areas of the model requiring adjustment to improve overall IMM output, which will thereby provide better decision support for

  13. MODIS volcanic ash retrievals vs FALL3D transport model: a quantitative comparison

    Science.gov (United States)

    Corradini, S.; Merucci, L.; Folch, A.

    2010-12-01

    Satellite retrievals and transport models represents the key tools to monitor the volcanic clouds evolution. Because of the harming effects of fine ash particles on aircrafts, the real-time tracking and forecasting of volcanic clouds is key for aviation safety. Together with the security reasons also the economical consequences of a disruption of airports must be taken into account. The airport closures due to the recent Icelandic Eyjafjöll eruption caused millions of passengers to be stranded not only in Europe, but across the world. IATA (the International Air Transport Association) estimates that the worldwide airline industry has lost a total of about 2.5 billion of Euro during the disruption. Both security and economical issues require reliable and robust ash cloud retrievals and trajectory forecasting. The intercomparison between remote sensing and modeling is required to assure precise and reliable volcanic ash products. In this work we perform a quantitative comparison between Moderate Resolution Imaging Spectroradiometer (MODIS) retrievals of volcanic ash cloud mass and Aerosol Optical Depth (AOD) with the FALL3D ash dispersal model. MODIS, aboard the NASA-Terra and NASA-Aqua polar satellites, is a multispectral instrument with 36 spectral bands operating in the VIS-TIR spectral range and spatial resolution varying between 250 and 1000 m at nadir. The MODIS channels centered around 11 and 12 micron have been used for the ash retrievals through the Brightness Temperature Difference algorithm and MODTRAN simulations. FALL3D is a 3-D time-dependent Eulerian model for the transport and deposition of volcanic particles that outputs, among other variables, cloud column mass and AOD. Three MODIS images collected the October 28, 29 and 30 on Mt. Etna volcano during the 2002 eruption have been considered as test cases. The results show a general good agreement between the retrieved and the modeled volcanic clouds in the first 300 km from the vents. Even if the

  14. A Quantitative Human Spacecraft Design Evaluation Model for Assessing Crew Accommodation and Utilization

    Science.gov (United States)

    Fanchiang, Christine

    Crew performance, including both accommodation and utilization factors, is an integral part of every human spaceflight mission from commercial space tourism, to the demanding journey to Mars and beyond. Spacecraft were historically built by engineers and technologists trying to adapt the vehicle into cutting edge rocketry with the assumption that the astronauts could be trained and will adapt to the design. By and large, that is still the current state of the art. It is recognized, however, that poor human-machine design integration can lead to catastrophic and deadly mishaps. The premise of this work relies on the idea that if an accurate predictive model exists to forecast crew performance issues as a result of spacecraft design and operations, it can help designers and managers make better decisions throughout the design process, and ensure that the crewmembers are well-integrated with the system from the very start. The result should be a high-quality, user-friendly spacecraft that optimizes the utilization of the crew while keeping them alive, healthy, and happy during the course of the mission. Therefore, the goal of this work was to develop an integrative framework to quantitatively evaluate a spacecraft design from the crew performance perspective. The approach presented here is done at a very fundamental level starting with identifying and defining basic terminology, and then builds up important axioms of human spaceflight that lay the foundation for how such a framework can be developed. With the framework established, a methodology for characterizing the outcome using a mathematical model was developed by pulling from existing metrics and data collected on human performance in space. Representative test scenarios were run to show what information could be garnered and how it could be applied as a useful, understandable metric for future spacecraft design. While the model is the primary tangible product from this research, the more interesting outcome of

  15. The hydrological calibration and validation of a complexly-linked watershed reservoir model for the Occoquan watershed, Virginia

    Science.gov (United States)

    Xu, Zhongyan; Godrej, Adil N.; Grizzard, Thomas J.

    2007-10-01

    SummaryRunoff models such as HSPF and reservoir models such as CE-QUAL-W2 are used to model water quality in watersheds. Most often, the models are independently calibrated to observed data. While this approach can achieve good calibration, it does not replicate the physically-linked nature of the system. When models are linked by using the model output from an upstream model as input to a downstream model, the physical reality of a continuous watershed, where the overland and waterbody portions are parts of the whole, is better represented. There are some additional challenges in the calibration of such linked models, because the aim is to simulate the entire system as a whole, rather than piecemeal. When public entities are charged with model development, one of the driving forces is to use public-domain models. This paper describes the use of two such models, HSPF and CE-QUAL-W2, in the linked modeling of the Occoquan watershed located in northern Virginia, USA. The description of the process is provided, and results from the hydrological calibration and validation are shown. The Occoquan model consists of six HSPF and two CE-QUAL-W2 models, linked in a complex way, to simulate two major reservoirs and the associated drainage areas. The overall linked model was calibrated for a three-year period and validated for a two-year period. The results show that a successful calibration can be achieved using the linked approach, with moderate additional effort. Overall flow balances based on the three-year calibration period at four stream stations showed agreement ranging from -3.95% to +3.21%. Flow balances for the two reservoirs, compared via the daily water surface elevations, also showed good agreement ( R2 values of 0.937 for Lake Manassas and 0.926 for Occoquan Reservoir), when missing (un-monitored) flows were included. Validation of the models ranged from poor to fair for the watershed models and excellent for the waterbody models, thus indicating that the

  16. Quantitative acid-base physiology using the Stewart model. Does it improve our understanding of what is really wrong?

    NARCIS (Netherlands)

    Derksen, R.; Scheffer, G.J.; Hoeven, J.G. van der

    2006-01-01

    Traditional theories of acid-base balance are based on the Henderson-Hasselbalch equation to calculate proton concentration. The recent revival of quantitative acid-base physiology using the Stewart model has increased our understanding of complicated acid-base disorders, but has also led to several

  17. Physically based dynamic run-out modelling for quantitative debris flow risk assessment: a case study in Tresenda, northern Italy

    Czech Academy of Sciences Publication Activity Database

    Quan Luna, B.; Blahůt, Jan; Camera, C.; Van Westen, C.; Apuani, T.; Jetten, V.; Sterlacchini, S.

    2014-01-01

    Roč. 72, č. 3 (2014), s. 645-661 ISSN 1866-6280 Institutional support: RVO:67985891 Keywords : debris flow * FLO-2D * run-out * quantitative hazard and risk assessment * vulnerability * numerical modelling Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.765, year: 2014

  18. A Quantitative Study of Faculty Perceptions and Attitudes on Asynchronous Virtual Teamwork Using the Technology Acceptance Model

    Science.gov (United States)

    Wolusky, G. Anthony

    2016-01-01

    This quantitative study used a web-based questionnaire to assess the attitudes and perceptions of online and hybrid faculty towards student-centered asynchronous virtual teamwork (AVT) using the technology acceptance model (TAM) of Davis (1989). AVT is online student participation in a team approach to problem-solving culminating in a written…

  19. Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships (MCBIOS)

    Science.gov (United States)

    Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships Jie Liu1,2, Richard Judson1, Matthew T. Martin1, Huixiao Hong3, Imran Shah1 1National Center for Computational Toxicology (NCCT), US EPA, RTP, NC...

  20. Quantitative predictions from competition theory with incomplete information on model parameters tested against experiments across diverse taxa

    OpenAIRE

    Fort, Hugo

    2017-01-01

    We derive an analytical approximation for making quantitative predictions for ecological communities as a function of the mean intensity of the inter-specific competition and the species richness. This method, with only a fraction of the model parameters (carrying capacities and competition coefficients), is able to predict accurately empirical measurements covering a wide variety of taxa (algae, plants, protozoa).

  1. Using ISOS consensus test protocols for development of quantitative life test models in ageing of organic solar cells

    DEFF Research Database (Denmark)

    Kettle, J.; Stoichkov, V.; Kumar, D.

    2017-01-01

    As Organic Photovoltaic (OPV) development matures, the demand grows for rapid characterisation of degradation and application of Quantitative Accelerated Life Tests (QALT) models to predict and improve reliability. To date, most accelerated testing on OPVs has been conducted using ISOS consensus...

  2. Quantitative trait loci affecting phenotypic variation in the vacuolated lens mouse mutant, a multigenic mouse model of neural tube defects

    NARCIS (Netherlands)

    Korstanje, Ron; Desai, Jigar; Lazar, Gloria; King, Benjamin; Rollins, Jarod; Spurr, Melissa; Joseph, Jamie; Kadambi, Sindhuja; Li, Yang; Cherry, Allison; Matteson, Paul G.; Paigen, Beverly; Millonig, James H.

    Korstanje R, Desai J, Lazar G, King B, Rollins J, Spurr M, Joseph J, Kadambi S, Li Y, Cherry A, Matteson PG, Paigen B, Millonig JH. Quantitative trait loci affecting phenotypic variation in the vacuolated lens mouse mutant, a multigenic mouse model of neural tube defects. Physiol Genomics 35:

  3. Model development for quantitative evaluation of nuclear fuel cycle alternatives and its application

    International Nuclear Information System (INIS)

    Ko, Won Il

    2000-02-01

    This study addresses the quantitative evaluation of the proliferation resistance and the economics which are important factors of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles, and a fuel cycle cost analysis model was suggested to incorporate various uncertainties in the fuel cycle cost calculation. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. In this model, the proliferation resistance was described an a relative size of the barrier that must be overcome in order to acquire nuclear weapons. Therefore, a larger barriers means that the risk of failure is great, expenditure of resources is large and the time scales for implementation is long. The electromotive force was expressed as the political motivation of the potential proliferators, such as an unauthorized party or a national group to acquire nuclear weapons. The electrical current was then defined as a proliferation resistance index. There are two electrical circuit models used in the evaluation of the proliferation resistance: the series and the parallel circuits. In the series circuit model of the proliferation resistance, a potential proliferator has to overcome all resistance barriers to achieve the manufacturing of the nuclear weapons. This phenomenon could be explained by the fact that the IAEA(International Atomic Energy Agency)'s safeguards philosophy relies on the defense-in-depth principle against nuclear proliferation at a specific facility. The parallel circuit model was also used to imitate the risk of proliferation for

  4. Quantitative modeling of clinical, cellular, and extracellular matrix variables suggest prognostic indicators in cancer: a model in neuroblastoma.

    Science.gov (United States)

    Tadeo, Irene; Piqueras, Marta; Montaner, David; Villamón, Eva; Berbegall, Ana P; Cañete, Adela; Navarro, Samuel; Noguera, Rosa

    2014-02-01

    Risk classification and treatment stratification for cancer patients is restricted by our incomplete picture of the complex and unknown interactions between the patient's organism and tumor tissues (transformed cells supported by tumor stroma). Moreover, all clinical factors and laboratory studies used to indicate treatment effectiveness and outcomes are by their nature a simplification of the biological system of cancer, and cannot yet incorporate all possible prognostic indicators. A multiparametric analysis on 184 tumor cylinders was performed. To highlight the benefit of integrating digitized medical imaging into this field, we present the results of computational studies carried out on quantitative measurements, taken from stromal and cancer cells and various extracellular matrix fibers interpenetrated by glycosaminoglycans, and eight current approaches to risk stratification systems in patients with primary and nonprimary neuroblastoma. New tumor tissue indicators from both fields, the cellular and the extracellular elements, emerge as reliable prognostic markers for risk stratification and could be used as molecular targets of specific therapies. The key to dealing with personalized therapy lies in the mathematical modeling. The use of bioinformatics in patient-tumor-microenvironment data management allows a predictive model in neuroblastoma.

  5. Linking land use change to recreational fishery valuation with a spatially explicit behavior model: A case study from Tampa Bay, FL USA

    Science.gov (United States)

    Drawing a link between habitat change and production and delivery of ecosystem services is a priority in coastal estuarine ecosystems. This link is needed to fully understand how human communities can influence ecosystem sustainability. Mechanistic modeling tools are highly fun...

  6. Vibro-acoustic modelling of aircraft double-walls with structural links using Statistical Energy Analysis

    Science.gov (United States)

    Campolina, Bruno L.

    The prediction of aircraft interior noise involves the vibroacoustic modelling of the fuselage with noise control treatments. This structure is composed of a stiffened metallic or composite panel, lined with a thermal and acoustic insulation layer (glass wool), and structurally connected via vibration isolators to a commercial lining panel (trim). The goal of this work aims at tailoring the noise control treatments taking design constraints such as weight and space optimization into account. For this purpose, a representative aircraft double-wall is modelled using the Statistical Energy Analysis (SEA) method. Laboratory excitations such as diffuse acoustic field and point force are addressed and trends