WorldWideScience

Sample records for macromolecular damage inferred

  1. Radiation damage to nucleoprotein complexes in macromolecular crystallography

    International Nuclear Information System (INIS)

    Bury, Charles; Garman, Elspeth F.; Ginn, Helen Mary; Ravelli, Raimond B. G.; Carmichael, Ian; Kneale, Geoff; McGeehan, John E.

    2015-01-01

    Quantitative X-ray induced radiation damage studies employing a model protein–DNA complex revealed a striking partition of damage sites. The DNA component was observed to be far more resistant to specific damage compared with the protein. Significant progress has been made in macromolecular crystallography over recent years in both the understanding and mitigation of X-ray induced radiation damage when collecting diffraction data from crystalline proteins. In contrast, despite the large field that is productively engaged in the study of radiation chemistry of nucleic acids, particularly of DNA, there are currently very few X-ray crystallographic studies on radiation damage mechanisms in nucleic acids. Quantitative comparison of damage to protein and DNA crystals separately is challenging, but many of the issues are circumvented by studying pre-formed biological nucleoprotein complexes where direct comparison of each component can be made under the same controlled conditions. Here a model protein–DNA complex C.Esp1396I is employed to investigate specific damage mechanisms for protein and DNA in a biologically relevant complex over a large dose range (2.07–44.63 MGy). In order to allow a quantitative analysis of radiation damage sites from a complex series of macromolecular diffraction data, a computational method has been developed that is generally applicable to the field. Typical specific damage was observed for both the protein on particular amino acids and for the DNA on, for example, the cleavage of base-sugar N 1 —C and sugar-phosphate C—O bonds. Strikingly the DNA component was determined to be far more resistant to specific damage than the protein for the investigated dose range. At low doses the protein was observed to be susceptible to radiation damage while the DNA was far more resistant, damage only being observed at significantly higher doses

  2. Practical macromolecular cryocrystallography

    Energy Technology Data Exchange (ETDEWEB)

    Pflugrath, J. W., E-mail: jim.pflugrath@gmail.com [Rigaku Americas Corp., 9009 New Trails Drive, The Woodlands, TX 77381 (United States)

    2015-05-27

    Current methods, reagents and experimental hardware for successfully and reproducibly flash-cooling macromolecular crystals to cryogenic temperatures for X-ray diffraction data collection are reviewed. Cryocrystallography is an indispensable technique that is routinely used for single-crystal X-ray diffraction data collection at temperatures near 100 K, where radiation damage is mitigated. Modern procedures and tools to cryoprotect and rapidly cool macromolecular crystals with a significant solvent fraction to below the glass-transition phase of water are reviewed. Reagents and methods to help prevent the stresses that damage crystals when flash-cooling are described. A method of using isopentane to assess whether cryogenic temperatures have been preserved when dismounting screened crystals is also presented.

  3. Detection of multiple damages employing best achievable eigenvectors under Bayesian inference

    Science.gov (United States)

    Prajapat, Kanta; Ray-Chaudhuri, Samit

    2018-05-01

    A novel approach is presented in this work to localize simultaneously multiple damaged elements in a structure along with the estimation of damage severity for each of the damaged elements. For detection of damaged elements, a best achievable eigenvector based formulation has been derived. To deal with noisy data, Bayesian inference is employed in the formulation wherein the likelihood of the Bayesian algorithm is formed on the basis of errors between the best achievable eigenvectors and the measured modes. In this approach, the most probable damage locations are evaluated under Bayesian inference by generating combinations of various possible damaged elements. Once damage locations are identified, damage severities are estimated using a Bayesian inference Markov chain Monte Carlo simulation. The efficiency of the proposed approach has been demonstrated by carrying out a numerical study involving a 12-story shear building. It has been found from this study that damage scenarios involving as low as 10% loss of stiffness in multiple elements are accurately determined (localized and severities quantified) even when 2% noise contaminated modal data are utilized. Further, this study introduces a term parameter impact (evaluated based on sensitivity of modal parameters towards structural parameters) to decide the suitability of selecting a particular mode, if some idea about the damaged elements are available. It has been demonstrated here that the accuracy and efficiency of the Bayesian quantification algorithm increases if damage localization is carried out a-priori. An experimental study involving a laboratory scale shear building and different stiffness modification scenarios shows that the proposed approach is efficient enough to localize the stories with stiffness modification.

  4. Progress in rational methods of cryoprotection in macromolecular crystallography

    International Nuclear Information System (INIS)

    Alcorn, Thomas; Juers, Douglas H.

    2010-01-01

    Measurements of the average thermal contractions (294→72 K) of 26 different cryosolutions are presented and discussed in conjunction with other recent advances in the rational design of protocols for cryogenic cooling in macromolecular crystallography. Cryogenic cooling of macromolecular crystals is commonly used for X-ray data collection both to reduce crystal damage from radiation and to gather functional information by cryogenically trapping intermediates. However, the cooling process can damage the crystals. Limiting cooling-induced crystal damage often requires cryoprotection strategies, which can involve substantial screening of solution conditions and cooling protocols. Here, recent developments directed towards rational methods for cryoprotection are described. Crystal damage is described in the context of the temperature response of the crystal as a thermodynamic system. As such, the internal and external parts of the crystal typically have different cryoprotection requirements. A key physical parameter, the thermal contraction, of 26 different cryoprotective solutions was measured between 294 and 72 K. The range of contractions was 2–13%, with the more polar cryosolutions contracting less. The potential uses of these results in the development of cryocooling conditions, as well as recent developments in determining minimum cryosolution soaking times, are discussed

  5. Bayesian inference method for stochastic damage accumulation modeling

    International Nuclear Information System (INIS)

    Jiang, Xiaomo; Yuan, Yong; Liu, Xian

    2013-01-01

    Damage accumulation based reliability model plays an increasingly important role in successful realization of condition based maintenance for complicated engineering systems. This paper developed a Bayesian framework to establish stochastic damage accumulation model from historical inspection data, considering data uncertainty. Proportional hazards modeling technique is developed to model the nonlinear effect of multiple influencing factors on system reliability. Different from other hazard modeling techniques such as normal linear regression model, the approach does not require any distribution assumption for the hazard model, and can be applied for a wide variety of distribution models. A Bayesian network is created to represent the nonlinear proportional hazards models and to estimate model parameters by Bayesian inference with Markov Chain Monte Carlo simulation. Both qualitative and quantitative approaches are developed to assess the validity of the established damage accumulation model. Anderson–Darling goodness-of-fit test is employed to perform the normality test, and Box–Cox transformation approach is utilized to convert the non-normality data into normal distribution for hypothesis testing in quantitative model validation. The methodology is illustrated with the seepage data collected from real-world subway tunnels.

  6. Inference Generation during Text Comprehension by Adults with Right Hemisphere Brain Damage: Activation Failure Versus Multiple Activation.

    Science.gov (United States)

    Tompkins, Connie A.; Fassbinder, Wiltrud; Blake, Margaret Lehman; Baumgaertner, Annette; Jayaram, Nandini

    2004-01-01

    ourse comprehensionEvidence conflicts as to whether adults with right hemisphere brain damage (RHD) generate inferences during text comprehension. M. Beeman (1993) reported that adults with RHD fail to activate the lexical-semantic bases of routine bridging inferences, which are necessary for comprehension. But other evidence indicates that adults…

  7. Macromolecular therapeutics.

    Science.gov (United States)

    Yang, Jiyuan; Kopeček, Jindřich

    2014-09-28

    This review covers water-soluble polymer-drug conjugates and macromolecules that possess biological activity without attached low molecular weight drugs. The main design principles of traditional and backbone degradable polymer-drug conjugates as well as the development of a new paradigm in nanomedicines - (low molecular weight) drug-free macromolecular therapeutics are discussed. To address the biological features of cancer, macromolecular therapeutics directed to stem/progenitor cells and the tumor microenvironment are deliberated. Finally, the future perspectives of the field are briefly debated. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Effects of far-ultraviolet radiation and oxygen on macromolecular synthesis and protein induction in Bacteroides fragilis BF-2

    International Nuclear Information System (INIS)

    Schumann, J.P.

    1983-11-01

    The study deals with the effects of far-UV radiation, oxygen and hydrogen peroxide on macromolecular synthesis and viability in the obligate anaerobe, Bacteroides fragilis, as well as the specific proteins induced in this organism by these different DNA damaging agents. Irradiation of Bacteroides fragilis cells with far-UV light (254 nm) under anaerobic conditions resulted in the immediate, rapid and extensive degradation of DNA which continued for 40 to 60 min after irradiation. DNA degradation after irradiation was inhibited by chloramphenicol and caffeine. RNA and protein synthesis were decreased by UV irradiation and the degree of inhibition was proportional to the UV dose. Colony formation was not affected immediately by UV irradiation and continued for a dose-dependent period prior to inhibition. The relationship between the DNA damage-induced proteins, macromolecular synthesis in damaged B. fragilis cells and the observed physiological responses and inducible repair phenomena after the different DNA damaging treatments in this anaerobe are discussed

  9. Indicators of Macromolecular Oxidative Damage and Antioxidant Defence Examinees Exposed to the Radar Frequencies 1.5 - 10.9 GHz

    International Nuclear Information System (INIS)

    Marjanovic, A.M.; Flajs, D.; Pavicic, I.; Domijan, A.

    2011-01-01

    Radar is an object-detection system which uses microwaves (Mw). As a result of increased use of radar there is a rising concern regarding health effects of Mw radiation on human body. Living organisms are complex electrochemical systems being evolved in a relatively narrow range of well-defined environmental parameters. For life to be maintained these parameters must be kept within their normal range, since deviations can induce biochemical effects causing cell function impairment and disease. Some theories indicate connection between Mw radiation, oxidative damage as well as antioxidant defence of organism. Aim of this study was to evaluate level and damage of macromolecular structures - proteins and lipids in blood of men occupationally exposed to Mw radiation. Concentration of glutathione (GSH), a known indicator of organism antioxidant defence, was also determined. Blood samples were collected from 27 male workers occupationally exposed to radar frequencies 1.5 to 10.9 GHz. Corresponding control group (N = 8) was a part of study. Concentrations of total and oxidised proteins, protein carbonyls, and GSH were measured by spectrophotometric method, while malondialdeyde (MDA), product of lipid peroxidation, was determined by high performance liquid chromatography (HPLC). Gained concentrations of oxidised proteins, GSH and MDA were presented in relation to total proteins. Concentration of oxidised proteins between control and exposed group of examinees did not show any significant statistical difference. However, concentration of GSH in exposed group was found considerably decreased, while concentration of MDA was found to be increased. Results indicate that Mw radiation of radar operating at frequencies 1.5 - 10.9 GHz could cause damage to proteins and lipids in addition to impairment of antioxidant defence of organism. (author)

  10. Macromolecular crystallography using synchrotron radiation

    International Nuclear Information System (INIS)

    Bartunik, H.D.; Phillips, J.C.; Fourme, R.

    1982-01-01

    The use of synchrotron X-ray sources in macromolecular crystallography is described. The properties of synchrotron radiation relevant to macromolecular crystallography are examined. The applications discussed include anomalous dispersion techniques, the acquisition of normal and high resolution data, and kinetic studies of structural changes in macromolecules; protein data are presented illustrating these applications. The apparatus used is described including information on the electronic detectors, the monitoring of the incident beam and crystal cooling. (U.K.)

  11. Macromolecular crystallization in microgravity

    International Nuclear Information System (INIS)

    Snell, Edward H; Helliwell, John R

    2005-01-01

    Density difference fluid flows and sedimentation of growing crystals are greatly reduced when crystallization takes place in a reduced gravity environment. In the case of macromolecular crystallography a crystal of a biological macromolecule is used for diffraction experiments (x-ray or neutron) so as to determine the three-dimensional structure of the macromolecule. The better the internal order of the crystal then the greater the molecular structure detail that can be extracted. It is this structural information that enables an understanding of how the molecule functions. This knowledge is changing the biological and chemical sciences, with major potential in understanding disease pathologies. In this review, we examine the use of microgravity as an environment to grow macromolecular crystals. We describe the crystallization procedures used on the ground, how the resulting crystals are studied and the knowledge obtained from those crystals. We address the features desired in an ordered crystal and the techniques used to evaluate those features in detail. We then introduce the microgravity environment, the techniques to access that environment and the theory and evidence behind the use of microgravity for crystallization experiments. We describe how ground-based laboratory techniques have been adapted to microgravity flights and look at some of the methods used to analyse the resulting data. Several case studies illustrate the physical crystal quality improvements and the macromolecular structural advances. Finally, limitations and alternatives to microgravity and future directions for this research are covered. Macromolecular structural crystallography in general is a remarkable field where physics, biology, chemistry and mathematics meet to enable insight to the fundamentals of life. As the reader will see, there is a great deal of physics involved when the microgravity environment is applied to crystallization, some of it known, and undoubtedly much yet to

  12. Sequential recovery of macromolecular components of the nucleolus.

    Science.gov (United States)

    Bai, Baoyan; Laiho, Marikki

    2015-01-01

    The nucleolus is involved in a number of cellular processes of importance to cell physiology and pathology, including cell stress responses and malignancies. Studies of macromolecular composition of the nucleolus depend critically on the efficient extraction and accurate quantification of all macromolecular components (e.g., DNA, RNA, and protein). We have developed a TRIzol-based method that efficiently and simultaneously isolates these three macromolecular constituents from the same sample of purified nucleoli. The recovered and solubilized protein can be accurately quantified by the bicinchoninic acid assay and assessed by polyacrylamide gel electrophoresis or by mass spectrometry. We have successfully applied this approach to extract and quantify the responses of all three macromolecular components in nucleoli after drug treatments of HeLa cells, and conducted RNA-Seq analysis of the nucleolar RNA.

  13. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography

    International Nuclear Information System (INIS)

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L.; Armour, Wes; Waterman, David G.; Iwata, So; Evans, Gwyndaf

    2013-01-01

    A systematic approach to the scaling and merging of data from multiple crystals in macromolecular crystallography is introduced and explained. The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of <10 µm in size. The increased likelihood of severe radiation damage where microcrystals or particularly sensitive crystals are used forces crystallographers to acquire large numbers of data sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein

  14. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Foadi, James [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Imperial College, London SW7 2AZ (United Kingdom); Aller, Pierre [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Alguel, Yilmaz; Cameron, Alex [Imperial College, London SW7 2AZ (United Kingdom); Axford, Danny; Owen, Robin L. [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Armour, Wes [Oxford e-Research Centre (OeRC), Keble Road, Oxford OX1 3QG (United Kingdom); Waterman, David G. [Research Complex at Harwell (RCaH), Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0FA (United Kingdom); Iwata, So [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Imperial College, London SW7 2AZ (United Kingdom); Evans, Gwyndaf, E-mail: gwyndaf.evans@diamond.ac.uk [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom)

    2013-08-01

    A systematic approach to the scaling and merging of data from multiple crystals in macromolecular crystallography is introduced and explained. The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of <10 µm in size. The increased likelihood of severe radiation damage where microcrystals or particularly sensitive crystals are used forces crystallographers to acquire large numbers of data sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein.

  15. Repair of radiation damage in mammalian cells

    Energy Technology Data Exchange (ETDEWEB)

    Setlow, R.B.

    1981-01-01

    The responses, such as survival, mutation, and carcinogenesis, of mammalian cells and tissues to radiation are dependent not only on the magnitude of the damage to macromolecular structures - DNA, RNA, protein, and membranes - but on the rates of macromolecular syntheses of cells relative to the half-lives of the damages. Cells possess a number of mechanisms for repairing damage to DNA. If the repair systems are rapid and error free, cells can tolerate much larger doses than if repair is slow or error prone. It is important to understand the effects of radiation and the repair of radiation damage because there exist reasonable amounts of epidemiological data that permits the construction of dose-response curves for humans. The shapes of such curves or the magnitude of the response will depend on repair. Radiation damage is emphasized because: (a) radiation dosimetry, with all its uncertainties for populations, is excellent compared to chemical dosimetry; (b) a number of cancer-prone diseases are known in which there are defects in DNA repair and radiation results in more chromosomal damage in cells from such individuals than in cells from normal individuals; (c) in some cases, specific radiation products in DNA have been correlated with biological effects, and (d) many chemical effects seem to mimic radiation effects. A further reason for emphasizing damage to DNA is the wealth of experimental evidence indicating that damages to DNA can be initiating events in carcinogenesis.

  16. Repair of radiation damage in mammalian cells

    International Nuclear Information System (INIS)

    Setlow, R.B.

    1981-01-01

    The responses, such as survival, mutation, and carcinogenesis, of mammalian cells and tissues to radiation are dependent not only on the magnitude of the damage to macromolecular structures - DNA, RNA, protein, and membranes - but on the rates of macromolecular syntheses of cells relative to the half-lives of the damages. Cells possess a number of mechanisms for repairing damage to DNA. If the repair systems are rapid and error free, cells can tolerate much larger doses than if repair is slow or error prone. It is important to understand the effects of radiation and the repair of radiation damage because there exist reasonable amounts of epidemiological data that permits the construction of dose-response curves for humans. The shapes of such curves or the magnitude of the response will depend on repair. Radiation damage is emphasized because: (a) radiation dosimetry, with all its uncertainties for populations, is excellent compared to chemical dosimetry; (b) a number of cancer-prone diseases are known in which there are defects in DNA repair and radiation results in more chromosomal damage in cells from such individuals than in cells from normal individuals; (c) in some cases, specific radiation products in DNA have been correlated with biological effects, and (d) many chemical effects seem to mimic radiation effects. A further reason for emphasizing damage to DNA is the wealth of experimental evidence indicating that damages to DNA can be initiating events in carcinogenesis

  17. The role of macromolecular stability in desiccation tolerance

    NARCIS (Netherlands)

    Wolkers, W.F.

    1998-01-01

    The work presented in this thesis concerns a study on the molecular interactions that play a role in the macromolecular stability of desiccation-tolerant higher plant organs. Fourier transform infrared microspectroscopy was used as the main experimental technique to assess macromolecular

  18. The design of macromolecular crystallography diffraction experiments

    International Nuclear Information System (INIS)

    Evans, Gwyndaf; Axford, Danny; Owen, Robin L.

    2011-01-01

    Thoughts about the decisions made in designing macromolecular X-ray crystallography experiments at synchrotron beamlines are presented. The measurement of X-ray diffraction data from macromolecular crystals for the purpose of structure determination is the convergence of two processes: the preparation of diffraction-quality crystal samples on the one hand and the construction and optimization of an X-ray beamline and end station on the other. Like sample preparation, a macromolecular crystallography beamline is geared to obtaining the best possible diffraction measurements from crystals provided by the synchrotron user. This paper describes the thoughts behind an experiment that fully exploits both the sample and the beamline and how these map into everyday decisions that users can and should make when visiting a beamline with their most precious crystals

  19. Macromolecular crystallography beamline X25 at the NSLS

    Energy Technology Data Exchange (ETDEWEB)

    Héroux, Annie; Allaire, Marc; Buono, Richard; Cowan, Matthew L.; Dvorak, Joseph; Flaks, Leon; LaMarra, Steven; Myers, Stuart F.; Orville, Allen M.; Robinson, Howard H.; Roessler, Christian G.; Schneider, Dieter K.; Shea-McCarthy, Grace; Skinner, John M.; Skinner, Michael; Soares, Alexei S.; Sweet, Robert M.; Berman, Lonny E., E-mail: berman@bnl.gov [Brookhaven National Laboratory, PO Box 5000, Upton, NY 11973-5000 (United States)

    2014-04-08

    A description of the upgraded beamline X25 at the NSLS, operated by the PXRR and the Photon Sciences Directorate serving the Macromolecular Crystallography community, is presented. Beamline X25 at the NSLS is one of the five beamlines dedicated to macromolecular crystallography operated by the Brookhaven National Laboratory Macromolecular Crystallography Research Resource group. This mini-gap insertion-device beamline has seen constant upgrades for the last seven years in order to achieve mini-beam capability down to 20 µm × 20 µm. All major components beginning with the radiation source, and continuing along the beamline and its experimental hutch, have changed to produce a state-of-the-art facility for the scientific community.

  20. Macromolecular crystallography beamline X25 at the NSLS

    International Nuclear Information System (INIS)

    Héroux, Annie; Allaire, Marc; Buono, Richard; Cowan, Matthew L.; Dvorak, Joseph; Flaks, Leon; LaMarra, Steven; Myers, Stuart F.; Orville, Allen M.; Robinson, Howard H.; Roessler, Christian G.; Schneider, Dieter K.; Shea-McCarthy, Grace; Skinner, John M.; Skinner, Michael; Soares, Alexei S.; Sweet, Robert M.; Berman, Lonny E.

    2014-01-01

    A description of the upgraded beamline X25 at the NSLS, operated by the PXRR and the Photon Sciences Directorate serving the Macromolecular Crystallography community, is presented. Beamline X25 at the NSLS is one of the five beamlines dedicated to macromolecular crystallography operated by the Brookhaven National Laboratory Macromolecular Crystallography Research Resource group. This mini-gap insertion-device beamline has seen constant upgrades for the last seven years in order to achieve mini-beam capability down to 20 µm × 20 µm. All major components beginning with the radiation source, and continuing along the beamline and its experimental hutch, have changed to produce a state-of-the-art facility for the scientific community

  1. Macromolecular crowding directs extracellular matrix organization and mesenchymal stem cell behavior.

    Directory of Open Access Journals (Sweden)

    Adam S Zeiger

    Full Text Available Microenvironments of biological cells are dominated in vivo by macromolecular crowding and resultant excluded volume effects. This feature is absent in dilute in vitro cell culture. Here, we induced macromolecular crowding in vitro by using synthetic macromolecular globules of nm-scale radius at physiological levels of fractional volume occupancy. We quantified the impact of induced crowding on the extracellular and intracellular protein organization of human mesenchymal stem cells (MSCs via immunocytochemistry, atomic force microscopy (AFM, and AFM-enabled nanoindentation. Macromolecular crowding in extracellular culture media directly induced supramolecular assembly and alignment of extracellular matrix proteins deposited by cells, which in turn increased alignment of the intracellular actin cytoskeleton. The resulting cell-matrix reciprocity further affected adhesion, proliferation, and migration behavior of MSCs. Macromolecular crowding can thus aid the design of more physiologically relevant in vitro studies and devices for MSCs and other cells, by increasing the fidelity between materials synthesized by cells in vivo and in vitro.

  2. Macromolecular crowding directs extracellular matrix organization and mesenchymal stem cell behavior.

    Science.gov (United States)

    Zeiger, Adam S; Loe, Felicia C; Li, Ran; Raghunath, Michael; Van Vliet, Krystyn J

    2012-01-01

    Microenvironments of biological cells are dominated in vivo by macromolecular crowding and resultant excluded volume effects. This feature is absent in dilute in vitro cell culture. Here, we induced macromolecular crowding in vitro by using synthetic macromolecular globules of nm-scale radius at physiological levels of fractional volume occupancy. We quantified the impact of induced crowding on the extracellular and intracellular protein organization of human mesenchymal stem cells (MSCs) via immunocytochemistry, atomic force microscopy (AFM), and AFM-enabled nanoindentation. Macromolecular crowding in extracellular culture media directly induced supramolecular assembly and alignment of extracellular matrix proteins deposited by cells, which in turn increased alignment of the intracellular actin cytoskeleton. The resulting cell-matrix reciprocity further affected adhesion, proliferation, and migration behavior of MSCs. Macromolecular crowding can thus aid the design of more physiologically relevant in vitro studies and devices for MSCs and other cells, by increasing the fidelity between materials synthesized by cells in vivo and in vitro.

  3. Nitrogen isotopic composition of macromolecular organic matter in interplanetary dust particles

    Science.gov (United States)

    Aléon, Jérôme; Robert, François; Chaussidon, Marc; Marty, Bernard

    2003-10-01

    Nitrogen concentrations and isotopic compositions were measured by ion microprobe scanning imaging in two interplanetary dust particles L2021 K1 and L2036 E22, in which imaging of D/H and C/H ratios has previously evidenced the presence of D-rich macromolecular organic components. High nitrogen concentrations of 10-20 wt% and δ 15N values up to +400‰ are observed in these D-rich macromolecular components. The previous study of D/H and C/H ratios has revealed three different D-rich macromolecular phases. The one previously ascribed to macromolecular organic matter akin the insoluble organic matter (IOM) from carbonaceous chondrites is enriched in nitrogen by one order of magnitude compared to the carbonaceous chondrite IOM, although its isotopic composition is still similar to what is known from Renazzo (δ 15N = +208‰). The correlation observed in macromolecular organic material between the D- and 15N-excesses suggests that the latter originate probably from chemical reactions typical of the cold interstellar medium. These interstellar materials preserved to some extent in IDPs are therefore macromolecular organic components with various aliphaticity and aromaticity. They are heavily N-heterosubstituted as shown by their high nitrogen concentrations >10 wt%. They have high D/H ratios >10 -3 and δ 15N values ≥ +400‰. In L2021 K1 a mixture is observed at the micron scale between interstellar and chondritic-like organic phases. This indicates that some IDPs contain organic materials processed at various heliocentric distances in a turbulent nebula. Comparison with observation in comets suggests that these molecules may be cometary macromolecules. A correlation is observed between the D/H ratios and δ 15N values of macromolecular organic matter from IDPs, meteorites, the Earth and of major nebular reservoirs. This suggests that most macromolecular organic matter in the inner solar system was probably issued from interstellar precursors and further processed

  4. Analytical model for macromolecular partitioning during yeast cell division

    International Nuclear Information System (INIS)

    Kinkhabwala, Ali; Khmelinskii, Anton; Knop, Michael

    2014-01-01

    Asymmetric cell division, whereby a parent cell generates two sibling cells with unequal content and thereby distinct fates, is central to cell differentiation, organism development and ageing. Unequal partitioning of the macromolecular content of the parent cell — which includes proteins, DNA, RNA, large proteinaceous assemblies and organelles — can be achieved by both passive (e.g. diffusion, localized retention sites) and active (e.g. motor-driven transport) processes operating in the presence of external polarity cues, internal asymmetries, spontaneous symmetry breaking, or stochastic effects. However, the quantitative contribution of different processes to the partitioning of macromolecular content is difficult to evaluate. Here we developed an analytical model that allows rapid quantitative assessment of partitioning as a function of various parameters in the budding yeast Saccharomyces cerevisiae. This model exposes quantitative degeneracies among the physical parameters that govern macromolecular partitioning, and reveals regions of the solution space where diffusion is sufficient to drive asymmetric partitioning and regions where asymmetric partitioning can only be achieved through additional processes such as motor-driven transport. Application of the model to different macromolecular assemblies suggests that partitioning of protein aggregates and episomes, but not prions, is diffusion-limited in yeast, consistent with previous reports. In contrast to computationally intensive stochastic simulations of particular scenarios, our analytical model provides an efficient and comprehensive overview of partitioning as a function of global and macromolecule-specific parameters. Identification of quantitative degeneracies among these parameters highlights the importance of their careful measurement for a given macromolecular species in order to understand the dominant processes responsible for its observed partitioning

  5. What Macromolecular Crowding Can Do to a Protein

    Science.gov (United States)

    Kuznetsova, Irina M.; Turoverov, Konstantin K.; Uversky, Vladimir N.

    2014-01-01

    The intracellular environment represents an extremely crowded milieu, with a limited amount of free water and an almost complete lack of unoccupied space. Obviously, slightly salted aqueous solutions containing low concentrations of a biomolecule of interest are too simplistic to mimic the “real life” situation, where the biomolecule of interest scrambles and wades through the tightly packed crowd. In laboratory practice, such macromolecular crowding is typically mimicked by concentrated solutions of various polymers that serve as model “crowding agents”. Studies under these conditions revealed that macromolecular crowding might affect protein structure, folding, shape, conformational stability, binding of small molecules, enzymatic activity, protein-protein interactions, protein-nucleic acid interactions, and pathological aggregation. The goal of this review is to systematically analyze currently available experimental data on the variety of effects of macromolecular crowding on a protein molecule. The review covers more than 320 papers and therefore represents one of the most comprehensive compendia of the current knowledge in this exciting area. PMID:25514413

  6. Macromolecular nanotheranostics for multimodal anticancer therapy

    Science.gov (United States)

    Huis in't Veld, Ruben; Storm, Gert; Hennink, Wim E.; Kiessling, Fabian; Lammers, Twan

    2011-10-01

    Macromolecular carrier materials based on N-(2-hydroxypropyl)methacrylamide (HPMA) are prototypic and well-characterized drug delivery systems that have been extensively evaluated in the past two decades, both at the preclinical and at the clinical level. Using several different imaging agents and techniques, HPMA copolymers have been shown to circulate for prolonged periods of time, and to accumulate in tumors both effectively and selectively by means of the Enhanced Permeability and Retention (EPR) effect. Because of this, HPMA-based macromolecular nanotheranostics, i.e. formulations containing both drug and imaging agents within a single formulation, have been shown to be highly effective in inducing tumor growth inhibition in animal models. In patients, however, as essentially all other tumor-targeted nanomedicines, they are generally only able to improve the therapeutic index of the attached active agent by lowering its toxicity, and they fail to improve the efficacy of the intervention. Bearing this in mind, we have recently reasoned that because of their biocompatibility and their beneficial biodistribution, nanomedicine formulations might be highly suitable systems for combination therapies. In the present manuscript, we briefly summarize several exemplary efforts undertaken in this regard in our labs in the past couple of years, and we show that long-circulating and passively tumor-targeted macromolecular nanotheranostics can be used to improve the efficacy of radiochemotherapy and of chemotherapy combinations.

  7. Recent advances in macromolecular prodrugs

    DEFF Research Database (Denmark)

    Riber, Camilla Frich; Zelikin, Alexander N.

    2017-01-01

    Macromolecular prodrugs (MP) are high molar mass conjugates, typically carrying several copies of a drug or a drug combination, designed to optimize delivery of the drug, that is — its pharmacokinetics. From its advent several decades ago, design of MP has undergone significant development and es...

  8. Status and prospects of macromolecular crystallography

    Indian Academy of Sciences (India)

    technique that could be completely automated in most cases. ... major challenge in macromolecular crystallography today is ... tial characterization of crystals in the home source and make a ... opportunities for a generation of structural biolo-.

  9. A decade of user operation on the macromolecular crystallography MAD beamline ID14-4 at the ESRF

    International Nuclear Information System (INIS)

    McCarthy, Andrew A.; Brockhauser, Sandor; Nurizzo, Didier; Theveneau, Pascal; Mairs, Trevor; Spruce, Darren; Guijarro, Matias; Lesourd, Marc; Ravelli, Raimond B. G.; McSweeney, Sean

    2009-01-01

    The improvement of the X-ray beam quality achieved on ID14-4 by the installation of new X-ray optical elements is described. ID14-4 at the ESRF is the first tunable undulator-based macromolecular crystallography beamline that can celebrate a decade of user service. During this time ID14-4 has not only been instrumental in the determination of the structures of biologically important molecules but has also contributed significantly to the development of various instruments, novel data collection schemes and pioneering radiation damage studies on biological samples. Here, the evolution of ID14-4 over the last decade is presented, and some of the major improvements that were carried out in order to maintain its status as one of the most productive macromolecular crystallography beamlines are highlighted. The experimental hutch has been upgraded to accommodate a high-precision diffractometer, a sample changer and a large CCD detector. More recently, the optical hutch has been refurbished in order to improve the X-ray beam quality on ID14-4 and to incorporate the most modern and robust optical elements used at other ESRF beamlines. These new optical elements will be described and their effect on beam stability discussed. These studies may be useful in the design, construction and maintenance of future X-ray beamlines for macromolecular crystallography and indeed other applications, such as those planned for the ESRF upgrade

  10. Automated data collection for macromolecular crystallography.

    Science.gov (United States)

    Winter, Graeme; McAuley, Katherine E

    2011-09-01

    An overview, together with some practical advice, is presented of the current status of the automation of macromolecular crystallography (MX) data collection, with a focus on MX beamlines at Diamond Light Source, UK. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Celebrating macromolecular crystallography: A personal perspective

    Directory of Open Access Journals (Sweden)

    Abad-Zapatero, Celerino

    2015-04-01

    Full Text Available The twentieth century has seen an enormous advance in the knowledge of the atomic structures that surround us. The discovery of the first crystal structures of simple inorganic salts by the Braggs in 1914, using the diffraction of X-rays by crystals, provided the critical elements to unveil the atomic structure of matter. Subsequent developments in the field leading to macromolecular crystallography are presented with a personal perspective, related to the cultural milieu of Spain in the late 1950’s. The journey of discovery of the author, as he developed professionally, is interwoven with the expansion of macromolecular crystallography from the first proteins (myoglobin, hemoglobin to the ‘coming of age’ of the field in 1971 and the discoveries that followed, culminating in the determination of the structure of the ribosomes at the turn of the century. A perspective is presented exploring the future of the field and also a reflection about the future generations of Spanish scientists.El siglo XX ha sido testigo del increíble avance que ha experimentado el conocimiento de la estructura atómica de la materia que nos rodea. El descubrimiento de las primeras estructuras atómicas de sales inorgánicas por los Bragg en 1914, empleando difracción de rayos X con cristales, proporcionó los elementos clave para alcanzar tal conocimiento. Posteriores desarrollos en este campo, que condujeron a la cristalografía macromolecular, se presentan aquí desde una perspectiva personal, relacionada con el contexto cultural de la España de la década de los 50. La experiencia del descubrimiento científico, durante mi desarrollo profesional, se integra en el desarrollo de la cristalografía macromolecular, desde las primeras proteínas (míoglobina y hemoglobina, hasta su madurez en 1971 que, con los posteriores descubrimientos, culmina con la determinación del la estructura del ribosoma. Asimismo, se explora el futuro de esta disciplina y se

  12. Structure studies of macromolecular systems

    Czech Academy of Sciences Publication Activity Database

    Hašek, Jindřich; Dohnálek, Jan; Skálová, Tereza; Dušková, Jarmila; Kolenko, Petr

    2006-01-01

    Roč. 13, č. 3 (2006), s. 136 ISSN 1211-5894. [Czech and Slovak Crystallographic Colloquium. 22.06.2006-24.06.2006, Grenoble] R&D Projects: GA AV ČR IAA4050811; GA MŠk 1K05008 Keywords : structure * X-ray diffraction * synchrotron Subject RIV: CD - Macromolecular Chemistry http://www. xray .cz/ms/default.htm

  13. Control of Macromolecular Architectures for Renewable Polymers: Case Studies

    Science.gov (United States)

    Tang, Chuanbing

    The development of sustainable polymers from nature biomass is growing, but facing fierce competition from existing petrochemical-based counterparts. Controlling macromolecular architectures to maximize the properties of renewable polymers is a desirable approach to gain advantages. Given the complexity of biomass, there needs special consideration other than traditional design. In the presentation, I will talk about a few case studies on how macromolecular architectures could tune the properties of sustainable bioplastics and elastomers from renewable biomass such as resin acids (natural rosin) and plant oils.

  14. Hypoxic tumor environments exhibit disrupted collagen I fibers and low macromolecular transport.

    Directory of Open Access Journals (Sweden)

    Samata M Kakkad

    Full Text Available Hypoxic tumor microenvironments result in an aggressive phenotype and resistance to therapy that lead to tumor progression, recurrence, and metastasis. While poor vascularization and the resultant inadequate drug delivery are known to contribute to drug resistance, the effect of hypoxia on molecular transport through the interstitium, and the role of the extracellular matrix (ECM in mediating this transport are unexplored. The dense mesh of fibers present in the ECM can especially influence the movement of macromolecules. Collagen 1 (Col1 fibers form a key component of the ECM in breast cancers. Here we characterized the influence of hypoxia on macromolecular transport in tumors, and the role of Col1 fibers in mediating this transport using an MDA-MB-231 breast cancer xenograft model engineered to express red fluorescent protein under hypoxia. Magnetic resonance imaging of macromolecular transport was combined with second harmonic generation microscopy of Col1 fibers. Hypoxic tumor regions displayed significantly decreased Col1 fiber density and volume, as well as significantly lower macromolecular draining and pooling rates, than normoxic regions. Regions adjacent to severely hypoxic areas revealed higher deposition of Col1 fibers and increased macromolecular transport. These data suggest that Col1 fibers may facilitate macromolecular transport in tumors, and their reduction in hypoxic regions may reduce this transport. Decreased macromolecular transport in hypoxic regions may also contribute to poor drug delivery and tumor recurrence in hypoxic regions. High Col1 fiber density observed around hypoxic regions may facilitate the escape of aggressive cancer cells from hypoxic regions.

  15. Macromolecular target prediction by self-organizing feature maps.

    Science.gov (United States)

    Schneider, Gisbert; Schneider, Petra

    2017-03-01

    Rational drug discovery would greatly benefit from a more nuanced appreciation of the activity of pharmacologically active compounds against a diverse panel of macromolecular targets. Already, computational target-prediction models assist medicinal chemists in library screening, de novo molecular design, optimization of active chemical agents, drug re-purposing, in the spotting of potential undesired off-target activities, and in the 'de-orphaning' of phenotypic screening hits. The self-organizing map (SOM) algorithm has been employed successfully for these and other purposes. Areas covered: The authors recapitulate contemporary artificial neural network methods for macromolecular target prediction, and present the basic SOM algorithm at a conceptual level. Specifically, they highlight consensus target-scoring by the employment of multiple SOMs, and discuss the opportunities and limitations of this technique. Expert opinion: Self-organizing feature maps represent a straightforward approach to ligand clustering and classification. Some of the appeal lies in their conceptual simplicity and broad applicability domain. Despite known algorithmic shortcomings, this computational target prediction concept has been proven to work in prospective settings with high success rates. It represents a prototypic technique for future advances in the in silico identification of the modes of action and macromolecular targets of bioactive molecules.

  16. Stochastic reaction-diffusion algorithms for macromolecular crowding

    Science.gov (United States)

    Sturrock, Marc

    2016-06-01

    Compartment-based (lattice-based) reaction-diffusion algorithms are often used for studying complex stochastic spatio-temporal processes inside cells. In this paper the influence of macromolecular crowding on stochastic reaction-diffusion simulations is investigated. Reaction-diffusion processes are considered on two different kinds of compartmental lattice, a cubic lattice and a hexagonal close packed lattice, and solved using two different algorithms, the stochastic simulation algorithm and the spatiocyte algorithm (Arjunan and Tomita 2010 Syst. Synth. Biol. 4, 35-53). Obstacles (modelling macromolecular crowding) are shown to have substantial effects on the mean squared displacement and average number of molecules in the domain but the nature of these effects is dependent on the choice of lattice, with the cubic lattice being more susceptible to the effects of the obstacles. Finally, improvements for both algorithms are presented.

  17. In situ macromolecular crystallography using microbeams.

    Science.gov (United States)

    Axford, Danny; Owen, Robin L; Aishima, Jun; Foadi, James; Morgan, Ann W; Robinson, James I; Nettleship, Joanne E; Owens, Raymond J; Moraes, Isabel; Fry, Elizabeth E; Grimes, Jonathan M; Harlos, Karl; Kotecha, Abhay; Ren, Jingshan; Sutton, Geoff; Walter, Thomas S; Stuart, David I; Evans, Gwyndaf

    2012-05-01

    Despite significant progress in high-throughput methods in macromolecular crystallography, the production of diffraction-quality crystals remains a major bottleneck. By recording diffraction in situ from crystals in their crystallization plates at room temperature, a number of problems associated with crystal handling and cryoprotection can be side-stepped. Using a dedicated goniometer installed on the microfocus macromolecular crystallography beamline I24 at Diamond Light Source, crystals have been studied in situ with an intense and flexible microfocus beam, allowing weakly diffracting samples to be assessed without a manual crystal-handling step but with good signal to noise, despite the background scatter from the plate. A number of case studies are reported: the structure solution of bovine enterovirus 2, crystallization screening of membrane proteins and complexes, and structure solution from crystallization hits produced via a high-throughput pipeline. These demonstrate the potential for in situ data collection and structure solution with microbeams. © 2012 International Union of Crystallography

  18. Crowding-facilitated macromolecular transport in attractive micropost arrays.

    Science.gov (United States)

    Chien, Fan-Tso; Lin, Po-Keng; Chien, Wei; Hung, Cheng-Hsiang; Yu, Ming-Hung; Chou, Chia-Fu; Chen, Yeng-Long

    2017-05-02

    Our study of DNA dynamics in weakly attractive nanofabricated post arrays revealed crowding enhances polymer transport, contrary to hindered transport in repulsive medium. The coupling of DNA diffusion and adsorption to the microposts results in more frequent cross-post hopping and increased long-term diffusivity with increased crowding density. We performed Langevin dynamics simulations and found maximum long-term diffusivity in post arrays with gap sizes comparable to the polymer radius of gyration. We found that macromolecular transport in weakly attractive post arrays is faster than in non-attractive dense medium. Furthermore, we employed hidden Markov analysis to determine the transition of macromolecular adsorption-desorption on posts and hopping between posts. The apparent free energy barriers are comparable to theoretical estimates determined from polymer conformational fluctuations.

  19. In situ macromolecular crystallography using microbeams

    International Nuclear Information System (INIS)

    Axford, Danny; Owen, Robin L.; Aishima, Jun; Foadi, James; Morgan, Ann W.; Robinson, James I.; Nettleship, Joanne E.; Owens, Raymond J.; Moraes, Isabel; Fry, Elizabeth E.; Grimes, Jonathan M.; Harlos, Karl; Kotecha, Abhay; Ren, Jingshan; Sutton, Geoff; Walter, Thomas S.; Stuart, David I.; Evans, Gwyndaf

    2012-01-01

    A sample environment for mounting crystallization trays has been developed on the microfocus beamline I24 at Diamond Light Source. The technical developments and several case studies are described. Despite significant progress in high-throughput methods in macromolecular crystallography, the production of diffraction-quality crystals remains a major bottleneck. By recording diffraction in situ from crystals in their crystallization plates at room temperature, a number of problems associated with crystal handling and cryoprotection can be side-stepped. Using a dedicated goniometer installed on the microfocus macromolecular crystallography beamline I24 at Diamond Light Source, crystals have been studied in situ with an intense and flexible microfocus beam, allowing weakly diffracting samples to be assessed without a manual crystal-handling step but with good signal to noise, despite the background scatter from the plate. A number of case studies are reported: the structure solution of bovine enterovirus 2, crystallization screening of membrane proteins and complexes, and structure solution from crystallization hits produced via a high-throughput pipeline. These demonstrate the potential for in situ data collection and structure solution with microbeams

  20. In situ macromolecular crystallography using microbeams

    Energy Technology Data Exchange (ETDEWEB)

    Axford, Danny; Owen, Robin L.; Aishima, Jun [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Foadi, James [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Imperial College, London SW7 2AZ (United Kingdom); Morgan, Ann W.; Robinson, James I. [University of Leeds, Leeds LS9 7FT (United Kingdom); Nettleship, Joanne E.; Owens, Raymond J. [Research Complex at Harwell, Rutherford Appleton Laboratory R92, Didcot, Oxfordshire OX11 0DE (United Kingdom); Moraes, Isabel [Imperial College, London SW7 2AZ (United Kingdom); Fry, Elizabeth E.; Grimes, Jonathan M.; Harlos, Karl; Kotecha, Abhay; Ren, Jingshan; Sutton, Geoff; Walter, Thomas S. [University of Oxford, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Stuart, David I. [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); University of Oxford, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Evans, Gwyndaf, E-mail: gwyndaf.evans@diamond.ac.uk [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom)

    2012-04-17

    A sample environment for mounting crystallization trays has been developed on the microfocus beamline I24 at Diamond Light Source. The technical developments and several case studies are described. Despite significant progress in high-throughput methods in macromolecular crystallography, the production of diffraction-quality crystals remains a major bottleneck. By recording diffraction in situ from crystals in their crystallization plates at room temperature, a number of problems associated with crystal handling and cryoprotection can be side-stepped. Using a dedicated goniometer installed on the microfocus macromolecular crystallography beamline I24 at Diamond Light Source, crystals have been studied in situ with an intense and flexible microfocus beam, allowing weakly diffracting samples to be assessed without a manual crystal-handling step but with good signal to noise, despite the background scatter from the plate. A number of case studies are reported: the structure solution of bovine enterovirus 2, crystallization screening of membrane proteins and complexes, and structure solution from crystallization hits produced via a high-throughput pipeline. These demonstrate the potential for in situ data collection and structure solution with microbeams.

  1. An acoustic on-chip goniometer for room temperature macromolecular crystallography.

    Science.gov (United States)

    Burton, C G; Axford, D; Edwards, A M J; Gildea, R J; Morris, R H; Newton, M I; Orville, A M; Prince, M; Topham, P D; Docker, P T

    2017-12-05

    This paper describes the design, development and successful use of an on-chip goniometer for room-temperature macromolecular crystallography via acoustically induced rotations. We present for the first time a low cost, rate-tunable, acoustic actuator for gradual in-fluid sample reorientation about varying axes and its utilisation for protein structure determination on a synchrotron beamline. The device enables the efficient collection of diffraction data via a rotation method from a sample within a surface confined droplet. This method facilitates efficient macromolecular structural data acquisition in fluid environments for dynamical studies.

  2. Gaussian-Based Smooth Dielectric Function: A Surface-Free Approach for Modeling Macromolecular Binding in Solvents

    Directory of Open Access Journals (Sweden)

    Arghya Chakravorty

    2018-03-01

    Full Text Available Conventional modeling techniques to model macromolecular solvation and its effect on binding in the framework of Poisson-Boltzmann based implicit solvent models make use of a geometrically defined surface to depict the separation of macromolecular interior (low dielectric constant from the solvent phase (high dielectric constant. Though this simplification saves time and computational resources without significantly compromising the accuracy of free energy calculations, it bypasses some of the key physio-chemical properties of the solute-solvent interface, e.g., the altered flexibility of water molecules and that of side chains at the interface, which results in dielectric properties different from both bulk water and macromolecular interior, respectively. Here we present a Gaussian-based smooth dielectric model, an inhomogeneous dielectric distribution model that mimics the effect of macromolecular flexibility and captures the altered properties of surface bound water molecules. Thus, the model delivers a smooth transition of dielectric properties from the macromolecular interior to the solvent phase, eliminating any unphysical surface separating the two phases. Using various examples of macromolecular binding, we demonstrate its utility and illustrate the comparison with the conventional 2-dielectric model. We also showcase some additional abilities of this model, viz. to account for the effect of electrolytes in the solution and to render the distribution profile of water across a lipid membrane.

  3. A public database of macromolecular diffraction experiments.

    Science.gov (United States)

    Grabowski, Marek; Langner, Karol M; Cymborowski, Marcin; Porebski, Przemyslaw J; Sroka, Piotr; Zheng, Heping; Cooper, David R; Zimmerman, Matthew D; Elsliger, Marc André; Burley, Stephen K; Minor, Wladek

    2016-11-01

    The low reproducibility of published experimental results in many scientific disciplines has recently garnered negative attention in scientific journals and the general media. Public transparency, including the availability of `raw' experimental data, will help to address growing concerns regarding scientific integrity. Macromolecular X-ray crystallography has led the way in requiring the public dissemination of atomic coordinates and a wealth of experimental data, making the field one of the most reproducible in the biological sciences. However, there remains no mandate for public disclosure of the original diffraction data. The Integrated Resource for Reproducibility in Macromolecular Crystallography (IRRMC) has been developed to archive raw data from diffraction experiments and, equally importantly, to provide related metadata. Currently, the database of our resource contains data from 2920 macromolecular diffraction experiments (5767 data sets), accounting for around 3% of all depositions in the Protein Data Bank (PDB), with their corresponding partially curated metadata. IRRMC utilizes distributed storage implemented using a federated architecture of many independent storage servers, which provides both scalability and sustainability. The resource, which is accessible via the web portal at http://www.proteindiffraction.org, can be searched using various criteria. All data are available for unrestricted access and download. The resource serves as a proof of concept and demonstrates the feasibility of archiving raw diffraction data and associated metadata from X-ray crystallographic studies of biological macromolecules. The goal is to expand this resource and include data sets that failed to yield X-ray structures in order to facilitate collaborative efforts that will improve protein structure-determination methods and to ensure the availability of `orphan' data left behind for various reasons by individual investigators and/or extinct structural genomics

  4. Development of an online UV–visible microspectrophotometer for a macromolecular crystallography beamline

    Energy Technology Data Exchange (ETDEWEB)

    Shimizu, Nobutaka, E-mail: nobutaka.shimizu@kek.jp [SPring-8/JASRI, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5198 (Japan); High Energy Accelerator Research Organization (KEK), 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); Shimizu, Tetsuya [RIKEN SPring-8 Center, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5148 (Japan); Baba, Seiki; Hasegawa, Kazuya [SPring-8/JASRI, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5198 (Japan); Yamamoto, Masaki [RIKEN SPring-8 Center, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5148 (Japan); Kumasaka, Takashi [SPring-8/JASRI, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5198 (Japan)

    2013-11-01

    An online UV–visible microspectrophotometer has been developed for the macromolecular crystallography beamline at SPring-8. Details of this spectrophotometer are reported. Measurement of the UV–visible absorption spectrum is a convenient technique for detecting chemical changes of proteins, and it is therefore useful to combine spectroscopy and diffraction studies. An online microspectrophotometer for the UV–visible region was developed and installed on the macromolecular crystallography beamline, BL38B1, at SPring-8. This spectrophotometer is equipped with a difference dispersive double monochromator, a mercury–xenon lamp as the light source, and a photomultiplier as the detector. The optical path is mostly constructed using mirrors, in order to obtain high brightness in the UV region, and the confocal optics are assembled using a cross-slit diaphragm like an iris to eliminate stray light. This system can measure optical densities up to a maximum of 4.0. To study the effect of radiation damage, preliminary measurements of glucose isomerase and thaumatin crystals were conducted in the UV region. Spectral changes dependent on X-ray dose were observed at around 280 nm, suggesting that structural changes involving Trp or Tyr residues occurred in the protein crystal. In the case of the thaumatin crystal, a broad peak around 400 nm was also generated after X-ray irradiation, suggesting the cleavage of a disulfide bond. Dose-dependent spectral changes were also observed in cryo-solutions alone, and these changes differed with the composition of the cryo-solution. These responses in the UV region are informative regarding the state of the sample; consequently, this device might be useful for X-ray crystallography.

  5. Macromolecular Networks Containing Fluorinated Cyclic Moieties

    Science.gov (United States)

    2015-12-12

    Briefing Charts 3. DATES COVERED (From - To) 17 Nov 2015 – 12 Dec 2015 4. TITLE AND SUBTITLE Macromolecular Networks Containing Fluorinated Cyclic... FLUORINATED CYCLIC MOIETIES 12 December 2015 Andrew J. Guenthner,1 Scott T. Iacono,2 Cynthia A. Corley,2 Christopher M. Sahagun,3 Kevin R. Lamison,4...Reinforcements Good Flame, Smoke, & Toxicity Characteristics Low Water Uptake with Near Zero Coefficient of Hygroscopic Expansion ∆ DISTRIBUTION A

  6. Design and application of a C++ macromolecular class library.

    Science.gov (United States)

    Chang, W; Shindyalov, I N; Pu, C; Bourne, P E

    1994-01-01

    PDBlib is an extensible object oriented class library written in C++ for representing the 3-dimensional structure of biological macromolecules. PDBlib forms the kernel of a larger software framework being developed for assiting in knowledge discovery from macromolecular structure data. The software design strategy used by PDBlib, how the library may be used and several prototype applications that use the library are summarized. PDBlib represents the structural features of proteins, DNA, RNA, and complexes thereof, at a level of detail on a par with that which can be parsed from a Protein Data Bank (PDB) entry. However, the memory resident representation of the macromolecule is independent of the PDB entry and can be obtained from other back-end data sources, for example, existing relational databases and our own object oriented database (OOPDB) built on top of the commercial object oriented database, ObjectStore. At the front-end are several prototype applications that use the library: Macromolecular Query Language (MMQL) is based on a separate class library (MMQLlib) for building complex queries pertaining to macromolecular structure; PDBtool is an interactive structure verification tool; and PDBview, is a structure rendering tool used either as a standalone tool or as part of another application. Each of these software components are described. All software is available via anonymous ftp from cuhhca.hhmi.columbia.edu.

  7. Principles and Overview of Sampling Methods for Modeling Macromolecular Structure and Dynamics.

    Science.gov (United States)

    Maximova, Tatiana; Moffatt, Ryan; Ma, Buyong; Nussinov, Ruth; Shehu, Amarda

    2016-04-01

    Investigation of macromolecular structure and dynamics is fundamental to understanding how macromolecules carry out their functions in the cell. Significant advances have been made toward this end in silico, with a growing number of computational methods proposed yearly to study and simulate various aspects of macromolecular structure and dynamics. This review aims to provide an overview of recent advances, focusing primarily on methods proposed for exploring the structure space of macromolecules in isolation and in assemblies for the purpose of characterizing equilibrium structure and dynamics. In addition to surveying recent applications that showcase current capabilities of computational methods, this review highlights state-of-the-art algorithmic techniques proposed to overcome challenges posed in silico by the disparate spatial and time scales accessed by dynamic macromolecules. This review is not meant to be exhaustive, as such an endeavor is impossible, but rather aims to balance breadth and depth of strategies for modeling macromolecular structure and dynamics for a broad audience of novices and experts.

  8. Outrunning free radicals in room-temperature macromolecular crystallography

    International Nuclear Information System (INIS)

    Owen, Robin L.; Axford, Danny; Nettleship, Joanne E.; Owens, Raymond J.; Robinson, James I.; Morgan, Ann W.; Doré, Andrew S.; Lebon, Guillaume; Tate, Christopher G.; Fry, Elizabeth E.; Ren, Jingshan; Stuart, David I.; Evans, Gwyndaf

    2012-01-01

    A systematic increase in lifetime is observed in room-temperature protein and virus crystals through the use of reduced exposure times and a fast detector. A significant increase in the lifetime of room-temperature macromolecular crystals is reported through the use of a high-brilliance X-ray beam, reduced exposure times and a fast-readout detector. This is attributed to the ability to collect diffraction data before hydroxyl radicals can propagate through the crystal, fatally disrupting the lattice. Hydroxyl radicals are shown to be trapped in amorphous solutions at 100 K. The trend in crystal lifetime was observed in crystals of a soluble protein (immunoglobulin γ Fc receptor IIIa), a virus (bovine enterovirus serotype 2) and a membrane protein (human A 2A adenosine G-protein coupled receptor). The observation of a similar effect in all three systems provides clear evidence for a common optimal strategy for room-temperature data collection and will inform the design of future synchrotron beamlines and detectors for macromolecular crystallography

  9. Outrunning free radicals in room-temperature macromolecular crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Owen, Robin L., E-mail: robin.owen@diamond.ac.uk; Axford, Danny [Diamond Light Source, Harwell Science and Innovation Campus, Didcot OX11 0DE (United Kingdom); Nettleship, Joanne E.; Owens, Raymond J. [Rutherford Appleton Laboratory, Didcot OX11 0FA (United Kingdom); The Henry Wellcome Building for Genomic Medicine, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Robinson, James I.; Morgan, Ann W. [University of Leeds, Leeds LS9 7FT (United Kingdom); Doré, Andrew S. [Heptares Therapeutics Ltd, BioPark, Welwyn Garden City AL7 3AX (United Kingdom); Lebon, Guillaume; Tate, Christopher G. [MRC Laboratory of Molecular Biology, Hills Road, Cambridge CB2 0QH (United Kingdom); Fry, Elizabeth E.; Ren, Jingshan [The Henry Wellcome Building for Genomic Medicine, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Stuart, David I. [Diamond Light Source, Harwell Science and Innovation Campus, Didcot OX11 0DE (United Kingdom); The Henry Wellcome Building for Genomic Medicine, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Evans, Gwyndaf [Diamond Light Source, Harwell Science and Innovation Campus, Didcot OX11 0DE (United Kingdom)

    2012-06-15

    A systematic increase in lifetime is observed in room-temperature protein and virus crystals through the use of reduced exposure times and a fast detector. A significant increase in the lifetime of room-temperature macromolecular crystals is reported through the use of a high-brilliance X-ray beam, reduced exposure times and a fast-readout detector. This is attributed to the ability to collect diffraction data before hydroxyl radicals can propagate through the crystal, fatally disrupting the lattice. Hydroxyl radicals are shown to be trapped in amorphous solutions at 100 K. The trend in crystal lifetime was observed in crystals of a soluble protein (immunoglobulin γ Fc receptor IIIa), a virus (bovine enterovirus serotype 2) and a membrane protein (human A{sub 2A} adenosine G-protein coupled receptor). The observation of a similar effect in all three systems provides clear evidence for a common optimal strategy for room-temperature data collection and will inform the design of future synchrotron beamlines and detectors for macromolecular crystallography.

  10. Macromolecular diffusion in crowded media beyond the hard-sphere model.

    Science.gov (United States)

    Blanco, Pablo M; Garcés, Josep Lluís; Madurga, Sergio; Mas, Francesc

    2018-04-25

    The effect of macromolecular crowding on diffusion beyond the hard-core sphere model is studied. A new coarse-grained model is presented, the Chain Entanglement Softened Potential (CESP) model, which takes into account the macromolecular flexibility and chain entanglement. The CESP model uses a shoulder-shaped interaction potential that is implemented in the Brownian Dynamics (BD) computations. The interaction potential contains only one parameter associated with the chain entanglement energetic cost (Ur). The hydrodynamic interactions are included in the BD computations via Tokuyama mean-field equations. The model is used to analyze the diffusion of a streptavidin protein among different sized dextran obstacles. For this system, Ur is obtained by fitting the streptavidin experimental long-time diffusion coefficient Dlongversus the macromolecular concentration for D50 (indicating their molecular weight in kg mol-1) dextran obstacles. The obtained Dlong values show better quantitative agreement with experiments than those obtained with hard-core spheres. Moreover, once parametrized, the CESP model is also able to quantitatively predict Dlong and the anomalous exponent (α) for streptavidin diffusion among D10, D400 and D700 dextran obstacles. Dlong, the short-time diffusion coefficient (Dshort) and α are obtained from the BD simulations by using a new empirical expression, able to describe the full temporal evolution of the diffusion coefficient.

  11. A smooth and differentiable bulk-solvent model for macromolecular diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Fenn, T. D. [Department of Molecular and Cellular Physiology and Howard Hughes Medical Institute, Stanford, California (United States); Schnieders, M. J. [Department of Chemistry, Stanford, California (United States); Brunger, A. T., E-mail: brunger@stanford.edu [Department of Molecular and Cellular Physiology and Howard Hughes Medical Institute, Stanford, California (United States); Departments of Neurology and Neurological Sciences, Structural Biology and Photon Science, Stanford, California (United States)

    2010-09-01

    A new method for modeling the bulk solvent in macromolecular diffraction data based on Babinet’s principle is presented. The proposed models offer the advantage of differentiability with respect to atomic coordinates. Inclusion of low-resolution data in macromolecular crystallography requires a model for the bulk solvent. Previous methods have used a binary mask to accomplish this, which has proven to be very effective, but the mask is discontinuous at the solute–solvent boundary (i.e. the mask value jumps from zero to one) and is not differentiable with respect to atomic parameters. Here, two algorithms are introduced for computing bulk-solvent models using either a polynomial switch or a smoothly thresholded product of Gaussians, and both models are shown to be efficient and differentiable with respect to atomic coordinates. These alternative bulk-solvent models offer algorithmic improvements, while showing similar agreement of the model with the observed amplitudes relative to the binary model as monitored using R, R{sub free} and differences between experimental and model phases. As with the standard solvent models, the alternative models improve the agreement primarily with lower resolution (>6 Å) data versus no bulk solvent. The models are easily implemented into crystallographic software packages and can be used as a general method for bulk-solvent correction in macromolecular crystallography.

  12. A smooth and differentiable bulk-solvent model for macromolecular diffraction

    International Nuclear Information System (INIS)

    Fenn, T. D.; Schnieders, M. J.; Brunger, A. T.

    2010-01-01

    A new method for modeling the bulk solvent in macromolecular diffraction data based on Babinet’s principle is presented. The proposed models offer the advantage of differentiability with respect to atomic coordinates. Inclusion of low-resolution data in macromolecular crystallography requires a model for the bulk solvent. Previous methods have used a binary mask to accomplish this, which has proven to be very effective, but the mask is discontinuous at the solute–solvent boundary (i.e. the mask value jumps from zero to one) and is not differentiable with respect to atomic parameters. Here, two algorithms are introduced for computing bulk-solvent models using either a polynomial switch or a smoothly thresholded product of Gaussians, and both models are shown to be efficient and differentiable with respect to atomic coordinates. These alternative bulk-solvent models offer algorithmic improvements, while showing similar agreement of the model with the observed amplitudes relative to the binary model as monitored using R, R free and differences between experimental and model phases. As with the standard solvent models, the alternative models improve the agreement primarily with lower resolution (>6 Å) data versus no bulk solvent. The models are easily implemented into crystallographic software packages and can be used as a general method for bulk-solvent correction in macromolecular crystallography

  13. Atomic Scale Structural Studies of Macromolecular Assemblies by Solid-state Nuclear Magnetic Resonance Spectroscopy.

    Science.gov (United States)

    Loquet, Antoine; Tolchard, James; Berbon, Melanie; Martinez, Denis; Habenstein, Birgit

    2017-09-17

    Supramolecular protein assemblies play fundamental roles in biological processes ranging from host-pathogen interaction, viral infection to the propagation of neurodegenerative disorders. Such assemblies consist in multiple protein subunits organized in a non-covalent way to form large macromolecular objects that can execute a variety of cellular functions or cause detrimental consequences. Atomic insights into the assembly mechanisms and the functioning of those macromolecular assemblies remain often scarce since their inherent insolubility and non-crystallinity often drastically reduces the quality of the data obtained from most techniques used in structural biology, such as X-ray crystallography and solution Nuclear Magnetic Resonance (NMR). We here present magic-angle spinning solid-state NMR spectroscopy (SSNMR) as a powerful method to investigate structures of macromolecular assemblies at atomic resolution. SSNMR can reveal atomic details on the assembled complex without size and solubility limitations. The protocol presented here describes the essential steps from the production of 13 C/ 15 N isotope-labeled macromolecular protein assemblies to the acquisition of standard SSNMR spectra and their analysis and interpretation. As an example, we show the pipeline of a SSNMR structural analysis of a filamentous protein assembly.

  14. A simple quantitative model of macromolecular crowding effects on protein folding: Application to the murine prion protein(121-231)

    Science.gov (United States)

    Bergasa-Caceres, Fernando; Rabitz, Herschel A.

    2013-06-01

    A model of protein folding kinetics is applied to study the effects of macromolecular crowding on protein folding rate and stability. Macromolecular crowding is found to promote a decrease of the entropic cost of folding of proteins that produces an increase of both the stability and the folding rate. The acceleration of the folding rate due to macromolecular crowding is shown to be a topology-dependent effect. The model is applied to the folding dynamics of the murine prion protein (121-231). The differential effect of macromolecular crowding as a function of protein topology suffices to make non-native configurations relatively more accessible.

  15. Electron damage in organic crystals

    International Nuclear Information System (INIS)

    Howitt, D.G.; Thomas, G.

    1977-01-01

    The effects of radiation damage in three crystalline organic materials (l-valine, cytosine, copper phthalocyanine) have been investigated by electron microscopy. The degradation of these materials has been found to be consistent with a gradual collapse of their crystal structures brought about by ionization damage to the comprising molecules. It is inferred that the crystallinity of these materials is destroyed by ionizing radiation because the damaged molecules cannot be incorporated into the framework of their original structures. (author)

  16. The Postgraduate Study of Macromolecular Sciences at the University of Zagreb (1971-1980

    Directory of Open Access Journals (Sweden)

    Kunst, B.

    2008-07-01

    Full Text Available The postgraduate study of macromolecular sciences (PSMS was established at the University of Zagreb in 1971 as a university study in the time of expressed interdisciplinary permeation of natural sciences - physics, chemistry and biology, and application of their achievements in technologicaldisciplines. PSMS was established by a group of prominent university professors from the schools of Science, Chemical Technology, Pharmacy and Medicine, as well as from the Institute of Biology. The study comprised basic fields of macromolecular sciences: organic chemistry of synthetic macromolecules, physical chemistry of macromolecules, physics of macromolecules, biological macromolecules and polymer engineering with polymer application and processing, and teaching was performed in 29 lecture courses lead by 30 professors with their collaborators. PSMS ceased to exist with the change of legislation in Croatia in 1980, when the attitude prevailed to render back postgraduate studies to the university schools. During 9 years of existence of PSMS the MSci grade was awarded to 37 macromolecular experts. It was assessed that the PSMS some thirty years ago was an important example of modern postgraduate education as compared with the international postgraduate development. In concordance with the recent introduction of similar interdisciplinary studies in macromolecular sciences elsewhere in the world, the establishment of a modern interdisciplinary study in the field would be of importance for further development of these sciences in Croatia.

  17. Synthesis and characterization of macromolecular rhodamine tethers and their interactions with P-glycoprotein.

    Science.gov (United States)

    Crawford, Lindsey; Putnam, David

    2014-08-20

    Rhodamine dyes are well-known P-glycoprotein (P-gp) substrates that have played an important role in the detection of inhibitors and other substrates of P-gp, as well as in the understanding of P-gp function. Macromolecular conjugates of rhodamines could prove useful as tethers for further probing of P-gp structure and function. Two macromolecular derivatives of rhodamine, methoxypolyethylene glycol-rhodamine6G and methoxypolyethylene glycol-rhodamine123, were synthesized through the 2'-position of rhodamine6G and rhodamine123, thoroughly characterized, and then evaluated by inhibition with verapamil for their ability to interact with P-gp and to act as efflux substrates. To put the results into context, the P-gp interactions of the new conjugates were compared to the commercially available methoxypolyethylene glycol-rhodamineB. FACS analysis confirmed that macromolecular tethers of rhodamine6G, rhodamine123, and rhodamineB were accumulated in P-gp expressing cells 5.2 ± 0.3%, 26.2 ± 4%, and 64.2 ± 6%, respectively, compared to a sensitive cell line that does not overexpress P-gp. Along with confocal imaging, the efflux analysis confirmed that the macromolecular rhodamine tethers remain P-gp substrates. These results open potential avenues for new ways to probe the function of P-gp both in vitro and in vivo.

  18. MMTF-An efficient file format for the transmission, visualization, and analysis of macromolecular structures.

    Directory of Open Access Journals (Sweden)

    Anthony R Bradley

    2017-06-01

    Full Text Available Recent advances in experimental techniques have led to a rapid growth in complexity, size, and number of macromolecular structures that are made available through the Protein Data Bank. This creates a challenge for macromolecular visualization and analysis. Macromolecular structure files, such as PDB or PDBx/mmCIF files can be slow to transfer, parse, and hard to incorporate into third-party software tools. Here, we present a new binary and compressed data representation, the MacroMolecular Transmission Format, MMTF, as well as software implementations in several languages that have been developed around it, which address these issues. We describe the new format and its APIs and demonstrate that it is several times faster to parse, and about a quarter of the file size of the current standard format, PDBx/mmCIF. As a consequence of the new data representation, it is now possible to visualize structures with millions of atoms in a web browser, keep the whole PDB archive in memory or parse it within few minutes on average computers, which opens up a new way of thinking how to design and implement efficient algorithms in structural bioinformatics. The PDB archive is available in MMTF file format through web services and data that are updated on a weekly basis.

  19. Diffusion accessibility as a method for visualizing macromolecular surface geometry.

    Science.gov (United States)

    Tsai, Yingssu; Holton, Thomas; Yeates, Todd O

    2015-10-01

    Important three-dimensional spatial features such as depth and surface concavity can be difficult to convey clearly in the context of two-dimensional images. In the area of macromolecular visualization, the computer graphics technique of ray-tracing can be helpful, but further techniques for emphasizing surface concavity can give clearer perceptions of depth. The notion of diffusion accessibility is well-suited for emphasizing such features of macromolecular surfaces, but a method for calculating diffusion accessibility has not been made widely available. Here we make available a web-based platform that performs the necessary calculation by solving the Laplace equation for steady state diffusion, and produces scripts for visualization that emphasize surface depth by coloring according to diffusion accessibility. The URL is http://services.mbi.ucla.edu/DiffAcc/. © 2015 The Protein Society.

  20. Local analysis of strains and rotations for macromolecular electron microscopy maps

    Energy Technology Data Exchange (ETDEWEB)

    Martin-Ramos, A.; Prieto, F.; Melero, R.; Martin-Benito, J.; Jonic, S.; Navas-Calvente, J.; Vargas, J.; Oton, J.; Abrishami, V.; Rosa-Trevin, J.L. de la; Gomez-Blanco, J.; Vilas, J.L.; Marabini, R.; Carazo, R.; Sorzano, C.O.S.

    2016-07-01

    Macromolecular complexes can be considered as molecular nano-machines that must have mobile parts in order to perform their physiological functions. The reordering of their parts is essential to execute their task. These rearrangements induce local strains and rotations which, after analyzing them, may provide relevant information about how the proteins perform their function. In this project these deformations of the macromolecular complexes are characterized, translating into a “mathematical language” the conformational changes of the complexes when they perform their function. Electron Microscopy (EM) volumes are analyzed using a method that uses B-splines as its basis functions. It is shown that the results obtained are consistent with the conformational changes described in their corresponding reference publications. (Author)

  1. [Macromolecular aromatic network characteristics of Chinese power coal analyzed by synchronous fluorescence and X-ray diffraction].

    Science.gov (United States)

    Ye, Cui-Ping; Feng, Jie; Li, Wen-Ying

    2012-07-01

    Coal structure, especially the macromolecular aromatic skeleton structure, has a strong influence on coke reactivity and coal gasification, so it is the key to grasp the macromolecular aromatic skeleton coal structure for getting the reasonable high efficiency utilization of coal. However, it is difficult to acquire their information due to the complex compositions and structure of coal. It has been found that the macromolecular aromatic network coal structure would be most isolated if small molecular of coal was first extracted. Then the macromolecular aromatic skeleton coal structure would be clearly analyzed by instruments, such as X-ray diffraction (XRD), fluorescence spectroscopy with synchronous mode (Syn-F), Gel permeation chromatography (GPC) etc. Based on the previous results, according to the stepwise fractional liquid extraction, two Chinese typical power coals, PS and HDG, were extracted by silica gel as stationary phase and acetonitrile, tetrahydrofuran (THF), pyridine and 1-methyl-2-pyrollidinone (NMP) as a solvent group for sequential elution. GPC, Syn-F and XRD were applied to investigate molecular mass distribution, condensed aromatic structure and crystal characteristics. The results showed that the size of aromatic layers (La) is small (3-3.95 nm) and the stacking heights (Lc) are 0.8-1.2 nm. The molecular mass distribution of the macromolecular aromatic network structure is between 400 and 1 130 amu, with condensed aromatic numbers of 3-7 in the structure units.

  2. Flexibility damps macromolecular crowding effects on protein folding dynamics: Application to the murine prion protein (121-231)

    Science.gov (United States)

    Bergasa-Caceres, Fernando; Rabitz, Herschel A.

    2014-01-01

    A model of protein folding kinetics is applied to study the combined effects of protein flexibility and macromolecular crowding on protein folding rate and stability. It is found that the increase in stability and folding rate promoted by macromolecular crowding is damped for proteins with highly flexible native structures. The model is applied to the folding dynamics of the murine prion protein (121-231). It is found that the high flexibility of the native isoform of the murine prion protein (121-231) reduces the effects of macromolecular crowding on its folding dynamics. The relevance of these findings for the pathogenic mechanism are discussed.

  3. Macromolecular crystallography research at Trombay

    International Nuclear Information System (INIS)

    Kannan, K.K.; Chidamrabam, R.

    1983-01-01

    Neutron diffraction studies of hydrogen positions in small molecules of biological interest at Trombay have provided valuable information that has been used in protein and enzyme structure model-building and in developing hydrogen bond potential functions. The new R-5 reactor is expected to provide higher neutron fluxes and also make possible small-angle neutron scattering studies of large biomolecules and bio-aggregates. In the last few years infrastructure facilities have also been established for macromolecular x-ray crystallography research. Meanwhile, the refinement of carbonic hydrases and lyysozyme structures have been carried out and interesting results obtained on protein dynamics and structure-function relationships. Some interesting presynaptic toxin phospholipases have also taken up for study. (author)

  4. Dexamethasone attenuates grain sorghum dust extract-induced increase in macromolecular efflux in vivo.

    Science.gov (United States)

    Akhter, S R; Ikezaki, H; Gao, X P; Rubinstein, I

    1999-05-01

    The purpose of this study was to determine whether dexamethasone attenuates grain sorghum dust extract-induced increase in macromolecular efflux from the in situ hamster cheek pouch and, if so, whether this response is specific. By using intravital microscopy, we found that an aqueous extract of grain sorghum dust elicited significant, concentration-dependent leaky site formation and increase in clearance of FITC-labeled dextran (FITC-dextran; mol mass, 70 kDa) from the in situ hamster cheek pouch (P grain sorghum dust extract- and substance P-induced increases in macromolecular efflux from the in situ hamster cheek pouch in a specific fashion.

  5. The Joint Structural Biology Group beam lines at the ESRF: Modern macromolecular crystallography

    CERN Document Server

    Mitchell, E P

    2001-01-01

    Macromolecular crystallography has evolved considerably over the last decade. Data sets in under an hour are now possible on high throughput beam lines leading to electron density and, possibly, initial models calculated on-site. There are five beam lines currently dedicated to macromolecular crystallography: the ID14 complex and BM-14 (soon to be superseded by ID-29). These lines handle over five hundred projects every six months and demand is increasing. Automated sample handling, alignment and data management protocols will be required to work efficiently with this demanding load. Projects developing these themes are underway within the JSBG.

  6. The effect of ancient DNA damage on inferences of demographic histories

    DEFF Research Database (Denmark)

    Axelsson, Erik; Willerslev, Eske; Gilbert, Marcus Thomas Pius

    2008-01-01

    The field of ancient DNA (aDNA) is casting new light on many evolutionary questions. However, problems associated with the postmortem instability of DNA may complicate the interpretation of aDNA data. For example, in population genetic studies, the inclusion of damaged DNA may inflate estimates o...... for a change in effective population size in this data set vanishes once the effects of putative damage are removed. Our results suggest that population genetic analyses of aDNA sequences, which do not accurately account for damage, should be interpreted with great caution....

  7. Isotope labeling for NMR studies of macromolecular structure and interactions

    International Nuclear Information System (INIS)

    Wright, P.E.

    1994-01-01

    Implementation of biosynthetic methods for uniform or specific isotope labeling of proteins, coupled with the recent development of powerful heteronuclear multidimensional NMR methods, has led to a dramatic increase in the size and complexity of macromolecular systems that are now amenable to NMR structural analysis. In recent years, a new technology has emerged that combines uniform 13 C, 15 N labeling with heteronuclear multidimensional NMR methods to allow NMR structural studies of systems approaching 25 to 30 kDa in molecular weight. In addition, with the introduction of specific 13 C and 15 N labels into ligands, meaningful NMR studies of complexes of even higher molecular weight have become feasible. These advances usher in a new era in which the earlier, rather stringent molecular weight limitations have been greatly surpassed and NMR can begin to address many central biological problems that involve macromolecular structure, dynamics, and interactions

  8. Isotope labeling for NMR studies of macromolecular structure and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Wright, P.E. [Scripps Research Institute, La Jolla, CA (United States)

    1994-12-01

    Implementation of biosynthetic methods for uniform or specific isotope labeling of proteins, coupled with the recent development of powerful heteronuclear multidimensional NMR methods, has led to a dramatic increase in the size and complexity of macromolecular systems that are now amenable to NMR structural analysis. In recent years, a new technology has emerged that combines uniform {sup 13}C, {sup 15}N labeling with heteronuclear multidimensional NMR methods to allow NMR structural studies of systems approaching 25 to 30 kDa in molecular weight. In addition, with the introduction of specific {sup 13}C and {sup 15}N labels into ligands, meaningful NMR studies of complexes of even higher molecular weight have become feasible. These advances usher in a new era in which the earlier, rather stringent molecular weight limitations have been greatly surpassed and NMR can begin to address many central biological problems that involve macromolecular structure, dynamics, and interactions.

  9. Macromolecular shape and interactions in layer-by-layer assemblies within cylindrical nanopores.

    Science.gov (United States)

    Lazzara, Thomas D; Lau, K H Aaron; Knoll, Wolfgang; Janshoff, Andreas; Steinem, Claudia

    2012-01-01

    Layer-by-layer (LbL) deposition of polyelectrolytes and proteins within the cylindrical nanopores of anodic aluminum oxide (AAO) membranes was studied by optical waveguide spectroscopy (OWS). AAO has aligned cylindrical, nonintersecting pores with a defined pore diameter d(0) and functions as a planar optical waveguide so as to monitor, in situ, the LbL process by OWS. The LbL deposition of globular proteins, i.e., avidin and biotinylated bovine serum albumin was compared with that of linear polyelectrolytes (linear-PEs), both species being of similar molecular weight. LbL deposition within the cylindrical AAO geometry for different pore diameters (d(0) = 25-80 nm) for the various macromolecular species, showed that the multilayer film growth was inhibited at different maximum numbers of LbL steps (n(max)). The value of n(max) was greatest for linear-PEs, while proteins had a lower value. The cylindrical pore geometry imposes a physical limit to LbL growth such that n(max) is strongly dependent on the overall internal structure of the LbL film. For all macromolecular species, deposition was inhibited in native AAO, having pores of d(0) = 25-30 nm. Both, OWS and scanning electron microscopy showed that LbL growth in larger AAO pores (d(0) > 25-30 nm) became inhibited when approaching a pore diameter of d(eff,n_max) = 25-35 nm, a similar size to that of native AAO pores, with d(0) = 25-30 nm. For a reasonable estimation of d(eff,n_max), the actual volume occupied by a macromolecular assembly must be taken into consideration. The results clearly show that electrostatic LbL allowed for compact macromolecular layers, whereas proteins formed loosely packed multilayers.

  10. Variationally optimal selection of slow coordinates and reaction coordinates in macromolecular systems

    Science.gov (United States)

    Noe, Frank

    To efficiently simulate and generate understanding from simulations of complex macromolecular systems, the concept of slow collective coordinates or reaction coordinates is of fundamental importance. Here we will introduce variational approaches to approximate the slow coordinates and the reaction coordinates between selected end-states given MD simulations of the macromolecular system and a (possibly large) basis set of candidate coordinates. We will then discuss how to select physically intuitive order paremeters that are good surrogates of this variationally optimal result. These result can be used in order to construct Markov state models or other models of the stationary and kinetics properties, in order to parametrize low-dimensional / coarse-grained model of the dynamics. Deutsche Forschungsgemeinschaft, European Research Council.

  11. Superhydrophobic hybrid membranes by grafting arc-like macromolecular bridges on graphene sheets: Synthesis, characterization and properties

    Science.gov (United States)

    Mo, Zhao-Hua; Luo, Zheng; Huang, Qiang; Deng, Jian-Ping; Wu, Yi-Xian

    2018-05-01

    Grafting single end-tethered polymer chains on the surface of graphene is a conventional way to modify the surface properties of graphene oxide. However, grafting arc-like macromolecular bridges on graphene surfaces has been barely reported. Herein, a novel arc-like polydimethylsiloxane (PDMS) macromolecular bridges grafted graphene sheets (GO-g-Arc PDMS) was successfully synthesized via a confined interface reaction at 90 °C. Both the hydrophilic α- and ω-amino groups of linear hydrophobic NH2-PDMS-NH2 macromolecular chains rapidly reacted with epoxy and carboxyl groups on the surfaces of graphene oxide in water suspension to form arc-like PDMS macromolecular bridges on graphene sheets. The grafting density of arc-like PDMS bridges on graphene sheets can reach up to 0.80 mmol g-1 or 1.32 arc-like bridges per nm2 by this confined interface reaction. The water contact angle (WCA) of the hybrid membrane could be increased with increasing both the grafting density and content of covalent arc-like bridges architecture. The superhydrophobic hybrid membrane with a WCA of 153.4° was prepared by grinding of the above arc-like PDMS bridges grafted graphene hybrid, dispersing in ethanol and filtrating by organic filter membrane. This superhydrophobic hybrid membrane shows good self-cleaning and complete oil-water separation properties, which provides potential applications in anticontamination coating and oil-water separation. To the best of our knowledge, this is the first report on the synthesis of functional hybrid membranes by grafting arc-like PDMS macromolecular bridges on graphene sheets via a confined interface reaction.

  12. Tuning the properties of an anthracene-based PPE-PPV copolymer by fine variation of its macromolecular parameters

    Czech Academy of Sciences Publication Activity Database

    Tinti, F.; Sabir, F. K.; Gazzano, M.; Righi, S.; Ulbricht, C.; Usluer, Ö.; Pokorná, Veronika; Cimrová, Věra; Yohannes, T.; Egbe, D. A. M.; Camaioni, N.

    2013-01-01

    Roč. 3, č. 19 (2013), s. 6972-6980 ISSN 2046-2069 R&D Projects: GA ČR GAP106/12/0827; GA ČR(CZ) GA13-26542S Institutional support: RVO:61389013 Keywords : anthracene-containing PPE-PPV copolymer * macromolecular parameters * structural and transport properties Subject RIV: CD - Macromolecular Chemistry Impact factor: 3.708, year: 2013

  13. Novel types of DNA-sugar damage in neocarzinostatin cytotoxicity and mutagenesis

    International Nuclear Information System (INIS)

    Goldberg, I.H.

    1986-01-01

    Although a number of antitumor antibiotics interact with DNA to form covalent adducts with the bases, relatively few damage DNA by interacting with the deoxyribose moiety. Neocarzinostatin (NCS), a member of a family of macromolecular antibiotics obtained from filtrates of Streptomyces, is such an agent. Many of the biochemical and cellular effects of NCS resemble those of ionizing radiation. Most, possibly all, of the DNA lesions caused by NCS appear to result from the direct attack of an activated form of the drug on the deoxyribose of DNA. This is to be contrasted with ionizing radiation or the antibiotic bleomycin, that damage DNA deoxyribose through the intervention of a reduced form of oxygen. This paper describes the nature of the interaction between the active component of NCS and DNA, on the mechanism of the ensuing deoxyribose damage, and on some of the biological consequences of these actions. 24 refs., 7 figs

  14. MxCuBE: a synchrotron beamline control environment customized for macromolecular crystallography experiments

    International Nuclear Information System (INIS)

    Gabadinho, José; Beteva, Antonia; Guijarro, Matias; Rey-Bakaikoa, Vicente; Spruce, Darren

    2010-01-01

    MxCuBE is a beamline control environment optimized for the needs of macromolecular crystallography. This paper describes the design of the software and the features that MxCuBE currently provides. The design and features of a beamline control software system for macromolecular crystallography (MX) experiments developed at the European Synchrotron Radiation Facility (ESRF) are described. This system, MxCuBE, allows users to easily and simply interact with beamline hardware components and provides automated routines for common tasks in the operation of a synchrotron beamline dedicated to experiments in MX. Additional functionality is provided through intuitive interfaces that enable the assessment of the diffraction characteristics of samples, experiment planning, automatic data collection and the on-line collection and analysis of X-ray emission spectra. The software can be run in a tandem client-server mode that allows for remote control and relevant experimental parameters and results are automatically logged in a relational database, ISPyB. MxCuBE is modular, flexible and extensible and is currently deployed on eight macromolecular crystallography beamlines at the ESRF. Additionally, the software is installed at MAX-lab beamline I911-3 and at BESSY beamline BL14.1

  15. Enzymes as Green Catalysts for Precision Macromolecular Synthesis.

    Science.gov (United States)

    Shoda, Shin-ichiro; Uyama, Hiroshi; Kadokawa, Jun-ichi; Kimura, Shunsaku; Kobayashi, Shiro

    2016-02-24

    The present article comprehensively reviews the macromolecular synthesis using enzymes as catalysts. Among the six main classes of enzymes, the three classes, oxidoreductases, transferases, and hydrolases, have been employed as catalysts for the in vitro macromolecular synthesis and modification reactions. Appropriate design of reaction including monomer and enzyme catalyst produces macromolecules with precisely controlled structure, similarly as in vivo enzymatic reactions. The reaction controls the product structure with respect to substrate selectivity, chemo-selectivity, regio-selectivity, stereoselectivity, and choro-selectivity. Oxidoreductases catalyze various oxidation polymerizations of aromatic compounds as well as vinyl polymerizations. Transferases are effective catalysts for producing polysaccharide having a variety of structure and polyesters. Hydrolases catalyzing the bond-cleaving of macromolecules in vivo, catalyze the reverse reaction for bond forming in vitro to give various polysaccharides and functionalized polyesters. The enzymatic polymerizations allowed the first in vitro synthesis of natural polysaccharides having complicated structures like cellulose, amylose, xylan, chitin, hyaluronan, and chondroitin. These polymerizations are "green" with several respects; nontoxicity of enzyme, high catalyst efficiency, selective reactions under mild conditions using green solvents and renewable starting materials, and producing minimal byproducts. Thus, the enzymatic polymerization is desirable for the environment and contributes to "green polymer chemistry" for maintaining sustainable society.

  16. Atomic force microscopy applied to study macromolecular content of embedded biological material

    Energy Technology Data Exchange (ETDEWEB)

    Matsko, Nadejda B. [Electron Microscopy Centre, Institute of Applied Physics, HPM C 15.1, ETH-Hoenggerberg, CH-8093, Zurich (Switzerland)]. E-mail: matsko@iap.phys.ethz.ch

    2007-02-15

    We demonstrate that atomic force microscopy represents a powerful tool for the estimation of structural preservation of biological samples embedded in epoxy resin, in terms of their macromolecular distribution and architecture. The comparison of atomic force microscopy (AFM) and transmission electron microscopy (TEM) images of a biosample (Caenorhabditis elegans) prepared following to different types of freeze-substitution protocols (conventional OsO{sub 4} fixation, epoxy fixation) led to the conclusion that high TEM stainability of the sample results from a low macromolecular density of the cellular matrix. We propose a novel procedure aimed to obtain AFM and TEM images of the same particular organelle, which strongly facilitates AFM image interpretation and reveals new ultrastructural aspects (mainly protein arrangement) of a biosample in addition to TEM data.

  17. Effect of macromolecular crowding on the rate of diffusion-limited ...

    Indian Academy of Sciences (India)

    The enzymatic reaction rate has been shown to be affected by the presence of such macromolecules. A simple numerical model is proposed here based on percolation and diffusion in disordered systems to study the effect of macromolecular crowding on the enzymatic reaction rates. The model qualitatively explains some ...

  18. New Paradigm for Macromolecular Crystallography Experiments at SSRL: Automated Crystal Screening And Remote Data Collection

    International Nuclear Information System (INIS)

    Soltis, S.M.; Cohen, A.E.; Deacon, A.; Eriksson, T.; Gonzalez, A.; McPhillips, S.; Chui, H.; Dunten, P.; Hollenbeck, M.; Mathews, I.; Miller, M.; Moorhead, P.; Phizackerley, R.P.; Smith, C.; Song, J.; Bedem, H. van dem; Ellis, P.; Kuhn, P.; McPhillips, T.; Sauter, N.; Sharp, K.

    2009-01-01

    Complete automation of the macromolecular crystallography experiment has been achieved at Stanford Synchrotron Radiation Lightsource (SSRL) through the combination of robust mechanized experimental hardware and a flexible control system with an intuitive user interface. These highly reliable systems have enabled crystallography experiments to be carried out from the researchers' home institutions and other remote locations while retaining complete control over even the most challenging systems. A breakthrough component of the system, the Stanford Auto-Mounter (SAM), has enabled the efficient mounting of cryocooled samples without human intervention. Taking advantage of this automation, researchers have successfully screened more than 200 000 samples to select the crystals with the best diffraction quality for data collection as well as to determine optimal crystallization and cryocooling conditions. These systems, which have been deployed on all SSRL macromolecular crystallography beamlines and several beamlines worldwide, are used by more than 80 research groups in remote locations, establishing a new paradigm for macromolecular crystallography experimentation.

  19. In Vitro and In Vivo Evaluation of Microparticulate Drug Delivery Systems Composed of Macromolecular Prodrugs

    Directory of Open Access Journals (Sweden)

    Yoshiharu Machida

    2008-08-01

    Full Text Available Macromolecular prodrugs are very useful systems for achieving controlled drug release and drug targeting. In particular, various macromolecule-antitumor drug conjugates enhance the effectiveness and improve the toxic side effects. Also, polymeric micro- and nanoparticles have been actively examined and their in vivo behaviors elucidated, and it has been realized that their particle characteristics are very useful to control drug behavior. Recently, researches based on the combination of the concepts of macromolecular prodrugs and micro- or nanoparticles have been reported, although they are limited. Macromolecular prodrugs enable drugs to be released at a certain controlled release rate based on the features of the macromolecule-drug linkage. Micro- and nanoparticles can control in vivo behavior based on their size, surface charge and surface structure. These merits are expected for systems produced by the combination of each concept. In this review, several micro- or nanoparticles composed of macromolecule-drug conjugates are described for their preparation, in vitro properties and/or in vivo behavior.

  20. Inferring Gear Damage from Oil-Debris and Vibration Data

    Science.gov (United States)

    Dempsey, Paula

    2006-01-01

    A system for real-time detection of surface-fatigue-pitting damage to gears for use in a helicopter transmission is based on fuzzy-logic used to fuse data from sensors that measure oil-borne debris, referred to as "oil debris" in the article, and vibration signatures. A system to detect helicopter-transmission gear damage is beneficial because the power train of a helicopter is essential for propulsion, lift, and maneuvering, hence, the integrity of the transmission is critical to helicopter safety. To enable detection of an impending transmission failure, an ideal diagnostic system should provide real-time monitoring of the "health" of the transmission, be capable of a high level of reliable detection (with minimization of false alarms), and provide human users with clear information on the health of the system without making it necessary for them to interpret large amounts of sensor data.

  1. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    Science.gov (United States)

    Donovan, Preston; Chehreghanianzabi, Yasaman; Rathinam, Muruhan; Zustiak, Silviya Petrova

    2016-01-01

    The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

  2. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    Directory of Open Access Journals (Sweden)

    Preston Donovan

    Full Text Available The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

  3. PRIGo: a new multi-axis goniometer for macromolecular crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Waltersperger, Sandro; Olieric, Vincent, E-mail: vincent.olieric@psi.ch; Pradervand, Claude [Paul Scherrer Institute, Villigen PSI (Switzerland); Glettig, Wayne [Centre Suisse d’Electronique et Microtechnique SA, Neuchâtel 2002 (Switzerland); Salathe, Marco; Fuchs, Martin R.; Curtin, Adrian; Wang, Xiaoqiang; Ebner, Simon; Panepucci, Ezequiel; Weinert, Tobias [Paul Scherrer Institute, Villigen PSI (Switzerland); Schulze-Briese, Clemens [Dectris Ltd, Baden 5400 (Switzerland); Wang, Meitian, E-mail: vincent.olieric@psi.ch [Paul Scherrer Institute, Villigen PSI (Switzerland)

    2015-05-09

    The design and performance of the new multi-axis goniometer PRIGo developed at the Swiss Light Source at Paul Scherrer Institute is described. The Parallel Robotics Inspired Goniometer (PRIGo) is a novel compact and high-precision goniometer providing an alternative to (mini-)kappa, traditional three-circle goniometers and Eulerian cradles used for sample reorientation in macromolecular crystallography. Based on a combination of serial and parallel kinematics, PRIGo emulates an arc. It is mounted on an air-bearing stage for rotation around ω and consists of four linear positioners working synchronously to achieve x, y, z translations and χ rotation (0–90°), followed by a ϕ stage (0–360°) for rotation around the sample holder axis. Owing to the use of piezo linear positioners and active correction, PRIGo features spheres of confusion of <1 µm, <7 µm and <10 µm for ω, χ and ϕ, respectively, and is therefore very well suited for micro-crystallography. PRIGo enables optimal strategies for both native and experimental phasing crystallographic data collection. Herein, PRIGo hardware and software, its calibration, as well as applications in macromolecular crystallography are described.

  4. Dendrimer-based Macromolecular MRI Contrast Agents: Characteristics and Application

    Directory of Open Access Journals (Sweden)

    Hisataka Kobayashi

    2003-01-01

    Full Text Available Numerous macromolecular MRI contrast agents prepared employing relatively simple chemistry may be readily available that can provide sufficient enhancement for multiple applications. These agents operate using a ~100-fold lower concentration of gadolinium ions in comparison to the necessary concentration of iodine employed in CT imaging. Herein, we describe some of the general potential directions of macromolecular MRI contrast agents using our recently reported families of dendrimer-based agents as examples. Changes in molecular size altered the route of excretion. Smaller-sized contrast agents less than 60 kDa molecular weight were excreted through the kidney resulting in these agents being potentially suitable as functional renal contrast agents. Hydrophilic and larger-sized contrast agents were found better suited for use as blood pool contrast agents. Hydrophobic variants formed with polypropylenimine diaminobutane dendrimer cores created liver contrast agents. Larger hydrophilic agents are useful for lymphatic imaging. Finally, contrast agents conjugated with either monoclonal antibodies or with avidin are able to function as tumor-specific contrast agents, which also might be employed as therapeutic drugs for either gadolinium neutron capture therapy or in conjunction with radioimmunotherapy.

  5. Collagen macromolecular drug delivery systems

    International Nuclear Information System (INIS)

    Gilbert, D.L.

    1988-01-01

    The objective of this study was to examine collagen for use as a macromolecular drug delivery system by determining the mechanism of release through a matrix. Collagen membranes varying in porosity, crosslinking density, structure and crosslinker were fabricated. Collagen characterized by infrared spectroscopy and solution viscosity was determined to be pure and native. The collagen membranes were determined to possess native vs. non-native quaternary structure and porous vs. dense aggregate membranes by electron microscopy. Collagen monolithic devices containing a model macromolecule (inulin) were fabricated. In vitro release rates were found to be linear with respect to t 1/2 and were affected by crosslinking density, crosslinker and structure. The biodegradation of the collagen matrix was also examined. In vivo biocompatibility, degradation and 14 C-inulin release rates were evaluated subcutaneously in rats

  6. Thermodynamics of Macromolecular Association in Heterogeneous Crowding Environments: Theoretical and Simulation Studies with a Simplified Model.

    Science.gov (United States)

    Ando, Tadashi; Yu, Isseki; Feig, Michael; Sugita, Yuji

    2016-11-23

    The cytoplasm of a cell is crowded with many different kinds of macromolecules. The macromolecular crowding affects the thermodynamics and kinetics of biological reactions in a living cell, such as protein folding, association, and diffusion. Theoretical and simulation studies using simplified models focus on the essential features of the crowding effects and provide a basis for analyzing experimental data. In most of the previous studies on the crowding effects, a uniform crowder size is assumed, which is in contrast to the inhomogeneous size distribution of macromolecules in a living cell. Here, we evaluate the free energy changes upon macromolecular association in a cell-like inhomogeneous crowding system via a theory of hard-sphere fluids and free energy calculations using Brownian dynamics trajectories. The inhomogeneous crowding model based on 41 different types of macromolecules represented by spheres with different radii mimics the physiological concentrations of macromolecules in the cytoplasm of Mycoplasma genitalium. The free energy changes of macromolecular association evaluated by the theory and simulations were in good agreement with each other. The crowder size distribution affects both specific and nonspecific molecular associations, suggesting that not only the volume fraction but also the size distribution of macromolecules are important factors for evaluating in vivo crowding effects. This study relates in vitro experiments on macromolecular crowding to in vivo crowding effects by using the theory of hard-sphere fluids with crowder-size heterogeneity.

  7. Variable effects of soman on macromolecular secretion by ferret trachea

    International Nuclear Information System (INIS)

    McBride, R.K.; Zwierzynski, D.J.; Stone, K.K.; Culp, D.J.; Marin, M.G.

    1991-01-01

    The purpose of this study was to examine the effect of the anticholinesterase agent, soman, on macromolecular secretion by ferret trachea, in vitro. We mounted pieces of ferret trachea in Ussing-type chambers. Secreted sulfated macromolecules were radiolabeled by adding 500 microCi of 35 SO 4 to the submucosal medium and incubating for 17 hr. Soman added to the submucosal side produced a concentration-dependent increase in radiolabeled macromolecular release with a maximal secretory response (mean +/- SD) of 202 +/- 125% (n = 8) relative to the basal secretion rate at a concentration of 10 - 7 M. The addition of either 10 -6 M pralidoxime (acetylcholinesterase reactivator) or 10 -6 M atropine blocked the response to 10 -7 M soman. At soman concentrations greater than 10 -7 M, secretion rate decreased and was not significantly different from basal secretion. Additional experiments utilizing acetylcholine and the acetylcholinesterase inhibitor, physostigmine, suggest that inhibition of secretion by high concentrations of soman may be due to a secondary antagonistic effect of soman on muscarinic receptors

  8. Data Management System at the Photon Factory Macromolecular Crystallography Beamline

    International Nuclear Information System (INIS)

    Yamada, Y; Matsugaki, N; Chavas, L M G; Hiraki, M; Igarashi, N; Wakatsuki, S

    2013-01-01

    Macromolecular crystallography is a very powerful tool to investigate three-dimensional structures of macromolecules at the atomic level, and is widely spread among structural biology researchers. Due to recent upgrades of the macromolecular crystallography beamlines at the Photon Factory, beamline throughput has improved, allowing more experiments to be conducted during a user's beam time. Although the number of beamlines has increased, so has the number of beam time applications. Consequently, both the experimental data from users' experiments and data derived from beamline operations have dramatically increased, causing difficulties in organizing these diverse and large amounts of data for the beamline operation staff and users. To overcome this problem, we have developed a data management system by introducing commercial middleware, which consists of a controller, database, and web servers. We have prepared several database projects using this system. Each project is dedicated to a certain aspect such as experimental results, beam time applications, beam time schedule, or beamline operation reports. Then we designed a scheme to link all the database projects.

  9. Modeling the multi-scale mechanisms of macromolecular resource allocation

    DEFF Research Database (Denmark)

    Yang, Laurence; Yurkovich, James T; King, Zachary A

    2018-01-01

    As microbes face changing environments, they dynamically allocate macromolecular resources to produce a particular phenotypic state. Broad 'omics' data sets have revealed several interesting phenomena regarding how the proteome is allocated under differing conditions, but the functional consequen...... and detail how mathematical models have aided in our understanding of these processes. Ultimately, such modeling efforts have helped elucidate the principles of proteome allocation and hold promise for further discovery....

  10. Pi sampling: a methodical and flexible approach to initial macromolecular crystallization screening

    International Nuclear Information System (INIS)

    Gorrec, Fabrice; Palmer, Colin M.; Lebon, Guillaume; Warne, Tony

    2011-01-01

    Pi sampling, derived from the incomplete factorial approach, is an effort to maximize the diversity of macromolecular crystallization conditions and to facilitate the preparation of 96-condition initial screens. The Pi sampling method is derived from the incomplete factorial approach to macromolecular crystallization screen design. The resulting ‘Pi screens’ have a modular distribution of a given set of up to 36 stock solutions. Maximally diverse conditions can be produced by taking into account the properties of the chemicals used in the formulation and the concentrations of the corresponding solutions. The Pi sampling method has been implemented in a web-based application that generates screen formulations and recipes. It is particularly adapted to screens consisting of 96 different conditions. The flexibility and efficiency of Pi sampling is demonstrated by the crystallization of soluble proteins and of an integral membrane-protein sample

  11. A fuzzy logic-based damage identification method for simply-supported bridge using modal shape ratios

    Directory of Open Access Journals (Sweden)

    Hanbing Liu

    2012-08-01

    Full Text Available A fuzzy logic system (FLS is established for damage identification of simply supported bridge. A novel damage indicator is developed based on ratios of mode shape components between before and after damage. Numerical simulation of a simply-supported bridge is presented to demonstrate the memory, inference and anti-noise ability of the proposed method. The bridge is divided into eight elements and nine nodes, the damage indicator vector at characteristic nodes is used as the input measurement of FLS. Results reveal that FLS can detect damage of training patterns with an accuracy of 100%. Aiming at other test patterns, the FLS also possesses favorable inference ability, the identification accuracy for single damage location is up to 93.75%. Tests with noise simulated data show that the FLS possesses favorable anti-noise ability.

  12. Functionalization of Planet-Satellite Nanostructures Revealed by Nanoscopic Localization of Distinct Macromolecular Species

    KAUST Repository

    Rossner, Christian; Roddatis, Vladimir; Lopatin, Sergei; Vana, Philipp

    2016-01-01

    The development of a straightforward method is reported to form hybrid polymer/gold planet-satellite nanostructures (PlSNs) with functional polymer. Polyacrylate type polymer with benzyl chloride in its backbone as a macromolecular tracer

  13. Thiomers for oral delivery of hydrophilic macromolecular drugs.

    Science.gov (United States)

    Bernkop-Schnürch, Andreas; Hoffer, Martin H; Kafedjiiski, Krum

    2004-11-01

    In recent years thiolated polymers (thiomers) have appeared as a promising new tool in oral drug delivery. Thiomers are obtained by the immobilisation of thio-bearing ligands to mucoadhesive polymeric excipients. By the formation of disulfide bonds with mucus glycoproteins, the mucoadhesive properties of thiomers are up to 130-fold improved compared with the corresponding unmodified polymers. Owing to the formation of inter- and intramolecular disulfide bonds within the thiomer itself, matrix tablets and particulate delivery systems show strong cohesive properties, resulting in comparatively higher stability, prolonged disintegration times and a more controlled drug release. The permeation of hydrophilic macromolecular drugs through the gastrointestinal (GI) mucosa can be improved by the use of thiomers. Furthermore, some thiomers exhibit improved inhibitory properties towards GI peptidases. The efficacy of thiomers in oral drug delivery has been demonstrated by various in vivo studies. A pharmacological efficacy of 1%, for example, was achieved in rats by oral administration of calcitonin tablets comprising a thiomer. Furthermore, tablets comprising a thiomer and pegylated insulin resulted in a pharmacological efficacy of 7% after oral application to diabetic mice. Low-molecular-weight heparin embedded in thiolated polycarbophil led to an absolute bioavailability of > or = 20% after oral administration to rats. In these studies, formulations comprising the corresponding unmodified polymer had only a marginal or no effect. These results indicate drug carrier systems based on thiomers appear to be a promising tool for oral delivery of hydrophilic macromolecular drugs.

  14. Auto- and cross-power spectral analysis of dual trap optical tweezer experiments using Bayesian inference.

    Science.gov (United States)

    von Hansen, Yann; Mehlich, Alexander; Pelz, Benjamin; Rief, Matthias; Netz, Roland R

    2012-09-01

    The thermal fluctuations of micron-sized beads in dual trap optical tweezer experiments contain complete dynamic information about the viscoelastic properties of the embedding medium and-if present-macromolecular constructs connecting the two beads. To quantitatively interpret the spectral properties of the measured signals, a detailed understanding of the instrumental characteristics is required. To this end, we present a theoretical description of the signal processing in a typical dual trap optical tweezer experiment accounting for polarization crosstalk and instrumental noise and discuss the effect of finite statistics. To infer the unknown parameters from experimental data, a maximum likelihood method based on the statistical properties of the stochastic signals is derived. In a first step, the method can be used for calibration purposes: We propose a scheme involving three consecutive measurements (both traps empty, first one occupied and second empty, and vice versa), by which all instrumental and physical parameters of the setup are determined. We test our approach for a simple model system, namely a pair of unconnected, but hydrodynamically interacting spheres. The comparison to theoretical predictions based on instantaneous as well as retarded hydrodynamics emphasizes the importance of hydrodynamic retardation effects due to vorticity diffusion in the fluid. For more complex experimental scenarios, where macromolecular constructs are tethered between the two beads, the same maximum likelihood method in conjunction with dynamic deconvolution theory will in a second step allow one to determine the viscoelastic properties of the tethered element connecting the two beads.

  15. ISPyB: an information management system for synchrotron macromolecular crystallography.

    Science.gov (United States)

    Delagenière, Solange; Brenchereau, Patrice; Launer, Ludovic; Ashton, Alun W; Leal, Ricardo; Veyrier, Stéphanie; Gabadinho, José; Gordon, Elspeth J; Jones, Samuel D; Levik, Karl Erik; McSweeney, Seán M; Monaco, Stéphanie; Nanao, Max; Spruce, Darren; Svensson, Olof; Walsh, Martin A; Leonard, Gordon A

    2011-11-15

    Individual research groups now analyze thousands of samples per year at synchrotron macromolecular crystallography (MX) resources. The efficient management of experimental data is thus essential if the best possible experiments are to be performed and the best possible data used in downstream processes in structure determination pipelines. Information System for Protein crystallography Beamlines (ISPyB), a Laboratory Information Management System (LIMS) with an underlying data model allowing for the integration of analyses down-stream of the data collection experiment was developed to facilitate such data management. ISPyB is now a multisite, generic LIMS for synchrotron-based MX experiments. Its initial functionality has been enhanced to include improved sample tracking and reporting of experimental protocols, the direct ranking of the diffraction characteristics of individual samples and the archiving of raw data and results from ancillary experiments and post-experiment data processing protocols. This latter feature paves the way for ISPyB to play a central role in future macromolecular structure solution pipelines and validates the application of the approach used in ISPyB to other experimental techniques, such as biological solution Small Angle X-ray Scattering and spectroscopy, which have similar sample tracking and data handling requirements.

  16. Macromolecular systems for vaccine delivery.

    Science.gov (United States)

    MuŽíková, G; Laga, R

    2016-10-20

    Vaccines have helped considerably in eliminating some life-threatening infectious diseases in past two hundred years. Recently, human medicine has focused on vaccination against some of the world's most common infectious diseases (AIDS, malaria, tuberculosis, etc.), and vaccination is also gaining popularity in the treatment of cancer or autoimmune diseases. The major limitation of current vaccines lies in their poor ability to generate a sufficient level of protective antibodies and T cell responses against diseases such as HIV, malaria, tuberculosis and cancers. Among the promising vaccination systems that could improve the potency of weakly immunogenic vaccines belong macromolecular carriers (water soluble polymers, polymer particels, micelles, gels etc.) conjugated with antigens and immunistumulatory molecules. The size, architecture, and the composition of the high molecular-weight carrier can significantly improve the vaccine efficiency. This review includes the most recently developed (bio)polymer-based vaccines reported in the literature.

  17. Maintaining Genome Stability in Defiance of Mitotic DNA Damage

    Science.gov (United States)

    Ferrari, Stefano; Gentili, Christian

    2016-01-01

    The implementation of decisions affecting cell viability and proliferation is based on prompt detection of the issue to be addressed, formulation and transmission of a correct set of instructions and fidelity in the execution of orders. While the first and the last are purely mechanical processes relying on the faithful functioning of single proteins or macromolecular complexes (sensors and effectors), information is the real cue, with signal amplitude, duration, and frequency ultimately determining the type of response. The cellular response to DNA damage is no exception to the rule. In this review article we focus on DNA damage responses in G2 and Mitosis. First, we set the stage describing mitosis and the machineries in charge of assembling the apparatus responsible for chromosome alignment and segregation as well as the inputs that control its function (checkpoints). Next, we examine the type of issues that a cell approaching mitosis might face, presenting the impact of post-translational modifications (PTMs) on the correct and timely functioning of pathways correcting errors or damage before chromosome segregation. We conclude this essay with a perspective on the current status of mitotic signaling pathway inhibitors and their potential use in cancer therapy. PMID:27493659

  18. The contrasting effect of macromolecular crowding on amyloid fibril formation.

    Directory of Open Access Journals (Sweden)

    Qian Ma

    Full Text Available Amyloid fibrils associated with neurodegenerative diseases can be considered biologically relevant failures of cellular quality control mechanisms. It is known that in vivo human Tau protein, human prion protein, and human copper, zinc superoxide dismutase (SOD1 have the tendency to form fibril deposits in a variety of tissues and they are associated with different neurodegenerative diseases, while rabbit prion protein and hen egg white lysozyme do not readily form fibrils and are unlikely to cause neurodegenerative diseases. In this study, we have investigated the contrasting effect of macromolecular crowding on fibril formation of different proteins.As revealed by assays based on thioflavin T binding and turbidity, human Tau fragments, when phosphorylated by glycogen synthase kinase-3β, do not form filaments in the absence of a crowding agent but do form fibrils in the presence of a crowding agent, and the presence of a strong crowding agent dramatically promotes amyloid fibril formation of human prion protein and its two pathogenic mutants E196K and D178N. Such an enhancing effect of macromolecular crowding on fibril formation is also observed for a pathological human SOD1 mutant A4V. On the other hand, rabbit prion protein and hen lysozyme do not form amyloid fibrils when a crowding agent at 300 g/l is used but do form fibrils in the absence of a crowding agent. Furthermore, aggregation of these two proteins is remarkably inhibited by Ficoll 70 and dextran 70 at 200 g/l.We suggest that proteins associated with neurodegenerative diseases are more likely to form amyloid fibrils under crowded conditions than in dilute solutions. By contrast, some of the proteins that are not neurodegenerative disease-associated are unlikely to misfold in crowded physiological environments. A possible explanation for the contrasting effect of macromolecular crowding on these two sets of proteins (amyloidogenic proteins and non-amyloidogenic proteins has been

  19. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  20. Polydisulfide Manganese(II) Complexes as Non-Gadolinium Biodegradable Macromolecular MRI Contrast Agents

    Science.gov (United States)

    Ye, Zhen; Jeong, Eun-Kee; Wu, Xueming; Tan, Mingqian; Yin, Shouyu; Lu, Zheng-Rong

    2011-01-01

    Purpose To develop safe and effective manganese(II) based biodegradable macromolecular MRI contrast agents. Materials and Methods In this study, we synthesized and characterized two polydisulfide manganese(II) complexes, Mn-DTPA cystamine copolymers and Mn-EDTA cystamine copolymers, as new biodegradable macromolecular MRI contrast agents. The contrast enhancement of the two manganese based contrast agents were evaluated in mice bearing MDA-MB-231 human breast carcinoma xenografts, in comparison with MnCl2. Results The T1 and T2 relaxivities were 4.74 and 10.38 mM−1s−1 per manganese at 3T for Mn-DTPA cystamine copolymers (Mn=30.50 kDa) and 6.41 and 9.72 mM−1s−1 for Mn-EDTA cystamine copolymers (Mn= 61.80 kDa). Both polydisulfide Mn(II) complexes showed significant liver, myocardium and tumor enhancement. Conclusion The manganese based polydisulfide contrast agents have a potential to be developed as alternative non-gadolinium contrast agents for MR cancer and myocardium imaging. PMID:22031457

  1. In-vacuum long-wavelength macromolecular crystallography.

    Science.gov (United States)

    Wagner, Armin; Duman, Ramona; Henderson, Keith; Mykhaylyk, Vitaliy

    2016-03-01

    Structure solution based on the weak anomalous signal from native (protein and DNA) crystals is increasingly being attempted as part of synchrotron experiments. Maximizing the measurable anomalous signal by collecting diffraction data at longer wavelengths presents a series of technical challenges caused by the increased absorption of X-rays and larger diffraction angles. A new beamline at Diamond Light Source has been built specifically for collecting data at wavelengths beyond the capability of other synchrotron macromolecular crystallography beamlines. Here, the theoretical considerations in support of the long-wavelength beamline are outlined and the in-vacuum design of the endstation is discussed, as well as other hardware features aimed at enhancing the accuracy of the diffraction data. The first commissioning results, representing the first in-vacuum protein structure solution, demonstrate the promising potential of the beamline.

  2. Workshop on algorithms for macromolecular modeling. Final project report, June 1, 1994--May 31, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Leimkuhler, B.; Hermans, J.; Skeel, R.D.

    1995-07-01

    A workshop was held on algorithms and parallel implementations for macromolecular dynamics, protein folding, and structural refinement. This document contains abstracts and brief reports from that workshop.

  3. MR lymphography with macromolecular Gd-DTPA compounds

    International Nuclear Information System (INIS)

    Hamm, B.; Wagner, S.; Branding, G.; Taupitz, M.; Wolf, K.J.

    1990-01-01

    This paper investigates the suitability of macromolecular Gd-DTPA compounds as signal-enhancing lymphographic agents in MR imaging. Two Gd-DTPA polylysin compounds and Gd-DTPA albumin, with molecular weights of 48,000,170,000, and 87,000 daltons, respectively, were tested in rabbits at gadolinium doses of 5 and 15 μmol per animal. Three animals were examined at each dose with T1-weighted sequences. The iliac lymph nodes were imaged prior to and during unilateral endolymphatic infusion into a femoral lymph vessel as well as over a period of 2 hours thereafter. All contrast media showed a homogeneous and pronounced signal enhancement in the lymph nodes during infusion at both doses

  4. Aging changes of macromolecular synthesis in the mitochondria of mouse hepatocytes as revealed by microscopic radioautography

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, Tetsuji [Shinshu University, Matsumoto (Japan). Dept. of Anatomy and Cell Biology

    2007-07-01

    This mini-review reports aging changes of macromolecular synthesis in the mitochondria of mouse hepatocytes. We have observed the macromolecular synthesis, such as DNA, RNA and proteins, in the mitochondria of various mammalian cells by means of electron microscopic radioautography technique developed in our laboratory. The number of mitochondria per cell, number of labeled mitochondria per cell with 3H-thymidine, 3H-uridine and 3H-leucine, precursors for DNA, RNA and proteins, respectively, were counted and the labeling indices at various ages, from fetal to postnatal early days and several months to 1 and 2 years in senescence, were calculated, which showed variations due to aging. (author)

  5. Damage effects and mechanisms of proton irradiation on methyl silicone rubber

    International Nuclear Information System (INIS)

    Zhang, L.X.; He, Sh.Y.; Xu, Zh.; Wei, Q.

    2004-01-01

    A study was performed on the damage effects and mechanisms of proton irradiation with 150 keV energy to space-grade methyl silicone rubber. The changes in surface morphology, mechanical properties, infrared attenuated total reflection (ATR) spectrum, mass spectrum and pyrolysis gas chromatography-mass spectrum (PYGC-MS) indicated that, under lower fluence, the proton radiation would induce cross-linking effect, resulting in an increase in tensile strengths and hardness of the methyl silicon rubber. However, under higher proton fluence, the radiation-induced degradation, which decreased the tensile strengths and hardness, became a dominant effect. A macromolecular-network destruction model for the silicone rubber radiated with the protons was proposed

  6. Plasma membrane damage detected by nucleic acid leakage

    International Nuclear Information System (INIS)

    Fortunati, E.; Bianchi, V.

    1989-01-01

    Among the indicators of membrane damage, the leakage of intracellular components into the medium is the most directly related to the perturbations of the membrane molecular organization. The extent of the damage can be evaluated from the size of the released components. We have designed a protocol for the detection of membrane leakage based on the preincubation of cells with tritiated adenine for 24 h, followed by a 24-h chase in nonradioactive medium. The treatment takes place when the distribution of the precursor among its end products has reached the plateau, and thus the differences of radioactivity in the fractions obtained from the control and treated cultures (medium, nucleotide pool, RNA, DNA) correspond to actual quantitative variations induced by the test chemical. Aliquots of the medium are processed to determine which percentage of the released material is macromolecular, in order to distinguish between mild and severe membrane damage. The origin of the extracellular radioactivity can be recognized from the variations of RNA counts in the treated cells. DNA radioactivity is used to evaluate the number of cells that remain attached to the plates in the different conditions of treatment. By this means, generalized permeabilization of membranes to macromolecules is distinguished from complete solubilization of only a subpopulation of cells. We present some examples of application of the protocol with detergents (LAS, SDS, Triton X-100) and with Cr(VI), which damages cell membranes by a different mechanism of action

  7. Geometry of the Nojima fault at Nojima-Hirabayashi, Japan - I. A simple damage structure inferred from borehole core permeability

    Science.gov (United States)

    Lockner, David A.; Tanaka, Hidemi; Ito, Hisao; Ikeda, Ryuji; Omura, Kentaro; Naka, Hisanobu

    2009-01-01

    The 1995 Kobe (Hyogo-ken Nanbu) earthquake, M = 7.2, ruptured the Nojima fault in southwest Japan. We have studied core samples taken from two scientific drillholes that crossed the fault zone SW of the epicentral region on Awaji Island. The shallower hole, drilled by the Geological Survey of Japan (GSJ), was started 75 m to the SE of the surface trace of the Nojima fault and crossed the fault at a depth of 624 m. A deeper hole, drilled by the National Research Institute for Earth Science and Disaster Prevention (NIED) was started 302 m to the SE of the fault and crossed fault strands below a depth of 1140 m. We have measured strength and matrix permeability of core samples taken from these two drillholes. We find a strong correlation between permeability and proximity to the fault zone shear axes. The half-width of the high permeability zone (approximately 15 to 25 m) is in good agreement with the fault zone width inferred from trapped seismic wave analysis and other evidence. The fault zone core or shear axis contains clays with permeabilities of approximately 0.1 to 1 microdarcy at 50 MPa effective confining pressure (10 to 30 microdarcy at in situ pressures). Within a few meters of the fault zone core, the rock is highly fractured but has sustained little net shear. Matrix permeability of this zone is approximately 30 to 60 microdarcy at 50 MPa effective confining pressure (300 to 1000 microdarcy at in situ pressures). Outside this damage zone, matrix permeability drops below 0.01 microdarcy. The clay-rich core material has the lowest strength with a coefficient of friction of approximately 0.55. Shear strength increases with distance from the shear axis. These permeability and strength observations reveal a simple fault zone structure with a relatively weak fine-grained core surrounded by a damage zone of fractured rock. In this case, the damage zone will act as a high-permeability conduit for vertical and horizontal flow in the plane of the

  8. Detection and cellular localisation of the synthetic soluble macromolecular drug carrier pHPMA

    Czech Academy of Sciences Publication Activity Database

    Kissel, M.; Peschke, P.; Šubr, Vladimír; Ulbrich, Karel; Strunz, A. M.; Kühnlein, R.; Debus, J.; Friedrich, E.

    2002-01-01

    Roč. 29, č. 8 (2002), s. 1055-1062 ISSN 1619-7070 R&D Projects: GA ČR GV307/96/K226 Institutional research plan: CEZ:AV0Z4050913 Keywords : EPR effect * Radiolabelled macromolecules * Pharmacokinetic Subject RIV: CD - Macromolecular Chemistry Impact factor: 3.568, year: 2002

  9. UQlust: combining profile hashing with linear-time ranking for efficient clustering and analysis of big macromolecular data.

    Science.gov (United States)

    Adamczak, Rafal; Meller, Jarek

    2016-12-28

    Advances in computing have enabled current protein and RNA structure prediction and molecular simulation methods to dramatically increase their sampling of conformational spaces. The quickly growing number of experimentally resolved structures, and databases such as the Protein Data Bank, also implies large scale structural similarity analyses to retrieve and classify macromolecular data. Consequently, the computational cost of structure comparison and clustering for large sets of macromolecular structures has become a bottleneck that necessitates further algorithmic improvements and development of efficient software solutions. uQlust is a versatile and easy-to-use tool for ultrafast ranking and clustering of macromolecular structures. uQlust makes use of structural profiles of proteins and nucleic acids, while combining a linear-time algorithm for implicit comparison of all pairs of models with profile hashing to enable efficient clustering of large data sets with a low memory footprint. In addition to ranking and clustering of large sets of models of the same protein or RNA molecule, uQlust can also be used in conjunction with fragment-based profiles in order to cluster structures of arbitrary length. For example, hierarchical clustering of the entire PDB using profile hashing can be performed on a typical laptop, thus opening an avenue for structural explorations previously limited to dedicated resources. The uQlust package is freely available under the GNU General Public License at https://github.com/uQlust . uQlust represents a drastic reduction in the computational complexity and memory requirements with respect to existing clustering and model quality assessment methods for macromolecular structure analysis, while yielding results on par with traditional approaches for both proteins and RNAs.

  10. Predictive Mechanical Characterization of Macro-Molecular Material Chemistry Structures of Cement Paste at Nano Scale - Two-phase Macro-Molecular Structures of Calcium Silicate Hydrate, Tri-Calcium Silicate, Di-Calcium Silicate and Calcium Hydroxide

    Science.gov (United States)

    Padilla Espinosa, Ingrid Marcela

    Concrete is a hierarchical composite material with a random structure over a wide range of length scales. At submicron length scale the main component of concrete is cement paste, formed by the reaction of Portland cement clinkers and water. Cement paste acts as a binding matrix for the other components and is responsible for the strength of concrete. Cement paste microstructure contains voids, hydrated and unhydrated cement phases. The main crystalline phases of unhydrated cement are tri-calcium silicate (C3S) and di-calcium silicate (C2S), and of hydrated cement are calcium silicate hydrate (CSH) and calcium hydroxide (CH). Although efforts have been made to comprehend the chemical and physical nature of cement paste, studies at molecular level have primarily been focused on individual components. Present research focuses on the development of a method to model, at molecular level, and analysis of the two-phase combination of hydrated and unhydrated phases of cement paste as macromolecular systems. Computational molecular modeling could help in understanding the influence of the phase interactions on the material properties, and mechanical performance of cement paste. Present work also strives to create a framework for molecular level models suitable for potential better comparisons with low length scale experimental methods, in which the sizes of the samples involve the mixture of different hydrated and unhydrated crystalline phases of cement paste. Two approaches based on two-phase cement paste macromolecular structures, one involving admixed molecular phases, and the second involving cluster of two molecular phases are investigated. The mechanical properties of two-phase macromolecular systems of cement paste consisting of key hydrated phase CSH and unhydrated phases C3S or C2S, as well as CSH with the second hydrated phase CH were calculated. It was found that these cement paste two-phase macromolecular systems predicted an isotropic material behavior. Also

  11. Coevolutionary constraints in the sequence-space of macromolecular complexes reflect their self-assembly pathways.

    Science.gov (United States)

    Mallik, Saurav; Kundu, Sudip

    2017-07-01

    Is the order in which biomolecular subunits self-assemble into functional macromolecular complexes imprinted in their sequence-space? Here, we demonstrate that the temporal order of macromolecular complex self-assembly can be efficiently captured using the landscape of residue-level coevolutionary constraints. This predictive power of coevolutionary constraints is irrespective of the structural, functional, and phylogenetic classification of the complex and of the stoichiometry and quaternary arrangement of the constituent monomers. Combining this result with a number of structural attributes estimated from the crystal structure data, we find indications that stronger coevolutionary constraints at interfaces formed early in the assembly hierarchy probably promotes coordinated fixation of mutations that leads to high-affinity binding with higher surface area, increased surface complementarity and elevated number of molecular contacts, compared to those that form late in the assembly. Proteins 2017; 85:1183-1189. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Interplay between the bacterial nucleoid protein H-NS and macromolecular crowding in compacting DNA

    NARCIS (Netherlands)

    Wintraecken, C.H.J.M.

    2012-01-01

    In this dissertation we discuss H-NS and its connection to nucleoid compaction and organization. Nucleoid formation involves a dramatic reduction in coil volume of the genomic DNA. Four factors are thought to influence coil volume: supercoiling, DNA charge neutralization, macromolecular

  13. Development of an online UV-visible microspectrophotometer for a macromolecular crystallography beamline.

    Science.gov (United States)

    Shimizu, Nobutaka; Shimizu, Tetsuya; Baba, Seiki; Hasegawa, Kazuya; Yamamoto, Masaki; Kumasaka, Takashi

    2013-11-01

    Measurement of the UV-visible absorption spectrum is a convenient technique for detecting chemical changes of proteins, and it is therefore useful to combine spectroscopy and diffraction studies. An online microspectrophotometer for the UV-visible region was developed and installed on the macromolecular crystallography beamline, BL38B1, at SPring-8. This spectrophotometer is equipped with a difference dispersive double monochromator, a mercury-xenon lamp as the light source, and a photomultiplier as the detector. The optical path is mostly constructed using mirrors, in order to obtain high brightness in the UV region, and the confocal optics are assembled using a cross-slit diaphragm like an iris to eliminate stray light. This system can measure optical densities up to a maximum of 4.0. To study the effect of radiation damage, preliminary measurements of glucose isomerase and thaumatin crystals were conducted in the UV region. Spectral changes dependent on X-ray dose were observed at around 280 nm, suggesting that structural changes involving Trp or Tyr residues occurred in the protein crystal. In the case of the thaumatin crystal, a broad peak around 400 nm was also generated after X-ray irradiation, suggesting the cleavage of a disulfide bond. Dose-dependent spectral changes were also observed in cryo-solutions alone, and these changes differed with the composition of the cryo-solution. These responses in the UV region are informative regarding the state of the sample; consequently, this device might be useful for X-ray crystallography.

  14. Bringing macromolecular machinery to life using 3D animation.

    Science.gov (United States)

    Iwasa, Janet H

    2015-04-01

    Over the past decade, there has been a rapid rise in the use of three-dimensional (3D) animation to depict molecular and cellular processes. Much of the growth in molecular animation has been in the educational arena, but increasingly, 3D animation software is finding its way into research laboratories. In this review, I will discuss a number of ways in which 3d animation software can play a valuable role in visualizing and communicating macromolecular structures and dynamics. I will also consider the challenges of using animation tools within the research sphere. Copyright © 2015. Published by Elsevier Ltd.

  15. Macromolecular and dendrimer-based magnetic resonance contrast agents

    Energy Technology Data Exchange (ETDEWEB)

    Bumb, Ambika; Brechbiel, Martin W. (Radiation Oncology Branch, National Cancer Inst., National Inst. of Health, Bethesda, MD (United States)), e-mail: pchoyke@mail.nih.gov; Choyke, Peter (Molecular Imaging Program, National Cancer Inst., National Inst. of Health, Bethesda, MD (United States))

    2010-09-15

    Magnetic resonance imaging (MRI) is a powerful imaging modality that can provide an assessment of function or molecular expression in tandem with anatomic detail. Over the last 20-25 years, a number of gadolinium-based MR contrast agents have been developed to enhance signal by altering proton relaxation properties. This review explores a range of these agents from small molecule chelates, such as Gd-DTPA and Gd-DOTA, to macromolecular structures composed of albumin, polylysine, polysaccharides (dextran, inulin, starch), poly(ethylene glycol), copolymers of cystamine and cystine with GD-DTPA, and various dendritic structures based on polyamidoamine and polylysine (Gadomers). The synthesis, structure, biodistribution, and targeting of dendrimer-based MR contrast agents are also discussed

  16. Human brain lesion-deficit inference remapped.

    Science.gov (United States)

    Mah, Yee-Haur; Husain, Masud; Rees, Geraint; Nachev, Parashkev

    2014-09-01

    Our knowledge of the anatomical organization of the human brain in health and disease draws heavily on the study of patients with focal brain lesions. Historically the first method of mapping brain function, it is still potentially the most powerful, establishing the necessity of any putative neural substrate for a given function or deficit. Great inferential power, however, carries a crucial vulnerability: without stronger alternatives any consistent error cannot be easily detected. A hitherto unexamined source of such error is the structure of the high-dimensional distribution of patterns of focal damage, especially in ischaemic injury-the commonest aetiology in lesion-deficit studies-where the anatomy is naturally shaped by the architecture of the vascular tree. This distribution is so complex that analysis of lesion data sets of conventional size cannot illuminate its structure, leaving us in the dark about the presence or absence of such error. To examine this crucial question we assembled the largest known set of focal brain lesions (n = 581), derived from unselected patients with acute ischaemic injury (mean age = 62.3 years, standard deviation = 17.8, male:female ratio = 0.547), visualized with diffusion-weighted magnetic resonance imaging, and processed with validated automated lesion segmentation routines. High-dimensional analysis of this data revealed a hidden bias within the multivariate patterns of damage that will consistently distort lesion-deficit maps, displacing inferred critical regions from their true locations, in a manner opaque to replication. Quantifying the size of this mislocalization demonstrates that past lesion-deficit relationships estimated with conventional inferential methodology are likely to be significantly displaced, by a magnitude dependent on the unknown underlying lesion-deficit relationship itself. Past studies therefore cannot be retrospectively corrected, except by new knowledge that would render them redundant

  17. Recent Major Improvements to the ALS Sector 5 Macromolecular Crystallography Beamlines

    International Nuclear Information System (INIS)

    Morton, Simon A.; Glossinger, James; Smith-Baumann, Alexis; McKean, John P.; Trame, Christine; Dickert, Jeff; Rozales, Anthony; Dauz, Azer; Taylor, John; Zwart, Petrus; Duarte, Robert; Padmore, Howard; McDermott, Gerry; Adams, Paul

    2007-01-01

    Although the Advanced Light Source (ALS) was initially conceived primarily as a low energy (1.9GeV) 3rd generation source of VUV and soft x-ray radiation it was realized very early in the development of the facility that a multipole wiggler source coupled with high quality, (brightness preserving), optics would result in a beamline whose performance across the optimal energy range (5-15keV) for macromolecular crystallography (MX) would be comparable to, or even exceed, that of many existing crystallography beamlines at higher energy facilities. Hence, starting in 1996, a suite of three beamlines, branching off a single wiggler source, was constructed, which together formed the ALS Macromolecular Crystallography Facility. From the outset this facility was designed to cater equally to the needs of both academic and industrial users with a heavy emphasis placed on the development and introduction of high throughput crystallographic tools, techniques, and facilities--such as large area CCD detectors, robotic sample handling and automounting facilities, a service crystallography program, and a tightly integrated, centralized, and highly automated beamline control environment for users. This facility was immediately successful, with the primary Multiwavelength Anomalous Diffraction beamline (5.0.2) in particular rapidly becoming one of the foremost crystallographic facilities in the US--responsible for structures such as the 70S ribosome. This success in-turn triggered enormous growth of the ALS macromolecular crystallography community and spurred the development of five additional ALS MX beamlines all utilizing the newly developed superconducting bending magnets ('superbends') as sources. However in the years since the original Sector 5.0 beamlines were built the performance demands of macromolecular crystallography users have become ever more exacting; with growing emphasis placed on studying larger complexes, more difficult structures, weakly diffracting or smaller

  18. Macromolecularly crowded in vitro microenvironments accelerate the production of extracellular matrix-rich supramolecular assemblies.

    Science.gov (United States)

    Kumar, Pramod; Satyam, Abhigyan; Fan, Xingliang; Collin, Estelle; Rochev, Yury; Rodriguez, Brian J; Gorelov, Alexander; Dillon, Simon; Joshi, Lokesh; Raghunath, Michael; Pandit, Abhay; Zeugolis, Dimitrios I

    2015-03-04

    Therapeutic strategies based on the principles of tissue engineering by self-assembly put forward the notion that functional regeneration can be achieved by utilising the inherent capacity of cells to create highly sophisticated supramolecular assemblies. However, in dilute ex vivo microenvironments, prolonged culture time is required to develop an extracellular matrix-rich implantable device. Herein, we assessed the influence of macromolecular crowding, a biophysical phenomenon that regulates intra- and extra-cellular activities in multicellular organisms, in human corneal fibroblast culture. In the presence of macromolecules, abundant extracellular matrix deposition was evidenced as fast as 48 h in culture, even at low serum concentration. Temperature responsive copolymers allowed the detachment of dense and cohesive supramolecularly assembled living substitutes within 6 days in culture. Morphological, histological, gene and protein analysis assays demonstrated maintenance of tissue-specific function. Macromolecular crowding opens new avenues for a more rational design in engineering of clinically relevant tissue modules in vitro.

  19. Long-wavelength macromolecular crystallography - First successful native SAD experiment close to the sulfur edge

    Science.gov (United States)

    Aurelius, O.; Duman, R.; El Omari, K.; Mykhaylyk, V.; Wagner, A.

    2017-11-01

    Phasing of novel macromolecular crystal structures has been challenging since the start of structural biology. Making use of anomalous diffraction of natively present elements, such as sulfur and phosphorus, for phasing has been possible for some systems, but hindered by the necessity to access longer X-ray wavelengths in order to make most use of the anomalous scattering contributions of these elements. Presented here are the results from a first successful experimental phasing study of a macromolecular crystal structure at a wavelength close to the sulfur K edge. This has been made possible by the in-vacuum setup and the long-wavelength optimised experimental setup at the I23 beamline at Diamond Light Source. In these early commissioning experiments only standard data collection and processing procedures have been applied, in particular no dedicated absorption correction has been used. Nevertheless the success of the experiment demonstrates that the capability to extract phase information can be even further improved once data collection protocols and data processing have been optimised.

  20. The In-Situ One-Step Synthesis of a PDC Macromolecular Pro-Drug and the Fabrication of a Novel Core-Shell Micell.

    Science.gov (United States)

    Yu, Cui-Yun; Yang, Sa; Li, Zhi-Ping; Huang, Can; Ning, Qian; Huang, Wen; Yang, Wen-Tong; He, Dongxiu; Sun, Lichun

    2016-01-01

    The development of slow release nano-sized carriers for efficient antineoplastic drug delivery with a biocompatible and biodegradable pectin-based macromolecular pro-drug for tumor therapy has been reported in this study. Pectin-doxorubicin conjugates (PDC), a macromolecular pro-drug, were prepared via an amide condensation reaction, and a novel amphiphilic core-shell micell based on a PDC macromolecular pro-drug (PDC-M) was self-assembled in situ, with pectin as the hydrophilic shell and doxorubicin (DOX) as the hydrophobic core. Then the chemical structure of the PDC macromolecular pro-drug was identified by both Fourier transform infrared spectroscopy (FTIR) and nuclear magnetic resonance spectroscopy ((1)H-NMR), and proved that doxorubicin combined well with the pectin and formed macromolecular pro-drug. The PDC-M were observed to have an unregularly spherical shape and were uniform in size by scanning electron microscopy (SEM). The average particle size of PDC-M, further measured by a Zetasizer nanoparticle analyzer (Nano ZS, Malvern Instruments), was about 140 nm. The encapsulation efficiency and drug loading were 57.82% ± 3.7% (n = 3) and 23.852% ±2.3% (n = 3), respectively. The in vitro drug release behaviors of the resulting PDC-M were studied in a simulated tumor environment (pH 5.0), blood (pH 7.4) and a lysosome media (pH 6.8), and showed a prolonged slow release profile. Assays for antiproliferative effects and flow cytometry of the resulting PDC-M in HepG2 cell lines demonstrated greater properties of delayed and slow release as compared to free DOX. A cell viability study against endothelial cells further revealed that the resulting PDC-M possesses excellent cell compatibilities and low cytotoxicities in comparison with that of the free DOX. Hemolysis activity was investigated in rabbits, and the results also demonstrated that the PDC-M has greater compatibility in comparison with free DOX. This shows that the resulting PDC-M can ameliorate the

  1. MX1: a bending-magnet crystallography beamline serving both chemical and macromolecular crystallography communities at the Australian Synchrotron

    International Nuclear Information System (INIS)

    Cowieson, Nathan Philip; Aragao, David; Clift, Mark; Ericsson, Daniel J.; Gee, Christine; Harrop, Stephen J.; Mudie, Nathan; Panjikar, Santosh; Price, Jason R.; Riboldi-Tunnicliffe, Alan; Williamson, Rachel; Caradoc-Davies, Tom

    2015-01-01

    The macromolecular crystallography beamline MX1 at the Australian Synchrotron is described. MX1 is a bending-magnet crystallography beamline at the 3 GeV Australian Synchrotron. The beamline delivers hard X-rays in the energy range from 8 to 18 keV to a focal spot at the sample position of 120 µm FWHM. The beamline endstation and ancillary equipment facilitate local and remote access for both chemical and biological macromolecular crystallography. Here, the design of the beamline and endstation are discussed. The beamline has enjoyed a full user program for the last seven years and scientific highlights from the user program are also presented

  2. Synthesis of branched polymers under continuous-flow microprocess: an improvement of the control of macromolecular architectures.

    Science.gov (United States)

    Bally, Florence; Serra, Christophe A; Brochon, Cyril; Hadziioannou, Georges

    2011-11-15

    Polymerization reactions can benefit from continuous-flow microprocess in terms of kinetics control, reactants mixing or simply efficiency when high-throughput screening experiments are carried out. In this work, we perform for the first time the synthesis of branched macromolecular architecture through a controlled/'living' polymerization technique, in tubular microreactor. Just by tuning process parameters, such as flow rates of the reactants, we manage to generate a library of polymers with various macromolecular characteristics. Compared to conventional batch process, polymerization kinetics shows a faster initiation step and more interestingly an improved branching efficiency. Due to reduced diffusion pathway, a characteristic of microsystems, it is thus possible to reach branched polymers exhibiting a denser architecture, and potentially a higher functionality for later applications. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Structural analysis of nanoparticulate carriers for encapsulation of macromolecular drugs

    Czech Academy of Sciences Publication Activity Database

    Angelov, Borislav; Garamus, V.M.; Drechsler, M.; Angelova, A.

    2017-01-01

    Roč. 235, Jun (2017), s. 83-89 ISSN 0167-7322 R&D Projects: GA MŠk EF15_003/0000447; GA MŠk EF15_008/0000162 Grant - others:OP VVV - ELIBIO(XE) CZ.02.1.01/0.0/0.0/15_003/0000447; ELI Beamlines(XE) CZ.02.1.01/0.0/0.0/15_008/0000162 Institutional support: RVO:68378271 Keywords : self-assembled nanocarriers * liquid crystalline phase transitions * cationic lipids * macromolecular drugs Subject RIV: BO - Biophysics OBOR OECD: Biophysics Impact factor: 3.648, year: 2016

  4. Protein crystal growth studies at the Center for Macromolecular Crystallography

    International Nuclear Information System (INIS)

    DeLucas, Lawrence J.; Long, Marianna M.; Moore, Karen M.; Harrington, Michael; McDonald, William T.; Smith, Craig D.; Bray, Terry; Lewis, Johanna; Crysel, William B.; Weise, Lance D.

    2000-01-01

    The Center for Macromolecular Crystallography (CMC) has been involved in fundamental studies of protein crystal growth (PCG) in microgravity and in our earth-based laboratories. A large group of co-investigators from academia and industry participated in these experiments by providing protein samples and by performing the x-ray crystallographic analysis. These studies have clearly demonstrated the usefulness of a microgravity environment for enhancing the quality and size of protein crystals. Review of the vapor diffusion (VDA) PCG results from nineteen space shuttle missions is given in this paper

  5. Extraction of cobalt ion from textile using a complexing macromolecular surfactant in supercritical carbon dioxide

    International Nuclear Information System (INIS)

    Chirat, Mathieu; Ribaut, Tiphaine; Clerc, Sebastien; Lacroix-Desmazes, Patrick; Charton, Frederic; Fournel, Bruno

    2013-01-01

    Cobalt ion under the form of cobalt nitrate is removed from a textile lab coat using supercritical carbon dioxide extraction. The process involves a macromolecular additive of well-defined architecture, acting both as a surfactant and a complexing agent. The extraction efficiency of cobalt reaches 66% when using a poly(1,1,2,2-tetrahydroperfluoro-decyl-acrylate-co-vinyl-benzylphosphonic diacid) gradient copolymer in the presence of water at 160 bar and 40 C. The synergy of the two additives, namely the copolymer and water which are useless if used separately, is pointed out. The potential of the supercritical carbon dioxide process using complexing macromolecular surfactant lies in the ability to modulate the complexing unit as a function of the metal as well as the architecture of the surface-active agent for applications ranging for instance from nuclear decontamination to the recovery of strategic metals. (authors)

  6. E-MSD: the European Bioinformatics Institute Macromolecular Structure Database.

    Science.gov (United States)

    Boutselakis, H; Dimitropoulos, D; Fillon, J; Golovin, A; Henrick, K; Hussain, A; Ionides, J; John, M; Keller, P A; Krissinel, E; McNeil, P; Naim, A; Newman, R; Oldfield, T; Pineda, J; Rachedi, A; Copeland, J; Sitnov, A; Sobhany, S; Suarez-Uruena, A; Swaminathan, J; Tagari, M; Tate, J; Tromm, S; Velankar, S; Vranken, W

    2003-01-01

    The E-MSD macromolecular structure relational database (http://www.ebi.ac.uk/msd) is designed to be a single access point for protein and nucleic acid structures and related information. The database is derived from Protein Data Bank (PDB) entries. Relational database technologies are used in a comprehensive cleaning procedure to ensure data uniformity across the whole archive. The search database contains an extensive set of derived properties, goodness-of-fit indicators, and links to other EBI databases including InterPro, GO, and SWISS-PROT, together with links to SCOP, CATH, PFAM and PROSITE. A generic search interface is available, coupled with a fast secondary structure domain search tool.

  7. Macromolecular Engineering: New Routes Towards the Synthesis of Well-??Defined Polyethers/Polyesters Co/Terpolymers with Different Architectures

    KAUST Repository

    Alamri, Haleema

    2016-01-01

    Macromolecular engineering (as discussed in the first chapter) of homo/copolymers refers to the specific tailoring of these materials for achieving an easy and reproducible synthesis that results in precise molecular

  8. Estimation of insurance premiums for coverage against natural disaster risk: an application of Bayesian Inference

    NARCIS (Netherlands)

    Paudel, Y.; Botzen, W.J.W.; Aerts, J.C.J.H.

    2013-01-01

    This study applies Bayesian Inference to estimate flood risk for 53 dyke ring areas in the Netherlands, and focuses particularly on the data scarcity and extreme behaviour of catastrophe risk. The probability density curves of flood damage are estimated through Monte Carlo simulations. Based on

  9. Time-efficient, high-resolution, whole brain three-dimensional macromolecular proton fraction mapping.

    Science.gov (United States)

    Yarnykh, Vasily L

    2016-05-01

    Macromolecular proton fraction (MPF) mapping is a quantitative MRI method that reconstructs parametric maps of a relative amount of macromolecular protons causing the magnetization transfer (MT) effect and provides a biomarker of myelination in neural tissues. This study aimed to develop a high-resolution whole brain MPF mapping technique using a minimal number of source images for scan time reduction. The described technique was based on replacement of an actually acquired reference image without MT saturation by a synthetic one reconstructed from R1 and proton density maps, thus requiring only three source images. This approach enabled whole brain three-dimensional MPF mapping with isotropic 1.25 × 1.25 × 1.25 mm(3) voxel size and a scan time of 20 min. The synthetic reference method was validated against standard MPF mapping with acquired reference images based on data from eight healthy subjects. Mean MPF values in segmented white and gray matter appeared in close agreement with no significant bias and small within-subject coefficients of variation (maps demonstrated sharp white-gray matter contrast and clear visualization of anatomical details, including gray matter structures with high iron content. The proposed synthetic reference method improves resolution of MPF mapping and combines accurate MPF measurements with unique neuroanatomical contrast features. © 2015 Wiley Periodicals, Inc.

  10. Generalized Born Models of Macromolecular Solvation Effects

    Science.gov (United States)

    Bashford, Donald; Case, David A.

    2000-10-01

    It would often be useful in computer simulations to use a simple description of solvation effects, instead of explicitly representing the individual solvent molecules. Continuum dielectric models often work well in describing the thermodynamic aspects of aqueous solvation, and approximations to such models that avoid the need to solve the Poisson equation are attractive because of their computational efficiency. Here we give an overview of one such approximation, the generalized Born model, which is simple and fast enough to be used for molecular dynamics simulations of proteins and nucleic acids. We discuss its strengths and weaknesses, both for its fidelity to the underlying continuum model and for its ability to replace explicit consideration of solvent molecules in macromolecular simulations. We focus particularly on versions of the generalized Born model that have a pair-wise analytical form, and therefore fit most naturally into conventional molecular mechanics calculations.

  11. Probing the hydration water diffusion of macromolecular surfaces and interfaces

    International Nuclear Information System (INIS)

    Ortony, Julia H; Cheng, Chi-Yuan; Franck, John M; Pavlova, Anna; Hunt, Jasmine; Han, Songi; Kausik, Ravinath

    2011-01-01

    We probe the translational dynamics of the hydration water surrounding the macromolecular surfaces of selected polyelectrolytes, lipid vesicles and intrinsically disordered proteins with site specificity in aqueous solutions. These measurements are made possible by the recent development of a new instrumental and methodological approach based on Overhauser dynamic nuclear polarization (DNP)-enhanced nuclear magnetic resonance (NMR) spectroscopy. This technique selectively amplifies 1 H NMR signals of hydration water around a spin label that is attached to a molecular site of interest. The selective 1 H NMR amplification within molecular length scales of a spin label is achieved by utilizing short-distance range (∼r -3 ) magnetic dipolar interactions between the 1 H spin of water and the electron spin of a nitroxide radical-based label. Key features include the fact that only minute quantities (<10 μl) and dilute (≥100 μM) sample concentrations are needed. There is no size limit on the macromolecule or molecular assembly to be analyzed. Hydration water with translational correlation times between 10 and 800 ps is measured within ∼10 A distance of the spin label, encompassing the typical thickness of a hydration layer with three water molecules across. The hydration water moving within this time scale has significant implications, as this is what is modulated whenever macromolecules or molecular assemblies undergo interactions, binding or conformational changes. We demonstrate, with the examples of polymer complexation, protein aggregation and lipid-polymer interaction, that the measurements of interfacial hydration dynamics can sensitively and site specifically probe macromolecular interactions.

  12. Measurement and Interpretation of Diffuse Scattering in X-Ray Diffraction for Macromolecular Crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Michael E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-16

    X-ray diffraction from macromolecular crystals includes both sharply peaked Bragg reflections and diffuse intensity between the peaks. The information in Bragg scattering reflects the mean electron density in the unit cells of the crystal. The diffuse scattering arises from correlations in the variations of electron density that may occur from one unit cell to another, and therefore contains information about collective motions in proteins.

  13. Proceedings of a one-week course on exploiting anomalous scattering in macromolecular structure determination (EMBO'07)

    International Nuclear Information System (INIS)

    Weiss, M.S.; Shepard, W.; Dauter, Z.; Leslie, A.; Diederichs, K.; Evans, G.; Svensson, O.; Schneider, T.; Bricogne, G.; Dauter, Z.; Flensburg, C.; Terwilliger, T.; Lamzin, V.; Leslie, A.; Kabsch, W.; Flensburg, C.; Terwilliger, T.; Lamzin, V.; Read, R.; Panjikar, S.; Pannu, N.S.; Dauter, Z.; Weiss, M.S.; McSweeney, S.

    2007-01-01

    This course, which was directed to young scientists, illustrated both theoretical and practical aspects of macromolecular crystal structure solution using synchrotron radiation. Some software dedicated to data collection, processing and analysis were presented. This document gathers only the slides of the presentations

  14. Proceedings of a one-week course on exploiting anomalous scattering in macromolecular structure determination (EMBO'07)

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, M S; Shepard, W; Dauter, Z; Leslie, A; Diederichs, K; Evans, G; Svensson, O; Schneider, T; Bricogne, G; Dauter, Z; Flensburg, C; Terwilliger, T; Lamzin, V; Leslie, A; Kabsch, W; Flensburg, C; Terwilliger, T; Lamzin, V; Read, R; Panjikar, S; Pannu, N S; Dauter, Z; Weiss, M S; McSweeney, S

    2007-07-01

    This course, which was directed to young scientists, illustrated both theoretical and practical aspects of macromolecular crystal structure solution using synchrotron radiation. Some software dedicated to data collection, processing and analysis were presented. This document gathers only the slides of the presentations.

  15. Polycapillary x-ray optics for macromolecular crystallography

    International Nuclear Information System (INIS)

    Owens, S.M.; Gibson, W.M.; Carter, D.C.; Sisk, R.C.; Ho, J.X.

    1996-01-01

    Polycapillary x-ray optics have found potential application in many different fields, including antiscatter and magnification in mammography, radiography, x-ray fluorescence, x-ray lithography, and x-ray diffraction techniques. In x-ray diffraction, an optic is used to collect divergent x-rays from a point source and redirect them into a quasi-parallel, or slightly focused beam. Monolithic polycapillary optics have been developed recently for macromolecular crystallography and have already shown considerable gains in diffracted beam intensity over pinhole collimation. Development is being pursued through a series of simulations and prototype optics. Many improvements have been made over the stage 1 prototype reported previously, which include better control over the manufacturing process, reducing the diameter of the output beam, and addition of a slight focusing at the output of the optic to further increase x-ray flux at the sample. The authors report the characteristics and performance of the stage 1 and stage 2 optics

  16. The Postgraduate Study of Macromolecular Sciences at the University of Zagreb (1971-1980)

    OpenAIRE

    Kunst, B.; Dezelic, D.; Veksli, Z.

    2008-01-01

    The postgraduate study of macromolecular sciences (PSMS) was established at the University of Zagreb in 1971 as a university study in the time of expressed interdisciplinary permeation of natural sciences - physics, chemistry and biology, and application of their achievements in technologicaldisciplines. PSMS was established by a group of prominent university professors from the schools of Science, Chemical Technology, Pharmacy and Medicine, as well as from the Institute of Biology. The study...

  17. Distribution and enzymatic activity of heterotrophic bacteria decomposing selected macromolecular compounds in a Baltic Sea sandy beach

    Science.gov (United States)

    Podgórska, B.; Mudryk, Z. J.

    2003-03-01

    The potential capability to decompose macromolecular compounds, and the level of extracellular enzyme activities were determined in heterotrophic bacteria isolated from a sandy beach in Sopot on the Southern Baltic Sea coast. Individual isolates were capable of hydrolysing a wide spectrum of organic macromolecular compounds. Lipids, gelatine, and DNA were hydrolyzed most efficiently. Only a very small percentage of strains were able to decompose cellulose, and no pectinolytic bacteria were found. Except for starch-hydrolysis, no significant differences in the intensity of organic compound decomposition were recorded between horizontal and vertical profiles of the studied beach. Of all the studied extracellular enzymes, alkaline phosphatase, esterase lipase, and leucine acrylaminidase were most active; in contrast, the activity α-fucosidase, α-galactosidase and β-glucouronidase was the weakest. The level of extracellular enzyme activity was similar in both sand layers.

  18. Macromolecular contrast agents for MR mammography: current status

    International Nuclear Information System (INIS)

    Daldrup-Link, Heike E.; Brasch, Robert C.

    2003-01-01

    Macromolecular contrast media (MMCM) encompass a new class of diagnostic drugs that can be applied with dynamic MRI to extract both physiologic and morphologic information in breast lesions. Kinetic analysis of dynamic MMCM-enhanced MR data in breast tumor patients provides useful estimates of tumor blood volume and microvascular permeability, typically increased in cancer. These tumor characteristics can be applied to differentiate benign from malignant lesions, to define the angiogenesis status of cancers, and to monitor tumor response to therapy. The most immediate challenge to the development of MMCM-enhanced mammography is the identification of those candidate compounds that demonstrate the requisite long intravascular distribution and have the high tolerance necessary for clinical use. Potential mammographic applications and limitations of various MMCM, defined by either experimental animal testing or clinical testing in patients, are reviewed in this article. (orig.)

  19. Irreversible entropy model for damage diagnosis in resistors

    Energy Technology Data Exchange (ETDEWEB)

    Cuadras, Angel, E-mail: angel.cuadras@upc.edu; Crisóstomo, Javier; Ovejas, Victoria J.; Quilez, Marcos [Instrumentation, Sensor and Interfaces Group, Electronic Engineering Department, Escola d' Enginyeria de Telecomunicació i Aeronàutica de Castelldefels EETAC, Universitat Politècnica de Catalunya, Barcelona Tech (UPC), Castelldefels-Barcelona (Spain)

    2015-10-28

    We propose a method to characterize electrical resistor damage based on entropy measurements. Irreversible entropy and the rate at which it is generated are more convenient parameters than resistance for describing damage because they are essentially positive in virtue of the second law of thermodynamics, whereas resistance may increase or decrease depending on the degradation mechanism. Commercial resistors were tested in order to characterize the damage induced by power surges. Resistors were biased with constant and pulsed voltage signals, leading to power dissipation in the range of 4–8 W, which is well above the 0.25 W nominal power to initiate failure. Entropy was inferred from the added power and temperature evolution. A model is proposed to understand the relationship among resistance, entropy, and damage. The power surge dissipates into heat (Joule effect) and damages the resistor. The results show a correlation between entropy generation rate and resistor failure. We conclude that damage can be conveniently assessed from irreversible entropy generation. Our results for resistors can be easily extrapolated to other systems or machines that can be modeled based on their resistance.

  20. Irreversible entropy model for damage diagnosis in resistors

    International Nuclear Information System (INIS)

    Cuadras, Angel; Crisóstomo, Javier; Ovejas, Victoria J.; Quilez, Marcos

    2015-01-01

    We propose a method to characterize electrical resistor damage based on entropy measurements. Irreversible entropy and the rate at which it is generated are more convenient parameters than resistance for describing damage because they are essentially positive in virtue of the second law of thermodynamics, whereas resistance may increase or decrease depending on the degradation mechanism. Commercial resistors were tested in order to characterize the damage induced by power surges. Resistors were biased with constant and pulsed voltage signals, leading to power dissipation in the range of 4–8 W, which is well above the 0.25 W nominal power to initiate failure. Entropy was inferred from the added power and temperature evolution. A model is proposed to understand the relationship among resistance, entropy, and damage. The power surge dissipates into heat (Joule effect) and damages the resistor. The results show a correlation between entropy generation rate and resistor failure. We conclude that damage can be conveniently assessed from irreversible entropy generation. Our results for resistors can be easily extrapolated to other systems or machines that can be modeled based on their resistance

  1. Entropic Inference

    Science.gov (United States)

    Caticha, Ariel

    2011-03-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes' rule, and therefore unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme.

  2. A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design.

    Directory of Open Access Journals (Sweden)

    Shane Ó Conchúir

    Full Text Available The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.

  3. Macromolecular crowding compacts unfolded apoflavodoxin and causes severe aggregation of the off-pathway intermediate during apoflavodoxin folding

    NARCIS (Netherlands)

    Engel, R.; Westphal, A.H.; Huberts, D.; Nabuurs, S.M.; Lindhoud, S.; Visser, A.J.W.G.; Mierlo, van C.P.M.

    2008-01-01

    To understand how proteins fold in vivo, it is important to investigate the effects of macromolecular crowding on protein folding. Here, the influence of crowding on in vitro apoflavodoxin folding, which involves a relatively stable off-pathway intermediate with molten globule characteristics, is

  4. Proceedings of a one-week course on exploiting anomalous scattering in macromolecular structure determination (EMBO'07)

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, M.S.; Shepard, W.; Dauter, Z.; Leslie, A.; Diederichs, K.; Evans, G.; Svensson, O.; Schneider, T.; Bricogne, G.; Dauter, Z.; Flensburg, C.; Terwilliger, T.; Lamzin, V.; Leslie, A.; Kabsch, W.; Flensburg, C.; Terwilliger, T.; Lamzin, V.; Read, R.; Panjikar, S.; Pannu, N.S.; Dauter, Z.; Weiss, M.S.; McSweeney, S

    2007-07-01

    This course, which was directed to young scientists, illustrated both theoretical and practical aspects of macromolecular crystal structure solution using synchrotron radiation. Some software dedicated to data collection, processing and analysis were presented. This document gathers only the slides of the presentations.

  5. Probing the Interplay of Size, Shape, and Solution Environment in Macromolecular Diffusion Using a Simple Refraction Experiment

    Science.gov (United States)

    Mankidy, Bijith D.; Coutinho, Cecil A.; Gupta, Vinay K.

    2010-01-01

    The diffusion coefficient of polymers is a critical parameter in biomedicine, catalysis, chemical separations, nanotechnology, and other industrial applications. Here, measurement of macromolecular diffusion in solutions is described using a visually instructive, undergraduate-level optical refraction experiment based on Weiner's method. To…

  6. C,N-2-[(Dimethylamino)methyl]phenylplatinum Complexes Functionalized with C60 as Macromolecular Building Blocks

    NARCIS (Netherlands)

    Koten, G. van; Meijer, M.D.; Wolf, E. de; Lutz, M.H.; Spek, A.L.; Klink, G.P.M. van

    2001-01-01

    The application of platinum(II) complexes based on the N,N-dimethylbenzylamine ligand (abbreviated as H-C,N) in macromolecular synthesis was demonstrated. Two cationic C,N-platinum moieties were linked with a 4,4'-bipyridine bridge, giving [{C6H4(CH2NMe2)-2-Pt(PPh3)}2(4,4'-bpy)](BF4)2 (2), the

  7. Grain sorghum dust increases macromolecular efflux from the in situ nasal mucosa.

    Science.gov (United States)

    Gao, X P

    1998-04-01

    The purpose of this study was to determine whether an aqueous extract of grain sorghum dust increases macromolecular efflux from the nasal mucosa in vivo and, if so, whether this response is mediated, in part, by substance P. Suffusion of grain sorghum dust extract on the in situ nasal mucosa of anesthetized hamsters elicits a significant increase in clearance of fluorescein isothiocyanate-labeled dextran (FITC-dextran; mol mass, 70 kDa; P grain sorghum dust elicits neurogenic plasma exudation from the in situ nasal mucosa.

  8. Remote Access to the PXRR Macromolecular Crystallography Facilities at the NSLS

    Energy Technology Data Exchange (ETDEWEB)

    A Soares; D Schneider; J Skinner; M Cowan; R Buono; H Robinson; A Heroux; M Carlucci-Dayton; A Saxena; R Sweet

    2011-12-31

    The most recent surge of innovations that have simplified and streamlined the process of determining macromolecular structures by crystallography owes much to the efforts of the structural genomics community. However, this was only the last step in a long evolution that saw the metamorphosis of crystallography from an heroic effort that involved years of dedication and skill into a straightforward measurement that is occasionally almost trivial. Many of the steps in this remarkable odyssey involved reducing the physical labor that is demanded of experimenters in the field. Other steps reduced the technical expertise required for conducting those experiments.

  9. Remote Access to the PXRR Macromolecular Crystallography Facilities at the NSLS

    International Nuclear Information System (INIS)

    Soares, A.; Schneider, D.; Skinner, J.; Cowan, M.; Buono, R.; Robinson, H.; Heroux, A.; Carlucci-Dayton, M.; Saxena, A.; Sweet, R.

    2008-01-01

    The most recent surge of innovations that have simplified and streamlined the process of determining macromolecular structures by crystallography owes much to the efforts of the structural genomics community. However, this was only the last step in a long evolution that saw the metamorphosis of crystallography from an heroic effort that involved years of dedication and skill into a straightforward measurement that is occasionally almost trivial. Many of the steps in this remarkable odyssey involved reducing the physical labor that is demanded of experimenters in the field. Other steps reduced the technical expertise required for conducting those experiments.

  10. Exercise-Induced Muscle Damage and Hypertrophy: A Closer Look Reveals the Jury is Still Out

    OpenAIRE

    Schoenfeld, Brad; Contreras, Bret

    2018-01-01

    This letter is a response to the paper by Damas et al (2017) titled, “The development of skeletal muscle hypertrophy through resistance training: the role of muscle damage and muscle protein synthesis,” which, in part, endeavored to review the role of exercise-induced muscle damage on muscle hypertrophy. We feel there are a number of issues in interpretation of research and extrapolation that preclude drawing the inference expressed in the paper that muscle damage neither explains nor potenti...

  11. More than one kind of inference: re-examining what's learned in feature inference and classification.

    Science.gov (United States)

    Sweller, Naomi; Hayes, Brett K

    2010-08-01

    Three studies examined how task demands that impact on attention to typical or atypical category features shape the category representations formed through classification learning and inference learning. During training categories were learned via exemplar classification or by inferring missing exemplar features. In the latter condition inferences were made about missing typical features alone (typical feature inference) or about both missing typical and atypical features (mixed feature inference). Classification and mixed feature inference led to the incorporation of typical and atypical features into category representations, with both kinds of features influencing inferences about familiar (Experiments 1 and 2) and novel (Experiment 3) test items. Those in the typical inference condition focused primarily on typical features. Together with formal modelling, these results challenge previous accounts that have characterized inference learning as producing a focus on typical category features. The results show that two different kinds of inference learning are possible and that these are subserved by different kinds of category representations.

  12. Inference of segmented color and texture description by tensor voting.

    Science.gov (United States)

    Jia, Jiaya; Tang, Chi-Keung

    2004-06-01

    A robust synthesis method is proposed to automatically infer missing color and texture information from a damaged 2D image by (N)D tensor voting (N > 3). The same approach is generalized to range and 3D data in the presence of occlusion, missing data and noise. Our method translates texture information into an adaptive (N)D tensor, followed by a voting process that infers noniteratively the optimal color values in the (N)D texture space. A two-step method is proposed. First, we perform segmentation based on insufficient geometry, color, and texture information in the input, and extrapolate partitioning boundaries by either 2D or 3D tensor voting to generate a complete segmentation for the input. Missing colors are synthesized using (N)D tensor voting in each segment. Different feature scales in the input are automatically adapted by our tensor scale analysis. Results on a variety of difficult inputs demonstrate the effectiveness of our tensor voting approach.

  13. Perceptual inference.

    Science.gov (United States)

    Aggelopoulos, Nikolaos C

    2015-08-01

    Perceptual inference refers to the ability to infer sensory stimuli from predictions that result from internal neural representations built through prior experience. Methods of Bayesian statistical inference and decision theory model cognition adequately by using error sensing either in guiding action or in "generative" models that predict the sensory information. In this framework, perception can be seen as a process qualitatively distinct from sensation, a process of information evaluation using previously acquired and stored representations (memories) that is guided by sensory feedback. The stored representations can be utilised as internal models of sensory stimuli enabling long term associations, for example in operant conditioning. Evidence for perceptual inference is contributed by such phenomena as the cortical co-localisation of object perception with object memory, the response invariance in the responses of some neurons to variations in the stimulus, as well as from situations in which perception can be dissociated from sensation. In the context of perceptual inference, sensory areas of the cerebral cortex that have been facilitated by a priming signal may be regarded as comparators in a closed feedback loop, similar to the better known motor reflexes in the sensorimotor system. The adult cerebral cortex can be regarded as similar to a servomechanism, in using sensory feedback to correct internal models, producing predictions of the outside world on the basis of past experience. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. SEMANTIC PATCH INFERENCE

    DEFF Research Database (Denmark)

    Andersen, Jesper

    2009-01-01

    Collateral evolution the problem of updating several library-using programs in response to API changes in the used library. In this dissertation we address the issue of understanding collateral evolutions by automatically inferring a high-level specification of the changes evident in a given set ...... specifications inferred by spdiff in Linux are shown. We find that the inferred specifications concisely capture the actual collateral evolution performed in the examples....

  15. Efficient analysis of macromolecular rotational diffusion from heteronuclear relaxation data

    International Nuclear Information System (INIS)

    Dosset, Patrice; Hus, Jean-Christophe; Blackledge, Martin; Marion, Dominique

    2000-01-01

    A novel program has been developed for the interpretation of 15 N relaxation rates in terms of macromolecular anisotropic rotational diffusion. The program is based on a highly efficient simulated annealing/minimization algorithm, designed specifically to search the parametric space described by the isotropic, axially symmetric and fully anisotropic rotational diffusion tensor models. The high efficiency of this algorithm allows extensive noise-based Monte Carlo error analysis. Relevant statistical tests are systematically applied to provide confidence limits for the proposed tensorial models. The program is illustrated here using the example of the cytochrome c' from Rhodobacter capsulatus, a four-helix bundle heme protein, for which data at three different field strengths were independently analysed and compared

  16. 129 Xe NMR Relaxation-Based Macromolecular Sensing

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Muller D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Dao, Phuong [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Jeong, Keunhong [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Slack, Clancy C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Vassiliou, Christophoros C. [Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Finbloom, Joel A. [Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Francis, Matthew B. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Wemmer, David E. [Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Physical Biosciences Division; Pines, Alexander [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry

    2016-07-29

    A 129Xe NMR relaxation-based sensing approach is reported on that exploits changes in the bulk xenon relaxation rate induced by slowed tumbling of a cryptophane-based sensor upon target binding. The amplification afforded by detection of the bulk dissolved xenon allows sensitive detection of targets. The sensor comprises a xenon-binding cryptophane cage, a target interaction element, and a metal chelating agent. Xenon associated with the target-bound cryptophane cage is rapidly relaxed and then detected after exchange with the bulk. Here we show that large macromolecular targets increase the rotational correlation time of xenon, increasing its relaxation rate. Upon binding of a biotin-containing sensor to avidin at 1.5 μM concentration, the free xenon T2 is reduced by a factor of 4.

  17. Multimodel inference and adaptive management

    Science.gov (United States)

    Rehme, S.E.; Powell, L.A.; Allen, Craig R.

    2011-01-01

    Ecology is an inherently complex science coping with correlated variables, nonlinear interactions and multiple scales of pattern and process, making it difficult for experiments to result in clear, strong inference. Natural resource managers, policy makers, and stakeholders rely on science to provide timely and accurate management recommendations. However, the time necessary to untangle the complexities of interactions within ecosystems is often far greater than the time available to make management decisions. One method of coping with this problem is multimodel inference. Multimodel inference assesses uncertainty by calculating likelihoods among multiple competing hypotheses, but multimodel inference results are often equivocal. Despite this, there may be pressure for ecologists to provide management recommendations regardless of the strength of their study’s inference. We reviewed papers in the Journal of Wildlife Management (JWM) and the journal Conservation Biology (CB) to quantify the prevalence of multimodel inference approaches, the resulting inference (weak versus strong), and how authors dealt with the uncertainty. Thirty-eight percent and 14%, respectively, of articles in the JWM and CB used multimodel inference approaches. Strong inference was rarely observed, with only 7% of JWM and 20% of CB articles resulting in strong inference. We found the majority of weak inference papers in both journals (59%) gave specific management recommendations. Model selection uncertainty was ignored in most recommendations for management. We suggest that adaptive management is an ideal method to resolve uncertainty when research results in weak inference.

  18. DNA damage caused by UV- and near UV-irradiation

    International Nuclear Information System (INIS)

    Ohnishi, Takeo

    1986-01-01

    Much work with mutants deficient in DNA repair has been performed concerning UV-induced DNA damage under the condition where there is no artificial stimulation. In an attempt to infer the effects of solar wavelengths, the outcome of the work is discussed in terms of cellular radiation sensitivity, unscheduled DNA synthesis, and mutation induction, leading to the conclusion that some DNA damage occurs even by irradiation of the shorter wavelength light (270 - 315 nm) and is repaired by excision repair. It has been thought to date that pyrimidine dimer (PD) plays the most important role in UV-induced DNA damage, followed by (6 - 4) photoproducts. As for DNA damage induced by near UV irradiation, the yield of DNA single-strand breaks and of DNA-protein crosslinking, other than PD, is considered. The DNA-protein crosslinking has proved to be induced by irradiation at any wavelength of UV ranging from 260 to 425 nm. Near UV irradiation causes the inhibition of cell proliferation to take place. (Namekawa, K.)

  19. Localization of protein aggregation in Escherichia coli is governed by diffusion and nucleoid macromolecular crowding effect.

    Directory of Open Access Journals (Sweden)

    Anne-Sophie Coquel

    2013-04-01

    Full Text Available Aggregates of misfolded proteins are a hallmark of many age-related diseases. Recently, they have been linked to aging of Escherichia coli (E. coli where protein aggregates accumulate at the old pole region of the aging bacterium. Because of the potential of E. coli as a model organism, elucidating aging and protein aggregation in this bacterium may pave the way to significant advances in our global understanding of aging. A first obstacle along this path is to decipher the mechanisms by which protein aggregates are targeted to specific intercellular locations. Here, using an integrated approach based on individual-based modeling, time-lapse fluorescence microscopy and automated image analysis, we show that the movement of aging-related protein aggregates in E. coli is purely diffusive (Brownian. Using single-particle tracking of protein aggregates in live E. coli cells, we estimated the average size and diffusion constant of the aggregates. Our results provide evidence that the aggregates passively diffuse within the cell, with diffusion constants that depend on their size in agreement with the Stokes-Einstein law. However, the aggregate displacements along the cell long axis are confined to a region that roughly corresponds to the nucleoid-free space in the cell pole, thus confirming the importance of increased macromolecular crowding in the nucleoids. We thus used 3D individual-based modeling to show that these three ingredients (diffusion, aggregation and diffusion hindrance in the nucleoids are sufficient and necessary to reproduce the available experimental data on aggregate localization in the cells. Taken together, our results strongly support the hypothesis that the localization of aging-related protein aggregates in the poles of E. coli results from the coupling of passive diffusion-aggregation with spatially non-homogeneous macromolecular crowding. They further support the importance of "soft" intracellular structuring (based on

  20. Radiation damage in room-temperature data acquisition with the PILATUS 6M pixel detector.

    Science.gov (United States)

    Rajendran, Chitra; Dworkowski, Florian S N; Wang, Meitian; Schulze-Briese, Clemens

    2011-05-01

    The first study of room-temperature macromolecular crystallography data acquisition with a silicon pixel detector is presented, where the data are collected in continuous sample rotation mode, with millisecond read-out time and no read-out noise. Several successive datasets were collected sequentially from single test crystals of thaumatin and insulin. The dose rate ranged between ∼ 1320 Gy s(-1) and ∼ 8420 Gy s(-1) with corresponding frame rates between 1.565 Hz and 12.5 Hz. The data were analysed for global radiation damage. A previously unreported negative dose-rate effect is observed in the indicators of global radiation damage, which showed an approximately 75% decrease in D(1/2) at sixfold higher dose rate. The integrated intensity decreases in an exponential manner. Sample heating that could give rise to the enhanced radiation sensitivity at higher dose rate is investigated by collecting data between crystal temperatures of 298 K and 353 K. UV-Vis spectroscopy is used to demonstrate that disulfide radicals and trapped electrons do not accumulate at high dose rates in continuous data collection.

  1. Macromolecular synthesis in algal cells

    International Nuclear Information System (INIS)

    Ishida, M.R.; Kikuchi, Tadatoshi

    1980-01-01

    The present paper is a review of our experimental results obtained previously on the macromolecular biosyntheses in the cells of blue-green alga Anacystis nidulans as a representative species of prokaryote, and also in those of three species of eukaryotic algae, i.e. Euglena gracilis strain Z, Chlamydomonas reinhardi, and Cyanidium caldarium. In these algal cells, the combined methods consisting of pulse-labelling using 32 P, 3 H- and 14 C-labelled precursors for macromolecules, of their chasing and of the use of inhibitors which block specifically the syntheses of macromolecules such as proteins, RNA and DNA in living cells were very effectively applied for the analyses of the regulatory mechanism in biosyntheses of macromolecules and of the mode of their assembly into the cell structure, especially organelle constituents. Rased on the results obtained thus, the following conclusions are reached: (1) the metabolic pool for syntheses of macromolecules in the cells of prokaryotic blue-green alga is limited to the small extent and such activities couple largely with the photosynthetic mechanism; (2) 70 S ribosomes in the blue-green algal cells are assembled on the surface of thylakoid membranes widely distributed in their cytoplasm; and (3) the cells of eukaryotic unicellular algae used here have biochemical characters specific for already differentiated enzyme system involving in transcription and translation machineries as the same as in higher organisms, but the control mechanism concerning with such macromolecule syntheses are different among each species. (author)

  2. Evaluation of macromolecular electron-density map quality using the correlation of local r.m.s. density

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    The correlation of local r.m.s. density is shown to be a good measure of the presence of distinct solvent and macromolecule regions in macromolecular electron-density maps. It has recently been shown that the standard deviation of local r.m.s. electron density is a good indicator of the presence of distinct regions of solvent and protein in macromolecular electron-density maps [Terwilliger & Berendzen (1999 ▶). Acta Cryst. D55, 501–505]. Here, it is demonstrated that a complementary measure, the correlation of local r.m.s. density in adjacent regions on the unit cell, is also a good measure of the presence of distinct solvent and protein regions. The correlation of local r.m.s. density is essentially a measure of how contiguous the solvent (and protein) regions are in the electron-density map. This statistic can be calculated in real space or in reciprocal space and has potential uses in evaluation of heavy-atom solutions in the MIR and MAD methods as well as for evaluation of trial phase sets in ab initio phasing procedures

  3. A brief history of macromolecular crystallography, illustrated by a family tree and its Nobel fruits.

    Science.gov (United States)

    Jaskolski, Mariusz; Dauter, Zbigniew; Wlodawer, Alexander

    2014-09-01

    As a contribution to the celebration of the year 2014, declared by the United Nations to be 'The International Year of Crystallography', the FEBS Journal is dedicating this issue to papers showcasing the intimate union between macromolecular crystallography and structural biology, both in historical perspective and in current research. Instead of a formal editorial piece, by way of introduction, this review discusses the most important, often iconic, achievements of crystallographers that led to major advances in our understanding of the structure and function of biological macromolecules. We identified at least 42 scientists who received Nobel Prizes in Physics, Chemistry or Medicine for their contributions that included the use of X-rays or neutrons and crystallography, including 24 who made seminal discoveries in macromolecular sciences. Our spotlight is mostly, but not only, on the recipients of this most prestigious scientific honor, presented in approximately chronological order. As a summary of the review, we attempt to construct a genealogy tree of the principal lineages of protein crystallography, leading from the founding members to the present generation. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  4. Evaluation of quantum-chemical methods of radiolysis stability for macromolecular structures

    International Nuclear Information System (INIS)

    Postolache, Cristian; Matei, Lidia

    2005-01-01

    The behavior of macromolecular structures in ionising fields was analyzed by quantum-chemical methods. In this study the primary radiolytic effect was analyzed using a two-step radiolytic mechanism: a) ionisation of molecule and spatial redistribution of atoms in order to reach a minimum value of energy, characteristic to the quantum state; b) neutralisation of the molecule by electron capture and its rapid dissociation into free radicals. Chemical bonds suspected to break are located in the distribution region of LUMO orbital and have minimal homolytic dissociation energies. Representative polymer structures (polyethylene, polypropylene, polystyrene, poly α and β polystyrene, polyisobutylene, polytetrafluoroethylene, poly methylsiloxanes) were analyzed. (authors)

  5. The Postgraduate Study of Macromolecular Sciences at the University of Zagreb (1971– 1980)

    OpenAIRE

    Deželić, D.; Kunst, B.; Veksli, Zorica

    2008-01-01

    The postgraduate study of macromolecular sciences (PSMS) was established at the University of Zagreb in 1971 as a university study in the time of expressed interdisciplinary permeation of natural sciences - physics, chemistry and biology, and application of their achievements in technological disciplines. PSMS was established by a group of prominent university professors from the schools of Science, Chemical Technology, Pharmacy and Medicine, as well as from the Institute of Biology. The s...

  6. Optimal inference with suboptimal models: Addiction and active Bayesian inference

    Science.gov (United States)

    Schwartenbeck, Philipp; FitzGerald, Thomas H.B.; Mathys, Christoph; Dolan, Ray; Wurst, Friedrich; Kronbichler, Martin; Friston, Karl

    2015-01-01

    When casting behaviour as active (Bayesian) inference, optimal inference is defined with respect to an agent’s beliefs – based on its generative model of the world. This contrasts with normative accounts of choice behaviour, in which optimal actions are considered in relation to the true structure of the environment – as opposed to the agent’s beliefs about worldly states (or the task). This distinction shifts an understanding of suboptimal or pathological behaviour away from aberrant inference as such, to understanding the prior beliefs of a subject that cause them to behave less ‘optimally’ than our prior beliefs suggest they should behave. Put simply, suboptimal or pathological behaviour does not speak against understanding behaviour in terms of (Bayes optimal) inference, but rather calls for a more refined understanding of the subject’s generative model upon which their (optimal) Bayesian inference is based. Here, we discuss this fundamental distinction and its implications for understanding optimality, bounded rationality and pathological (choice) behaviour. We illustrate our argument using addictive choice behaviour in a recently described ‘limited offer’ task. Our simulations of pathological choices and addictive behaviour also generate some clear hypotheses, which we hope to pursue in ongoing empirical work. PMID:25561321

  7. Macromolecular Crystallization in Microfluidics for the International Space Station

    Science.gov (United States)

    Monaco, Lisa A.; Spearing, Scott

    2003-01-01

    At NASA's Marshall Space Flight Center, the Iterative Biological Crystallization (IBC) project has begun development on scientific hardware for macromolecular crystallization on the International Space Station (ISS). Currently ISS crystallization research is limited to solution recipes that were prepared on the ground prior to launch. The proposed hardware will conduct solution mixing and dispensing on board the ISS, be fully automated, and have imaging functions via remote commanding from the ground. Utilizing microfluidic technology, IBC will allow for on orbit iterations. The microfluidics LabChip(R) devices that have been developed, along with Caliper Technologies, will greatly benefit researchers by allowing for precise fluid handling of nano/pico liter sized volumes. IBC will maximize the amount of science return by utilizing the microfluidic approach and be a valuable tool to structural biologists investigating medically relevant projects.

  8. The monitoring system for macromolecular crystallography beamlines at BSRF

    International Nuclear Information System (INIS)

    Guo Xian; Chang Guangcai; Gan Quan; Shi Hong; Liu Peng; Sun Gongxing

    2012-01-01

    The monitoring system for macromolecular crystallography beamlines at BSRF (Beijing Synchrotron Radiation Facility) based on LabVIEW is introduced. In order to guarantee a safe, stable, and reliable running for the beamline devices, the system monitors the state of vacuum, cooling-water, optical components, beam, Liquid nitrogen in the beamlines in real time, detects faults and gives the alarm timely. System underlying uses the driver developed for the field devices for data acquisition, Data of collection is uploaded to the data-sharing platform makes it accessible via a network share. The upper system divides modules according to the actual function, and establishes the main interface of the monitoring system of beamline. To Facilitate data storage, management and inquiry, the system use LabSQL toolkit to achieve the interconnection with MySQL database which data of collection is sent to. (authors)

  9. Inference rule and problem solving

    Energy Technology Data Exchange (ETDEWEB)

    Goto, S

    1982-04-01

    Intelligent information processing signifies an opportunity of having man's intellectual activity executed on the computer, in which inference, in place of ordinary calculation, is used as the basic operational mechanism for such an information processing. Many inference rules are derived from syllogisms in formal logic. The problem of programming this inference function is referred to as a problem solving. Although logically inference and problem-solving are in close relation, the calculation ability of current computers is on a low level for inferring. For clarifying the relation between inference and computers, nonmonotonic logic has been considered. The paper deals with the above topics. 16 references.

  10. The 2D Structure of the T. brucei Preedited RPS12 mRNA Is Not Affected by Macromolecular Crowding

    Directory of Open Access Journals (Sweden)

    W.-Matthias Leeder

    2017-01-01

    Full Text Available Mitochondrial transcript maturation in African trypanosomes requires RNA editing to convert sequence-deficient pre-mRNAs into translatable mRNAs. The different pre-mRNAs have been shown to adopt highly stable 2D folds; however, it is not known whether these structures resemble the in vivo folds given the extreme “crowding” conditions within the mitochondrion. Here, we analyze the effects of macromolecular crowding on the structure of the mitochondrial RPS12 pre-mRNA. We use high molecular mass polyethylene glycol as a macromolecular cosolute and monitor the structure of the RNA globally and with nucleotide resolution. We demonstrate that crowding has no impact on the 2D fold and we conclude that the MFE structure in dilute solvent conditions represents a good proxy for the folding of the pre-mRNA in its mitochondrial solvent context.

  11. Macromolecular Engineering: New Routes Towards the Synthesis of Well-??Defined Polyethers/Polyesters Co/Terpolymers with Different Architectures

    KAUST Repository

    Alamri, Haleema

    2016-05-18

    The primary objective of this research was to develop a new and efficient pathway for well-defined multicomponent homo/co/terpolymers of cyclic esters/ethers using an organocatalytic approach with an emphasis on the macromolecular engineering aspects of the overall synthesis. Macromolecular engineering (as discussed in the first chapter) of homo/copolymers refers to the specific tailoring of these materials for achieving an easy and reproducible synthesis that results in precise molecular characteristics, i.e. molecular weight and polydispersity, as well as specific structure and end?group choices. Precise control of these molecular characteristics will provide access to new materials that can be used for pre-targeted purposes such as biomedical applications. Among the most commonly used engineering materials are polyesters (biocompatible and biodegradable) and polyethers (biocompatible), either as homopolymers or when or copolymers with linear structures. The ability to create non-linear structures, for example stars, will open new horizons in the applications of these important polymeric materials. The second part of this thesis describes the synthesis of aliphatic polyesters, particularly polycaprolactone and polylactide, using a metal-free initiator/catalyst system. A phosphazene base (t?BuP2) was used as the catalyst for the ring-opening copolymerization of ?-aprolactone (??CL) and L,Lactide (LLA) at room temperature with a variety of protic initiators in different solvents. These studies provided important information for the design of a metal-free route toward the synthesis of polyester?based (bio) materials. The third part of the thesis describes a novel route for the one?pot synthesis of polyether-b polyester block copolymers with either a linear or a specific macromolecular architecture. Poly (styrene oxide)?b?poly(caprolactone)?b?poly(L,lactide) was prepared using this method with the goal of synthesizing poly(styrene oxide)-based materials since this

  12. Knowledge and inference

    CERN Document Server

    Nagao, Makoto

    1990-01-01

    Knowledge and Inference discusses an important problem for software systems: How do we treat knowledge and ideas on a computer and how do we use inference to solve problems on a computer? The book talks about the problems of knowledge and inference for the purpose of merging artificial intelligence and library science. The book begins by clarifying the concept of """"knowledge"""" from many points of view, followed by a chapter on the current state of library science and the place of artificial intelligence in library science. Subsequent chapters cover central topics in the artificial intellig

  13. Geometric statistical inference

    International Nuclear Information System (INIS)

    Periwal, Vipul

    1999-01-01

    A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined

  14. Molecular Mechanisms Responsible for Increased Vulnerability of the Ageing Oocyte to Oxidative Damage

    Science.gov (United States)

    Redgrove, Kate A.; McLaughlin, Eileen A.

    2017-01-01

    In their midthirties, women experience a decline in fertility, coupled to a pronounced increase in the risk of aneuploidy, miscarriage, and birth defects. Although the aetiology of such pathologies are complex, a causative relationship between the age-related decline in oocyte quality and oxidative stress (OS) is now well established. What remains less certain are the molecular mechanisms governing the increased vulnerability of the aged oocyte to oxidative damage. In this review, we explore the reduced capacity of the ageing oocyte to mitigate macromolecular damage arising from oxidative insults and highlight the dramatic consequences for oocyte quality and female fertility. Indeed, while oocytes are typically endowed with a comprehensive suite of molecular mechanisms to moderate oxidative damage and thus ensure the fidelity of the germline, there is increasing recognition that the efficacy of such protective mechanisms undergoes an age-related decline. For instance, impaired reactive oxygen species metabolism, decreased DNA repair, reduced sensitivity of the spindle assembly checkpoint, and decreased capacity for protein repair and degradation collectively render the aged oocyte acutely vulnerable to OS and limits their capacity to recover from exposure to such insults. We also highlight the inadequacies of our current armoury of assisted reproductive technologies to combat age-related female infertility, emphasising the need for further research into mechanisms underpinning the functional deterioration of the ageing oocyte. PMID:29312475

  15. Can visco-elastic phase separation, macromolecular crowding and colloidal physics explain nuclear organisation?

    Directory of Open Access Journals (Sweden)

    Iborra Francisco J

    2007-04-01

    Full Text Available Abstract Background The cell nucleus is highly compartmentalized with well-defined domains, it is not well understood how this nuclear order is maintained. Many scientists are fascinated by the different set of structures observed in the nucleus to attribute functions to them. In order to distinguish functional compartments from non-functional aggregates, I believe is important to investigate the biophysical nature of nuclear organisation. Results The various nuclear compartments can be divided broadly as chromatin or protein and/or RNA based, and they have very different dynamic properties. The chromatin compartment displays a slow, constrained diffusional motion. On the other hand, the protein/RNA compartment is very dynamic. Physical systems with dynamical asymmetry go to viscoelastic phase separation. This phase separation phenomenon leads to the formation of a long-lived interaction network of slow components (chromatin scattered within domains rich in fast components (protein/RNA. Moreover, the nucleus is packed with macromolecules in the order of 300 mg/ml. This high concentration of macromolecules produces volume exclusion effects that enhance attractive interactions between macromolecules, known as macromolecular crowding, which favours the formation of compartments. In this paper I hypothesise that nuclear compartmentalization can be explained by viscoelastic phase separation of the dynamically different nuclear components, in combination with macromolecular crowding and the properties of colloidal particles. Conclusion I demonstrate that nuclear structure can satisfy the predictions of this hypothesis. I discuss the functional implications of this phenomenon.

  16. Glycogen-graft-poly(2-alkyl-2-oxazolines) - the new versatile biopolymer-based thermoresponsive macromolecular toolbox

    Czech Academy of Sciences Publication Activity Database

    Pospíšilová, Aneta; Filippov, Sergey K.; Bogomolova, Anna; Turner, S.; Sedláček, Ondřej; Matushkin, Nikolai; Černochová, Zulfiya; Štěpánek, Petr; Hrubý, Martin

    2014-01-01

    Roč. 4, č. 106 (2014), s. 61580-61588 ISSN 2046-2069 R&D Projects: GA ČR GA13-08336S; GA MŠk(CZ) LH14079 Grant - others:AV ČR(CZ) M200501201; AV ČR(CZ) ASCR/CONICET 2012CZ006 Program:M Institutional support: RVO:61389013 Keywords : glycogen * poly(2-alkyl-2-oxazoline) * hybrid copolymer Subject RIV: CD - Macromolecular Chemistry Impact factor: 3.840, year: 2014

  17. Time reversed Lamb wave for damage detection in a stiffened aluminum plate

    International Nuclear Information System (INIS)

    Bijudas, C R; Mitra, M; Mujumdar, P M

    2013-01-01

    According to the concept of time reversibility of the Lamb wave, in the absence of damage, a Lamb wave signal can be reconstructed at the transmitter location if a time reversed signal is sent back from the receiver location. This property is used for baseline-free damage detection, where the presence of damage breaks down the time reversibility and the mismatch between the reconstructed and the input signal is inferred as the presence of damage. This paper presents an experimental and a simulation study of baseline-free damage detection in a stiffened aluminum plate by time reversed Lamb wave (TRLW). In this study, single Lamb wave mode (A 0 ) is generated and sensed using piezoelectric (PZT) transducers through specific transducer placement and amplitude tuning. Different stiffening configurations such as plane and T-stiffeners are considered. Damage cases of disbonding of stiffeners from the base plate, and vertical and embedded cracks in the stiffened plate, are studied. The results show that TRLW based schemes can efficiently identify the presence of damage in a stiffened plate. (paper)

  18. Goal inferences about robot behavior : goal inferences and human response behaviors

    NARCIS (Netherlands)

    Broers, H.A.T.; Ham, J.R.C.; Broeders, R.; De Silva, P.; Okada, M.

    2014-01-01

    This explorative research focused on the goal inferences human observers draw based on a robot's behavior, and the extent to which those inferences predict people's behavior in response to that robot. Results show that different robot behaviors cause different response behavior from people.

  19. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    Science.gov (United States)

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  20. Macromolecular organization of xyloglucan and cellulose in pea epicotyls

    International Nuclear Information System (INIS)

    Hayashi, T.; Maclachlan, G.

    1984-01-01

    Xyloglucan is known to occur widely in the primary cell walls of higher plants. This polysaccharide in most dicots possesses a cellulose-like main chain with three of every four consecutive residues substituted with xylose and minor addition of other sugars. Xyloglucan and cellulose metabolism is regulated by different processes; since different enzyme systems are probably required for the synthesis of their 1,4-β-linkages. A macromolecular complex composed of xyloglucan and cellulose only was obtained from elongating regions of etiolated pea stems. It was examined by light microscopy using iodine staining, by radioautography after labeling with [ 3 H]fructose, by fluorescence microscopy using a fluorescein-lectin (fructose-binding) as probe, and by electron microscopy after shadowing. The techniques all demonstrated that the macromolecule was present in files of cell shapes, referred to here as cell-wall ghosts, in which xyloglucan was localized both on and between the cellulose microfibrils

  1. AR-NE3A, a New Macromolecular Crystallography Beamline for Pharmaceutical Applications at the Photon Factory

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi

    2010-01-01

    Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.

  2. Organ specific acute toxicity of the carcinogen trans-4-acetylaminostilbene is not correlated with macromolecular binding.

    Science.gov (United States)

    Pfeifer, A; Neumann, H G

    1986-09-01

    trans-4-Acetylaminostilbene (trans-AAS) is acutely toxic in rats and lesions are produced specifically in the glandular stomach. Toxicity is slightly increased by pretreating the animals with phenobarbital (PB) and is completely prevented by pretreatment with methylcholanthrene (MC). The prostaglandin inhibitors, indomethacin and acetyl salicylic acid, do not reduce toxicity. The high efficiency of MC suggested that toxicity is caused by reactive metabolites. trans-[3H]-AAS was administered orally to untreated and to PB- or MC-pretreated female Wistar rats and target doses in different tissues were measured by means of covalent binding to proteins, RNA and DNA. Macromolecular binding in the target tissue of poisoned animals was significantly lower than in liver and kidney and comparable to other non-target tissues. Pretreatment with MC lowered macromolecular binding in all extrahepatic tissues but not in liver. These findings are not in line with tissue specific metabolic activation. The only unique property of the target tissue, glandular stomach, that we observed was a particular affinity for the systemically available parent compound. In the early phase of poisoning, tissue concentrations were exceedingly high and the stomach function was impaired.

  3. Entropic Inference

    OpenAIRE

    Caticha, Ariel

    2010-01-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEn...

  4. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  5. A facile metal-free "grafting-from" route from acrylamide-based substrate toward complex macromolecular combs

    KAUST Repository

    Zhao, Junpeng

    2013-01-01

    High-molecular-weight poly(N,N-dimethylacrylamide-co-acrylamide) was used as a model functional substrate to investigate phosphazene base (t-BuP 4)-promoted metal-free anionic graft polymerization utilizing primary amide moieties as initiating sites. The (co)polymerization of epoxides was proven to be effective, leading to macromolecular combs with side chains being single- or double-graft homopolymer, block copolymer and statistical copolymer. © 2013 The Royal Society of Chemistry.

  6. Macromolecular pHPMA-based nanoparticles with cholesterol for solid tumor targeting: behavior in HSA protein environment

    Czech Academy of Sciences Publication Activity Database

    Zhang, X.; Niebuur, B.-J.; Chytil, Petr; Etrych, Tomáš; Filippov, Sergey K.; Kikhney, A.; Wieland, D. C. F.; Svergun, D. I.; Papadakis, C. M.

    2018-01-01

    Roč. 19, č. 2 (2018), s. 470-480 ISSN 1525-7797 R&D Projects: GA ČR(CZ) GC15-10527J; GA MZd(CZ) NV16-28594A; GA MŠk(CZ) LO1507 Institutional support: RVO:61389013 Keywords : polymer carriers * N-(2-hydroxypropyl)methacrylamide * tumor targeting Subject RIV: CD - Macromolecular Chemistry OBOR OECD: Polymer science Impact factor: 5.246, year: 2016

  7. Learning Convex Inference of Marginals

    OpenAIRE

    Domke, Justin

    2012-01-01

    Graphical models trained using maximum likelihood are a common tool for probabilistic inference of marginal distributions. However, this approach suffers difficulties when either the inference process or the model is approximate. In this paper, the inference process is first defined to be the minimization of a convex function, inspired by free energy approximations. Learning is then done directly in terms of the performance of the inference process at univariate marginal prediction. The main ...

  8. Theory of Mind in Adults with Right Hemisphere Damage: What's the Story?

    Science.gov (United States)

    Weed, Ethan; McGregor, William; Nielsen, Jorgen Feldbaek; Roepstorff, Andreas; Frith, Uta

    2010-01-01

    Why do people with right hemisphere damage (RHD) have difficulty with pragmatics and communication? One hypothesis has been that pragmatic impairment in RHD is the result of an underlying impairment in Theory of Mind (ToM): the ability to infer the mental states of others. In previous studies evaluating ToM abilities in people with RHD,…

  9. Analysis of elastic nonlinearity for impact damage detection in composite laminates

    International Nuclear Information System (INIS)

    Frau, A; Porcu, M C; Aymerich, F; Pieczonka, L; Staszewski, W J

    2015-01-01

    This paper concerns the experimental analysis of nonlinear response features of a composite laminate plate for impact damage detection. The measurement procedure is based on the Scaling Subtraction Method (SSM) and consists in exciting the damaged specimen with two sinusoidal signals at different amplitude. The linearly rescaled response signal at low amplitude excitation is subtracted from the response at large amplitude excitation to extract the nonlinear signatures. The latter are analysed in the time domain to infer the presence of damage. Results are compared with frequency domain analyses using the nonlinear vibro-acoustic modulation technique (NWMS). Changes in amplitude and phase as well as modulation effects of the acquired responses are also monitored. Surface-bonded, low profile piezoceramic transducers are used for excitation and sensing. Both measurements techniques are applied to detect barely visible impact damage in laminate composite plate. Non-destructive penetrant-enhanced X-ray inspections are carried out to characterize the extent of internal damage. The behavior of the nonlinear features and the sensitivity of each technique are also investigated in the paper. (paper)

  10. Probabilistic inductive inference: a survey

    OpenAIRE

    Ambainis, Andris

    2001-01-01

    Inductive inference is a recursion-theoretic theory of learning, first developed by E. M. Gold (1967). This paper surveys developments in probabilistic inductive inference. We mainly focus on finite inference of recursive functions, since this simple paradigm has produced the most interesting (and most complex) results.

  11. LAIT: a local ancestry inference toolkit.

    Science.gov (United States)

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  12. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  13. Fluid Physics and Macromolecular Crystal Growth in Microgravity

    Science.gov (United States)

    Helliwell, John R.; Snell, Edward H.; Chayen, Naomi E.; Judge, Russell A.; Boggon, Titus J.; Pusey, M. L.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The first protein crystallization experiment in microgravity was launched in April, 1981 and used Germany's Technologische Experimente unter Schwerelosigkeit (TEXUS 3) sounding rocket. The protein P-galactosidase (molecular weight 465Kda) was chosen as the sample with a liquid-liquid diffusion growth method. A sliding device brought the protein, buffer and salt solution into contact when microgravity was reached. The sounding rocket gave six minutes of microgravity time with a cine camera and schlieren optics used to monitor the experiment, a single growth cell. In microgravity a strictly laminar diffusion process was observed in contrast to the turbulent convection seen on the ground. Several single crystals, approx 100micron in length, were formed in the flight which were of inferior but of comparable visual quality to those grown on the ground over several days. A second experiment using the same protocol but with solutions cooled to -8C (kept liquid with glycerol antifreeze) again showed laminar diffusion. The science of macromolecular structural crystallography involves crystallization of the macromolecule followed by use of the crystal for X-ray diffraction experiments to determine the three dimensional structure of the macromolecule. Neutron protein crystallography is employed for elucidation of H/D exchange and for improved definition of the bound solvent (D20). The structural information enables an understanding of how the molecule functions with important potential for rational drug design, improved efficiency of industrial enzymes and agricultural chemical development. The removal of turbulent convection and sedimentation in microgravity, and the assumption that higher quality crystals will be produced, has given rise to the growing number of crystallization experiments now flown. Many experiments can be flown in a small volume with simple, largely automated, equipment - an ideal combination for a microgravity experiment. The term "protein crystal growth

  14. Extracting trends from two decades of microgravity macromolecular crystallization history.

    Science.gov (United States)

    Judge, Russell A; Snell, Edward H; van der Woerd, Mark J

    2005-06-01

    Since the 1980s hundreds of macromolecular crystal growth experiments have been performed in the reduced acceleration environment of an orbiting spacecraft. Significant enhancements in structural knowledge have resulted from X-ray diffraction of the crystals grown. Similarly, many samples have shown no improvement or degradation in comparison to those grown on the ground. A complex series of interrelated factors affect these experiments and by building a comprehensive archive of the results it was aimed to identify factors that result in success and those that result in failure. Specifically, it was found that dedicated microgravity missions increase the chance of success when compared with those where crystallization took place as a parasitic aspect of the mission. It was also found that the chance of success could not be predicted based on any discernible property of the macromolecule available to us.

  15. Is there a hierarchy of social inferences? The likelihood and speed of inferring intentionality, mind, and personality.

    Science.gov (United States)

    Malle, Bertram F; Holbrook, Jess

    2012-04-01

    People interpret behavior by making inferences about agents' intentionality, mind, and personality. Past research studied such inferences 1 at a time; in real life, people make these inferences simultaneously. The present studies therefore examined whether 4 major inferences (intentionality, desire, belief, and personality), elicited simultaneously in response to an observed behavior, might be ordered in a hierarchy of likelihood and speed. To achieve generalizability, the studies included a wide range of stimulus behaviors, presented them verbally and as dynamic videos, and assessed inferences both in a retrieval paradigm (measuring the likelihood and speed of accessing inferences immediately after they were made) and in an online processing paradigm (measuring the speed of forming inferences during behavior observation). Five studies provide evidence for a hierarchy of social inferences-from intentionality and desire to belief to personality-that is stable across verbal and visual presentations and that parallels the order found in developmental and primate research. (c) 2012 APA, all rights reserved.

  16. A new paradigm for macromolecular crystallography beamlines derived from high-pressure methodology and results

    Energy Technology Data Exchange (ETDEWEB)

    Fourme, Roger, E-mail: roger.fourme@synchrotron-soleil.fr [Synchrotron SOLEIL, BP 48, Saint Aubin, 91192 Gif-sur-Yvette (France); Girard, Eric [IBS (UMR 5075 CEA-CNRS-UJF-PSB), 41 rue Jules Horowitz, 38027 Grenoble Cedex (France); Dhaussy, Anne-Claire [CRISMAT, ENSICAEN, 6 Boulevard du Maréchal Juin, 14000 Caen (France); Medjoubi, Kadda [Synchrotron SOLEIL, BP 48, Saint Aubin, 91192 Gif-sur-Yvette (France); Prangé, Thierry [LCRB (UMR 8015 CNRS), Université Paris Descartes, Faculté de Pharmacie, 4 avenue de l’Observatoire, 75270 Paris (France); Ascone, Isabella [ENSCP (UMR CNRS 7223), 11 rue Pierre et Marie Curie, 75231 Paris Cedex 05 (France); Mezouar, Mohamed [ESRF, BP 220, 38043 Grenoble (France); Kahn, Richard [IBS (UMR 5075 CEA-CNRS-UJF-PSB), 41 rue Jules Horowitz, 38027 Grenoble Cedex (France)

    2011-01-01

    Macromolecular crystallography at high pressure (HPMX) is a mature technique. Shorter X-ray wavelengths increase data collection efficiency on cryocooled crystals. Extending applications and exploiting spin-off of HPMX will require dedicated synchrotron radiation beamlines based on a new paradigm. Biological structures can now be investigated at high resolution by high-pressure X-ray macromolecular crystallography (HPMX). The number of HPMX studies is growing, with applications to polynucleotides, monomeric and multimeric proteins, complex assemblies and even a virus capsid. Investigations of the effects of pressure perturbation have encompassed elastic compression of the native state, study of proteins from extremophiles and trapping of higher-energy conformers that are often of biological interest; measurements of the compressibility of crystals and macromolecules were also performed. HPMX results were an incentive to investigate short and ultra-short wavelengths for standard biocrystallography. On cryocooled lysozyme crystals it was found that the data collection efficiency using 33 keV photons is increased with respect to 18 keV photons. This conclusion was extended from 33 keV down to 6.5 keV by exploiting previously published data. To be fully exploited, the potential of higher-energy photons requires detectors with a good efficiency. Accordingly, a new paradigm for MX beamlines was suggested, using conventional short and ultra-short wavelengths, aiming at the collection of very high accuracy data on crystals under standard conditions or under high pressure. The main elements of such beamlines are outlined.

  17. INFERENCE BUILDING BLOCKS

    Science.gov (United States)

    2018-02-15

    expressed a variety of inference techniques on discrete and continuous distributions: exact inference, importance sampling, Metropolis-Hastings (MH...without redoing any math or rewriting any code. And although our main goal is composable reuse, our performance is also good because we can use...control paths. • The Hakaru language can express mixtures of discrete and continuous distributions, but the current disintegration transformation

  18. Practical Bayesian Inference

    Science.gov (United States)

    Bailer-Jones, Coryn A. L.

    2017-04-01

    Preface; 1. Probability basics; 2. Estimation and uncertainty; 3. Statistical models and inference; 4. Linear models, least squares, and maximum likelihood; 5. Parameter estimation: single parameter; 6. Parameter estimation: multiple parameters; 7. Approximating distributions; 8. Monte Carlo methods for inference; 9. Parameter estimation: Markov chain Monte Carlo; 10. Frequentist hypothesis testing; 11. Model comparison; 12. Dealing with more complicated problems; References; Index.

  19. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography.

    Science.gov (United States)

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L; Armour, Wes; Waterman, David G; Iwata, So; Evans, Gwyndaf

    2013-08-01

    The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein.

  20. Logical inference and evaluation

    International Nuclear Information System (INIS)

    Perey, F.G.

    1981-01-01

    Most methodologies of evaluation currently used are based upon the theory of statistical inference. It is generally perceived that this theory is not capable of dealing satisfactorily with what are called systematic errors. Theories of logical inference should be capable of treating all of the information available, including that not involving frequency data. A theory of logical inference is presented as an extension of deductive logic via the concept of plausibility and the application of group theory. Some conclusions, based upon the application of this theory to evaluation of data, are also given

  1. An integrated native mass spectrometry and top-down proteomics method that connects sequence to structure and function of macromolecular complexes

    Science.gov (United States)

    Li, Huilin; Nguyen, Hong Hanh; Ogorzalek Loo, Rachel R.; Campuzano, Iain D. G.; Loo, Joseph A.

    2018-02-01

    Mass spectrometry (MS) has become a crucial technique for the analysis of protein complexes. Native MS has traditionally examined protein subunit arrangements, while proteomics MS has focused on sequence identification. These two techniques are usually performed separately without taking advantage of the synergies between them. Here we describe the development of an integrated native MS and top-down proteomics method using Fourier-transform ion cyclotron resonance (FTICR) to analyse macromolecular protein complexes in a single experiment. We address previous concerns of employing FTICR MS to measure large macromolecular complexes by demonstrating the detection of complexes up to 1.8 MDa, and we demonstrate the efficacy of this technique for direct acquirement of sequence to higher-order structural information with several large complexes. We then summarize the unique functionalities of different activation/dissociation techniques. The platform expands the ability of MS to integrate proteomics and structural biology to provide insights into protein structure, function and regulation.

  2. NATO Advanced Study Institute on Evolving Methods for Macromolecular Gystallography

    CERN Document Server

    Read, Randy J

    2007-01-01

    X-ray crystallography is the pre-eminent technique for visualizing the structures of macromolecules at atomic resolution. These structures are central to understanding the detailed mechanisms of biological processes, and to discovering novel therapeutics using a structure-based approach. As yet, structures are known for only a small fraction of the proteins encoded by human and pathogenic genomes. To counter the myriad modern threats of disease, there is an urgent need to determine the structures of the thousands of proteins whose structure and function remain unknown. This volume draws on the expertise of leaders in the field of macromolecular crystallography to illuminate the dramatic developments that are accelerating progress in structural biology. Their contributions span the range of techniques from crystallization through data collection, structure solution and analysis, and show how modern high-throughput methods are contributing to a deeper understanding of medical problems.

  3. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    (This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.1 with the ......(This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.......1 with the title ‘Inference'.) This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods using Markov chain Monte Carlo (MCMC) simulations. Due to space limitations the focus...

  4. System Would Detect Foreign-Object Damage in Turbofan Engine

    Science.gov (United States)

    Torso, James A.; Litt, Jonathan S.

    2006-01-01

    A proposed data-fusion system, to be implemented mostly in software, would further process the digitized and preprocessed outputs of sensors in a turbofan engine to detect foreign-object damage (FOD) [more precisely, damage caused by impingement of such foreign objects as birds, pieces of ice, and runway debris]. The proposed system could help a flight crew to decide what, if any, response is necessary to complete a flight safely, and could aid mechanics in deciding what post-flight maintenance action might be needed. The sensory information to be utilized by the proposed system would consist of (1) the output of an accelerometer in an engine-vibration-monitoring subsystem and (2) features extracted from a gas path analysis. ["Gas path analysis" (GPA) is a term of art that denotes comprehensive analysis of engine performance derived from readings of fuel-flow meters, shaft-speed sensors, temperature sensors, and the like.] The acceleration signal would first be processed by a wavelet-transform-based algorithm, using a wavelet created for the specific purpose of finding abrupt FOD-induced changes in noisy accelerometer signals. Two additional features extracted would be the amplitude of vibration (determined via a single- frequency Fourier transform calculated at the rotational speed of the engine), and the rate of change in amplitude due to an FOD-induced rotor imbalance. This system would utilize two GPA features: the fan efficiency and the rate of change of fan efficiency with time. The selected GPA and vibrational features would be assessed by two fuzzy-logic inference engines, denoted the "Gas Path Expert" and the "Vibration Expert," respectively (see Figure 1). Each of these inference engines would generate a "possibility" distribution for occurrence of an FOD event: Each inference engine would assign, to its input information, degrees of membership, which would subsequently be transformed into basic probability assignments for the gas path and vibration

  5. Lower complexity bounds for lifted inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2015-01-01

    instances of the model. Numerous approaches for such “lifted inference” techniques have been proposed. While it has been demonstrated that these techniques will lead to significantly more efficient inference on some specific models, there are only very recent and still quite restricted results that show...... the feasibility of lifted inference on certain syntactically defined classes of models. Lower complexity bounds that imply some limitations for the feasibility of lifted inference on more expressive model classes were established earlier in Jaeger (2000; Jaeger, M. 2000. On the complexity of inference about...... that under the assumption that NETIME≠ETIME, there is no polynomial lifted inference algorithm for knowledge bases of weighted, quantifier-, and function-free formulas. Further strengthening earlier results, this is also shown to hold for approximate inference and for knowledge bases not containing...

  6. Variations on Bayesian Prediction and Inference

    Science.gov (United States)

    2016-05-09

    inference 2.2.1 Background There are a number of statistical inference problems that are not generally formulated via a full probability model...problem of inference about an unknown parameter, the Bayesian approach requires a full probability 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...the problem of inference about an unknown parameter, the Bayesian approach requires a full probability model/likelihood which can be an obstacle

  7. Updated Simulation Studies of Damage Limit of LHC Tertiary Collimators

    CERN Document Server

    AUTHOR|(CDS)2085459; Bertarelli, Alessandro; Bruce, Roderik; Carra, Federico; Cerutti, Francesco; Gradassi, Paolo; Lechner, Anton; Redaelli, Stefano; Skordis, Eleftherios

    2015-01-01

    The tertiary collimators (TCTs) in the LHC, installed in front of the experiments, in standard operation intercept fractions of 10−3 halo particles. However, they risk to be hit by high-intensity primary beams in case of asynchronous beam dump. TCT damage thresholds were initially inferred from results of destructive tests on a TCT jaw, supported by numerical simulations, assuming simplified impact scenarios with one single bunch hitting the jaw with a given impact parameter. In this paper, more realistic failure conditions, including a train of bunches and taking into account the full collimation hierarchy, are used to derive updated damage limits. The results are used to update the margins in the collimation hierarchy and could thus potentially have an influence on the LHC performance.

  8. Permeability to macromolecular contrast media quantified by dynamic MRI correlates with tumor tissue assays of vascular endothelial growth factor (VEGF)

    International Nuclear Information System (INIS)

    Cyran, Clemens C.; Sennino, Barbara; Fu, Yanjun; Rogut, Victor; Shames, David M.; Chaopathomkul, Bundit; Wendland, Michael F.; McDonald, Donald M.; Brasch, Robert C.; Raatschen, Hans-Juergen

    2012-01-01

    Purpose: To correlate dynamic MRI assays of macromolecular endothelial permeability with microscopic area–density measurements of vascular endothelial growth factor (VEGF) in tumors. Methods and material: This study compared tumor xenografts from two different human cancer cell lines, MDA-MB-231 tumors (n = 5), and MDA-MB-435 (n = 8), reported to express respectively higher and lower levels of VEGF. Dynamic MRI was enhanced by a prototype macromolecular contrast medium (MMCM), albumin-(Gd-DTPA)35. Quantitative estimates of tumor microvascular permeability (K PS ; μl/min × 100 cm 3 ), obtained using a two-compartment kinetic model, were correlated with immunohistochemical measurements of VEGF in each tumor. Results: Mean K PS was 2.4 times greater in MDA-MB-231 tumors (K PS = 58 ± 30.9 μl/min × 100 cm 3 ) than in MDA-MB-435 tumors (K PS = 24 ± 8.4 μl/min × 100 cm 3 ) (p < 0.05). Correspondingly, the area–density of VEGF in MDA-MB-231 tumors was 2.6 times greater (27.3 ± 2.2%, p < 0.05) than in MDA-MB-435 cancers (10.5 ± 0.5%, p < 0.05). Considering all tumors without regard to cell type, a significant positive correlation (r = 0.67, p < 0.05) was observed between MRI-estimated endothelial permeability and VEGF immunoreactivity. Conclusion: Correlation of MRI assays of endothelial permeability to a MMCM and VEGF immunoreactivity of tumors support the hypothesis that VEGF is a major contributor to increased macromolecular permeability in cancers. When applied clinically, the MMCM-enhanced MRI approach could help to optimize the appropriate application of VEGF-inhibiting therapy on an individual patient basis.

  9. Macromolecular crowding-assisted fabrication of liquid-crystalline imprinted polymers.

    Science.gov (United States)

    Zhang, Chen; Zhang, Jing; Huang, Yan-Ping; Liu, Zhao-Sheng

    2015-04-01

    A macromolecular crowding-assisted liquid-crystalline molecularly imprinted monolith (LC-MIM) was prepared successfully for the first time. The imprinted stationary phase was synthesized with polymethyl methacrylate (PMMA) or polystyrene (PS) as the crowding agent, 4-cyanophenyl dicyclohexyl propylene (CPCE) as the liquid-crystal monomer, and hydroquinidine as the pseudo-template for the chiral separation of cinchona alkaloids in HPLC. A low level of cross-linker (26%) has been found to be sufficient to achieve molecular recognition on the crowding-assisted LC-MIM due to the physical cross-linking of mesogenic groups in place of chemical cross-linking, and baseline separation of quinidine and quinine could be achieved with good resolution (R(s) = 2.96), selectivity factor (α = 2.16), and column efficiency (N = 2650 plates/m). In contrast, the LC-MIM prepared without crowding agents displayed the smallest diastereoselectivity (α = 1.90), while the crowding-assisted MIM with high level of cross-linker (80%) obtained the greatest selectivity factor (α = 7.65), but the lowest column efficiency (N = 177 plates/m).

  10. On macromolecular refinement at subatomic resolution with interatomic scatterers

    Energy Technology Data Exchange (ETDEWEB)

    Afonine, Pavel V., E-mail: pafonine@lbl.gov; Grosse-Kunstleve, Ralf W.; Adams, Paul D. [Lawrence Berkeley National Laboratory, One Cyclotron Road, BLDG 64R0121, Berkeley, CA 94720 (United States); Lunin, Vladimir Y. [Institute of Mathematical Problems of Biology, Russian Academy of Sciences, Pushchino 142290 (Russian Federation); Urzhumtsev, Alexandre [IGMBC, 1 Rue L. Fries, 67404 Illkirch and IBMC, 15 Rue R. Descartes, 67084 Strasbourg (France); Faculty of Sciences, Nancy University, 54506 Vandoeuvre-lès-Nancy (France); Lawrence Berkeley National Laboratory, One Cyclotron Road, BLDG 64R0121, Berkeley, CA 94720 (United States)

    2007-11-01

    Modelling deformation electron density using interatomic scatters is simpler than multipolar methods, produces comparable results at subatomic resolution and can easily be applied to macromolecules. A study of the accurate electron-density distribution in molecular crystals at subatomic resolution (better than ∼1.0 Å) requires more detailed models than those based on independent spherical atoms. A tool that is conventionally used in small-molecule crystallography is the multipolar model. Even at upper resolution limits of 0.8–1.0 Å, the number of experimental data is insufficient for full multipolar model refinement. As an alternative, a simpler model composed of conventional independent spherical atoms augmented by additional scatterers to model bonding effects has been proposed. Refinement of these mixed models for several benchmark data sets gave results that were comparable in quality with the results of multipolar refinement and superior to those for conventional models. Applications to several data sets of both small molecules and macromolecules are shown. These refinements were performed using the general-purpose macromolecular refinement module phenix.refine of the PHENIX package.

  11. Adaptive Inference on General Graphical Models

    OpenAIRE

    Acar, Umut A.; Ihler, Alexander T.; Mettu, Ramgopal; Sumer, Ozgur

    2012-01-01

    Many algorithms and applications involve repeatedly solving variations of the same inference problem; for example we may want to introduce new evidence to the model or perform updates to conditional dependencies. The goal of adaptive inference is to take advantage of what is preserved in the model and perform inference more rapidly than from scratch. In this paper, we describe techniques for adaptive inference on general graphs that support marginal computation and updates to the conditional ...

  12. Timely deposition of macromolecular structures is necessary for peer review

    International Nuclear Information System (INIS)

    Joosten, Robbie P.; Soueidan, Hayssam; Wessels, Lodewyk F. A.; Perrakis, Anastassis

    2013-01-01

    Deposition of crystallographic structures should be concurrent with or prior to manuscript submission for peer review, enabling validation and increasing reliability of the PDB. Most of the macromolecular structures in the Protein Data Bank (PDB), which are used daily by thousands of educators and scientists alike, are determined by X-ray crystallography. It was examined whether the crystallographic models and data were deposited to the PDB at the same time as the publications that describe them were submitted for peer review. This condition is necessary to ensure pre-publication validation and the quality of the PDB public archive. It was found that a significant proportion of PDB entries were submitted to the PDB after peer review of the corresponding publication started, and many were only submitted after peer review had ended. It is argued that clear description of journal policies and effective policing is important for pre-publication validation, which is key in ensuring the quality of the PDB and of peer-reviewed literature

  13. Timely deposition of macromolecular structures is necessary for peer review

    Energy Technology Data Exchange (ETDEWEB)

    Joosten, Robbie P. [Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX Amsterdam (Netherlands); Soueidan, Hayssam; Wessels, Lodewyk F. A. [Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX, Amsterdam (Netherlands); Perrakis, Anastassis, E-mail: a.perrakis@nki.nl [Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX Amsterdam (Netherlands)

    2013-12-01

    Deposition of crystallographic structures should be concurrent with or prior to manuscript submission for peer review, enabling validation and increasing reliability of the PDB. Most of the macromolecular structures in the Protein Data Bank (PDB), which are used daily by thousands of educators and scientists alike, are determined by X-ray crystallography. It was examined whether the crystallographic models and data were deposited to the PDB at the same time as the publications that describe them were submitted for peer review. This condition is necessary to ensure pre-publication validation and the quality of the PDB public archive. It was found that a significant proportion of PDB entries were submitted to the PDB after peer review of the corresponding publication started, and many were only submitted after peer review had ended. It is argued that clear description of journal policies and effective policing is important for pre-publication validation, which is key in ensuring the quality of the PDB and of peer-reviewed literature.

  14. A beamline for macromolecular crystallography at the Advanced Light Source

    International Nuclear Information System (INIS)

    Padmore, H.A.; Earnest, T.; Kim, S.H.; Thompson, A.C.; Robinson, A.L.

    1994-08-01

    A beamline for macromolecular crystallography has been designed for the ALS. The source will be a 37-pole wiggler with a, 2-T on-axis peak field. The wiggler will illuminate three beamlines, each accepting 3 mrad of horizontal aperture. The central beamline will primarily be used for multiple-wavelength anomalous dispersion measurements in the wavelength range from 4 to 0.9 angstrom. The beamline optics will comprise a double-crystal monochromator with a collimating pre-mirror and a double-focusing mirror after the monochromator. The two side stations will be used for fixed-wavelength experiments within the wavelength range from 1.5 to 0.95 angstrom. The optics will consist of a conventional vertically focusing cylindrical mirror followed by an asymmetrically cut curved-crystal monochromator. This paper presents details of the optimization of the wiggler source for crystallography, gives a description of the beamline configuration, and discusses the reasons for the choices made

  15. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography

    Science.gov (United States)

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L.; Armour, Wes; Waterman, David G.; Iwata, So; Evans, Gwyndaf

    2013-01-01

    The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein. PMID:23897484

  16. Comparison of two self-assembled macromolecular prodrug micelles with different conjugate positions of SN38 for enhancing antitumor activity

    Directory of Open Access Journals (Sweden)

    Liu Y

    2015-03-01

    Full Text Available Yi Liu,1 Hongyu Piao,1 Ying Gao,1 Caihong Xu,2 Ye Tian,1 Lihong Wang,1 Jinwen Liu,1 Bo Tang,1 Meijuan Zou,1 Gang Cheng1 1Department of Pharmaceutics, Shenyang Pharmaceutical University, Shenyang, Liaoning Province, People’s Republic of China; 2Department of Food Science, Shenyang Normal University, Shenyang, Liaoning Province, People’s Republic of China Abstract: 7-Ethyl-10-hydroxycamptothecin (SN38, an active metabolite of irinotecan (CPT-11, is a remarkably potent antitumor agent. The clinical application of SN38 has been extremely restricted by its insolubility in water. In this study, we successfully synthesized two macromolecular prodrugs of SN38 with different conjugate positions (chitosan-(C10-OHSN38 and chitosan-(C20-OHSN38 to improve the water solubility and antitumor activity of SN38. These prodrugs can self-assemble into micelles in aqueous medium. The particle size, morphology, zeta potential, and in vitro drug release of SN38 and its derivatives, as well as their cytotoxicity, pharmacokinetics, and in vivo antitumor activity in a xenograft BALB/c mouse model were studied. In vitro, chitosan-(C10-OHSN38 (CS-(10sSN38 and chitosan-(C20-OHSN38 (CS-(20sSN38 were 13.3- and 25.9-fold more potent than CPT-11 in the murine colon adenocarcinoma cell line CT26, respectively. The area under the curve (AUC0–24 of SN38 after intravenously administering CS-(10sSN38 and CS-(20sSN38 to Sprague Dawley rats was greatly improved when compared with CPT-11 (both P<0.01. A larger AUC0–24 of CS-(20sSN38 was observed when compared to CS-(10sSN38 (P<0.05. Both of the novel self-assembled chitosan-SN38 prodrugs demonstrated superior anticancer activity to CPT-11 in the CT26 xenograft BALB/c mouse model. We have also investigated the differences between these macromolecular prodrug micelles with regards to enhancing the antitumor activity of SN38. CS-(20sSN38 exhibited better in vivo antitumor activity than CS-(10sSN38 at a dose of 2.5 mg/kg (P<0

  17. An analysis of the development of high temperature cavitation damage

    International Nuclear Information System (INIS)

    Tinivella, R.

    1986-07-01

    The objective of the paper is the investigation of creep cavitation damage in copper. Radii distribution curves obtained from small angle neutron scattering experiments conducted on crept specimens were analyzed and compared with calculated curves. The latter were derived from cavity nucleation- and growth models. From the comparison the appropriateness of particular models can be infered. Valuable information is obtained about the nucleation behaviour. In crept and fatigued specimens, already after very short loading times, cavities appear with remarkable different radii, an observation which is contradictory to the concept of a critical radius. The analysis of the nucleation behavior emphasizes the influence of the stress dependence of the nucleation rate upon the stress dependence of damage and hence upon the stress dependence of the lifetime. In most of damage theories the latter is attributed to the stress dependency of cavity growth. A strong argument is derived in this paper in favour of the idea that both the mechanisms - growth and nucleation - contribute to the stress dependence of the lifetime. The damage development in Cu (as well as in alpha-Fe, AISI 304 and AISI 347) is compared with the prediction of the phenomenological A-model which assumes that the damage rate is proportional to the damage itself. The experiments show, that the damage increases in time slower (Cu, alpha-Fe, AISI 304) or faster (AISI 347) than predicted by the model. In copper the damage rate turns out to be constant independent of time. Accordingly the A-model is modified and the respective consequences are briefly discussed. (orig./GSCH) [de

  18. The inference from a single case: moral versus scientific inferences in implementing new biotechnologies.

    Science.gov (United States)

    Hofmann, B

    2008-06-01

    Are there similarities between scientific and moral inference? This is the key question in this article. It takes as its point of departure an instance of one person's story in the media changing both Norwegian public opinion and a brand-new Norwegian law prohibiting the use of saviour siblings. The case appears to falsify existing norms and to establish new ones. The analysis of this case reveals similarities in the modes of inference in science and morals, inasmuch as (a) a single case functions as a counter-example to an existing rule; (b) there is a common presupposition of stability, similarity and order, which makes it possible to reason from a few cases to a general rule; and (c) this makes it possible to hold things together and retain order. In science, these modes of inference are referred to as falsification, induction and consistency. In morals, they have a variety of other names. Hence, even without abandoning the fact-value divide, there appear to be similarities between inference in science and inference in morals, which may encourage communication across the boundaries between "the two cultures" and which are relevant to medical humanities.

  19. Errors in macromolecular synthesis after stress. A study of the possible protective role of the small heat shock proteinsBiochemistry

    NARCIS (Netherlands)

    Marin Vinader, L.

    2006-01-01

    The general goal of this thesis was to gain insight in what small heat shock proteins (sHsps) do with respect to macromolecular synthesis during a stressful situation in the cell. It is known that after a non-lethal heat shock, cells are better protected against a subsequent more severe heat shock,

  20. Radiation damage in room-temperature data acquisition with the PILATUS 6M pixel detector

    Energy Technology Data Exchange (ETDEWEB)

    Rajendran, Chitra, E-mail: chitra.rajendran@psi.ch; Dworkowski, Florian S. N.; Wang, Meitian; Schulze-Briese, Clemens [Swiss Light Source at Paul Scherrer Institute, CH-5232 Villigen (Switzerland)

    2011-05-01

    Observations of the dose-rate effect in continuous X-ray diffraction data acquisition at room temperature are presented. The first study of room-temperature macromolecular crystallography data acquisition with a silicon pixel detector is presented, where the data are collected in continuous sample rotation mode, with millisecond read-out time and no read-out noise. Several successive datasets were collected sequentially from single test crystals of thaumatin and insulin. The dose rate ranged between ∼1320 Gy s{sup −1} and ∼8420 Gy s{sup −1} with corresponding frame rates between 1.565 Hz and 12.5 Hz. The data were analysed for global radiation damage. A previously unreported negative dose-rate effect is observed in the indicators of global radiation damage, which showed an approximately 75% decrease in D{sub 1/2} at sixfold higher dose rate. The integrated intensity decreases in an exponential manner. Sample heating that could give rise to the enhanced radiation sensitivity at higher dose rate is investigated by collecting data between crystal temperatures of 298 K and 353 K. UV-Vis spectroscopy is used to demonstrate that disulfide radicals and trapped electrons do not accumulate at high dose rates in continuous data collection.

  1. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  2. The target theory applied to the analysis of irradiation damages in organic detectors

    International Nuclear Information System (INIS)

    Mesquita, Carlos Henrique de

    2005-01-01

    The Target Theory was used to explain the radiation damage in samples containing 1% (g//L) of 2,5-diphenyl-oxazolyl (PPO) diluted in toluene and irradiated with 60 Co (1.8 Gy/s). The survival molecules of irradiated PPO obeys the bi-exponential mathematical model [74.3 x exp(-D/104.3) + 25.7 x exp(-D/800,0)]. It indicates that 74.3% of the molecules decay with D37=104.3 kGy and 25.7% decay with D37=800 kGy. From the Target Theory it was inferred the energies involved in the irradiation damages which were 0.239 ± 0.031 eV (G=418.4 ± 54.1. damages/100 eV) and 1.83 ± 0.30 eV (54.5 ± 8.9 damages/100 eV). The diameter of PPO molecule estimated from the Target Theory is in the interval of 45.5 to 64.9 angstrom. (author)

  3. Probabilistic inference of fatigue damage propagation with limited and partial information

    Directory of Open Access Journals (Sweden)

    Huang Min

    2015-08-01

    Full Text Available A general method of probabilistic fatigue damage prognostics using limited and partial information is developed. Limited and partial information refers to measurable data that are not enough or cannot directly be used to statistically identify model parameter using traditional regression analysis. In the proposed method, the prior probability distribution of model parameters is derived based on the principle of maximum entropy (MaxEnt using the limited and partial information as constraints. The posterior distribution is formulated using the principle of maximum relative entropy (MRE to perform probability updating when new information is available and reduces uncertainty in prognosis results. It is shown that the posterior distribution is equivalent to a Bayesian posterior when the new information used for updating is point measurements. A numerical quadrature interpolating method is used to calculate the asymptotic approximation for the prior distribution. Once the prior is obtained, subsequent measurement data are used to perform updating using Markov chain Monte Carlo (MCMC simulations. Fatigue crack prognosis problems with experimental data are presented for demonstration and validation.

  4. Active inference, communication and hermeneutics.

    Science.gov (United States)

    Friston, Karl J; Frith, Christopher D

    2015-07-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others--during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions--both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then--in principle--they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Site-selective electroless nickel plating on patterned thin films of macromolecular metal complexes.

    Science.gov (United States)

    Kimura, Mutsumi; Yamagiwa, Hiroki; Asakawa, Daisuke; Noguchi, Makoto; Kurashina, Tadashi; Fukawa, Tadashi; Shirai, Hirofusa

    2010-12-01

    We demonstrate a simple route to depositing nickel layer patterns using photocross-linked polymer thin films containing palladium catalysts, which can be used as adhesive interlayers for fabrication of nickel patterns on glass and plastic substrates. Electroless nickel patterns can be obtained in three steps: (i) the pattern formation of partially quaterized poly(vinyl pyridine) by UV irradiation, (ii) the formation of macromolecular metal complex with palladium, and (iii) the nickel metallization using electroless plating bath. Metallization is site-selective and allows for a high resolution. And the resulting nickel layered structure shows good adhesion with glass and plastic substrates. The direct patterning of metallic layers onto insulating substrates indicates a great potential for fabricating micro/nano devices.

  6. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  7. Optimization methods for logical inference

    CERN Document Server

    Chandru, Vijay

    2011-01-01

    Merging logic and mathematics in deductive inference-an innovative, cutting-edge approach. Optimization methods for logical inference? Absolutely, say Vijay Chandru and John Hooker, two major contributors to this rapidly expanding field. And even though ""solving logical inference problems with optimization methods may seem a bit like eating sauerkraut with chopsticks. . . it is the mathematical structure of a problem that determines whether an optimization model can help solve it, not the context in which the problem occurs."" Presenting powerful, proven optimization techniques for logic in

  8. Inference in `poor` languages

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, S.

    1996-10-01

    Languages with a solvable implication problem but without complete and consistent systems of inference rules (`poor` languages) are considered. The problem of existence of finite complete and consistent inference rule system for a ``poor`` language is stated independently of the language or rules syntax. Several properties of the problem arc proved. An application of results to the language of join dependencies is given.

  9. Macromolecular Crystal Growth by Means of Microfluidics

    Science.gov (United States)

    vanderWoerd, Mark; Ferree, Darren; Spearing, Scott; Monaco, Lisa; Molho, Josh; Spaid, Michael; Brasseur, Mike; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    We have performed a feasibility study in which we show that chip-based, microfluidic (LabChip(TM)) technology is suitable for protein crystal growth. This technology allows for accurate and reliable dispensing and mixing of very small volumes while minimizing bubble formation in the crystallization mixture. The amount of (protein) solution remaining after completion of an experiment is minimal, which makes this technique efficient and attractive for use with proteins, which are difficult or expensive to obtain. The nature of LabChip(TM) technology renders it highly amenable to automation. Protein crystals obtained in our initial feasibility studies were of excellent quality as determined by X-ray diffraction. Subsequent to the feasibility study, we designed and produced the first LabChip(TM) device specifically for protein crystallization in batch mode. It can reliably dispense and mix from a range of solution constituents into two independent growth wells. We are currently testing this design to prove its efficacy for protein crystallization optimization experiments. In the near future we will expand our design to incorporate up to 10 growth wells per LabChip(TM) device. Upon completion, additional crystallization techniques such as vapor diffusion and liquid-liquid diffusion will be accommodated. Macromolecular crystallization using microfluidic technology is envisioned as a fully automated system, which will use the 'tele-science' concept of remote operation and will be developed into a research facility for the International Space Station as well as on the ground.

  10. Atomic force microscopy imaging of macromolecular complexes.

    Science.gov (United States)

    Santos, Sergio; Billingsley, Daniel; Thomson, Neil

    2013-01-01

    This chapter reviews amplitude modulation (AM) AFM in air and its applications to high-resolution imaging and interpretation of macromolecular complexes. We discuss single DNA molecular imaging and DNA-protein interactions, such as those with topoisomerases and RNA polymerase. We show how relative humidity can have a major influence on resolution and contrast and how it can also affect conformational switching of supercoiled DNA. Four regimes of AFM tip-sample interaction in air are defined and described, and relate to water perturbation and/or intermittent mechanical contact of the tip with either the molecular sample or the surface. Precise control and understanding of the AFM operational parameters is shown to allow the user to switch between these different regimes: an interpretation of the origins of topographical contrast is given for each regime. Perpetual water contact is shown to lead to a high-resolution mode of operation, which we term SASS (small amplitude small set-point) imaging, and which maximizes resolution while greatly decreasing tip and sample wear and any noise due to perturbation of the surface water. Thus, this chapter provides sufficient information to reliably control the AFM in the AM AFM mode of operation in order to image both heterogeneous samples and single macromolecules including complexes, with high resolution and with reproducibility. A brief introduction to AFM, its versatility and applications to biology is also given while providing references to key work and general reviews in the field.

  11. Mix and Inject: Reaction Initiation by Diffusion for Time-Resolved Macromolecular Crystallography

    Directory of Open Access Journals (Sweden)

    Marius Schmidt

    2013-01-01

    Full Text Available Time-resolved macromolecular crystallography unifies structure determination with chemical kinetics, since the structures of transient states and chemical and kinetic mechanisms can be determined simultaneously from the same data. To start a reaction in an enzyme, typically, an initially inactive substrate present in the crystal is activated. This has particular disadvantages that are circumvented when active substrate is directly provided by diffusion. However, then it is prohibitive to use macroscopic crystals because diffusion times become too long. With small micro- and nanocrystals diffusion times are adequately short for most enzymes and the reaction can be swiftly initiated. We demonstrate here that a time-resolved crystallographic experiment becomes feasible by mixing substrate with enzyme nanocrystals which are subsequently injected into the X-ray beam of a pulsed X-ray source.

  12. Structure analysis of molecular systems in the Institute of Macromolecular Chemistry of the Czech Academy of Sciences

    Czech Academy of Sciences Publication Activity Database

    Hašek, Jindřich

    2010-01-01

    Roč. 17, 2a (2010), k32-k34 ISSN 1211-5894. [Struktura 2010. Soláň, 14.06.2010-17.06.2010] R&D Projects: GA AV ČR IAA500500701; GA ČR GA305/07/1073 Institutional research plan: CEZ:AV0Z40500505 Keywords : Academy of Sciences of the Czech Republic * X-ray structure analysis * crystallography Subject RIV: CD - Macromolecular Chemistry http:// xray .cz/ms/bul2010-2a/hasek.pdf

  13. EI: A Program for Ecological Inference

    Directory of Open Access Journals (Sweden)

    Gary King

    2004-09-01

    Full Text Available The program EI provides a method of inferring individual behavior from aggregate data. It implements the statistical procedures, diagnostics, and graphics from the book A Solution to the Ecological Inference Problem: Reconstructing Individual Behavior from Aggregate Data (King 1997. Ecological inference, as traditionally defined, is the process of using aggregate (i.e., "ecological" data to infer discrete individual-level relationships of interest when individual-level data are not available. Ecological inferences are required in political science research when individual-level surveys are unavailable (e.g., local or comparative electoral politics, unreliable (racial politics, insufficient (political geography, or infeasible (political history. They are also required in numerous areas of ma jor significance in public policy (e.g., for applying the Voting Rights Act and other academic disciplines ranging from epidemiology and marketing to sociology and quantitative history.

  14. Stretchable All-Gel-State Fiber-Shaped Supercapacitors Enabled by Macromolecularly Interconnected 3D Graphene/Nanostructured Conductive Polymer Hydrogels.

    Science.gov (United States)

    Li, Panpan; Jin, Zhaoyu; Peng, Lele; Zhao, Fei; Xiao, Dan; Jin, Yong; Yu, Guihua

    2018-05-01

    Nanostructured conductive polymer hydrogels (CPHs) have been extensively applied in energy storage owing to their advantageous features, such as excellent electrochemical activity and relatively high electrical conductivity, yet the fabrication of self-standing and flexible electrode-based CPHs is still hampered by their limited mechanical properties. Herein, macromolecularly interconnected 3D graphene/nanostructured CPH is synthesized via self-assembly of CPHs and graphene oxide macrostructures. The 3D hybrid hydrogel shows uniform interconnectivity and enhanced mechanical properties due to the strong macromolecular interaction between the CPHs and graphene, thus greatly reducing aggregation in the fiber-shaping process. A proof-of-concept all-gel-state fibrous supercapacitor based on the 3D polyaniline/graphene hydrogel is fabricated to demonstrate the outstanding flexibility and mouldability, as well as superior electrochemical properties enabled by this 3D hybrid hydrogel design. The proposed device can achieve a large strain (up to ≈40%), and deliver a remarkable volumetric energy density of 8.80 mWh cm -3 (at power density of 30.77 mW cm -3 ), outperforming many fiber-shaped supercapacitors reported previously. The all-hydrogel design opens up opportunities in the fabrication of next-generation wearable and portable electronics. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Implementation of fast macromolecular proton fraction mapping on 1.5 and 3 Tesla clinical MRI scanners: preliminary experience

    Science.gov (United States)

    Yarnykh, V.; Korostyshevskaya, A.

    2017-08-01

    Macromolecular proton fraction (MPF) is a biophysical parameter describing the amount of macromolecular protons involved into magnetization exchange with water protons in tissues. MPF represents a significant interest as a magnetic resonance imaging (MRI) biomarker of myelin for clinical applications. A recent fast MPF mapping method enabled clinical translation of MPF measurements due to time-efficient acquisition based on the single-point constrained fit algorithm. However, previous MPF mapping applications utilized only 3 Tesla MRI scanners and modified pulse sequences, which are not commonly available. This study aimed to test the feasibility of MPF mapping implementation on a 1.5 Tesla clinical scanner using standard manufacturer’s sequences and compare the performance of this method between 1.5 and 3 Tesla scanners. MPF mapping was implemented on 1.5 and 3 Tesla MRI units of one manufacturer with either optimized custom-written or standard product pulse sequences. Whole-brain three-dimensional MPF maps obtained from a single volunteer were compared between field strengths and implementation options. MPF maps demonstrated similar quality at both field strengths. MPF values in segmented brain tissues and specific anatomic regions appeared in close agreement. This experiment demonstrates the feasibility of fast MPF mapping using standard sequences on 1.5 T and 3 T clinical scanners.

  16. Macromolecular refinement by model morphing using non-atomic parameterizations.

    Science.gov (United States)

    Cowtan, Kevin; Agirre, Jon

    2018-02-01

    Refinement is a critical step in the determination of a model which explains the crystallographic observations and thus best accounts for the missing phase components. The scattering density is usually described in terms of atomic parameters; however, in macromolecular crystallography the resolution of the data is generally insufficient to determine the values of these parameters for individual atoms. Stereochemical and geometric restraints are used to provide additional information, but produce interrelationships between parameters which slow convergence, resulting in longer refinement times. An alternative approach is proposed in which parameters are not attached to atoms, but to regions of the electron-density map. These parameters can move the density or change the local temperature factor to better explain the structure factors. Varying the size of the region which determines the parameters at a particular position in the map allows the method to be applied at different resolutions without the use of restraints. Potential applications include initial refinement of molecular-replacement models with domain motions, and potentially the use of electron density from other sources such as electron cryo-microscopy (cryo-EM) as the refinement model.

  17. Structural changes in the ordering processes of macromolecular compounds

    International Nuclear Information System (INIS)

    Kobayashi, M.; Tashiro, K.

    1998-01-01

    In order to clarify the microscopically-viewed relationship between the conformational ordering process and the aggregation process of the macromolecular chains in the phase transitions from melt to solid or from solution to gel, the time-resolved Fourier-transform infrared spectra and small-angle X-ray or neutron scattering data have been analyzed in an organized manner. Two concrete examples were presented. (1) In the gelation phenomenon of syndiotactic polystyrene-organic solvent system, the ordered TTGG conformation is formed and develops with time. This conformational ordering is accelerated by the aggregation of these chain segments, resulting in the formation of macroscopic gel network. (2) In the isothermal crystallization process from the melt of polyethylene, the following ordering mechanism was revealed. The conformationally-disordered short trans conformers appear at first in the random coils of the melt. These disordered trans sequences grow to longer and more regular trans sequences of the orthorhombic-type crystal and then the isolated lamellae are formed. Afterwards, the stacked lamellar structure is developed without change of lamellar thickness but with small decrease in the long period, indicating an insertion of new lamellae between the already produced lamellar layers

  18. On the criticality of inferred models

    Science.gov (United States)

    Mastromatteo, Iacopo; Marsili, Matteo

    2011-10-01

    Advanced inference techniques allow one to reconstruct a pattern of interaction from high dimensional data sets, from probing simultaneously thousands of units of extended systems—such as cells, neural tissues and financial markets. We focus here on the statistical properties of inferred models and argue that inference procedures are likely to yield models which are close to singular values of parameters, akin to critical points in physics where phase transitions occur. These are points where the response of physical systems to external perturbations, as measured by the susceptibility, is very large and diverges in the limit of infinite size. We show that the reparameterization invariant metrics in the space of probability distributions of these models (the Fisher information) are directly related to the susceptibility of the inferred model. As a result, distinguishable models tend to accumulate close to critical points, where the susceptibility diverges in infinite systems. This region is the one where the estimate of inferred parameters is most stable. In order to illustrate these points, we discuss inference of interacting point processes with application to financial data and show that sensible choices of observation time scales naturally yield models which are close to criticality.

  19. On the criticality of inferred models

    International Nuclear Information System (INIS)

    Mastromatteo, Iacopo; Marsili, Matteo

    2011-01-01

    Advanced inference techniques allow one to reconstruct a pattern of interaction from high dimensional data sets, from probing simultaneously thousands of units of extended systems—such as cells, neural tissues and financial markets. We focus here on the statistical properties of inferred models and argue that inference procedures are likely to yield models which are close to singular values of parameters, akin to critical points in physics where phase transitions occur. These are points where the response of physical systems to external perturbations, as measured by the susceptibility, is very large and diverges in the limit of infinite size. We show that the reparameterization invariant metrics in the space of probability distributions of these models (the Fisher information) are directly related to the susceptibility of the inferred model. As a result, distinguishable models tend to accumulate close to critical points, where the susceptibility diverges in infinite systems. This region is the one where the estimate of inferred parameters is most stable. In order to illustrate these points, we discuss inference of interacting point processes with application to financial data and show that sensible choices of observation time scales naturally yield models which are close to criticality

  20. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framewor...

  1. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    2010-01-01

    Chapter 9: This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based on a maximum likelihood or Bayesian approach combined with markov chain Monte Carlo...... (MCMC) techniques. Due to space limitations the focus is on spatial point processes....

  2. Signs of long-term adaptation to permanent brain damage as revealed by prehension studies of children with spastic hemiparesis

    NARCIS (Netherlands)

    Steenbergen, B.; Meulenbroek, R.G.J.; Latash, M.L.; Levin, M.

    2003-01-01

    This chapter focusses on signs of long-term adaptation to permanent brain damage in children with spastic hemiparesis. First, we recognize that adaptation processes may occur at various time scales. Then, we formulate a tentative strategy to infer signs of adaptation from behavioral data.

  3. Quantification of change in vocal fold tissue stiffness relative to depth of artificial damage.

    Science.gov (United States)

    Rohlfs, Anna-Katharina; Schmolke, Sebastian; Clauditz, Till; Hess, Markus; Müller, Frank; Püschel, Klaus; Roemer, Frank W; Schumacher, Udo; Goodyer, Eric

    2017-10-01

    To quantify changes in the biomechanical properties of human excised vocal folds with defined artificial damage. The linear skin rheometer (LSR) was used to obtain a series of rheological measurements of shear modulus from the surface of 30 human cadaver vocal folds. The tissue samples were initially measured in a native condition and then following varying intensities of thermal damage. Histological examination of each vocal fold was used to determine the depth of artificial alteration. The measured changes in stiffness were correlated with the depth of cell damage. For vocal folds in a pre-damage state the shear modulus values ranged from 537 Pa to 1,651 Pa (female) and from 583 Pa to 1,193 Pa (male). With increasing depth of damage from the intermediate layer of the lamina propria (LP), tissue stiffness increased consistently (compared with native values) following application of thermal damage to the vocal folds. The measurement showed an increase of tissue stiffness when the depth of tissue damage was extending from the intermediate LP layer downwards. Changes in the elastic characteristics of human vocal fold tissue following damage at defined depths were demonstrated in an in vitro experiment. In future, reproducible in vivo measurements of elastic vocal fold tissue alterations may enable phonosurgeons to infer the extent of subepithelial damage from changes in surface elasticity.

  4. Feature Inference Learning and Eyetracking

    Science.gov (United States)

    Rehder, Bob; Colner, Robert M.; Hoffman, Aaron B.

    2009-01-01

    Besides traditional supervised classification learning, people can learn categories by inferring the missing features of category members. It has been proposed that feature inference learning promotes learning a category's internal structure (e.g., its typical features and interfeature correlations) whereas classification promotes the learning of…

  5. Chromosomal damages and mutagenesis in mammalian and human cells induced by ionizing radiations with different LET

    International Nuclear Information System (INIS)

    Govorun, R.D.

    1997-01-01

    On the basis of literature and proper data the inference was made about essential role of structural chromosomal (and gene) damages in spontaneous and radiation-induced mutagenesis of mammalian and human cells on HPRT-loci. The evidences of increasing role of these damages in the mutagenesis after the influence of ionizing radiations with high LET are adduced. The consequences of HPRT-gene damages have been examined hypothetically. The geterogeneity of mutant subclones on their cytogenetical properties were revealed experimentally. The data reflect a phenomenon of the reproductive chromosomal instability in many generations of mutant cell. The mutagenesis of mammalian cells is also accompanied by the impairment of chromosome integrity with high probability as a stage of appropriate genome reorganization because of changed vital conditions

  6. Forward and backward inference in spatial cognition.

    Directory of Open Access Journals (Sweden)

    Will D Penny

    Full Text Available This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of 'lower-level' computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.

  7. C1 Polymerization: a unique tool towards polyethylene-based complex macromolecular architectures

    KAUST Repository

    Wang, De

    2017-05-09

    The recent developments in organoborane initiated C1 polymerization (chain grows by one atom at a time) of ylides opens unique horizons towards well-defined/perfectly linear polymethylenes (equivalent to polyethylenes, PE) and PE-based complex macromolecular architectures. The general mechanism of C1 polymerization (polyhomologation) involves the formation of a Lewis complex between a methylide (monomer) and a borane (initiator), followed by migration/insertion of a methylene into the initiator and after oxidation/hydrolysis to afford OH-terminated polyethylenes. This review summarizes efforts towards conventional and newly discovered borane-initiators and ylides (monomers), as well as a combination of polyhomologation with other polymerization methods. Initial efforts dealing with C3 polymerization and the synthesis of the first C1/C3 copolymers are also given. Finally, some thoughts for the future of these polymerizations are presented.

  8. C1 Polymerization: a unique tool towards polyethylene-based complex macromolecular architectures

    KAUST Repository

    Wang, De; Zhang, Zhen; Hadjichristidis, Nikolaos

    2017-01-01

    The recent developments in organoborane initiated C1 polymerization (chain grows by one atom at a time) of ylides opens unique horizons towards well-defined/perfectly linear polymethylenes (equivalent to polyethylenes, PE) and PE-based complex macromolecular architectures. The general mechanism of C1 polymerization (polyhomologation) involves the formation of a Lewis complex between a methylide (monomer) and a borane (initiator), followed by migration/insertion of a methylene into the initiator and after oxidation/hydrolysis to afford OH-terminated polyethylenes. This review summarizes efforts towards conventional and newly discovered borane-initiators and ylides (monomers), as well as a combination of polyhomologation with other polymerization methods. Initial efforts dealing with C3 polymerization and the synthesis of the first C1/C3 copolymers are also given. Finally, some thoughts for the future of these polymerizations are presented.

  9. Proteome-wide dataset supporting the study of ancient metazoan macromolecular complexes

    Directory of Open Access Journals (Sweden)

    Sadhna Phanse

    2016-03-01

    Full Text Available Our analysis examines the conservation of multiprotein complexes among metazoa through use of high resolution biochemical fractionation and precision mass spectrometry applied to soluble cell extracts from 5 representative model organisms Caenorhabditis elegans, Drosophila melanogaster, Mus musculus, Strongylocentrotus purpuratus, and Homo sapiens. The interaction network obtained from the data was validated globally in 4 distant species (Xenopus laevis, Nematostella vectensis, Dictyostelium discoideum, Saccharomyces cerevisiae and locally by targeted affinity-purification experiments. Here we provide details of our massive set of supporting biochemical fractionation data available via ProteomeXchange (http://www.ebi.ac.uk/pride/archive/projects/PXD002319-http://www.ebi.ac.uk/pride/archive/projects/PXD002328, PPIs via BioGRID (185267; and interaction network projections via (http://metazoa.med.utoronto.ca made fully accessible to allow further exploration. The datasets here are related to the research article on metazoan macromolecular complexes in Nature [1]. Keywords: Proteomics, Metazoa, Protein complexes, Biochemical, Fractionation

  10. Macromolecular crystallography with a large format CMOS detector

    Energy Technology Data Exchange (ETDEWEB)

    Nix, Jay C., E-mail: jcnix@lbl.gov [Molecular Biology Consortium 12003 S. Pulaski Rd. #166 Alsip, IL 60803 U.S.A (United States)

    2016-07-27

    Recent advances in CMOS technology have allowed the production of large surface area detectors suitable for macromolecular crystallography experiments [1]. The Molecular Biology Consortium (MBC) Beamline 4.2.2 at the Advanced Light Source in Berkeley, CA, has installed a 2952 x 2820 mm RDI CMOS-8M detector with funds from NIH grant S10OD012073. The detector has a 20nsec dead pixel time and performs well with shutterless data collection strategies. The sensor obtains sharp point response and minimal optical distortion by use of a thin fiber-optic plate between the phosphor and sensor module. Shutterless data collections produce high-quality redundant datasets that can be obtained in minutes. The fine-sliced data are suitable for processing in standard crystallographic software packages (XDS, HKL2000, D*TREK, MOSFLM). Faster collection times relative to the previous CCD detector have resulted in a record number of datasets collected in a calendar year and de novo phasing experiments have resulted in publications in both Science and Nature [2,3]. The faster collections are due to a combination of the decreased overhead requirements of shutterless collections combined with exposure times that have decreased by over a factor of 2 for images with comparable signal to noise of the NOIR-1 detector. The overall increased productivity has allowed the development of new beamline capabilities and data collection strategies.

  11. A formal model of interpersonal inference

    Directory of Open Access Journals (Sweden)

    Michael eMoutoussis

    2014-03-01

    Full Text Available Introduction: We propose that active Bayesian inference – a general framework for decision-making – can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: 1. Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to 'mentalising' in the psychological literature, is based upon the outcomes of interpersonal exchanges. 2. We show how some well-known social-psychological phenomena (e.g. self-serving biases can be explained in terms of active interpersonal inference. 3. Mentalising naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one’s own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modelling intersubject variability in mentalising during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalising is distorted.

  12. Distributional Inference

    NARCIS (Netherlands)

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  13. Continuous Integrated Invariant Inference, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed project will develop a new technique for invariant inference and embed this and other current invariant inference and checking techniques in an...

  14. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  15. A 3D Image Filter for Parameter-Free Segmentation of Macromolecular Structures from Electron Tomograms

    Science.gov (United States)

    Ali, Rubbiya A.; Landsberg, Michael J.; Knauth, Emily; Morgan, Garry P.; Marsh, Brad J.; Hankamer, Ben

    2012-01-01

    3D image reconstruction of large cellular volumes by electron tomography (ET) at high (≤5 nm) resolution can now routinely resolve organellar and compartmental membrane structures, protein coats, cytoskeletal filaments, and macromolecules. However, current image analysis methods for identifying in situ macromolecular structures within the crowded 3D ultrastructural landscape of a cell remain labor-intensive, time-consuming, and prone to user-bias and/or error. This paper demonstrates the development and application of a parameter-free, 3D implementation of the bilateral edge-detection (BLE) algorithm for the rapid and accurate segmentation of cellular tomograms. The performance of the 3D BLE filter has been tested on a range of synthetic and real biological data sets and validated against current leading filters—the pseudo 3D recursive and Canny filters. The performance of the 3D BLE filter was found to be comparable to or better than that of both the 3D recursive and Canny filters while offering the significant advantage that it requires no parameter input or optimisation. Edge widths as little as 2 pixels are reproducibly detected with signal intensity and grey scale values as low as 0.72% above the mean of the background noise. The 3D BLE thus provides an efficient method for the automated segmentation of complex cellular structures across multiple scales for further downstream processing, such as cellular annotation and sub-tomogram averaging, and provides a valuable tool for the accurate and high-throughput identification and annotation of 3D structural complexity at the subcellular level, as well as for mapping the spatial and temporal rearrangement of macromolecular assemblies in situ within cellular tomograms. PMID:22479430

  16. Metabolic growth rate control in Escherichia coli may be a consequence of subsaturation of the macromolecular biosynthetic apparatus with substrates and catalytic components

    DEFF Research Database (Denmark)

    Jensen, Kaj Frank; Pedersen, Steen

    1990-01-01

    In this paper, the Escherichia coli cell is considered as a system designed for rapid growth, but limited by the medium. We propose that this very design causes the cell to become subsaturated with precursors and catalytic components at all levels of macromolecular biosynthesis and leads to a mol...

  17. Quantum-Like Representation of Non-Bayesian Inference

    Science.gov (United States)

    Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.

    2013-01-01

    This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.

  18. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  19. Statistical inference an integrated Bayesianlikelihood approach

    CERN Document Server

    Aitkin, Murray

    2010-01-01

    Filling a gap in current Bayesian theory, Statistical Inference: An Integrated Bayesian/Likelihood Approach presents a unified Bayesian treatment of parameter inference and model comparisons that can be used with simple diffuse prior specifications. This novel approach provides new solutions to difficult model comparison problems and offers direct Bayesian counterparts of frequentist t-tests and other standard statistical methods for hypothesis testing.After an overview of the competing theories of statistical inference, the book introduces the Bayes/likelihood approach used throughout. It pre

  20. Functionalization of Planet-Satellite Nanostructures Revealed by Nanoscopic Localization of Distinct Macromolecular Species

    KAUST Repository

    Rossner, Christian

    2016-09-26

    The development of a straightforward method is reported to form hybrid polymer/gold planet-satellite nanostructures (PlSNs) with functional polymer. Polyacrylate type polymer with benzyl chloride in its backbone as a macromolecular tracer is synthesized to study its localization within PlSNs by analyzing the elemental distribution of chlorine. The functionalized nanohybrid structures are analyzed by scanning transmission electron microscopy, electron energy loss spectroscopy, and spectrum imaging. The results show that the RAFT (reversible addition-fragmentation chain transfer) polymers\\' sulfur containing end groups are colocalized at the gold cores, both within nanohybrids of simple core-shell morphology and within higher order PlSNs, providing microscopic evidence for the affinity of the RAFT group toward gold surfaces. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA., Weinheim.

  1. OCTOPUS: an innovative multimodal diffractometer for neutron macromolecular crystallography across the length scales

    International Nuclear Information System (INIS)

    Blakeley, M.P.; Andersen, K.; Kreuz, M.; Giroud, B.; McSweeney, S.; Mitchell, E.; Teixeira, S.C.M.; Forsyth, V.T.

    2011-01-01

    We propose to construct a novel protein diffractometer at position H112B. The new instrument will deliver major efficiency gains, as well as offering greatly extended flexibility through the option of several easily interchangeable modes of operation. This proposal builds on the demonstrable need to extend ILL's capacity for high resolution structural studies of protein systems, as well as a need to widen the scope of biological crystallography - in particular for monochromatic studies at both high and low resolution. The development will be carried out in close collaboration with structural biologists at the ESRF, and engineered in such a way that the user interface of the instrument (from sample to software) will be transparently identifiable to a large, dynamic, and driven community of European synchrotron X-ray macromolecular crystallographers. (authors)

  2. Reliable and efficient solution of genome-scale models of Metabolism and macromolecular Expression

    DEFF Research Database (Denmark)

    Ma, Ding; Yang, Laurence; Fleming, Ronan M. T.

    2017-01-01

    orders of magnitude. Data values also have greatly varying magnitudes. Standard double-precision solvers may return inaccurate solutions or report that no solution exists. Exact simplex solvers based on rational arithmetic require a near-optimal warm start to be practical on large problems (current ME......Constraint-Based Reconstruction and Analysis (COBRA) is currently the only methodology that permits integrated modeling of Metabolism and macromolecular Expression (ME) at genome-scale. Linear optimization computes steady-state flux solutions to ME models, but flux values are spread over many...... models have 70,000 constraints and variables and will grow larger). We have developed a quadrupleprecision version of our linear and nonlinear optimizer MINOS, and a solution procedure (DQQ) involving Double and Quad MINOS that achieves reliability and efficiency for ME models and other challenging...

  3. A neutral polydisulfide containing Gd(III) DOTA monoamide as a redox-sensitive biodegradable macromolecular MRI contrast agent.

    Science.gov (United States)

    Ye, Zhen; Zhou, Zhuxian; Ayat, Nadia; Wu, Xueming; Jin, Erlei; Shi, Xiaoyue; Lu, Zheng-Rong

    2016-01-01

    This work aims to develop safe and effective gadolinium (III)-based biodegradable macromolecular MRI contrast agents for blood pool and cancer imaging. A neutral polydisulfide containing macrocyclic Gd-DOTA monoamide (GOLS) was synthesized and characterized. In addition to studying the in vitro degradation of GOLS, its kinetic stability was also investigated in an in vivo model. The efficacy of GOLS for contrast-enhanced MRI was examined with female BALB/c mice bearing 4T1 breast cancer xenografts. The pharmacokinetics, biodistribution, and metabolism of GOLS were also determined in mice. GOLS has an apparent molecular weight of 23.0 kDa with T1 relaxivities of 7.20 mM(-1) s(-1) per Gd at 1.5 T, and 6.62 mM(-1) s(-1) at 7.0 T. GOLS had high kinetic inertness against transmetallation with Zn(2+) ions, and its polymer backbone was readily cleaved by L-cysteine. The agent showed improved efficacy for blood pool and tumor MR imaging. The structural effect on biodistribution and in vivo chelation stability was assessed by comparing GOLS with Gd(HP-DO3A), a negatively charged polydisulfide containing Gd-DOTA monoamide GODC, and a polydisulfide containing Gd-DTPA-bisamide (GDCC). GOLS showed high in vivo chelation stability and minimal tissue deposition of gadolinium. The biodegradable macromolecular contrast agent GOLS is a promising polymeric contrast agent for clinical MR cardiovascular imaging and cancer imaging. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Inference Attacks and Control on Database Structures

    Directory of Open Access Journals (Sweden)

    Muhamed Turkanovic

    2015-02-01

    Full Text Available Today’s databases store information with sensitivity levels that range from public to highly sensitive, hence ensuring confidentiality can be highly important, but also requires costly control. This paper focuses on the inference problem on different database structures. It presents possible treats on privacy with relation to the inference, and control methods for mitigating these treats. The paper shows that using only access control, without any inference control is inadequate, since these models are unable to protect against indirect data access. Furthermore, it covers new inference problems which rise from the dimensions of new technologies like XML, semantics, etc.

  5. Type Inference with Inequalities

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    1991-01-01

    of (monotonic) inequalities on the types of variables and expressions. A general result about systems of inequalities over semilattices yields a solvable form. We distinguish between deciding typability (the existence of solutions) and type inference (the computation of a minimal solution). In our case, both......Type inference can be phrased as constraint-solving over types. We consider an implicitly typed language equipped with recursive types, multiple inheritance, 1st order parametric polymorphism, and assignments. Type correctness is expressed as satisfiability of a possibly infinite collection...

  6. The influence of oxygen exposure time on the composition of macromolecular organic matter as revealed by surface sediments on the Murray Ridge (Arabian Sea)

    Science.gov (United States)

    Nierop, Klaas G. J.; Reichart, Gert-Jan; Veld, Harry; Sinninghe Damsté, Jaap S.

    2017-06-01

    The Arabian Sea represents a prime example of an open ocean extended oxygen minimum zone (OMZ) with low oxygen concentrations (down to less than 2 μM) between 200 and 1000 m water depth. The OMZ impinges on the ocean floor, affecting organic matter (OM) mineralization. We investigated impact of oxygen depletion on the composition of macromolecular OM (MOM) along a transect through the OMZ on the slopes of the Murray Ridge. This sub-marine high in the northern Arabian Sea, with the top at approximately 500 m below sea surface (mbss), intersects the OMZ. We analyzed sediments deposited in the core of OMZ (suboxic conditions), directly below the OMZ (dysoxic conditions) and well below the OMZ (fully oxic conditions). The upper 18 cm of sediments from three stations recovered at different depths were studied. MOM was investigated by Rock Eval and flash pyrolysis techniques. The MOM was of a predominant marine origin and inferred from their pyrolysis products, most biomolecules (tetra-alkylpyrrole pigments, polysaccharides, proteins and their transformation products, and polyphenols including phlorotannins), showed a progressive relative degradation with increasing exposure to oxygen. Alkylbenzenes and, in particular, aliphatic macromolecules increased relatively. The observed differences in MOM composition between sediment deposited under various bottom water oxygen conditions (i.e. in terms of concentration and exposure time) was much larger than within sediment cores, implying that early diagenetic alteration of organic matter depends largely on bottom water oxygenation rather than subsequent anaerobic degradation within the sediments, even at longer time scales.

  7. Estimation of insurance premiums for coverage against natural disaster risk: an application of Bayesian Inference

    Directory of Open Access Journals (Sweden)

    Y. Paudel

    2013-03-01

    Full Text Available This study applies Bayesian Inference to estimate flood risk for 53 dyke ring areas in the Netherlands, and focuses particularly on the data scarcity and extreme behaviour of catastrophe risk. The probability density curves of flood damage are estimated through Monte Carlo simulations. Based on these results, flood insurance premiums are estimated using two different practical methods that each account in different ways for an insurer's risk aversion and the dispersion rate of loss data. This study is of practical relevance because insurers have been considering the introduction of flood insurance in the Netherlands, which is currently not generally available.

  8. Estimation of insurance premiums for coverage against natural disaster risk: an application of Bayesian Inference

    Science.gov (United States)

    Paudel, Y.; Botzen, W. J. W.; Aerts, J. C. J. H.

    2013-03-01

    This study applies Bayesian Inference to estimate flood risk for 53 dyke ring areas in the Netherlands, and focuses particularly on the data scarcity and extreme behaviour of catastrophe risk. The probability density curves of flood damage are estimated through Monte Carlo simulations. Based on these results, flood insurance premiums are estimated using two different practical methods that each account in different ways for an insurer's risk aversion and the dispersion rate of loss data. This study is of practical relevance because insurers have been considering the introduction of flood insurance in the Netherlands, which is currently not generally available.

  9. Inference in models with adaptive learning

    NARCIS (Netherlands)

    Chevillon, G.; Massmann, M.; Mavroeidis, S.

    2010-01-01

    Identification of structural parameters in models with adaptive learning can be weak, causing standard inference procedures to become unreliable. Learning also induces persistent dynamics, and this makes the distribution of estimators and test statistics non-standard. Valid inference can be

  10. JBluIce-EPICS control system for macromolecular crystallography

    International Nuclear Information System (INIS)

    Stepanov, S.; Makarov, O.; Hilgart, M.; Pothineni, S.; Urakhchin, A.; Devarapalli, S.; Yoder, D.; Becker, M.; Ogata, C.; Sanishvili, R.; Nagarajan, V.; Smith, J.L.; Fischetti, R.F.

    2011-01-01

    The trio of macromolecular crystallography beamlines constructed by the General Medicine and Cancer Institutes Collaborative Access Team (GM/CA-CAT) in Sector 23 of the Advanced Photon Source (APS) have been in growing demand owing to their outstanding beam quality and capacity to measure data from crystals of only a few micrometres in size. To take full advantage of the state-of-the-art mechanical and optical design of these beamlines, a significant effort has been devoted to designing fast, convenient, intuitive and robust beamline controls that could easily accommodate new beamline developments. The GM/CA-CAT beamline controls are based on the power of EPICS for distributed hardware control, the rich Java graphical user interface of Eclipse RCP and the task-oriented philosophy as well as the look and feel of the successful SSRL BluIce graphical user interface for crystallography. These beamline controls feature a minimum number of software layers, the wide use of plug-ins that can be written in any language and unified motion controls that allow on-the-fly scanning and optimization of any beamline component. This paper describes the ways in which BluIce was combined with EPICS and converted into the Java-based JBluIce, discusses the solutions aimed at streamlining and speeding up operations and gives an overview of the tools that are provided by this new open-source control system for facilitating crystallographic experiments, especially in the field of microcrystallography.

  11. Making microenvironments: A look into incorporating macromolecular crowding into in vitro experiments, to generate biomimetic microenvironments which are capable of directing cell function for tissue engineering applications.

    Science.gov (United States)

    Benny, Paula; Raghunath, Michael

    2017-01-01

    Biomimetic microenvironments are key components to successful cell culture and tissue engineering in vitro. One of the most accurate biomimetic microenvironments is that made by the cells themselves. Cell-made microenvironments are most similar to the in vivo state as they are cell-specific and produced by the actual cells which reside in that specific microenvironment. However, cell-made microenvironments have been challenging to re-create in vitro due to the lack of extracellular matrix composition, volume and complexity which are required. By applying macromolecular crowding to current cell culture protocols, cell-made microenvironments, or cell-derived matrices, can be generated at significant rates in vitro. In this review, we will examine the causes and effects of macromolecular crowding and how it has been applied in several in vitro systems including tissue engineering.

  12. Positron annihilation lifetime study of radiation-damaged natural zircons

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, J. [Centre for Antimatter-Matter Studies, Research School of Physics and Engineering, The Australian National University, Canberra (Australia); Gaugliardo, P. [Centre for Antimatter-Matter Studies, School of Physics, University of Western Australia (Australia); Farnan, I.; Zhang, M. [Department of Earth Sciences, University of Cambridge (United Kingdom); Vance, E.R.; Davis, J.; Karatchevtseva, I.; Knott, R.B. [Australian Nuclear Science and Technology Organisation (Australia); Mudie, S. [The Australian Synchrotron, Victoria (Australia); Buckman, S.J. [Centre for Antimatter-Matter Studies, Research School of Physics and Engineering, The Australian National University, Canberra (Australia); Institute for Mathematical Sciences, University of Malaya, Kuala Lumpur (Malaysia); Sullivan, J.P., E-mail: james.sullivan@anu.edu.au [Centre for Antimatter-Matter Studies, Research School of Physics and Engineering, The Australian National University, Canberra (Australia)

    2016-04-01

    Zircons are a well-known candidate waste form for actinides and their radiation damage behaviour has been widely studied by a range of techniques. In this study, well-characterised natural single crystal zircons have been studied using Positron Annihilation Lifetime Spectroscopy (PALS). In some, but not all, of the crystals that had incurred at least half of the alpha-event damage of ∼10{sup 19} α/g required to render them structurally amorphous, PALS spectra displayed long lifetimes corresponding to voids of ∼0.5 nm in diameter. The long lifetimes corresponded to expectations from published Small-Angle X-ray Scattering data on similar samples. However, the non-observation by PALS of such voids in some of the heavily damaged samples may reflect large size variations among the voids such that no singular size can be distinguished or. Characterisation of a range of samples was also performed using scanning electron microscopy, optical absorption spectroscopy, Raman scattering and X-ray scattering/diffraction, with the degree of alpha damage being inferred mainly from the Raman technique and X-ray diffraction. The observed void diameters and intensities of the long lifetime components were changed somewhat by annealing at 700 °C; annealing at 1200 °C removed the voids entirely. The voids themselves may derive from He gas bubbles or voids created by the inclusion of small quantities of organic and hydrous matter, notwithstanding the observation that no voidage was evidenced by PALS in two samples containing hydrous and organic matter. - Highlights: • Study of a range of naturally occurring zircons damaged by alpha radiation. • Characterised using a range of techniques, including PALS spectroscopy. • Effects on hydrous material appear important, rather than direct radiation damage. • Annealing is shown to remove the observed voids.

  13. Manifestations of damage from ionizing radiation in mammalian cells in the postirradiation generations

    International Nuclear Information System (INIS)

    Hopwood, L.E.; Tolmach, L.J.

    1979-01-01

    The lethally irradiated cell does not immediately cease metabolism. It may undergo one or many divisions before physiological death and disintegration ensue. Attention has been focused mainly on cell behavior during the generation in which the radiation is delivered, and cell behavior is in several respects very different in the irradiated generation from what it is in the succeeding generations. In particular, some of the responses in the irradiated generation are transient, and certain responses seem unrelated to lethal damage. The behavior of lethally irradiated cells during the postirradiation generations before they die is intrinsically of interest, and should provide information concerning the nature of the lethal damage they harbor. Accordingly, in reviewing the behavior of irradiated cells, the discussion is confined for the most part, though not entirely, to the postirradiations, even though distinction between the irradiated and the postirradiation generations is in several respects arbitrary. Effects at the cellular level are considered first; the final part of this section is devoted to a discussion of in vivo measurements of cellular effects, and includes responses that are properly classified as effects at the tissue level. The effects at the subcellular level are considered; the discussion is restricted to chromosomal effects. Biochemical effects are then considered; these are concerned entirely with macromolecular synthesis. In the final section possible interpretations of the experimental findings in terms of cell lethality are discussed but not loss of colony-forming ability (''clonogenicity'') directly

  14. Fiducial inference - A Neyman-Pearson interpretation

    NARCIS (Netherlands)

    Salome, D; VonderLinden, W; Dose,; Fischer, R; Preuss, R

    1999-01-01

    Fisher's fiducial argument is a tool for deriving inferences in the form of a probability distribution on the parameter space, not based on Bayes's Theorem. Lindley established that in exceptional situations fiducial inferences coincide with posterior distributions; in the other situations fiducial

  15. Uncertainty in prediction and in inference

    NARCIS (Netherlands)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close re-lationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in

  16. Polynomial Chaos Surrogates for Bayesian Inference

    KAUST Repository

    Le Maitre, Olivier

    2016-01-06

    The Bayesian inference is a popular probabilistic method to solve inverse problems, such as the identification of field parameter in a PDE model. The inference rely on the Bayes rule to update the prior density of the sought field, from observations, and derive its posterior distribution. In most cases the posterior distribution has no explicit form and has to be sampled, for instance using a Markov-Chain Monte Carlo method. In practice the prior field parameter is decomposed and truncated (e.g. by means of Karhunen- Lo´eve decomposition) to recast the inference problem into the inference of a finite number of coordinates. Although proved effective in many situations, the Bayesian inference as sketched above faces several difficulties requiring improvements. First, sampling the posterior can be a extremely costly task as it requires multiple resolutions of the PDE model for different values of the field parameter. Second, when the observations are not very much informative, the inferred parameter field can highly depends on its prior which can be somehow arbitrary. These issues have motivated the introduction of reduced modeling or surrogates for the (approximate) determination of the parametrized PDE solution and hyperparameters in the description of the prior field. Our contribution focuses on recent developments in these two directions: the acceleration of the posterior sampling by means of Polynomial Chaos expansions and the efficient treatment of parametrized covariance functions for the prior field. We also discuss the possibility of making such approach adaptive to further improve its efficiency.

  17. Macromolecular contrast media. A new approach for characterising breast tumors with MR-mammography

    International Nuclear Information System (INIS)

    Daldrup, H.E.; Gossmann, A.; Koeln Univ.; Wendland, M.; Brasch, R.C.; Rosenau, W.

    1997-01-01

    The value of macromolecular contrast agents (MMCM) for the characterization of benign and malignant breast tumors will be demonstrated in this review. Animal studies suggest a high potential of MMCM to increase the specificity of MR-mammography. The concept of tumor differentiation is based on the pathological hyperpermeability of microvessels in malignant tumors. MMCM show a leak into the interstitium of carcinomas, whereas they are confined to the intravascular space in benign tumors. Capabilities and limitations of the MMCM-prototype. Albumin-Gd-DTPA, for breast tumor characterization will be summarized and compared to the standard low molecular weight contrast agent Gd-DTPA. Initial experience with new MMCM, such as Dendrimers, Gd-DTPA-Polylysine and MS-325 will be outlined. The potential of 'blood-pool'-iron oxides, such as AMI-227 for the evaluation of tumor microvascular permeabilities will be discussed. (orig.) [de

  18. Interactive Instruction in Bayesian Inference

    DEFF Research Database (Denmark)

    Khan, Azam; Breslav, Simon; Hornbæk, Kasper

    2018-01-01

    An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction. These pri......An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction....... These principles concern coherence, personalization, signaling, segmenting, multimedia, spatial contiguity, and pretraining. Principles of self-explanation and interactivity are also applied. Four experiments on the Mammography Problem showed that these principles help participants answer the questions...... that an instructional approach to improving human performance in Bayesian inference is a promising direction....

  19. Rapid automated superposition of shapes and macromolecular models using spherical harmonics.

    Science.gov (United States)

    Konarev, Petr V; Petoukhov, Maxim V; Svergun, Dmitri I

    2016-06-01

    A rapid algorithm to superimpose macromolecular models in Fourier space is proposed and implemented ( SUPALM ). The method uses a normalized integrated cross-term of the scattering amplitudes as a proximity measure between two three-dimensional objects. The reciprocal-space algorithm allows for direct matching of heterogeneous objects including high- and low-resolution models represented by atomic coordinates, beads or dummy residue chains as well as electron microscopy density maps and inhomogeneous multi-phase models ( e.g. of protein-nucleic acid complexes). Using spherical harmonics for the computation of the amplitudes, the method is up to an order of magnitude faster than the real-space algorithm implemented in SUPCOMB by Kozin & Svergun [ J. Appl. Cryst. (2001 ▸), 34 , 33-41]. The utility of the new method is demonstrated in a number of test cases and compared with the results of SUPCOMB . The spherical harmonics algorithm is best suited for low-resolution shape models, e.g . those provided by solution scattering experiments, but also facilitates a rapid cross-validation against structural models obtained by other methods.

  20. Inferring Phylogenetic Networks Using PhyloNet.

    Science.gov (United States)

    Wen, Dingqiao; Yu, Yun; Zhu, Jiafan; Nakhleh, Luay

    2018-07-01

    PhyloNet was released in 2008 as a software package for representing and analyzing phylogenetic networks. At the time of its release, the main functionalities in PhyloNet consisted of measures for comparing network topologies and a single heuristic for reconciling gene trees with a species tree. Since then, PhyloNet has grown significantly. The software package now includes a wide array of methods for inferring phylogenetic networks from data sets of unlinked loci while accounting for both reticulation (e.g., hybridization) and incomplete lineage sorting. In particular, PhyloNet now allows for maximum parsimony, maximum likelihood, and Bayesian inference of phylogenetic networks from gene tree estimates. Furthermore, Bayesian inference directly from sequence data (sequence alignments or biallelic markers) is implemented. Maximum parsimony is based on an extension of the "minimizing deep coalescences" criterion to phylogenetic networks, whereas maximum likelihood and Bayesian inference are based on the multispecies network coalescent. All methods allow for multiple individuals per species. As computing the likelihood of a phylogenetic network is computationally hard, PhyloNet allows for evaluation and inference of networks using a pseudolikelihood measure. PhyloNet summarizes the results of the various analyzes and generates phylogenetic networks in the extended Newick format that is readily viewable by existing visualization software.

  1. Active inference and learning.

    Science.gov (United States)

    Friston, Karl; FitzGerald, Thomas; Rigoli, Francesco; Schwartenbeck, Philipp; O Doherty, John; Pezzulo, Giovanni

    2016-09-01

    This paper offers an active inference account of choice behaviour and learning. It focuses on the distinction between goal-directed and habitual behaviour and how they contextualise each other. We show that habits emerge naturally (and autodidactically) from sequential policy optimisation when agents are equipped with state-action policies. In active inference, behaviour has explorative (epistemic) and exploitative (pragmatic) aspects that are sensitive to ambiguity and risk respectively, where epistemic (ambiguity-resolving) behaviour enables pragmatic (reward-seeking) behaviour and the subsequent emergence of habits. Although goal-directed and habitual policies are usually associated with model-based and model-free schemes, we find the more important distinction is between belief-free and belief-based schemes. The underlying (variational) belief updating provides a comprehensive (if metaphorical) process theory for several phenomena, including the transfer of dopamine responses, reversal learning, habit formation and devaluation. Finally, we show that active inference reduces to a classical (Bellman) scheme, in the absence of ambiguity. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. A Test of Macromolecular Crystallization in Microgravity: Large, Well-Ordered Insulin Crystals

    Science.gov (United States)

    Borgstahl, Gloria E. O.; Vahedi-Faridi, Ardeschir; Lovelace, Jeff; Bellamy, Henry D.; Snell, Edward H.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    Crystals of insulin grown in microgravity on space shuttle mission STS-95 were extremely well-ordered and unusually large (many > 2 mm). The physical characteristics of six microgravity and six earth-grown crystals were examined by X-ray analysis employing superfine f slicing and unfocused synchrotron radiation. This experimental setup allowed hundreds of reflections to be precisely examined for each crystal in a short period of time. The microgravity crystals were on average 34 times larger, had 7 times lower mosaicity, had 54 times higher reflection peak heights and diffracted to significantly higher resolution than their earth grown counterparts. A single mosaic domain model could account for reflections in microgravity crystals whereas reflections from earth crystals required a model with multiple mosaic domains. This statistically significant and unbiased characterization indicates that the microgravity environment was useful for the improvement of crystal growth and resultant diffraction quality in insulin crystals and may be similarly useful for macromolecular crystals in general.

  3. A vibrating membrane bioreactor (VMBR): Macromolecular transmission-influence of extracellular polymeric substances

    DEFF Research Database (Denmark)

    Beier, Søren; Jonsson, Gunnar Eigil

    2009-01-01

    The vibrating membrane bioreactor (VMBR) system facilitates the possibility of conducting a separation of macromolecules (BSA) from larger biological components (yeast cells) with a relatively high and stable macromolecular transmission at sub-critical flux. This is not possible to achieve...... for a static non-vibrating membrane module. A BSA transmission of 74% has been measured in the separation of 4g/L BSA from 8 g/L dry weight yeast cells in suspension at sub-critical flux (20L/(m(2) h)). However, this transmission is lower than the 85% BSA transmission measured for at pure 4g/L BSA solution....... This can be ascribed to the presence of extracellular polymeric substances (EPS) from the yeast cells. The initial fouling rate for constant sub-critical flux filtration of unwashed yeast cells is 3-4 times larger than for washed yeast cells (18(mbar/h)/5(mbar/h)). At sub-critical flux, an EPS transmission...

  4. Synthesis and Self-Assembly of Amphiphilic Triblock Terpolymers with Complex Macromolecular Architecture

    KAUST Repository

    Polymeropoulos, George; Zapsas, George; Hadjichristidis, Nikolaos; Avgeropoulos, Apostolos

    2015-01-01

    Two star triblock terpolymers (PS-b-P2VP-b-PEO)3 and one dendritic-like terpolymer [PS-b-P2VP-b-(PEO)2]3 of PS (polystyrene), P2VP (poly(2-vinylpyridine)), and PEO (poly(ethylene oxide)), never reported before, were synthesized by combining atom transfer radical and anionic polymerizations. The synthesis involves the transformation of the -Br groups of the previously reported Br-terminated 3-arm star diblock copolymers to one or two -OH groups, followed by anionic polymerization of ethylene oxide to afford the star or dendritic structure, respectively. The well-defined structure of the terpolymers was confirmed by static light scattering, size exclusion chromatography, and NMR spectroscopy. The self-assembly in solution and the morphology in bulk of the terpolymers, studied by dynamic light scattering and transmission electron microscopy, respectively, reveal new insights in the phase separation of these materials with complex macromolecular architecture. © 2015 American Chemical Society.

  5. Control and data acquisition system for the macromolecular crystallography beamline of SSRF

    International Nuclear Information System (INIS)

    Wang Qisheng; Huang Sheng; Sun Bo; Tang Lin; He Jianhua

    2012-01-01

    The macromolecular crystallography beamline BL17U1 of Shanghai Synchrotron Radiation Facility (SSRF) is an important platform for structure biological science. High performance of the beamline would benefit the users greatly in their experiment and data acquisition. To take full advantage of the state-of-the-art mechanical and physical design of the beamline, we have made a series of efforts to develop a robust control and data acquisition system, with user-friendly GUI. These were done by adopting EPICS and Blu-Ice systems on the BL17U1 beamline, with considerations on easy accommodation of new beeline components. In this paper, we report the integration of EPICS and Blu-Ice systems. By using the EPICS gateway interface and several new DHS, Blu-Ice was successfully established for the BL17U1 beamline. As a result, the experiment control and data acquisition system is reliable and functional for users. (authors)

  6. Synthesis and Self-Assembly of Amphiphilic Triblock Terpolymers with Complex Macromolecular Architecture

    KAUST Repository

    Polymeropoulos, George

    2015-11-25

    Two star triblock terpolymers (PS-b-P2VP-b-PEO)3 and one dendritic-like terpolymer [PS-b-P2VP-b-(PEO)2]3 of PS (polystyrene), P2VP (poly(2-vinylpyridine)), and PEO (poly(ethylene oxide)), never reported before, were synthesized by combining atom transfer radical and anionic polymerizations. The synthesis involves the transformation of the -Br groups of the previously reported Br-terminated 3-arm star diblock copolymers to one or two -OH groups, followed by anionic polymerization of ethylene oxide to afford the star or dendritic structure, respectively. The well-defined structure of the terpolymers was confirmed by static light scattering, size exclusion chromatography, and NMR spectroscopy. The self-assembly in solution and the morphology in bulk of the terpolymers, studied by dynamic light scattering and transmission electron microscopy, respectively, reveal new insights in the phase separation of these materials with complex macromolecular architecture. © 2015 American Chemical Society.

  7. Constructing irregular surfaces to enclose macromolecular complexes for mesoscale modeling using the discrete surface charge optimization (DISCO) algorithm.

    Science.gov (United States)

    Zhang, Qing; Beard, Daniel A; Schlick, Tamar

    2003-12-01

    Salt-mediated electrostatics interactions play an essential role in biomolecular structures and dynamics. Because macromolecular systems modeled at atomic resolution contain thousands of solute atoms, the electrostatic computations constitute an expensive part of the force and energy calculations. Implicit solvent models are one way to simplify the model and associated calculations, but they are generally used in combination with standard atomic models for the solute. To approximate electrostatics interactions in models on the polymer level (e.g., supercoiled DNA) that are simulated over long times (e.g., milliseconds) using Brownian dynamics, Beard and Schlick have developed the DiSCO (Discrete Surface Charge Optimization) algorithm. DiSCO represents a macromolecular complex by a few hundred discrete charges on a surface enclosing the system modeled by the Debye-Hückel (screened Coulombic) approximation to the Poisson-Boltzmann equation, and treats the salt solution as continuum solvation. DiSCO can represent the nucleosome core particle (>12,000 atoms), for example, by 353 discrete surface charges distributed on the surfaces of a large disk for the nucleosome core particle and a slender cylinder for the histone tail; the charges are optimized with respect to the Poisson-Boltzmann solution for the electric field, yielding a approximately 5.5% residual. Because regular surfaces enclosing macromolecules are not sufficiently general and may be suboptimal for certain systems, we develop a general method to construct irregular models tailored to the geometry of macromolecules. We also compare charge optimization based on both the electric field and electrostatic potential refinement. Results indicate that irregular surfaces can lead to a more accurate approximation (lower residuals), and the refinement in terms of the electric field is more robust. We also show that surface smoothing for irregular models is important, that the charge optimization (by the TNPACK

  8. Active Inference, homeostatic regulation and adaptive behavioural control.

    Science.gov (United States)

    Pezzulo, Giovanni; Rigoli, Francesco; Friston, Karl

    2015-11-01

    We review a theory of homeostatic regulation and adaptive behavioural control within the Active Inference framework. Our aim is to connect two research streams that are usually considered independently; namely, Active Inference and associative learning theories of animal behaviour. The former uses a probabilistic (Bayesian) formulation of perception and action, while the latter calls on multiple (Pavlovian, habitual, goal-directed) processes for homeostatic and behavioural control. We offer a synthesis these classical processes and cast them as successive hierarchical contextualisations of sensorimotor constructs, using the generative models that underpin Active Inference. This dissolves any apparent mechanistic distinction between the optimization processes that mediate classical control or learning. Furthermore, we generalize the scope of Active Inference by emphasizing interoceptive inference and homeostatic regulation. The ensuing homeostatic (or allostatic) perspective provides an intuitive explanation for how priors act as drives or goals to enslave action, and emphasises the embodied nature of inference. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Generative Inferences Based on Learned Relations

    Science.gov (United States)

    Chen, Dawn; Lu, Hongjing; Holyoak, Keith J.

    2017-01-01

    A key property of relational representations is their "generativity": From partial descriptions of relations between entities, additional inferences can be drawn about other entities. A major theoretical challenge is to demonstrate how the capacity to make generative inferences could arise as a result of learning relations from…

  10. Stably engineered nanobubbles and ultrasound - An effective platform for enhanced macromolecular delivery to representative cells of the retina.

    Directory of Open Access Journals (Sweden)

    Sachin S Thakur

    Full Text Available Herein we showcase the potential of ultrasound-responsive nanobubbles in enhancing macromolecular permeation through layers of the retina, ultimately leading to significant and direct intracellular delivery; this being effectively demonstrated across three relevant and distinct retinal cell lines. Stably engineered nanobubbles of a highly homogenous and echogenic nature were fully characterised using dynamic light scattering, B-scan ultrasound and transmission electron microscopy (TEM. The nanobubbles appeared as spherical liposome-like structures under TEM, accompanied by an opaque luminal core and darkened corona around their periphery, with both features indicative of efficient gas entrapment and adsorption, respectively. A nanobubble +/- ultrasound sweeping study was conducted next, which determined the maximum tolerated dose for each cell line. Detection of underlying cellular stress was verified using the biomarker heat shock protein 70, measured before and after treatment with optimised ultrasound. Next, with safety to nanobubbles and optimised ultrasound demonstrated, each human or mouse-derived cell population was incubated with biotinylated rabbit-IgG in the presence and absence of ultrasound +/- nanobubbles. Intracellular delivery of antibody in each cell type was then quantified using Cy3-streptavidin. Nanobubbles and optimised ultrasound were found to be negligibly toxic across all cell lines tested. Macromolecular internalisation was achieved to significant, yet varying degrees in all three cell lines. The results of this study pave the way towards better understanding mechanisms underlying cellular responsiveness to ultrasound-triggered drug delivery in future ex vivo and in vivo models of the posterior eye.

  11. Prevention of DNA damage and anticarcinogenic activity of Activia® in a preclinical model.

    Science.gov (United States)

    Limeiras, S M A; Ogo, F M; Genez, L A L; Carreira, C M; Oliveira, E J T; Pessatto, L R; Neves, S C; Pesarini, J R; Schweich, L C; Silva, R A; Cantero, W B; Antoniolli-Silva, A C M B; Oliveira, R J

    2017-03-22

    Colorectal cancer is a global public health issue. Studies have pointed to the protective effect of probiotics on colorectal carcinogenesis. Activia ® is a lacto probiotic product that is widely consumed all over the world and its beneficial properties are related, mainly, to the lineage of traditional yoghurt bacteria combined with a specific bacillus, DanRegularis, which gives the product a proven capacity to intestinal regulation in humans. The aim of this study was to evaluate the antigenotoxic, antimutagenic, and anticarcinogenic proprieties of the Activia product, in response to damage caused by 1,2-dimethylhydrazine (DMH) in Swiss mice. Activia does not have shown antigenotoxic activity. However, the percent of DNA damage reduction, evaluated by the antimutagenicity assay, ranged from 69.23 to 96.15% indicating effective chemopreventive action. Activia reduced up to 79.82% the induction of aberrant crypt foci by DMH. Facing the results, it is inferred that Activia facilitates the weight loss, prevents DNA damage and pre-cancerous lesions in the intestinal mucosa.

  12. Macromolecular crystallization in microgravity generated by a superconducting magnet.

    Science.gov (United States)

    Wakayama, N I; Yin, D C; Harata, K; Kiyoshi, T; Fujiwara, M; Tanimoto, Y

    2006-09-01

    About 30% of the protein crystals grown in space yield better X-ray diffraction data than the best crystals grown on the earth. The microgravity environments provided by the application of an upward magnetic force constitute excellent candidates for simulating the microgravity conditions in space. Here, we describe a method to control effective gravity and formation of protein crystals in various levels of effective gravity. Since 2002, the stable and long-time durable microgravity generated by a convenient type of superconducting magnet has been available for protein crystal growth. For the first time, protein crystals, orthorhombic lysozyme, were grown at microgravity on the earth, and it was proved that this microgravity improved the crystal quality effectively and reproducibly. The present method always accompanies a strong magnetic field, and the magnetic field itself seems to improve crystal quality. Microgravity is not always effective for improving crystal quality. When we applied this microgravity to the formation of cubic porcine insulin and tetragonal lysozyme crystals, we observed no dependence of effective gravity on crystal quality. Thus, this kind of test will be useful for selecting promising proteins prior to the space experiments. Finally, the microgravity generated by the magnet is compared with that in space, considering the cost, the quality of microgravity, experimental convenience, etc., and the future use of this microgravity for macromolecular crystal growth is discussed.

  13. Parametric statistical inference basic theory and modern approaches

    CERN Document Server

    Zacks, Shelemyahu; Tsokos, C P

    1981-01-01

    Parametric Statistical Inference: Basic Theory and Modern Approaches presents the developments and modern trends in statistical inference to students who do not have advanced mathematical and statistical preparation. The topics discussed in the book are basic and common to many fields of statistical inference and thus serve as a jumping board for in-depth study. The book is organized into eight chapters. Chapter 1 provides an overview of how the theory of statistical inference is presented in subsequent chapters. Chapter 2 briefly discusses statistical distributions and their properties. Chapt

  14. Acoustic methods for high-throughput protein crystal mounting at next-generation macromolecular crystallographic beamlines.

    Science.gov (United States)

    Roessler, Christian G; Kuczewski, Anthony; Stearns, Richard; Ellson, Richard; Olechno, Joseph; Orville, Allen M; Allaire, Marc; Soares, Alexei S; Héroux, Annie

    2013-09-01

    To take full advantage of advanced data collection techniques and high beam flux at next-generation macromolecular crystallography beamlines, rapid and reliable methods will be needed to mount and align many samples per second. One approach is to use an acoustic ejector to eject crystal-containing droplets onto a solid X-ray transparent surface, which can then be positioned and rotated for data collection. Proof-of-concept experiments were conducted at the National Synchrotron Light Source on thermolysin crystals acoustically ejected onto a polyimide `conveyor belt'. Small wedges of data were collected on each crystal, and a complete dataset was assembled from a well diffracting subset of these crystals. Future developments and implementation will focus on achieving ejection and translation of single droplets at a rate of over one hundred per second.

  15. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines.

    Science.gov (United States)

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.

  16. Radiation damage prediction system using damage function

    International Nuclear Information System (INIS)

    Tanaka, Yoshihisa; Mori, Seiji

    1979-01-01

    The irradiation damage analysis system using a damage function was investigated. This irradiation damage analysis system consists of the following three processes, the unfolding of a damage function, the calculation of the neutron flux spectrum of the object of damage analysis and the estimation of irradiation effect of the object of damage analysis. The damage function is calculated by applying the SAND-2 code. The ANISN and DOT3, 5 codes are used to calculate neutron flux. The neutron radiation and the allowable time of reactor operation can be estimated based on these calculations of the damage function and neutron flux. The flow diagram of the process of analyzing irradiation damage by a damage function and the flow diagram of SAND-2 code are presented, and the analytical code for estimating damage, which is determined with a damage function and a neutron spectrum, is explained. The application of the irradiation damage analysis system using a damage function was carried out to the core support structure of a fast breeder reactor for the damage estimation and the uncertainty evaluation. The fundamental analytical conditions and the analytical model for this work are presented, then the irradiation data for SUS304, the initial estimated values of a damage function, the error analysis for a damage function and the analytical results are explained concerning the computation of a damage function for 10% total elongation. Concerning the damage estimation of FBR core support structure, the standard and lower limiting values of damage, the permissible neutron flux and the allowable years of reactor operation are presented and were evaluated. (Nakai, Y.)

  17. Variational inference & deep learning: A new synthesis

    OpenAIRE

    Kingma, D.P.

    2017-01-01

    In this thesis, Variational Inference and Deep Learning: A New Synthesis, we propose novel solutions to the problems of variational (Bayesian) inference, generative modeling, representation learning, semi-supervised learning, and stochastic optimization.

  18. Variational inference & deep learning : A new synthesis

    NARCIS (Netherlands)

    Kingma, D.P.

    2017-01-01

    In this thesis, Variational Inference and Deep Learning: A New Synthesis, we propose novel solutions to the problems of variational (Bayesian) inference, generative modeling, representation learning, semi-supervised learning, and stochastic optimization.

  19. Macromolecular weight specificity in covalent binding of bromobenzene

    International Nuclear Information System (INIS)

    Sun, J.D.; Dent, J.G.

    1984-01-01

    Bromobenzene is a hepatotoxicant that causes centrilobular necrosis. Pretreatment of animals with 3-methylcholanthrene decreases and phenobarbital pretreatment enhances the hepatotoxic action of this compound. We have investigated the macromolecular weight specificity of the covalent interactions of bromobenzene with liver macromolecules following incubation of [ 14 C]bromobenzene in isolated hepatocytes. Hepatocytes were prepared from Fischer-344 rats treated for 3 days with 3-methylcholanthrene, phenobarbital, or normal saline. After a 1-hr incubation, total covalent binding, as measured by sodium dodecyl sulfate-equilibrium dialysis, was twofold less in hepatocytes from 3-methylcholanthrene-treated rats and sixfold greater in hepatocytes from phenobarbital-treated rats, as compared to hepatocytes from control animals. Analysis of the arylated macromolecules by electrophoresis on 15% sodium dodecyl sulfate-polyacrylamide disc gels indicated that in the first 1 to 3 min of incubation substantial amounts of covalently bound radiolabel were associated with macromolecules of between 20,000 and 40,000. The amount of radioactivity associated with these macromolecules rapidly diminished in hepatocytes from control and 3-methylcholanthrene-treated animals. In hepatocytes from phenobarbital-treated animals, the amount of radioactivity associated with macromolecules, 20,000, increased throughout the incubation. The amount of radiolabel associated with macromolecules, 20,000, increased in all incubations. When nontoxic doses of phenylmethylsulfonyl fluoride, a specific inhibitor of serine proteases, were added to control hepatocytes incubated with [ 14 C]-bromobenzene, the decrease in radioactivity associated with larger (greater than 20,000) macromolecules was inhibited and a corresponding lack of increase in radioactivity associated with smaller macromolecules was observed

  20. Ensemble stacking mitigates biases in inference of synaptic connectivity

    Directory of Open Access Journals (Sweden)

    Brendan Chambers

    2018-03-01

    Full Text Available A promising alternative to directly measuring the anatomical connections in a neuronal population is inferring the connections from the activity. We employ simulated spiking neuronal networks to compare and contrast commonly used inference methods that identify likely excitatory synaptic connections using statistical regularities in spike timing. We find that simple adjustments to standard algorithms improve inference accuracy: A signing procedure improves the power of unsigned mutual-information-based approaches and a correction that accounts for differences in mean and variance of background timing relationships, such as those expected to be induced by heterogeneous firing rates, increases the sensitivity of frequency-based methods. We also find that different inference methods reveal distinct subsets of the synaptic network and each method exhibits different biases in the accurate detection of reciprocity and local clustering. To correct for errors and biases specific to single inference algorithms, we combine methods into an ensemble. Ensemble predictions, generated as a linear combination of multiple inference algorithms, are more sensitive than the best individual measures alone, and are more faithful to ground-truth statistics of connectivity, mitigating biases specific to single inference methods. These weightings generalize across simulated datasets, emphasizing the potential for the broad utility of ensemble-based approaches. Mapping the routing of spikes through local circuitry is crucial for understanding neocortical computation. Under appropriate experimental conditions, these maps can be used to infer likely patterns of synaptic recruitment, linking activity to underlying anatomical connections. Such inferences help to reveal the synaptic implementation of population dynamics and computation. We compare a number of standard functional measures to infer underlying connectivity. We find that regularization impacts measures

  1. Constraint Satisfaction Inference : Non-probabilistic Global Inference for Sequence Labelling

    NARCIS (Netherlands)

    Canisius, S.V.M.; van den Bosch, A.; Daelemans, W.; Basili, R.; Moschitti, A.

    2006-01-01

    We present a new method for performing sequence labelling based on the idea of using a machine-learning classifier to generate several possible output sequences, and then applying an inference procedure to select the best sequence among those. Most sequence labelling methods following a similar

  2. Structure, function and folding of phosphoglycerate kinase are strongly perturbed by macromolecular crowding.

    Science.gov (United States)

    Samiotakis, Antonios; Dhar, Apratim; Ebbinghaus, Simon; Nienhaus, Lea; Homouz, Dirar; Gruebele, Martin; Cheung, Margaret

    2010-10-01

    We combine experiment and computer simulation to show how macromolecular crowding dramatically affects the structure, function and folding landscape of phosphoglycerate kinase (PGK). Fluorescence labeling shows that compact states of yeast PGK are populated as the amount of crowding agents (Ficoll 70) increases. Coarse-grained molecular simulations reveal three compact ensembles: C (crystal structure), CC (collapsed crystal) and Sph (spherical compact). With an adjustment for viscosity, crowded wild type PGK and fluorescent PGK are about 15 times or more active in 200 mg/ml Ficoll than in aqueous solution. Our results suggest a new solution to the classic problem of how the ADP and diphosphoglycerate binding sites of PGK come together to make ATP: rather than undergoing a hinge motion, the ADP and substrate sites are already located in proximity under crowded conditions that mimic the in vivo conditions under which the enzyme actually operates.

  3. Reasoning about Informal Statistical Inference: One Statistician's View

    Science.gov (United States)

    Rossman, Allan J.

    2008-01-01

    This paper identifies key concepts and issues associated with the reasoning of informal statistical inference. I focus on key ideas of inference that I think all students should learn, including at secondary level as well as tertiary. I argue that a fundamental component of inference is to go beyond the data at hand, and I propose that statistical…

  4. Meta-learning framework applied in bioinformatics inference system design.

    Science.gov (United States)

    Arredondo, Tomás; Ormazábal, Wladimir

    2015-01-01

    This paper describes a meta-learner inference system development framework which is applied and tested in the implementation of bioinformatic inference systems. These inference systems are used for the systematic classification of the best candidates for inclusion in bacterial metabolic pathway maps. This meta-learner-based approach utilises a workflow where the user provides feedback with final classification decisions which are stored in conjunction with analysed genetic sequences for periodic inference system training. The inference systems were trained and tested with three different data sets related to the bacterial degradation of aromatic compounds. The analysis of the meta-learner-based framework involved contrasting several different optimisation methods with various different parameters. The obtained inference systems were also contrasted with other standard classification methods with accurate prediction capabilities observed.

  5. Characterization of matrix damage in ion-irradiated reactor vessel steel

    International Nuclear Information System (INIS)

    Fujii, Katsuhiko; Fukuya, Koji

    2004-01-01

    Exact nature of the matrix damage, that is one of radiation-induced nano-scale microstructural features causing radiation embrittlement of reactor vessel, in irradiated commercial steels has not been clarified yet by direct characterization using transmission electron microscopy (TEM). We designed a new preparation method of TEM observation samples and applied it to the direct TEM observation of the matrix damage in the commercial steel samples irradiated by ions. The simulation irradiation was carried out by 3 MeV Ni 2+ ion to a dose of 1 dpa at 290degC. Thin foil specimens for TEM observation were prepared using the modified focused ion beam method. A weak-beam TEM study was carried out for the observation of matrix damage in the samples. Results of this first detailed observation of the matrix damage in the irradiated commercial steel show that it is consisted of small dislocation loops. The observed and analyzed dislocation loops have Burgers vectors b = a , and a mean image size and the number density are 2.5 nm and about 1 x 10 22 m -3 , respectively. In this experiment, all of the observed dislocation loops were too small to determine the vacancy or interstitial nature of the dislocation loops directly. Although it is an indirect method, post-irradiation annealing was used to infer the loop nature. Most of dislocation loops were stable after the annealing at 400degC for 30 min. This result suggests that their nature is interstitial. (author)

  6. Statistical inference and Aristotle's Rhetoric.

    Science.gov (United States)

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  7. Children's and adults' judgments of the certainty of deductive inferences, inductive inferences, and guesses.

    Science.gov (United States)

    Pillow, Bradford H; Pearson, Raeanne M; Hecht, Mary; Bremer, Amanda

    2010-01-01

    Children and adults rated their own certainty following inductive inferences, deductive inferences, and guesses. Beginning in kindergarten, participants rated deductions as more certain than weak inductions or guesses. Deductions were rated as more certain than strong inductions beginning in Grade 3, and fourth-grade children and adults differentiated strong inductions, weak inductions, and informed guesses from pure guesses. By Grade 3, participants also gave different types of explanations for their deductions and inductions. These results are discussed in relation to children's concepts of cognitive processes, logical reasoning, and epistemological development.

  8. Optimization of selective inversion recovery magnetization transfer imaging for macromolecular content mapping in the human brain.

    Science.gov (United States)

    Dortch, Richard D; Bagnato, Francesca; Gochberg, Daniel F; Gore, John C; Smith, Seth A

    2018-03-24

    To optimize a selective inversion recovery (SIR) sequence for macromolecular content mapping in the human brain at 3.0T. SIR is a quantitative method for measuring magnetization transfer (qMT) that uses a low-power, on-resonance inversion pulse. This results in a biexponential recovery of free water signal that can be sampled at various inversion/predelay times (t I/ t D ) to estimate a subset of qMT parameters, including the macromolecular-to-free pool-size-ratio (PSR), the R 1 of free water (R 1f ), and the rate of MT exchange (k mf ). The adoption of SIR has been limited by long acquisition times (≈4 min/slice). Here, we use Cramér-Rao lower bound theory and data reduction strategies to select optimal t I /t D combinations to reduce imaging times. The schemes were experimentally validated in phantoms, and tested in healthy volunteers (N = 4) and a multiple sclerosis patient. Two optimal sampling schemes were determined: (i) a 5-point scheme (k mf estimated) and (ii) a 4-point scheme (k mf assumed). In phantoms, the 5/4-point schemes yielded parameter estimates with similar SNRs as our previous 16-point scheme, but with 4.1/6.1-fold shorter scan times. Pair-wise comparisons between schemes did not detect significant differences for any scheme/parameter. In humans, parameter values were consistent with published values, and similar levels of precision were obtained from all schemes. Furthermore, fixing k mf reduced the sensitivity of PSR to partial-volume averaging, yielding more consistent estimates throughout the brain. qMT parameters can be robustly estimated in ≤1 min/slice (without independent measures of ΔB 0 , B1+, and T 1 ) when optimized t I -t D combinations are selected. © 2018 International Society for Magnetic Resonance in Medicine.

  9. Macromolecular composition of terrestrial and marine organic matter in sediments across the East Siberian Arctic Shelf

    Science.gov (United States)

    Sparkes, Robert B.; Doğrul Selver, Ayça; Gustafsson, Örjan; Semiletov, Igor P.; Haghipour, Negar; Wacker, Lukas; Eglinton, Timothy I.; Talbot, Helen M.; van Dongen, Bart E.

    2016-10-01

    Mobilisation of terrestrial organic carbon (terrOC) from permafrost environments in eastern Siberia has the potential to deliver significant amounts of carbon to the Arctic Ocean, via both fluvial and coastal erosion. Eroded terrOC can be degraded during offshore transport or deposited across the wide East Siberian Arctic Shelf (ESAS). Most studies of terrOC on the ESAS have concentrated on solvent-extractable organic matter, but this represents only a small proportion of the total terrOC load. In this study we have used pyrolysis-gas chromatography-mass spectrometry (py-GCMS) to study all major groups of macromolecular components of the terrOC; this is the first time that this technique has been applied to the ESAS. This has shown that there is a strong offshore trend from terrestrial phenols, aromatics and cyclopentenones to marine pyridines. There is good agreement between proportion phenols measured using py-GCMS and independent quantification of lignin phenol concentrations (r2 = 0.67, p radiocarbon data for bulk OC (14COC) which, when coupled with previous measurements, allows us to produce the most comprehensive 14COC map of the ESAS to date. Combining the 14COC and py-GCMS data suggests that the aromatics group of compounds is likely sourced from old, aged terrOC, in contrast to the phenols group, which is likely sourced from modern woody material. We propose that an index of the relative proportions of phenols and pyridines can be used as a novel terrestrial vs. marine proxy measurement for macromolecular organic matter. Principal component analysis found that various terrestrial vs. marine proxies show different patterns across the ESAS, and it shows that multiple river-ocean transects of surface sediments transition from river-dominated to coastal-erosion-dominated to marine-dominated signatures.

  10. Size-related variation in arm damage frequency in the crown-of-thorns sea star, Acanthaster planci

    Directory of Open Access Journals (Sweden)

    Jairo Rivera-Posada

    2014-03-01

    Full Text Available Objective: To examine variation in the frequency of arm damage in different sizes of Acanthaster planci (A. planci, assess how this damage is inflicted by fish predators, and infer the potential role of predation in population regulation. Methods: Diameters of A. planci collected from three sites in the Philippines were measured and arm damage frequency and severity was assessed. Frequency of arm damage was compared between sizes. Feeding behavior of fish predators was also observed in the laboratory. Results: This study demonstrates that sublethal predation by triggerfishes on A. planci result in extensive arm damage. Overall, 60% of A. planci sampled across all sites had sublethal injuries. The frequency of individuals with missing or regenerating arms was highest in medium-sized young adults (11-20 cm, which coincides with the phase where A. planci shift from cryptic to exposed daytime feeding. Conclusions: The high incidence of arm damage within intermediate-sized sea stars indicates that predators exercise some level of regulation on A. planci populations at a local scale. Identification and protection of putative predators that target the most vulnerable life history stages of A. planci are essential in developing population control strategies and reverse sustained declines in coral cover.

  11. Towards a Fuzzy Bayesian Network Based Approach for Safety Risk Analysis of Tunnel-Induced Pipeline Damage.

    Science.gov (United States)

    Zhang, Limao; Wu, Xianguo; Qin, Yawei; Skibniewski, Miroslaw J; Liu, Wenli

    2016-02-01

    Tunneling excavation is bound to produce significant disturbances to surrounding environments, and the tunnel-induced damage to adjacent underground buried pipelines is of considerable importance for geotechnical practice. A fuzzy Bayesian networks (FBNs) based approach for safety risk analysis is developed in this article with detailed step-by-step procedures, consisting of risk mechanism analysis, the FBN model establishment, fuzzification, FBN-based inference, defuzzification, and decision making. In accordance with the failure mechanism analysis, a tunnel-induced pipeline damage model is proposed to reveal the cause-effect relationships between the pipeline damage and its influential variables. In terms of the fuzzification process, an expert confidence indicator is proposed to reveal the reliability of the data when determining the fuzzy probability of occurrence of basic events, with both the judgment ability level and the subjectivity reliability level taken into account. By means of the fuzzy Bayesian inference, the approach proposed in this article is capable of calculating the probability distribution of potential safety risks and identifying the most likely potential causes of accidents under both prior knowledge and given evidence circumstances. A case concerning the safety analysis of underground buried pipelines adjacent to the construction of the Wuhan Yangtze River Tunnel is presented. The results demonstrate the feasibility of the proposed FBN approach and its application potential. The proposed approach can be used as a decision tool to provide support for safety assurance and management in tunnel construction, and thus increase the likelihood of a successful project in a complex project environment. © 2015 Society for Risk Analysis.

  12. Macromolecular HPMA-based nanoparticles with cholesterol for solid-tumor targeting: detailed study of the inner structure of a highly efficient drug delivery system

    Czech Academy of Sciences Publication Activity Database

    Filippov, Sergey K.; Chytil, Petr; Konarev, P. V.; Dyakonova, M.; Papadakis, C. M.; Zhigunov, Alexander; Pleštil, Josef; Štěpánek, Petr; Etrych, Tomáš; Ulbrich, Karel; Svergun, D. I.

    2012-01-01

    Roč. 13, č. 8 (2012), s. 2594-2604 ISSN 1525-7797 R&D Projects: GA MŠk ME09059; GA AV ČR IAAX00500803; GA ČR GAP108/12/0640 Institutional research plan: CEZ:AV0Z40500505 Institutional support: RVO:61389013 Keywords : HPMA * cholesterol * SAXS Subject RIV: CD - Macromolecular Chemistry Impact factor: 5.371, year: 2012

  13. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  14. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    Science.gov (United States)

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  15. Deep Learning for Population Genetic Inference.

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S

    2016-03-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  16. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    Science.gov (United States)

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  17. On Maximum Entropy and Inference

    Directory of Open Access Journals (Sweden)

    Luigi Gresele

    2017-11-01

    Full Text Available Maximum entropy is a powerful concept that entails a sharp separation between relevant and irrelevant variables. It is typically invoked in inference, once an assumption is made on what the relevant variables are, in order to estimate a model from data, that affords predictions on all other (dependent variables. Conversely, maximum entropy can be invoked to retrieve the relevant variables (sufficient statistics directly from the data, once a model is identified by Bayesian model selection. We explore this approach in the case of spin models with interactions of arbitrary order, and we discuss how relevant interactions can be inferred. In this perspective, the dimensionality of the inference problem is not set by the number of parameters in the model, but by the frequency distribution of the data. We illustrate the method showing its ability to recover the correct model in a few prototype cases and discuss its application on a real dataset.

  18. Identification of transcriptional macromolecular associations in human bone using browser based in silico analysis in a giant correlation matrix.

    Science.gov (United States)

    Reppe, Sjur; Sachse, Daniel; Olstad, Ole K; Gautvik, Vigdis T; Sanderson, Paul; Datta, Harish K; Berg, Jens P; Gautvik, Kaare M

    2013-03-01

    Intracellular signaling is critically dependent on gene regulatory networks comprising physical molecular interactions. Presently, there is a lack of comprehensive databases for most human tissue types to verify such macromolecular interactions. We present a user friendly browser which helps to identify functional macromolecular interactions in human bone as significant correlations at the transcriptional level. The molecular skeletal phenotype has been characterized by transcriptome analysis of iliac crest bone biopsies from 84 postmenopausal women through quantifications of ~23,000 mRNA species. When the signal levels were inter-correlated, an array containing >260 million correlations was generated, thus recognizing the human bone interactome at the RNA level. The matrix correlation and p values were made easily accessible by a freely available online browser. We show that significant correlations within the giant matrix are reproduced in a replica set of 13 male vertebral biopsies. The identified correlations differ somewhat from transcriptional interactions identified in cell culture experiments and transgenic mice, thus demonstrating that care should be taken in extrapolating such results to the in vivo situation in human bone. The current giant matrix and web browser are a valuable tool for easy access to the human bone transcriptome and molecular interactions represented as significant correlations at the RNA-level. The browser and matrix should be a valuable hypothesis generating tool for identification of regulatory mechanisms and serve as a library of transcript relationships in human bone, a relatively inaccessible tissue. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  20. Causal inference in economics and marketing.

    Science.gov (United States)

    Varian, Hal R

    2016-07-05

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual-a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference.

  1. Uncertainty in prediction and in inference

    International Nuclear Information System (INIS)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support

  2. Nitrogen limitation in natural populations of cyanobacteria (Spirulina and Oscillatoria spp.) and its effect on macromolecular synthesis

    International Nuclear Information System (INIS)

    van Rijn, J.; Shilo, M.

    1986-01-01

    Natural populations of the cyanobacteria Spirulina species and Oscillatoria species obtained from Israeli fish ponds were limited in growth by nitrogen availability in summer. Physiological indicators for nitrogen limitation, such as phycocyanin, chlorophyll a, and carbohydrate content, did not show clear evidence for nitrogen limited growth, since these organisms are capable of vertical migration from and to the nitrogen-rich bottom. By means of 14 C labeling of the cells under simulated pond conditions followed by cell fractionation into macromolecular compounds, it was found that carbohydrates synthesized at the lighted surface were partially utilized for dark protein synthesis at the bottom of these ponds

  3. About Small Streams and Shiny Rocks: Macromolecular Crystal Growth in Microfluidics

    Science.gov (United States)

    vanderWoerd, Mark; Ferree, Darren; Spearing, Scott; Monaco, Lisa; Molho, Josh; Spaid, Michael; Brasseur, Mike; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    We are developing a novel technique with which we have grown diffraction quality protein crystals in very small volumes, utilizing chip-based, microfluidic ("LabChip") technology. With this technology volumes smaller than achievable with any laboratory pipette can be dispensed with high accuracy. We have performed a feasibility study in which we crystallized several proteins with the aid of a LabChip device. The protein crystals are of excellent quality as shown by X-ray diffraction. The advantages of this new technology include improved accuracy of dispensing for small volumes, complete mixing of solution constituents without bubble formation, highly repeatable recipe and growth condition replication, and easy automation of the method. We have designed a first LabChip device specifically for protein crystallization in batch mode and can reliably dispense and mix from a range of solution constituents. We are currently testing this design. Upon completion additional crystallization techniques, such as vapor diffusion and liquid-liquid diffusion will be accommodated. Macromolecular crystallization using microfluidic technology is envisioned as a fully automated system, which will use the 'tele-science' concept of remote operation and will be developed into a research facility aboard the International Space Station.

  4. A new on-axis micro-spectrophotometer for combining Raman, fluorescence and UV/Vis absorption spectroscopy with macromolecular crystallography at the Swiss Light Source

    International Nuclear Information System (INIS)

    Pompidor, Guillaume; Dworkowski, Florian S. N.; Thominet, Vincent; Schulze-Briese, Clemens; Fuchs, Martin R.

    2013-01-01

    The new version MS2 of the in situ on-axis micro-spectrophotometer at the macromolecular crystallography beamline X10SA of the Swiss Light Source supports the concurrent acquisition of Raman, resonance Raman, fluorescence and UV/Vis absorption spectra along with diffraction data. The combination of X-ray diffraction experiments with optical methods such as Raman, UV/Vis absorption and fluorescence spectroscopy greatly enhances and complements the specificity of the obtained information. The upgraded version of the in situ on-axis micro-spectrophotometer, MS2, at the macromolecular crystallography beamline X10SA of the Swiss Light Source is presented. The instrument newly supports Raman and resonance Raman spectroscopy, in addition to the previously available UV/Vis absorption and fluorescence modes. With the recent upgrades of the spectral bandwidth, instrument stability, detection efficiency and control software, the application range of the instrument and its ease of operation were greatly improved. Its on-axis geometry with collinear X-ray and optical axes to ensure optimal control of the overlap of sample volumes probed by each technique is still unique amongst comparable facilities worldwide and the instrument has now been in general user operation for over two years

  5. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  6. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  7. Making inference from wildlife collision data: inferring predator absence from prey strikes

    Directory of Open Access Journals (Sweden)

    Peter Caley

    2017-02-01

    Full Text Available Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.

  8. Making inference from wildlife collision data: inferring predator absence from prey strikes.

    Science.gov (United States)

    Caley, Peter; Hosack, Geoffrey R; Barry, Simon C

    2017-01-01

    Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.

  9. Causal inference in biology networks with integrated belief propagation.

    Science.gov (United States)

    Chang, Rui; Karr, Jonathan R; Schadt, Eric E

    2015-01-01

    Inferring causal relationships among molecular and higher order phenotypes is a critical step in elucidating the complexity of living systems. Here we propose a novel method for inferring causality that is no longer constrained by the conditional dependency arguments that limit the ability of statistical causal inference methods to resolve causal relationships within sets of graphical models that are Markov equivalent. Our method utilizes Bayesian belief propagation to infer the responses of perturbation events on molecular traits given a hypothesized graph structure. A distance measure between the inferred response distribution and the observed data is defined to assess the 'fitness' of the hypothesized causal relationships. To test our algorithm, we infer causal relationships within equivalence classes of gene networks in which the form of the functional interactions that are possible are assumed to be nonlinear, given synthetic microarray and RNA sequencing data. We also apply our method to infer causality in real metabolic network with v-structure and feedback loop. We show that our method can recapitulate the causal structure and recover the feedback loop only from steady-state data which conventional method cannot.

  10. Efficient Bayesian inference for ARFIMA processes

    Science.gov (United States)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-03-01

    Many geophysical quantities, like atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long-range dependence (LRD). LRD means that these quantities experience non-trivial temporal memory, which potentially enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LRD. In this paper we present a modern and systematic approach to the inference of LRD. Rather than Mandelbrot's fractional Gaussian noise, we use the more flexible Autoregressive Fractional Integrated Moving Average (ARFIMA) model which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LRD, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g. short memory effects) can be integrated over in order to focus on long memory parameters, and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data, with favorable comparison to the standard estimators.

  11. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    Science.gov (United States)

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  12. Damage pattern as a function of radiation quality and other factors.

    Science.gov (United States)

    Burkart, W; Jung, T; Frasch, G

    1999-01-01

    An understanding of damage pattern in critical cellular structures such as DNA is an important prerequisite for a mechanistic assessment of primary radiation damage, its possible repair, and the propagation of residual changes in somatic and germ cells as potential contributors to disease or ageing. Important quantitative insights have been made recently on the distribution in time and space of critical lesions from direct and indirect action of ionizing radiation on mammalian cells. When compared to damage from chemicals or from spontaneous degradation, e.g. depurination or base deamination in DNA, the potential of even low-LET radiation to create local hot spots of damage from single particle tracks is of utmost importance. This has important repercussions on inferences from critical biological effects at high dose and dose rate exposure situations to health risks at chronic, low-level exposures as experienced in environmental and controlled occupational settings. About 10,000 DNA lesions per human cell nucleus and day from spontaneous degradation and chemical attack cause no apparent effect, but a dose of 4 Gy translating into a similar number of direct and indirect DNA breaks induces acute lethality. Therefore, single lesions cannot explain the high efficiency of ionizing radiation in the induction of mutation, transformation and loss of proliferative capacity. Clustered damage leading to poorly repairable double-strand breaks or even more complex local DNA degradation, correlates better with fixed damage and critical biological endpoints. A comparison with other physical, chemical and biological agents indicates that ionizing radiation is indeed set apart from these by its unique micro- and nano-dosimetric traits. Only a few other agents such as bleomycin have a similar potential to cause complex damage from single events. However, in view of the multi-stage mechanism of carcinogenesis, it is still an open question whether dose-effect linearity for complex

  13. Metabolite Damage and Metabolite Damage Control in Plants

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, Andrew D. [Horticultural Sciences Department and; Henry, Christopher S. [Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, Illinois 60439, email:; Computation Institute, University of Chicago, Chicago, Illinois 60637; Fiehn, Oliver [Genome Center, University of California, Davis, California 95616, email:; de Crécy-Lagard, Valérie [Microbiology and Cell Science Department, University of Florida, Gainesville, Florida 32611, email: ,

    2016-04-29

    It is increasingly clear that (a) many metabolites undergo spontaneous or enzyme-catalyzed side reactions in vivo, (b) the damaged metabolites formed by these reactions can be harmful, and (c) organisms have biochemical systems that limit the buildup of damaged metabolites. These damage-control systems either return a damaged molecule to its pristine state (metabolite repair) or convert harmful molecules to harmless ones (damage preemption). Because all organisms share a core set of metabolites that suffer the same chemical and enzymatic damage reactions, certain damage-control systems are widely conserved across the kingdoms of life. Relatively few damage reactions and damage-control systems are well known. Uncovering new damage reactions and identifying the corresponding damaged metabolites, damage-control genes, and enzymes demands a coordinated mix of chemistry, metabolomics, cheminformatics, biochemistry, and comparative genomics. This review illustrates the above points using examples from plants, which are at least as prone to metabolite damage as other organisms.

  14. Deep Learning for Population Genetic Inference.

    Directory of Open Access Journals (Sweden)

    Sara Sheehan

    2016-03-01

    Full Text Available Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data to the output (e.g., population genetic parameters of interest. We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history. Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  15. Deep Learning for Population Genetic Inference

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S.

    2016-01-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme. PMID:27018908

  16. A Bayesian Network Schema for Lessening Database Inference

    National Research Council Canada - National Science Library

    Chang, LiWu; Moskowitz, Ira S

    2001-01-01

    .... The authors introduce a formal schema for database inference analysis, based upon a Bayesian network structure, which identifies critical parameters involved in the inference problem and represents...

  17. Type Inference for Session Types in the Pi-Calculus

    DEFF Research Database (Denmark)

    Graversen, Eva Fajstrup; Harbo, Jacob Buchreitz; Huttel, Hans

    2014-01-01

    In this paper we present a direct algorithm for session type inference for the π-calculus. Type inference for session types has previously been achieved by either imposing limitations and restriction on the π-calculus, or by reducing the type inference problem to that for linear types. Our approach...

  18. Explanatory Preferences Shape Learning and Inference.

    Science.gov (United States)

    Lombrozo, Tania

    2016-10-01

    Explanations play an important role in learning and inference. People often learn by seeking explanations, and they assess the viability of hypotheses by considering how well they explain the data. An emerging body of work reveals that both children and adults have strong and systematic intuitions about what constitutes a good explanation, and that these explanatory preferences have a systematic impact on explanation-based processes. In particular, people favor explanations that are simple and broad, with the consequence that engaging in explanation can shape learning and inference by leading people to seek patterns and favor hypotheses that support broad and simple explanations. Given the prevalence of explanation in everyday cognition, understanding explanation is therefore crucial to understanding learning and inference. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Grammatical inference algorithms, routines and applications

    CERN Document Server

    Wieczorek, Wojciech

    2017-01-01

    This book focuses on grammatical inference, presenting classic and modern methods of grammatical inference from the perspective of practitioners. To do so, it employs the Python programming language to present all of the methods discussed. Grammatical inference is a field that lies at the intersection of multiple disciplines, with contributions from computational linguistics, pattern recognition, machine learning, computational biology, formal learning theory and many others. Though the book is largely practical, it also includes elements of learning theory, combinatorics on words, the theory of automata and formal languages, plus references to real-world problems. The listings presented here can be directly copied and pasted into other programs, thus making the book a valuable source of ready recipes for students, academic researchers, and programmers alike, as well as an inspiration for their further development.>.

  20. BagReg: Protein inference through machine learning.

    Science.gov (United States)

    Zhao, Can; Liu, Dao; Teng, Ben; He, Zengyou

    2015-08-01

    Protein inference from the identified peptides is of primary importance in the shotgun proteomics. The target of protein inference is to identify whether each candidate protein is truly present in the sample. To date, many computational methods have been proposed to solve this problem. However, there is still no method that can fully utilize the information hidden in the input data. In this article, we propose a learning-based method named BagReg for protein inference. The method firstly artificially extracts five features from the input data, and then chooses each feature as the class feature to separately build models to predict the presence probabilities of proteins. Finally, the weak results from five prediction models are aggregated to obtain the final result. We test our method on six public available data sets. The experimental results show that our method is superior to the state-of-the-art protein inference algorithms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Ensemble stacking mitigates biases in inference of synaptic connectivity.

    Science.gov (United States)

    Chambers, Brendan; Levy, Maayan; Dechery, Joseph B; MacLean, Jason N

    2018-01-01

    A promising alternative to directly measuring the anatomical connections in a neuronal population is inferring the connections from the activity. We employ simulated spiking neuronal networks to compare and contrast commonly used inference methods that identify likely excitatory synaptic connections using statistical regularities in spike timing. We find that simple adjustments to standard algorithms improve inference accuracy: A signing procedure improves the power of unsigned mutual-information-based approaches and a correction that accounts for differences in mean and variance of background timing relationships, such as those expected to be induced by heterogeneous firing rates, increases the sensitivity of frequency-based methods. We also find that different inference methods reveal distinct subsets of the synaptic network and each method exhibits different biases in the accurate detection of reciprocity and local clustering. To correct for errors and biases specific to single inference algorithms, we combine methods into an ensemble. Ensemble predictions, generated as a linear combination of multiple inference algorithms, are more sensitive than the best individual measures alone, and are more faithful to ground-truth statistics of connectivity, mitigating biases specific to single inference methods. These weightings generalize across simulated datasets, emphasizing the potential for the broad utility of ensemble-based approaches.

  2. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  3. Russell and Humean Inferences

    Directory of Open Access Journals (Sweden)

    João Paulo Monteiro

    2001-12-01

    Full Text Available Russell's The Problems of Philosophy tries to establish a new theory of induction, at the same time that Hume is there accused of an irrational/ scepticism about induction". But a careful analysis of the theory of knowledge explicitly acknowledged by Hume reveals that, contrary to the standard interpretation in the XXth century, possibly influenced by Russell, Hume deals exclusively with causal inference (which he never classifies as "causal induction", although now we are entitled to do so, never with inductive inference in general, mainly generalizations about sensible qualities of objects ( whether, e.g., "all crows are black" or not is not among Hume's concerns. Russell's theories are thus only false alternatives to Hume's, in (1912 or in his (1948.

  4. Efficient algorithms for conditional independence inference

    Czech Academy of Sciences Publication Activity Database

    Bouckaert, R.; Hemmecke, R.; Lindner, S.; Studený, Milan

    2010-01-01

    Roč. 11, č. 1 (2010), s. 3453-3479 ISSN 1532-4435 R&D Projects: GA ČR GA201/08/0539; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional independence inference * linear programming approach Subject RIV: BA - General Mathematics Impact factor: 2.949, year: 2010 http://library.utia.cas.cz/separaty/2010/MTR/studeny-efficient algorithms for conditional independence inference.pdf

  5. State-Space Inference and Learning with Gaussian Processes

    OpenAIRE

    Turner, R; Deisenroth, MP; Rasmussen, CE

    2010-01-01

    18.10.13 KB. Ok to add author version to spiral, authors hold copyright. State-space inference and learning with Gaussian processes (GPs) is an unsolved problem. We propose a new, general methodology for inference and learning in nonlinear state-space models that are described probabilistically by non-parametric GP models. We apply the expectation maximization algorithm to iterate between inference in the latent state-space and learning the parameters of the underlying GP dynamics model. C...

  6. Enhancing Transparency and Control When Drawing Data-Driven Inferences About Individuals.

    Science.gov (United States)

    Chen, Daizhuo; Fraiberger, Samuel P; Moakler, Robert; Provost, Foster

    2017-09-01

    Recent studies show the remarkable power of fine-grained information disclosed by users on social network sites to infer users' personal characteristics via predictive modeling. Similar fine-grained data are being used successfully in other commercial applications. In response, attention is turning increasingly to the transparency that organizations provide to users as to what inferences are drawn and why, as well as to what sort of control users can be given over inferences that are drawn about them. In this article, we focus on inferences about personal characteristics based on information disclosed by users' online actions. As a use case, we explore personal inferences that are made possible from "Likes" on Facebook. We first present a means for providing transparency into the information responsible for inferences drawn by data-driven models. We then introduce the "cloaking device"-a mechanism for users to inhibit the use of particular pieces of information in inference. Using these analytical tools we ask two main questions: (1) How much information must users cloak to significantly affect inferences about their personal traits? We find that usually users must cloak only a small portion of their actions to inhibit inference. We also find that, encouragingly, false-positive inferences are significantly easier to cloak than true-positive inferences. (2) Can firms change their modeling behavior to make cloaking more difficult? The answer is a definitive yes. We demonstrate a simple modeling change that requires users to cloak substantially more information to affect the inferences drawn. The upshot is that organizations can provide transparency and control even into complicated, predictive model-driven inferences, but they also can make control easier or harder for their users.

  7. Fused Regression for Multi-source Gene Regulatory Network Inference.

    Directory of Open Access Journals (Sweden)

    Kari Y Lam

    2016-12-01

    Full Text Available Understanding gene regulatory networks is critical to understanding cellular differentiation and response to external stimuli. Methods for global network inference have been developed and applied to a variety of species. Most approaches consider the problem of network inference independently in each species, despite evidence that gene regulation can be conserved even in distantly related species. Further, network inference is often confined to single data-types (single platforms and single cell types. We introduce a method for multi-source network inference that allows simultaneous estimation of gene regulatory networks in multiple species or biological processes through the introduction of priors based on known gene relationships such as orthology incorporated using fused regression. This approach improves network inference performance even when orthology mapping and conservation are incomplete. We refine this method by presenting an algorithm that extracts the true conserved subnetwork from a larger set of potentially conserved interactions and demonstrate the utility of our method in cross species network inference. Last, we demonstrate our method's utility in learning from data collected on different experimental platforms.

  8. HIERARCHICAL PROBABILISTIC INFERENCE OF COSMIC SHEAR

    International Nuclear Information System (INIS)

    Schneider, Michael D.; Dawson, William A.; Hogg, David W.; Marshall, Philip J.; Bard, Deborah J.; Meyers, Joshua; Lang, Dustin

    2015-01-01

    Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics

  9. Inverse Ising inference with correlated samples

    International Nuclear Information System (INIS)

    Obermayer, Benedikt; Levine, Erel

    2014-01-01

    Correlations between two variables of a high-dimensional system can be indicative of an underlying interaction, but can also result from indirect effects. Inverse Ising inference is a method to distinguish one from the other. Essentially, the parameters of the least constrained statistical model are learned from the observed correlations such that direct interactions can be separated from indirect correlations. Among many other applications, this approach has been helpful for protein structure prediction, because residues which interact in the 3D structure often show correlated substitutions in a multiple sequence alignment. In this context, samples used for inference are not independent but share an evolutionary history on a phylogenetic tree. Here, we discuss the effects of correlations between samples on global inference. Such correlations could arise due to phylogeny but also via other slow dynamical processes. We present a simple analytical model to address the resulting inference biases, and develop an exact method accounting for background correlations in alignment data by combining phylogenetic modeling with an adaptive cluster expansion algorithm. We find that popular reweighting schemes are only marginally effective at removing phylogenetic bias, suggest a rescaling strategy that yields better results, and provide evidence that our conclusions carry over to the frequently used mean-field approach to the inverse Ising problem. (paper)

  10. Bayesian structural inference for hidden processes

    Science.gov (United States)

    Strelioff, Christopher C.; Crutchfield, James P.

    2014-04-01

    We introduce a Bayesian approach to discovering patterns in structurally complex processes. The proposed method of Bayesian structural inference (BSI) relies on a set of candidate unifilar hidden Markov model (uHMM) topologies for inference of process structure from a data series. We employ a recently developed exact enumeration of topological ɛ-machines. (A sequel then removes the topological restriction.) This subset of the uHMM topologies has the added benefit that inferred models are guaranteed to be ɛ-machines, irrespective of estimated transition probabilities. Properties of ɛ-machines and uHMMs allow for the derivation of analytic expressions for estimating transition probabilities, inferring start states, and comparing the posterior probability of candidate model topologies, despite process internal structure being only indirectly present in data. We demonstrate BSI's effectiveness in estimating a process's randomness, as reflected by the Shannon entropy rate, and its structure, as quantified by the statistical complexity. We also compare using the posterior distribution over candidate models and the single, maximum a posteriori model for point estimation and show that the former more accurately reflects uncertainty in estimated values. We apply BSI to in-class examples of finite- and infinite-order Markov processes, as well to an out-of-class, infinite-state hidden process.

  11. The Impact of Disablers on Predictive Inference

    Science.gov (United States)

    Cummins, Denise Dellarosa

    2014-01-01

    People consider alternative causes when deciding whether a cause is responsible for an effect (diagnostic inference) but appear to neglect them when deciding whether an effect will occur (predictive inference). Five experiments were conducted to test a 2-part explanation of this phenomenon: namely, (a) that people interpret standard predictive…

  12. Automatic physical inference with information maximizing neural networks

    Science.gov (United States)

    Charnock, Tom; Lavaux, Guilhem; Wandelt, Benjamin D.

    2018-04-01

    Compressing large data sets to a manageable number of summaries that are informative about the underlying parameters vastly simplifies both frequentist and Bayesian inference. When only simulations are available, these summaries are typically chosen heuristically, so they may inadvertently miss important information. We introduce a simulation-based machine learning technique that trains artificial neural networks to find nonlinear functionals of data that maximize Fisher information: information maximizing neural networks (IMNNs). In test cases where the posterior can be derived exactly, likelihood-free inference based on automatically derived IMNN summaries produces nearly exact posteriors, showing that these summaries are good approximations to sufficient statistics. In a series of numerical examples of increasing complexity and astrophysical relevance we show that IMNNs are robustly capable of automatically finding optimal, nonlinear summaries of the data even in cases where linear compression fails: inferring the variance of Gaussian signal in the presence of noise, inferring cosmological parameters from mock simulations of the Lyman-α forest in quasar spectra, and inferring frequency-domain parameters from LISA-like detections of gravitational waveforms. In this final case, the IMNN summary outperforms linear data compression by avoiding the introduction of spurious likelihood maxima. We anticipate that the automatic physical inference method described in this paper will be essential to obtain both accurate and precise cosmological parameter estimates from complex and large astronomical data sets, including those from LSST and Euclid.

  13. Complex Macromolecular Architectures by Living Cationic Polymerization

    KAUST Repository

    Alghamdi, Reem D.

    2015-05-01

    Poly (vinyl ether)-based graft polymers have been synthesized by the combination of living cationic polymerization of vinyl ethers with other living or controlled/ living polymerization techniques (anionic and ATRP). The process involves the synthesis of well-defined homopolymers (PnBVE) and co/terpolymers [PnBVE-b-PCEVE-b-PSiDEGVE (ABC type) and PSiDEGVE-b-PnBVE-b-PSiDEGVE (CAC type)] by sequential living cationic polymerization of n-butyl vinyl ether (nBVE), 2-chloroethyl vinyl ether (CEVE) and tert-butyldimethylsilyl ethylene glycol vinyl ether (SiDEGVE), using mono-functional {[n-butoxyethyl acetate (nBEA)], [1-(2-chloroethoxy) ethyl acetate (CEEA)], [1-(2-(2-(t-butyldimethylsilyloxy)ethoxy) ethoxy) ethyl acetate (SiDEGEA)]} or di-functional [1,4-cyclohexanedimethanol di(1-ethyl acetate) (cHMDEA), (VEMOA)] initiators. The living cationic polymerizations of those monomers were conducted in hexane at -20 0C using Et3Al2Cl3 (catalyst) in the presence of 1 M AcOEt base.[1] The PCEVE segments of the synthesized block terpolymers were then used to react with living macroanions (PS-DPE-Li; poly styrene diphenyl ethylene lithium) to afford graft polymers. The quantitative desilylation of PSiDEGVE segments by n-Bu4N+F- in THF at 0 °C led to graft co- and terpolymers in which the polyalcohol is the outer block. These co-/terpolymers were subsequently subjected to “grafting-from” reactions by atom transfer radical polymerization (ATRP) of styrene to afford more complex macromolecular architectures. The base assisted living cationic polymerization of vinyl ethers were also used to synthesize well-defined α-hydroxyl polyvinylether (PnBVE-OH). The resulting polymers were then modified into an ATRP macro-initiator for the synthesis of well-defined block copolymers (PnBVE-b-PS). Bifunctional PnBVE with terminal malonate groups was also synthesized and used as a precursor for more complex architectures such as H-shaped block copolymer by “grafting-from” or

  14. Accurate macromolecular crystallographic refinement: incorporation of the linear scaling, semiempirical quantum-mechanics program DivCon into the PHENIX refinement package

    Energy Technology Data Exchange (ETDEWEB)

    Borbulevych, Oleg Y.; Plumley, Joshua A.; Martin, Roger I. [QuantumBio Inc., 2790 West College Avenue, State College, PA 16801 (United States); Merz, Kenneth M. Jr [University of Florida, Gainesville, Florida (United States); Westerhoff, Lance M., E-mail: lance@quantumbioinc.com [QuantumBio Inc., 2790 West College Avenue, State College, PA 16801 (United States)

    2014-05-01

    Semiempirical quantum-chemical X-ray macromolecular refinement using the program DivCon integrated with PHENIX is described. Macromolecular crystallographic refinement relies on sometimes dubious stereochemical restraints and rudimentary energy functionals to ensure the correct geometry of the model of the macromolecule and any covalently bound ligand(s). The ligand stereochemical restraint file (CIF) requires a priori understanding of the ligand geometry within the active site, and creation of the CIF is often an error-prone process owing to the great variety of potential ligand chemistry and structure. Stereochemical restraints have been replaced with more robust functionals through the integration of the linear-scaling, semiempirical quantum-mechanics (SE-QM) program DivCon with the PHENIX X-ray refinement engine. The PHENIX/DivCon package has been thoroughly validated on a population of 50 protein–ligand Protein Data Bank (PDB) structures with a range of resolutions and chemistry. The PDB structures used for the validation were originally refined utilizing various refinement packages and were published within the past five years. PHENIX/DivCon does not utilize CIF(s), link restraints and other parameters for refinement and hence it does not make as many a priori assumptions about the model. Across the entire population, the method results in reasonable ligand geometries and low ligand strains, even when the original refinement exhibited difficulties, indicating that PHENIX/DivCon is applicable to both single-structure and high-throughput crystallography.

  15. Inference as Prediction

    Science.gov (United States)

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  16. Problem solving and inference mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Furukawa, K; Nakajima, R; Yonezawa, A; Goto, S; Aoyama, A

    1982-01-01

    The heart of the fifth generation computer will be powerful mechanisms for problem solving and inference. A deduction-oriented language is to be designed, which will form the core of the whole computing system. The language is based on predicate logic with the extended features of structuring facilities, meta structures and relational data base interfaces. Parallel computation mechanisms and specialized hardware architectures are being investigated to make possible efficient realization of the language features. The project includes research into an intelligent programming system, a knowledge representation language and system, and a meta inference system to be built on the core. 30 references.

  17. Agglomerative concentric hypersphere clustering applied to structural damage detection

    Science.gov (United States)

    Silva, Moisés; Santos, Adam; Santos, Reginaldo; Figueiredo, Eloi; Sales, Claudomiro; Costa, João C. W. A.

    2017-08-01

    The present paper proposes a novel cluster-based method, named as agglomerative concentric hypersphere (ACH), to detect structural damage in engineering structures. Continuous structural monitoring systems often require unsupervised approaches to automatically infer the health condition of a structure. However, when a structure is under linear and nonlinear effects caused by environmental and operational variability, data normalization procedures are also required to overcome these effects. The proposed approach aims, through a straightforward clustering procedure, to discover automatically the optimal number of clusters, representing the main state conditions of a structural system. Three initialization procedures are introduced to evaluate the impact of deterministic and stochastic initializations on the performance of this approach. The ACH is compared to state-of-the-art approaches, based on Gaussian mixture models and Mahalanobis squared distance, on standard data sets from a post-tensioned bridge located in Switzerland: the Z-24 Bridge. The proposed approach demonstrates more efficiency in modeling the normal condition of the structure and its corresponding main clusters. Furthermore, it reveals a better classification performance than the alternative ones in terms of false-positive and false-negative indications of damage, demonstrating a promising applicability in real-world structural health monitoring scenarios.

  18. Elements of Causal Inference: Foundations and Learning Algorithms

    DEFF Research Database (Denmark)

    Peters, Jonas Martin; Janzing, Dominik; Schölkopf, Bernhard

    A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning......A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning...

  19. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  20. Causal inference in econometrics

    CERN Document Server

    Kreinovich, Vladik; Sriboonchitta, Songsak

    2016-01-01

    This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.

  1. Assessment of network inference methods: how to cope with an underdetermined problem.

    Directory of Open Access Journals (Sweden)

    Caroline Siegenthaler

    Full Text Available The inference of biological networks is an active research area in the field of systems biology. The number of network inference algorithms has grown tremendously in the last decade, underlining the importance of a fair assessment and comparison among these methods. Current assessments of the performance of an inference method typically involve the application of the algorithm to benchmark datasets and the comparison of the network predictions against the gold standard or reference networks. While the network inference problem is often deemed underdetermined, implying that the inference problem does not have a (unique solution, the consequences of such an attribute have not been rigorously taken into consideration. Here, we propose a new procedure for assessing the performance of gene regulatory network (GRN inference methods. The procedure takes into account the underdetermined nature of the inference problem, in which gene regulatory interactions that are inferable or non-inferable are determined based on causal inference. The assessment relies on a new definition of the confusion matrix, which excludes errors associated with non-inferable gene regulations. For demonstration purposes, the proposed assessment procedure is applied to the DREAM 4 In Silico Network Challenge. The results show a marked change in the ranking of participating methods when taking network inferability into account.

  2. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  3. Fuzzy logic controller using different inference methods

    International Nuclear Information System (INIS)

    Liu, Z.; De Keyser, R.

    1994-01-01

    In this paper the design of fuzzy controllers by using different inference methods is introduced. Configuration of the fuzzy controllers includes a general rule-base which is a collection of fuzzy PI or PD rules, the triangular fuzzy data model and a centre of gravity defuzzification algorithm. The generalized modus ponens (GMP) is used with the minimum operator of the triangular norm. Under the sup-min inference rule, six fuzzy implication operators are employed to calculate the fuzzy look-up tables for each rule base. The performance is tested in simulated systems with MATLAB/SIMULINK. Results show the effects of using the fuzzy controllers with different inference methods and applied to different test processes

  4. An algebra-based method for inferring gene regulatory networks.

    Science.gov (United States)

    Vera-Licona, Paola; Jarrah, Abdul; Garcia-Puente, Luis David; McGee, John; Laubenbacher, Reinhard

    2014-03-26

    The inference of gene regulatory networks (GRNs) from experimental observations is at the heart of systems biology. This includes the inference of both the network topology and its dynamics. While there are many algorithms available to infer the network topology from experimental data, less emphasis has been placed on methods that infer network dynamics. Furthermore, since the network inference problem is typically underdetermined, it is essential to have the option of incorporating into the inference process, prior knowledge about the network, along with an effective description of the search space of dynamic models. Finally, it is also important to have an understanding of how a given inference method is affected by experimental and other noise in the data used. This paper contains a novel inference algorithm using the algebraic framework of Boolean polynomial dynamical systems (BPDS), meeting all these requirements. The algorithm takes as input time series data, including those from network perturbations, such as knock-out mutant strains and RNAi experiments. It allows for the incorporation of prior biological knowledge while being robust to significant levels of noise in the data used for inference. It uses an evolutionary algorithm for local optimization with an encoding of the mathematical models as BPDS. The BPDS framework allows an effective representation of the search space for algebraic dynamic models that improves computational performance. The algorithm is validated with both simulated and experimental microarray expression profile data. Robustness to noise is tested using a published mathematical model of the segment polarity gene network in Drosophila melanogaster. Benchmarking of the algorithm is done by comparison with a spectrum of state-of-the-art network inference methods on data from the synthetic IRMA network to demonstrate that our method has good precision and recall for the network reconstruction task, while also predicting several of the

  5. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  6. Active inference, sensory attenuation and illusions.

    Science.gov (United States)

    Brown, Harriet; Adams, Rick A; Parees, Isabel; Edwards, Mark; Friston, Karl

    2013-11-01

    Active inference provides a simple and neurobiologically plausible account of how action and perception are coupled in producing (Bayes) optimal behaviour. This can be seen most easily as minimising prediction error: we can either change our predictions to explain sensory input through perception. Alternatively, we can actively change sensory input to fulfil our predictions. In active inference, this action is mediated by classical reflex arcs that minimise proprioceptive prediction error created by descending proprioceptive predictions. However, this creates a conflict between action and perception; in that, self-generated movements require predictions to override the sensory evidence that one is not actually moving. However, ignoring sensory evidence means that externally generated sensations will not be perceived. Conversely, attending to (proprioceptive and somatosensory) sensations enables the detection of externally generated events but precludes generation of actions. This conflict can be resolved by attenuating the precision of sensory evidence during movement or, equivalently, attending away from the consequences of self-made acts. We propose that this Bayes optimal withdrawal of precise sensory evidence during movement is the cause of psychophysical sensory attenuation. Furthermore, it explains the force-matching illusion and reproduces empirical results almost exactly. Finally, if attenuation is removed, the force-matching illusion disappears and false (delusional) inferences about agency emerge. This is important, given the negative correlation between sensory attenuation and delusional beliefs in normal subjects--and the reduction in the magnitude of the illusion in schizophrenia. Active inference therefore links the neuromodulatory optimisation of precision to sensory attenuation and illusory phenomena during the attribution of agency in normal subjects. It also provides a functional account of deficits in syndromes characterised by false inference

  7. Bayesian Inference and Online Learning in Poisson Neuronal Networks.

    Science.gov (United States)

    Huang, Yanping; Rao, Rajesh P N

    2016-08-01

    Motivated by the growing evidence for Bayesian computation in the brain, we show how a two-layer recurrent network of Poisson neurons can perform both approximate Bayesian inference and learning for any hidden Markov model. The lower-layer sensory neurons receive noisy measurements of hidden world states. The higher-layer neurons infer a posterior distribution over world states via Bayesian inference from inputs generated by sensory neurons. We demonstrate how such a neuronal network with synaptic plasticity can implement a form of Bayesian inference similar to Monte Carlo methods such as particle filtering. Each spike in a higher-layer neuron represents a sample of a particular hidden world state. The spiking activity across the neural population approximates the posterior distribution over hidden states. In this model, variability in spiking is regarded not as a nuisance but as an integral feature that provides the variability necessary for sampling during inference. We demonstrate how the network can learn the likelihood model, as well as the transition probabilities underlying the dynamics, using a Hebbian learning rule. We present results illustrating the ability of the network to perform inference and learning for arbitrary hidden Markov models.

  8. Contingency inferences driven by base rates: Valid by sampling

    Directory of Open Access Journals (Sweden)

    Florian Kutzner

    2011-04-01

    Full Text Available Fiedler et al. (2009, reviewed evidence for the utilization of a contingency inference strategy termed pseudocontingencies (PCs. In PCs, the more frequent levels (and, by implication, the less frequent levels are assumed to be associated. PCs have been obtained using a wide range of task settings and dependent measures. Yet, the readiness with which decision makers rely on PCs is poorly understood. A computer simulation explored two potential sources of subjective validity of PCs. First, PCs are shown to perform above chance level when the task is to infer the sign of moderate to strong population contingencies from a sample of observations. Second, contingency inferences based on PCs and inferences based on cell frequencies are shown to partially agree across samples. Intriguingly, this criterion and convergent validity are by-products of random sampling error, highlighting the inductive nature of contingency inferences.

  9. Damage prognosis of adhesively-bonded joints in laminated composite structural components of unmanned aerial vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles R [Los Alamos National Laboratory; Gobbato, Maurizio [UCSD; Conte, Joel [UCSD; Kosmatke, John [UCSD; Oliver, Joseph A [UCSD

    2009-01-01

    The extensive use of lightweight advanced composite materials in unmanned aerial vehicles (UAVs) drastically increases the sensitivity to both fatigue- and impact-induced damage of their critical structural components (e.g., wings and tail stabilizers) during service life. The spar-to-skin adhesive joints are considered one of the most fatigue sensitive subcomponents of a lightweight UAV composite wing with damage progressively evolving from the wing root. This paper presents a comprehensive probabilistic methodology for predicting the remaining service life of adhesively-bonded joints in laminated composite structural components of UAVs. Non-destructive evaluation techniques and Bayesian inference are used to (i) assess the current state of damage of the system and, (ii) update the probability distribution of the damage extent at various locations. A probabilistic model for future loads and a mechanics-based damage model are then used to stochastically propagate damage through the joint. Combined local (e.g., exceedance of a critical damage size) and global (e.g.. flutter instability) failure criteria are finally used to compute the probability of component failure at future times. The applicability and the partial validation of the proposed methodology are then briefly discussed by analyzing the debonding propagation, along a pre-defined adhesive interface, in a simply supported laminated composite beam with solid rectangular cross section, subjected to a concentrated load applied at mid-span. A specially developed Eliler-Bernoulli beam finite element with interlaminar slip along the damageable interface is used in combination with a cohesive zone model to study the fatigue-induced degradation in the adhesive material. The preliminary numerical results presented are promising for the future validation of the methodology.

  10. Reinforcement and inference in cross-situational word learning.

    Science.gov (United States)

    Tilles, Paulo F C; Fontanari, José F

    2013-01-01

    Cross-situational word learning is based on the notion that a learner can determine the referent of a word by finding something in common across many observed uses of that word. Here we propose an adaptive learning algorithm that contains a parameter that controls the strength of the reinforcement applied to associations between concurrent words and referents, and a parameter that regulates inference, which includes built-in biases, such as mutual exclusivity, and information of past learning events. By adjusting these parameters so that the model predictions agree with data from representative experiments on cross-situational word learning, we were able to explain the learning strategies adopted by the participants of those experiments in terms of a trade-off between reinforcement and inference. These strategies can vary wildly depending on the conditions of the experiments. For instance, for fast mapping experiments (i.e., the correct referent could, in principle, be inferred in a single observation) inference is prevalent, whereas for segregated contextual diversity experiments (i.e., the referents are separated in groups and are exhibited with members of their groups only) reinforcement is predominant. Other experiments are explained with more balanced doses of reinforcement and inference.

  11. Data-driven inference for the spatial scan statistic.

    Science.gov (United States)

    Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C

    2011-08-02

    Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  12. Eight challenges in phylodynamic inference

    Directory of Open Access Journals (Sweden)

    Simon D.W. Frost

    2015-03-01

    Full Text Available The field of phylodynamics, which attempts to enhance our understanding of infectious disease dynamics using pathogen phylogenies, has made great strides in the past decade. Basic epidemiological and evolutionary models are now well characterized with inferential frameworks in place. However, significant challenges remain in extending phylodynamic inference to more complex systems. These challenges include accounting for evolutionary complexities such as changing mutation rates, selection, reassortment, and recombination, as well as epidemiological complexities such as stochastic population dynamics, host population structure, and different patterns at the within-host and between-host scales. An additional challenge exists in making efficient inferences from an ever increasing corpus of sequence data.

  13. Macromolecular composition of terrestrial and marine organic matter in sediments across the East Siberian Arctic Shelf

    Directory of Open Access Journals (Sweden)

    R. B. Sparkes

    2016-10-01

    Full Text Available Mobilisation of terrestrial organic carbon (terrOC from permafrost environments in eastern Siberia has the potential to deliver significant amounts of carbon to the Arctic Ocean, via both fluvial and coastal erosion. Eroded terrOC can be degraded during offshore transport or deposited across the wide East Siberian Arctic Shelf (ESAS. Most studies of terrOC on the ESAS have concentrated on solvent-extractable organic matter, but this represents only a small proportion of the total terrOC load. In this study we have used pyrolysis–gas chromatography–mass spectrometry (py-GCMS to study all major groups of macromolecular components of the terrOC; this is the first time that this technique has been applied to the ESAS. This has shown that there is a strong offshore trend from terrestrial phenols, aromatics and cyclopentenones to marine pyridines. There is good agreement between proportion phenols measured using py-GCMS and independent quantification of lignin phenol concentrations (r2 = 0.67, p < 0.01, n = 24. Furfurals, thought to represent carbohydrates, show no offshore trend and are likely found in both marine and terrestrial organic matter. We have also collected new radiocarbon data for bulk OC (14COC which, when coupled with previous measurements, allows us to produce the most comprehensive 14COC map of the ESAS to date. Combining the 14COC and py-GCMS data suggests that the aromatics group of compounds is likely sourced from old, aged terrOC, in contrast to the phenols group, which is likely sourced from modern woody material. We propose that an index of the relative proportions of phenols and pyridines can be used as a novel terrestrial vs. marine proxy measurement for macromolecular organic matter. Principal component analysis found that various terrestrial vs. marine proxies show different patterns across the ESAS, and it shows that multiple river–ocean transects of surface sediments transition from river-dominated to

  14. Human Inferences about Sequences: A Minimal Transition Probability Model.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2016-12-01

    Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.

  15. Lactoferricin B inhibits bacterial macromolecular synthesis in Escherichia coli and Bacillus subtilis.

    Science.gov (United States)

    Ulvatne, Hilde; Samuelsen, Ørjan; Haukland, Hanne H; Krämer, Manuela; Vorland, Lars H

    2004-08-15

    Most antimicrobial peptides have an amphipathic, cationic structure, and an effect on the cytoplasmic membrane of susceptible bacteria has been postulated as the main mode of action. Other mechanisms have been reported, including inhibition of cellular functions by binding to DNA, RNA and proteins, and the inhibition of DNA and/or protein synthesis. Lactoferricin B (Lfcin B), a cationic peptide derived from bovine lactoferrin, exerts slow inhibitory and bactericidal activity and does not lyse susceptible bacteria, indicating a possible intracellular target. In the present study incorporation of radioactive precursors into DNA, RNA and proteins was used to demonstrate effects of Lfcin B on macromolecular synthesis in bacteria. In Escherichia coli UC 6782, Lfcin B induces an initial increase in protein and RNA synthesis and a decrease in DNA synthesis. After 10 min, the DNA-synthesis increases while protein and RNA-synthesis decreases significantly. In Bacillus subtilis, however, all synthesis of macromolecules is inhibited for at least 20 min. After 20 min RNA-synthesis increases. The results presented here show that Lfcin B at concentrations not sufficient to kill bacterial cells inhibits incorporation of radioactive precursors into macromolecules in both Gram-positive and Gram-negative bacteria.

  16. Making Type Inference Practical

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Oxhøj, Nicholas; Palsberg, Jens

    1992-01-01

    We present the implementation of a type inference algorithm for untyped object-oriented programs with inheritance, assignments, and late binding. The algorithm significantly improves our previous one, presented at OOPSLA'91, since it can handle collection classes, such as List, in a useful way. Abo......, the complexity has been dramatically improved, from exponential time to low polynomial time. The implementation uses the techniques of incremental graph construction and constraint template instantiation to avoid representing intermediate results, doing superfluous work, and recomputing type information....... Experiments indicate that the implementation type checks as much as 100 lines pr. second. This results in a mature product, on which a number of tools can be based, for example a safety tool, an image compression tool, a code optimization tool, and an annotation tool. This may make type inference for object...

  17. Examples in parametric inference with R

    CERN Document Server

    Dixit, Ulhas Jayram

    2016-01-01

    This book discusses examples in parametric inference with R. Combining basic theory with modern approaches, it presents the latest developments and trends in statistical inference for students who do not have an advanced mathematical and statistical background. The topics discussed in the book are fundamental and common to many fields of statistical inference and thus serve as a point of departure for in-depth study. The book is divided into eight chapters: Chapter 1 provides an overview of topics on sufficiency and completeness, while Chapter 2 briefly discusses unbiased estimation. Chapter 3 focuses on the study of moments and maximum likelihood estimators, and Chapter 4 presents bounds for the variance. In Chapter 5, topics on consistent estimator are discussed. Chapter 6 discusses Bayes, while Chapter 7 studies some more powerful tests. Lastly, Chapter 8 examines unbiased and other tests. Senior undergraduate and graduate students in statistics and mathematics, and those who have taken an introductory cou...

  18. Causal Effect Inference with Deep Latent-Variable Models

    NARCIS (Netherlands)

    Louizos, C; Shalit, U.; Mooij, J.; Sontag, D.; Zemel, R.; Welling, M.

    2017-01-01

    Learning individual-level causal effects from observational data, such as inferring the most effective medication for a specific patient, is a problem of growing importance for policy makers. The most important aspect of inferring causal effects from observational data is the handling of

  19. Causal inference in survival analysis using pseudo-observations

    DEFF Research Database (Denmark)

    Andersen, Per K; Syriopoulou, Elisavet; Parner, Erik T

    2017-01-01

    Causal inference for non-censored response variables, such as binary or quantitative outcomes, is often based on either (1) direct standardization ('G-formula') or (2) inverse probability of treatment assignment weights ('propensity score'). To do causal inference in survival analysis, one needs ...

  20. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  1. On quantum statistical inference

    NARCIS (Netherlands)

    Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have

  2. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  3. The Impact of Contextual Clue Selection on Inference

    Directory of Open Access Journals (Sweden)

    Leila Barati

    2010-05-01

    Full Text Available Linguistic information can be conveyed in the form of speech and written text, but it is the content of the message that is ultimately essential for higher-level processes in language comprehension, such as making inferences and associations between text information and knowledge about the world. Linguistically, inference is the shovel that allows receivers to dig meaning out from the text with selecting different embedded contextual clues. Naturally, people with different world experiences infer similar contextual situations differently. Lack of contextual knowledge of the target language can present an obstacle to comprehension (Anderson & Lynch, 2003. This paper tries to investigate how true contextual clue selection from the text can influence listener’s inference. In the present study 60 male and female teenagers (13-19 and 60 male and female young adults (20-26 were selected randomly based on Oxford Placement Test (OPT. During the study two fiction and two non-fiction passages were read to the participants in the experimental and control groups respectively and they were given scores according to Lexile’s Score (LS[1] based on their correct inference and logical thinking ability. In general the results show that participants’ clue selection based on their personal schematic references and background knowledge differ between teenagers and young adults and influence inference and listening comprehension. [1]- This is a framework for reading and listening which matches the appropriate score to each text based on degree of difficulty of text and each text was given a Lexile score from zero to four.

  4. Inferring Demographic History Using Two-Locus Statistics.

    Science.gov (United States)

    Ragsdale, Aaron P; Gutenkunst, Ryan N

    2017-06-01

    Population demographic history may be learned from contemporary genetic variation data. Methods based on aggregating the statistics of many single loci into an allele frequency spectrum (AFS) have proven powerful, but such methods ignore potentially informative patterns of linkage disequilibrium (LD) between neighboring loci. To leverage such patterns, we developed a composite-likelihood framework for inferring demographic history from aggregated statistics of pairs of loci. Using this framework, we show that two-locus statistics are more sensitive to demographic history than single-locus statistics such as the AFS. In particular, two-locus statistics escape the notorious confounding of depth and duration of a bottleneck, and they provide a means to estimate effective population size based on the recombination rather than mutation rate. We applied our approach to a Zambian population of Drosophila melanogaster Notably, using both single- and two-locus statistics, we inferred a substantially lower ancestral effective population size than previous works and did not infer a bottleneck history. Together, our results demonstrate the broad potential for two-locus statistics to enable powerful population genetic inference. Copyright © 2017 by the Genetics Society of America.

  5. Statistical Inference on the Canadian Middle Class

    Directory of Open Access Journals (Sweden)

    Russell Davidson

    2018-03-01

    Full Text Available Conventional wisdom says that the middle classes in many developed countries have recently suffered losses, in terms of both the share of the total population belonging to the middle class, and also their share in total income. Here, distribution-free methods are developed for inference on these shares, by means of deriving expressions for their asymptotic variances of sample estimates, and the covariance of the estimates. Asymptotic inference can be undertaken based on asymptotic normality. Bootstrap inference can be expected to be more reliable, and appropriate bootstrap procedures are proposed. As an illustration, samples of individual earnings drawn from Canadian census data are used to test various hypotheses about the middle-class shares, and confidence intervals for them are computed. It is found that, for the earlier censuses, sample sizes are large enough for asymptotic and bootstrap inference to be almost identical, but that, in the twenty-first century, the bootstrap fails on account of a strange phenomenon whereby many presumably different incomes in the data are rounded to one and the same value. Another difference between the centuries is the appearance of heavy right-hand tails in the income distributions of both men and women.

  6. The importance of learning when making inferences

    Directory of Open Access Journals (Sweden)

    Jorg Rieskamp

    2008-03-01

    Full Text Available The assumption that people possess a repertoire of strategies to solve the inference problems they face has been made repeatedly. The experimental findings of two previous studies on strategy selection are reexamined from a learning perspective, which argues that people learn to select strategies for making probabilistic inferences. This learning process is modeled with the strategy selection learning (SSL theory, which assumes that people develop subjective expectancies for the strategies they have. They select strategies proportional to their expectancies, which are updated on the basis of experience. For the study by Newell, Weston, and Shanks (2003 it can be shown that people did not anticipate the success of a strategy from the beginning of the experiment. Instead, the behavior observed at the end of the experiment was the result of a learning process that can be described by the SSL theory. For the second study, by Br"oder and Schiffer (2006, the SSL theory is able to provide an explanation for why participants only slowly adapted to new environments in a dynamic inference situation. The reanalysis of the previous studies illustrates the importance of learning for probabilistic inferences.

  7. Bayesian inference of substrate properties from film behavior

    International Nuclear Information System (INIS)

    Aggarwal, R; Demkowicz, M J; Marzouk, Y M

    2015-01-01

    We demonstrate that by observing the behavior of a film deposited on a substrate, certain features of the substrate may be inferred with quantified uncertainty using Bayesian methods. We carry out this demonstration on an illustrative film/substrate model where the substrate is a Gaussian random field and the film is a two-component mixture that obeys the Cahn–Hilliard equation. We construct a stochastic reduced order model to describe the film/substrate interaction and use it to infer substrate properties from film behavior. This quantitative inference strategy may be adapted to other film/substrate systems. (paper)

  8. Brain Imaging, Forward Inference, and Theories of Reasoning

    Science.gov (United States)

    Heit, Evan

    2015-01-01

    This review focuses on the issue of how neuroimaging studies address theoretical accounts of reasoning, through the lens of the method of forward inference (Henson, 2005, 2006). After theories of deductive and inductive reasoning are briefly presented, the method of forward inference for distinguishing between psychological theories based on brain imaging evidence is critically reviewed. Brain imaging studies of reasoning, comparing deductive and inductive arguments, comparing meaningful versus non-meaningful material, investigating hemispheric localization, and comparing conditional and relational arguments, are assessed in light of the method of forward inference. Finally, conclusions are drawn with regard to future research opportunities. PMID:25620926

  9. Brain imaging, forward inference, and theories of reasoning.

    Science.gov (United States)

    Heit, Evan

    2014-01-01

    This review focuses on the issue of how neuroimaging studies address theoretical accounts of reasoning, through the lens of the method of forward inference (Henson, 2005, 2006). After theories of deductive and inductive reasoning are briefly presented, the method of forward inference for distinguishing between psychological theories based on brain imaging evidence is critically reviewed. Brain imaging studies of reasoning, comparing deductive and inductive arguments, comparing meaningful versus non-meaningful material, investigating hemispheric localization, and comparing conditional and relational arguments, are assessed in light of the method of forward inference. Finally, conclusions are drawn with regard to future research opportunities.

  10. Data-driven inference for the spatial scan statistic

    Directory of Open Access Journals (Sweden)

    Duczmal Luiz H

    2011-08-01

    Full Text Available Abstract Background Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. Results A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. Conclusions A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  11. An 'attachment kinetics-based' volume of fraction method for organic crystallization: a fluid-dynamic approach to macromolecular-crystal engineering

    International Nuclear Information System (INIS)

    Lappa, Marcello

    2003-01-01

    This analysis exhibits a strong interdisciplinary nature and deals with advances in protein (crystal) engineering models and computational methods as well as with novel results on the relative importance of 'controlling forces' in macromolecular crystal growth. The attention is focused in particular on microgravity fluid-dynamic aspects. From a numerical point of view, the growing crystal gives rise to a moving boundary problem. A 'kinetic-coefficient-based' volume tracking method is specifically and carefully developed according to the complex properties and mechanisms of macromolecular protein crystal growth taking into account the possibility of anisotropic (faceted) surface-orientation-dependent growth. The method is used to shed some light on the interplay of surface attachment kinetics and mass transport (diffusive or convective) in liquid phase and on several mechanisms still poorly understood. It is shown that the size of a growing crystal plays a 'critical role' in the relative importance of surface effects and in determining the intensity of convection. Convective effects, in turn, are found to impact growth rates, macroscopic structures of precipitates, particle size and morphology as well as the mechanisms driving growth. The paper introduces a novel computational method (that simulates the growth due to the slow addition of solute molecules to a lattice and can handle the shape of organic growing crystals under the influence of natural convection) and, at the same time, represents a quite exhaustive attempt to help organic crystal growers to discern the complex interrelations among the various parameters under one's control (that are not independent of one another) and to elaborate rational guidelines relating to physical factors that can influence the probability of success in crystallizing protein substances

  12. Statistical inference an integrated approach

    CERN Document Server

    Migon, Helio S; Louzada, Francisco

    2014-01-01

    Introduction Information The concept of probability Assessing subjective probabilities An example Linear algebra and probability Notation Outline of the bookElements of Inference Common statistical modelsLikelihood-based functions Bayes theorem Exchangeability Sufficiency and exponential family Parameter elimination Prior Distribution Entirely subjective specification Specification through functional forms Conjugacy with the exponential family Non-informative priors Hierarchical priors Estimation Introduction to decision theoryBayesian point estimation Classical point estimation Empirical Bayes estimation Comparison of estimators Interval estimation Estimation in the Normal model Approximating Methods The general problem of inference Optimization techniquesAsymptotic theory Other analytical approximations Numerical integration methods Simulation methods Hypothesis Testing Introduction Classical hypothesis testingBayesian hypothesis testing Hypothesis testing and confidence intervalsAsymptotic tests Prediction...

  13. Statistical learning and selective inference.

    Science.gov (United States)

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  14. EFFECTS OF MEDU AND COASTAL TOPOGRAPHY ON THE DAMAGE PATTERN DURING THE RECENT INDIAN OCEAN TSUNAMI ALONG THE COAST OF TAMILNADU

    Directory of Open Access Journals (Sweden)

    J.P. Narayan

    2005-01-01

    Full Text Available Effects of Medu (naturally elevated landmass very close to the seashore and elongated parallel to the coast and coastal topography on the damage pattern during the deadliest Indian Ocean tsunami of December 26, 2004 is reported. The tsunami caused severe damage and claimed many victims in the coastal areas of eleven countries bordering the Indian Ocean. The damage survey revealed large variation in damage along the coastal region of Tamilnadu (India.The most severe damage was observed in the Nagapattinam district on the east coast and the west coast of Kanyakumari district. Decrease of damage from Nagapattinam to Kanchipuram district was observed. Intense damage again appeared to the north of Adyar River (from Srinivaspuri to Anna Samadhi Park. Almost, no damage was observed along the coast of Thanjavur, Puddukkotai and Ramnathpuram districts in Palk Strait, situated in the shadow zone of Sri Lanka.It was concluded that the width of continental shelf has played a major role in the pattern of tsunami damage. It was inferred that the width of the continental shelf and the interference of reflected waves from Sri Lanka and Maldives Islands with direct waves and receding waves was responsible for intense damage in Nagapattinam and Kanyakumari districts, respectively. During the damage survey authors also noted that there was almost no damage or much lesser damage to houses situated on or behind the Medu. Many people observed the first arrival. The largest tsunami amplitude occurred as the first arrival on the eastern coast and in the second arrival on the western coast.

  15. Mapping causal functional contributions derived from the clinical assessment of brain damage after stroke

    Directory of Open Access Journals (Sweden)

    Melissa Zavaglia

    2015-01-01

    Full Text Available Lesion analysis reveals causal contributions of brain regions to mental functions, aiding the understanding of normal brain function as well as rehabilitation of brain-damaged patients. We applied a novel lesion inference technique based on game theory, Multi-perturbation Shapley value Analysis (MSA, to a large clinical lesion dataset. We used MSA to analyze the lesion patterns of 148 acute stroke patients together with their neurological deficits, as assessed by the National Institutes of Health Stroke Scale (NIHSS. The results revealed regional functional contributions to essential behavioral and cognitive functions as reflected in the NIHSS, particularly by subcortical structures. There were also side specific differences of functional contributions between the right and left hemispheric brain regions which may reflect the dominance of the left hemispheric syndrome aphasia in the NIHSS. Comparison of MSA to established lesion inference methods demonstrated the feasibility of the approach for analyzing clinical data and indicated its capability for objectively inferring functional contributions from multiple injured, potentially interacting sites, at the cost of having to predict the outcome of unknown lesion configurations. The analysis of regional functional contributions to neurological symptoms measured by the NIHSS contributes to the interpretation of this widely used standardized stroke scale in clinical practice as well as clinical trials and provides a first approximation of a ‘map of stroke’.

  16. Mapping causal functional contributions derived from the clinical assessment of brain damage after stroke.

    Science.gov (United States)

    Zavaglia, Melissa; Forkert, Nils D; Cheng, Bastian; Gerloff, Christian; Thomalla, Götz; Hilgetag, Claus C

    2015-01-01

    Lesion analysis reveals causal contributions of brain regions to mental functions, aiding the understanding of normal brain function as well as rehabilitation of brain-damaged patients. We applied a novel lesion inference technique based on game theory, Multi-perturbation Shapley value Analysis (MSA), to a large clinical lesion dataset. We used MSA to analyze the lesion patterns of 148 acute stroke patients together with their neurological deficits, as assessed by the National Institutes of Health Stroke Scale (NIHSS). The results revealed regional functional contributions to essential behavioral and cognitive functions as reflected in the NIHSS, particularly by subcortical structures. There were also side specific differences of functional contributions between the right and left hemispheric brain regions which may reflect the dominance of the left hemispheric syndrome aphasia in the NIHSS. Comparison of MSA to established lesion inference methods demonstrated the feasibility of the approach for analyzing clinical data and indicated its capability for objectively inferring functional contributions from multiple injured, potentially interacting sites, at the cost of having to predict the outcome of unknown lesion configurations. The analysis of regional functional contributions to neurological symptoms measured by the NIHSS contributes to the interpretation of this widely used standardized stroke scale in clinical practice as well as clinical trials and provides a first approximation of a 'map of stroke'.

  17. Mapping causal functional contributions derived from the clinical assessment of brain damage after stroke

    Science.gov (United States)

    Zavaglia, Melissa; Forkert, Nils D.; Cheng, Bastian; Gerloff, Christian; Thomalla, Götz; Hilgetag, Claus C.

    2015-01-01

    Lesion analysis reveals causal contributions of brain regions to mental functions, aiding the understanding of normal brain function as well as rehabilitation of brain-damaged patients. We applied a novel lesion inference technique based on game theory, Multi-perturbation Shapley value Analysis (MSA), to a large clinical lesion dataset. We used MSA to analyze the lesion patterns of 148 acute stroke patients together with their neurological deficits, as assessed by the National Institutes of Health Stroke Scale (NIHSS). The results revealed regional functional contributions to essential behavioral and cognitive functions as reflected in the NIHSS, particularly by subcortical structures. There were also side specific differences of functional contributions between the right and left hemispheric brain regions which may reflect the dominance of the left hemispheric syndrome aphasia in the NIHSS. Comparison of MSA to established lesion inference methods demonstrated the feasibility of the approach for analyzing clinical data and indicated its capability for objectively inferring functional contributions from multiple injured, potentially interacting sites, at the cost of having to predict the outcome of unknown lesion configurations. The analysis of regional functional contributions to neurological symptoms measured by the NIHSS contributes to the interpretation of this widely used standardized stroke scale in clinical practice as well as clinical trials and provides a first approximation of a ‘map of stroke’. PMID:26448908

  18. MolProbity: all-atom structure validation for macromolecular crystallography

    International Nuclear Information System (INIS)

    Chen, Vincent B.; Arendall, W. Bryan III; Headd, Jeffrey J.; Keedy, Daniel A.; Immormino, Robert M.; Kapral, Gary J.; Murray, Laura W.; Richardson, Jane S.; Richardson, David C.

    2010-01-01

    MolProbity structure validation will diagnose most local errors in macromolecular crystal structures and help to guide their correction. MolProbity is a structure-validation web service that provides broad-spectrum solidly based evaluation of model quality at both the global and local levels for both proteins and nucleic acids. It relies heavily on the power and sensitivity provided by optimized hydrogen placement and all-atom contact analysis, complemented by updated versions of covalent-geometry and torsion-angle criteria. Some of the local corrections can be performed automatically in MolProbity and all of the diagnostics are presented in chart and graphical forms that help guide manual rebuilding. X-ray crystallography provides a wealth of biologically important molecular data in the form of atomic three-dimensional structures of proteins, nucleic acids and increasingly large complexes in multiple forms and states. Advances in automation, in everything from crystallization to data collection to phasing to model building to refinement, have made solving a structure using crystallography easier than ever. However, despite these improvements, local errors that can affect biological interpretation are widespread at low resolution and even high-resolution structures nearly all contain at least a few local errors such as Ramachandran outliers, flipped branched protein side chains and incorrect sugar puckers. It is critical both for the crystallographer and for the end user that there are easy and reliable methods to diagnose and correct these sorts of errors in structures. MolProbity is the authors’ contribution to helping solve this problem and this article reviews its general capabilities, reports on recent enhancements and usage, and presents evidence that the resulting improvements are now beneficially affecting the global database

  19. Damage analysis: damage function development and application

    International Nuclear Information System (INIS)

    Simons, R.L.; Odette, G.R.

    1975-01-01

    The derivation and application of damage functions, including recent developments for the U.S. LMFBR and CTR programs, is reviewed. A primary application of damage functions is in predicting component life expectancies; i.e., the fluence required in a service spectrum to attain a specified design property change. An important part of the analysis is the estimation of the uncertainty in such fluence limit predictions. The status of standardizing the procedures for the derivation and application of damage functions is discussed. Improvements in several areas of damage function development are needed before standardization can be completed. These include increasing the quantity and quality of the data used in the analysis, determining the limitations of the analysis due to the presence of multiple damage mechanisms, and finally, testing of damage function predictions against data obtained from material surveillance programs in operating thermal and fast reactors. 23 references. (auth)

  20. Object-Oriented Type Inference

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Palsberg, Jens

    1991-01-01

    We present a new approach to inferring types in untyped object-oriented programs with inheritance, assignments, and late binding. It guarantees that all messages are understood, annotates the program with type information, allows polymorphic methods, and can be used as the basis of an op...

  1. The Probabilistic Convolution Tree: Efficient Exact Bayesian Inference for Faster LC-MS/MS Protein Inference

    Science.gov (United States)

    Serang, Oliver

    2014-01-01

    Exact Bayesian inference can sometimes be performed efficiently for special cases where a function has commutative and associative symmetry of its inputs (called “causal independence”). For this reason, it is desirable to exploit such symmetry on big data sets. Here we present a method to exploit a general form of this symmetry on probabilistic adder nodes by transforming those probabilistic adder nodes into a probabilistic convolution tree with which dynamic programming computes exact probabilities. A substantial speedup is demonstrated using an illustration example that can arise when identifying splice forms with bottom-up mass spectrometry-based proteomics. On this example, even state-of-the-art exact inference algorithms require a runtime more than exponential in the number of splice forms considered. By using the probabilistic convolution tree, we reduce the runtime to and the space to where is the number of variables joined by an additive or cardinal operator. This approach, which can also be used with junction tree inference, is applicable to graphs with arbitrary dependency on counting variables or cardinalities and can be used on diverse problems and fields like forward error correcting codes, elemental decomposition, and spectral demixing. The approach also trivially generalizes to multiple dimensions. PMID:24626234

  2. Reward inference by primate prefrontal and striatal neurons.

    Science.gov (United States)

    Pan, Xiaochuan; Fan, Hongwei; Sawa, Kosuke; Tsuda, Ichiro; Tsukada, Minoru; Sakagami, Masamichi

    2014-01-22

    The brain contains multiple yet distinct systems involved in reward prediction. To understand the nature of these processes, we recorded single-unit activity from the lateral prefrontal cortex (LPFC) and the striatum in monkeys performing a reward inference task using an asymmetric reward schedule. We found that neurons both in the LPFC and in the striatum predicted reward values for stimuli that had been previously well experienced with set reward quantities in the asymmetric reward task. Importantly, these LPFC neurons could predict the reward value of a stimulus using transitive inference even when the monkeys had not yet learned the stimulus-reward association directly; whereas these striatal neurons did not show such an ability. Nevertheless, because there were two set amounts of reward (large and small), the selected striatal neurons were able to exclusively infer the reward value (e.g., large) of one novel stimulus from a pair after directly experiencing the alternative stimulus with the other reward value (e.g., small). Our results suggest that although neurons that predict reward value for old stimuli in the LPFC could also do so for new stimuli via transitive inference, those in the striatum could only predict reward for new stimuli via exclusive inference. Moreover, the striatum showed more complex functions than was surmised previously for model-free learning.

  3. REPLACEMENT SPARE PART INVENTORY MONITORING USING ADAPTIVE NEURO FUZZY INFERENCE SYSTEM

    Directory of Open Access Journals (Sweden)

    Hartono Hartono

    2016-01-01

    Full Text Available Abstract   The amount of inventory is determined on the basis of the demand. So that users can know the demand forecasts need to be done on the request. This study uses the data to implement a replacement parts on the electronic module production equipment in the telecommunications transmission systems, switching, access and power, ie by replacing the electronic module in the system is trouble  or damaged parts of a good electronic module spare parts inventory, while the faulty electronic modules shipped to the Repair Center for repaired again, so that the results of these improvements can replenish spare part  inventory. Parameters speed on improvement process of electronic module broken (repaired, in the form of an average repair time at the repair centers, in order to get back into the electronic module that is ready for used as spare parts in compliance with the safe supply inventory  warehouse.  This research using the method  of  Adaptive Neuro Fuzzy Inference System (ANFIS in developing a decision support system for inventory control of spare parts available in Warehouse Inventory taking into account several parameters supporters, namely demand, improvement and fulfillment of spare parts and repair time. This study uses a recycling input parameter repair faulty electronic module of the customer to immediately replace the module in inventory warehouse,  do improvements in the Repair Center. So the acceleration restoration factor is very influential as the input spare parts inventory supply in the warehouse and using the Adaptive Neuro-Fuzzy Inference System (ANFIS method.   Keywords: ANFIS, inventory control, replacement

  4. Bootstrap inference when using multiple imputation.

    Science.gov (United States)

    Schomaker, Michael; Heumann, Christian

    2018-04-16

    Many modern estimators require bootstrapping to calculate confidence intervals because either no analytic standard error is available or the distribution of the parameter of interest is nonsymmetric. It remains however unclear how to obtain valid bootstrap inference when dealing with multiple imputation to address missing data. We present 4 methods that are intuitively appealing, easy to implement, and combine bootstrap estimation with multiple imputation. We show that 3 of the 4 approaches yield valid inference, but that the performance of the methods varies with respect to the number of imputed data sets and the extent of missingness. Simulation studies reveal the behavior of our approaches in finite samples. A topical analysis from HIV treatment research, which determines the optimal timing of antiretroviral treatment initiation in young children, demonstrates the practical implications of the 4 methods in a sophisticated and realistic setting. This analysis suffers from missing data and uses the g-formula for inference, a method for which no standard errors are available. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Evolutionary inference via the Poisson Indel Process.

    Science.gov (United States)

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  6. A new on-axis multimode spectrometer for the macromolecular crystallography beamlines of the Swiss Light Source

    International Nuclear Information System (INIS)

    Owen, Robin L.; Pearson, Arwen R.; Meents, Alke; Boehler, Pirmin; Thominet, Vincent; Schulze-Briese, Clemens

    2009-01-01

    Complementary techniques greatly aid the interpretation of macromolecule structures to yield functional information, and can also help to track radiation-induced changes. A new on-axis spectrometer being integrated into the macromolecular crystallography beamlines of the Swiss Light Source is presented. X-ray crystallography at third-generation synchrotron sources permits tremendous insight into the three-dimensional structure of macromolecules. Additional information is, however, often required to aid the transition from structure to function. In situ spectroscopic methods such as UV–Vis absorption and (resonance) Raman can provide this, and can also provide a means of detecting X-ray-induced changes. Here, preliminary results are introduced from an on-axis UV–Vis absorption and Raman multimode spectrometer currently being integrated into the beamline environment at X10SA of the Swiss Light Source. The continuing development of the spectrometer is also outlined

  7. A fast band–Krylov eigensolver for macromolecular functional motion simulation on multicore architectures and graphics processors

    Energy Technology Data Exchange (ETDEWEB)

    Aliaga, José I., E-mail: aliaga@uji.es [Depto. Ingeniería y Ciencia de Computadores, Universitat Jaume I, Castellón (Spain); Alonso, Pedro [Departamento de Sistemas Informáticos y Computación, Universitat Politècnica de València (Spain); Badía, José M. [Depto. Ingeniería y Ciencia de Computadores, Universitat Jaume I, Castellón (Spain); Chacón, Pablo [Dept. Biological Chemical Physics, Rocasolano Physics and Chemistry Institute, CSIC, Madrid (Spain); Davidović, Davor [Rudjer Bošković Institute, Centar za Informatiku i Računarstvo – CIR, Zagreb (Croatia); López-Blanco, José R. [Dept. Biological Chemical Physics, Rocasolano Physics and Chemistry Institute, CSIC, Madrid (Spain); Quintana-Ortí, Enrique S. [Depto. Ingeniería y Ciencia de Computadores, Universitat Jaume I, Castellón (Spain)

    2016-03-15

    We introduce a new iterative Krylov subspace-based eigensolver for the simulation of macromolecular motions on desktop multithreaded platforms equipped with multicore processors and, possibly, a graphics accelerator (GPU). The method consists of two stages, with the original problem first reduced into a simpler band-structured form by means of a high-performance compute-intensive procedure. This is followed by a memory-intensive but low-cost Krylov iteration, which is off-loaded to be computed on the GPU by means of an efficient data-parallel kernel. The experimental results reveal the performance of the new eigensolver. Concretely, when applied to the simulation of macromolecules with a few thousands degrees of freedom and the number of eigenpairs to be computed is small to moderate, the new solver outperforms other methods implemented as part of high-performance numerical linear algebra packages for multithreaded architectures.

  8. A fast band–Krylov eigensolver for macromolecular functional motion simulation on multicore architectures and graphics processors

    International Nuclear Information System (INIS)

    Aliaga, José I.; Alonso, Pedro; Badía, José M.; Chacón, Pablo; Davidović, Davor; López-Blanco, José R.; Quintana-Ortí, Enrique S.

    2016-01-01

    We introduce a new iterative Krylov subspace-based eigensolver for the simulation of macromolecular motions on desktop multithreaded platforms equipped with multicore processors and, possibly, a graphics accelerator (GPU). The method consists of two stages, with the original problem first reduced into a simpler band-structured form by means of a high-performance compute-intensive procedure. This is followed by a memory-intensive but low-cost Krylov iteration, which is off-loaded to be computed on the GPU by means of an efficient data-parallel kernel. The experimental results reveal the performance of the new eigensolver. Concretely, when applied to the simulation of macromolecules with a few thousands degrees of freedom and the number of eigenpairs to be computed is small to moderate, the new solver outperforms other methods implemented as part of high-performance numerical linear algebra packages for multithreaded architectures.

  9. System Support for Forensic Inference

    Science.gov (United States)

    Gehani, Ashish; Kirchner, Florent; Shankar, Natarajan

    Digital evidence is playing an increasingly important role in prosecuting crimes. The reasons are manifold: financially lucrative targets are now connected online, systems are so complex that vulnerabilities abound and strong digital identities are being adopted, making audit trails more useful. If the discoveries of forensic analysts are to hold up to scrutiny in court, they must meet the standard for scientific evidence. Software systems are currently developed without consideration of this fact. This paper argues for the development of a formal framework for constructing “digital artifacts” that can serve as proxies for physical evidence; a system so imbued would facilitate sound digital forensic inference. A case study involving a filesystem augmentation that provides transparent support for forensic inference is described.

  10. Geostatistical inference using crosshole ground-penetrating radar

    DEFF Research Database (Denmark)

    Looms, Majken C; Hansen, Thomas Mejer; Cordua, Knud Skou

    2010-01-01

    of the subsurface are used to evaluate the uncertainty of the inversion estimate. We have explored the full potential of the geostatistical inference method using several synthetic models of varying correlation structures and have tested the influence of different assumptions concerning the choice of covariance...... reflection profile. Furthermore, the inferred values of the subsurface global variance and the mean velocity have been corroborated with moisturecontent measurements, obtained gravimetrically from samples collected at the field site....

  11. Bayesian Inference for Functional Dynamics Exploring in fMRI Data

    Directory of Open Access Journals (Sweden)

    Xuan Guo

    2016-01-01

    Full Text Available This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM, Bayesian Connectivity Change Point Model (BCCPM, and Dynamic Bayesian Variable Partition Model (DBVPM, and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.

  12. Gd-DTPA L-cystine bisamide copolymers as novel biodegradable macromolecular contrast agents for MR blood pool imaging.

    Science.gov (United States)

    Kaneshiro, Todd L; Ke, Tianyi; Jeong, Eun-Kee; Parker, Dennis L; Lu, Zheng-Rong

    2006-06-01

    The purpose of this study was to synthesize biodegradable Gd-DTPA L-cystine bisamide copolymers (GCAC) as safe and effective, macromolecular contrast agents for magnetic resonance imaging (MRI) and to evaluate their biodegradability and efficacy in MR blood pool imaging in an animal model. Three new biodegradable GCAC with different substituents at the cystine bisamide [R = H (GCAC), CH2CH2CH3 (Gd-DTPA L-cystine bispropyl amide copolymers, GCPC), and CH(CH3)2 (Gd-DTPA cystine bisisopropyl copolymers, GCIC)] were prepared by the condensation copolymerization of diethylenetriamine pentaacetic acid (DTPA) dianhydride with cystine bisamide or bisalkyl amides, followed by complexation with gadolinium triacetate. The degradability of the agents was studied in vitro by incubation in 15 microM cysteine and in vivo with Sprague-Dawley rats. The kinetics of in vivo contrast enhancement was investigated in Sprague-Dawley rats on a Siemens Trio 3 T scanner. The apparent molecular weight of the polydisulfide Gd(III) chelates ranged from 22 to 25 kDa. The longitudinal (T1) relaxivities of GCAC, GCPC, and GCIC were 4.37, 5.28, and 5.56 mM(-1) s(-1) at 3 T, respectively. The polymeric ligands and polymeric Gd(III) chelates readily degraded into smaller molecules in incubation with 15 microM cysteine via disulfide-thiol exchange reactions. The in vitro degradation rates of both the polymeric ligands and macromolecular Gd(III) chelates decreased as the steric effect around the disulfide bonds increased. The agents readily degraded in vivo, and the catabolic degradation products were detected in rat urine samples collected after intravenous injection. The agents showed strong contrast enhancement in the blood pool, major organs, and tissues at a dose of 0.1 mmol Gd/kg. The difference of their in vitro degradability did not significantly alter the kinetics of in vivo contrast enhancement of the agents. These novel GCAC are promising contrast agents for cardiovascular and tumor MRI

  13. Working memory supports inference learning just like classification learning.

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan

    2013-08-01

    Recent research has found a positive relationship between people's working memory capacity (WMC) and their speed of category learning. To date, only classification-learning tasks have been considered, in which people learn to assign category labels to objects. It is unknown whether learning to make inferences about category features might also be related to WMC. We report data from a study in which 119 participants undertook classification learning and inference learning, and completed a series of WMC tasks. Working memory capacity was positively related to people's classification and inference learning performance.

  14. Statistical inference for stochastic processes

    National Research Council Canada - National Science Library

    Basawa, Ishwar V; Prakasa Rao, B. L. S

    1980-01-01

    The aim of this monograph is to attempt to reduce the gap between theory and applications in the area of stochastic modelling, by directing the interest of future researchers to the inference aspects...

  15. Inference of Large Phylogenies Using Neighbour-Joining

    DEFF Research Database (Denmark)

    Simonsen, Martin; Mailund, Thomas; Pedersen, Christian Nørgaard Storm

    2011-01-01

    The neighbour-joining method is a widely used method for phylogenetic reconstruction which scales to thousands of taxa. However, advances in sequencing technology have made data sets with more than 10,000 related taxa widely available. Inference of such large phylogenies takes hours or days using...... the Neighbour-Joining method on a normal desktop computer because of the O(n^3) running time. RapidNJ is a search heuristic which reduce the running time of the Neighbour-Joining method significantly but at the cost of an increased memory consumption making inference of large phylogenies infeasible. We present...... two extensions for RapidNJ which reduce the memory requirements and \\makebox{allows} phylogenies with more than 50,000 taxa to be inferred efficiently on a desktop computer. Furthermore, an improved version of the search heuristic is presented which reduces the running time of RapidNJ on many data...

  16. Effects of macromolecular crowding on protein conformational changes.

    Directory of Open Access Journals (Sweden)

    Hao Dong

    2010-07-01

    Full Text Available Many protein functions can be directly linked to conformational changes. Inside cells, the equilibria and transition rates between different conformations may be affected by macromolecular crowding. We have recently developed a new approach for modeling crowding effects, which enables an atomistic representation of "test" proteins. Here this approach is applied to study how crowding affects the equilibria and transition rates between open and closed conformations of seven proteins: yeast protein disulfide isomerase (yPDI, adenylate kinase (AdK, orotidine phosphate decarboxylase (ODCase, Trp repressor (TrpR, hemoglobin, DNA beta-glucosyltransferase, and Ap(4A hydrolase. For each protein, molecular dynamics simulations of the open and closed states are separately run. Representative open and closed conformations are then used to calculate the crowding-induced changes in chemical potential for the two states. The difference in chemical-potential change between the two states finally predicts the effects of crowding on the population ratio of the two states. Crowding is found to reduce the open population to various extents. In the presence of crowders with a 15 A radius and occupying 35% of volume, the open-to-closed population ratios of yPDI, AdK, ODCase and TrpR are reduced by 79%, 78%, 62% and 55%, respectively. The reductions for the remaining three proteins are 20-44%. As expected, the four proteins experiencing the stronger crowding effects are those with larger conformational changes between open and closed states (e.g., as measured by the change in radius of gyration. Larger proteins also tend to experience stronger crowding effects than smaller ones [e.g., comparing yPDI (480 residues and TrpR (98 residues]. The potentials of mean force along the open-closed reaction coordinate of apo and ligand-bound ODCase are altered by crowding, suggesting that transition rates are also affected. These quantitative results and qualitative trends will

  17. Statistical causal inferences and their applications in public health research

    CERN Document Server

    Wu, Pan; Chen, Ding-Geng

    2016-01-01

    This book compiles and presents new developments in statistical causal inference. The accompanying data and computer programs are publicly available so readers may replicate the model development and data analysis presented in each chapter. In this way, methodology is taught so that readers may implement it directly. The book brings together experts engaged in causal inference research to present and discuss recent issues in causal inference methodological development. This is also a timely look at causal inference applied to scenarios that range from clinical trials to mediation and public health research more broadly. In an academic setting, this book will serve as a reference and guide to a course in causal inference at the graduate level (Master's or Doctorate). It is particularly relevant for students pursuing degrees in Statistics, Biostatistics and Computational Biology. Researchers and data analysts in public health and biomedical research will also find this book to be an important reference.

  18. The anatomy of choice: active inference and agency

    Directory of Open Access Journals (Sweden)

    Karl eFriston

    2013-09-01

    Full Text Available This paper considers agency in the setting of embodied or active inference. In brief, we associate a sense of agency with prior beliefs about action and ask what sorts of beliefs underlie optimal behaviour. In particular, we consider prior beliefs that action minimises the Kullback-Leibler divergence between desired states and attainable states in the future. This allows one to formulate bounded rationality as approximate Bayesian inference that optimises a free energy bound on model evidence. We show that constructs like expected utility, exploration bonuses, softmax choice rules and optimism bias emerge as natural consequences of this formulation. Previous accounts of active inference have focused on predictive coding and Bayesian filtering schemes for minimising free energy. Here, we consider variational Bayes as an alternative scheme that provides formal constraints on the computational anatomy of inference and action – constraints that are remarkably consistent with neuroanatomy. Furthermore, this scheme contextualises optimal decision theory and economic (utilitarian formulations as pure inference problems. For example, expected utility theory emerges as a special case of free energy minimisation, where the sensitivity or inverse temperature (of softmax functions and quantal response equilibria has a unique and Bayes-optimal solution – that minimises free energy. This sensitivity corresponds to the precision of beliefs about behaviour, such that attainable goals are afforded a higher precision or confidence. In turn, this means that optimal behaviour entails a representation of confidence about outcomes that are under an agent's control.

  19. The anatomy of choice: active inference and agency.

    Science.gov (United States)

    Friston, Karl; Schwartenbeck, Philipp; Fitzgerald, Thomas; Moutoussis, Michael; Behrens, Timothy; Dolan, Raymond J

    2013-01-01

    This paper considers agency in the setting of embodied or active inference. In brief, we associate a sense of agency with prior beliefs about action and ask what sorts of beliefs underlie optimal behavior. In particular, we consider prior beliefs that action minimizes the Kullback-Leibler (KL) divergence between desired states and attainable states in the future. This allows one to formulate bounded rationality as approximate Bayesian inference that optimizes a free energy bound on model evidence. We show that constructs like expected utility, exploration bonuses, softmax choice rules and optimism bias emerge as natural consequences of this formulation. Previous accounts of active inference have focused on predictive coding and Bayesian filtering schemes for minimizing free energy. Here, we consider variational Bayes as an alternative scheme that provides formal constraints on the computational anatomy of inference and action-constraints that are remarkably consistent with neuroanatomy. Furthermore, this scheme contextualizes optimal decision theory and economic (utilitarian) formulations as pure inference problems. For example, expected utility theory emerges as a special case of free energy minimization, where the sensitivity or inverse temperature (of softmax functions and quantal response equilibria) has a unique and Bayes-optimal solution-that minimizes free energy. This sensitivity corresponds to the precision of beliefs about behavior, such that attainable goals are afforded a higher precision or confidence. In turn, this means that optimal behavior entails a representation of confidence about outcomes that are under an agent's control.

  20. Universal Darwinism As a Process of Bayesian Inference.

    Science.gov (United States)

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  1. sick: The Spectroscopic Inference Crank

    Science.gov (United States)

    Casey, Andrew R.

    2016-03-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  2. SICK: THE SPECTROSCOPIC INFERENCE CRANK

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Andrew R., E-mail: arc@ast.cam.ac.uk [Institute of Astronomy, University of Cambridge, Madingley Road, Cambdridge, CB3 0HA (United Kingdom)

    2016-03-15

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  3. SICK: THE SPECTROSCOPIC INFERENCE CRANK

    International Nuclear Information System (INIS)

    Casey, Andrew R.

    2016-01-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  4. The effect of macromolecular crowding on the electrostatic component of barnase-barstar binding: a computational, implicit solvent-based study.

    Directory of Open Access Journals (Sweden)

    Helena W Qi

    Full Text Available Macromolecular crowding within the cell can impact both protein folding and binding. Earlier models of cellular crowding focused on the excluded volume, entropic effect of crowding agents, which generally favors compact protein states. Recently, other effects of crowding have been explored, including enthalpically-related crowder-protein interactions and changes in solvation properties. In this work, we explore the effects of macromolecular crowding on the electrostatic desolvation and solvent-screened interaction components of protein-protein binding. Our simple model enables us to focus exclusively on the electrostatic effects of water depletion on protein binding due to crowding, providing us with the ability to systematically analyze and quantify these potentially intuitive effects. We use the barnase-barstar complex as a model system and randomly placed, uncharged spheres within implicit solvent to model crowding in an aqueous environment. On average, we find that the desolvation free energy penalties incurred by partners upon binding are lowered in a crowded environment and solvent-screened interactions are amplified. At a constant crowder density (fraction of total available volume occupied by crowders, this effect generally increases as the radius of model crowders decreases, but the strength and nature of this trend can depend on the water probe radius used to generate the molecular surface in the continuum model. In general, there is huge variation in desolvation penalties as a function of the random crowder positions. Results with explicit model crowders can be qualitatively similar to those using a lowered "effective" solvent dielectric to account for crowding, although the "best" effective dielectric constant will likely depend on multiple system properties. Taken together, this work systematically demonstrates, quantifies, and analyzes qualitative intuition-based insights into the effects of water depletion due to crowding on the

  5. On principles of inductive inference

    OpenAIRE

    Kostecki, Ryszard Paweł

    2011-01-01

    We propose an intersubjective epistemic approach to foundations of probability theory and statistical inference, based on relative entropy and category theory, and aimed to bypass the mathematical and conceptual problems of existing foundational approaches.

  6. Endocytic Uptake, Transport and Macromolecular Interactions of Anionic PAMAM Dendrimers within Lung Tissue.

    Science.gov (United States)

    Morris, Christopher J; Aljayyoussi, Ghaith; Mansour, Omar; Griffiths, Peter; Gumbleton, Mark

    2017-12-01

    Polyamidoamine (PAMAM) dendrimers are a promising class of nanocarrier with applications in both small and large molecule drug delivery. Here we report a comprehensive evaluation of the uptake and transport pathways that contribute to the lung disposition of dendrimers. Anionic PAMAM dendrimers and control dextran probes were applied to an isolated perfused rat lung (IPRL) model and lung epithelial monolayers. Endocytosis pathways were examined in primary alveolar epithelial cultures by confocal microscopy. Molecular interactions of dendrimers with protein and lipid lung fluid components were studied using small angle neutron scattering (SANS). Dendrimers were absorbed across the intact lung via a passive, size-dependent transport pathway at rates slower than dextrans of similar molecular sizes. SANS investigations of concentration-dependent PAMAM transport in the IPRL confirmed no aggregation of PAMAMs with either albumin or dipalmitoylphosphatidylcholine lung lining fluid components. Distinct endocytic compartments were identified within primary alveolar epithelial cells and their functionality in the rapid uptake of fluorescent dendrimers and model macromolecular probes was confirmed by co-localisation studies. PAMAM dendrimers display favourable lung biocompatibility but modest lung to blood absorption kinetics. These data support the investigation of dendrimer-based carriers for controlled-release drug delivery to the deep lung.

  7. Model averaging, optimal inference and habit formation

    Directory of Open Access Journals (Sweden)

    Thomas H B FitzGerald

    2014-06-01

    Full Text Available Postulating that the brain performs approximate Bayesian inference generates principled and empirically testable models of neuronal function – the subject of much current interest in neuroscience and related disciplines. Current formulations address inference and learning under some assumed and particular model. In reality, organisms are often faced with an additional challenge – that of determining which model or models of their environment are the best for guiding behaviour. Bayesian model averaging – which says that an agent should weight the predictions of different models according to their evidence – provides a principled way to solve this problem. Importantly, because model evidence is determined by both the accuracy and complexity of the model, optimal inference requires that these be traded off against one another. This means an agent’s behaviour should show an equivalent balance. We hypothesise that Bayesian model averaging plays an important role in cognition, given that it is both optimal and realisable within a plausible neuronal architecture. We outline model averaging and how it might be implemented, and then explore a number of implications for brain and behaviour. In particular, we propose that model averaging can explain a number of apparently suboptimal phenomena within the framework of approximate (bounded Bayesian inference, focussing particularly upon the relationship between goal-directed and habitual behaviour.

  8. Bootstrapping phylogenies inferred from rearrangement data

    Directory of Open Access Journals (Sweden)

    Lin Yu

    2012-08-01

    Full Text Available Abstract Background Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. Results We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Conclusions Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its

  9. Bootstrapping phylogenies inferred from rearrangement data.

    Science.gov (United States)

    Lin, Yu; Rajan, Vaibhav; Moret, Bernard Me

    2012-08-29

    Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its support values follow a similar scale and its receiver

  10. Classification versus inference learning contrasted with real-world categories.

    Science.gov (United States)

    Jones, Erin L; Ross, Brian H

    2011-07-01

    Categories are learned and used in a variety of ways, but the research focus has been on classification learning. Recent work contrasting classification with inference learning of categories found important later differences in category performance. However, theoretical accounts differ on whether this is due to an inherent difference between the tasks or to the implementation decisions. The inherent-difference explanation argues that inference learners focus on the internal structure of the categories--what each category is like--while classification learners focus on diagnostic information to predict category membership. In two experiments, using real-world categories and controlling for earlier methodological differences, inference learners learned more about what each category was like than did classification learners, as evidenced by higher performance on a novel classification test. These results suggest that there is an inherent difference between learning new categories by classifying an item versus inferring a feature.

  11. Statistical inference via fiducial methods

    OpenAIRE

    Salomé, Diemer

    1998-01-01

    In this thesis the attention is restricted to inductive reasoning using a mathematical probability model. A statistical procedure prescribes, for every theoretically possible set of data, the inference about the unknown of interest. ... Zie: Summary

  12. Macromolecular query language (MMQL): prototype data model and implementation.

    Science.gov (United States)

    Shindyalov, I N; Chang, W; Pu, C; Bourne, P E

    1994-11-01

    Macromolecular query language (MMQL) is an extensible interpretive language in which to pose questions concerning the experimental or derived features of the 3-D structure of biological macromolecules. MMQL portends to be intuitive with a simple syntax, so that from a user's perspective complex queries are easily written. A number of basic queries and a more complex query--determination of structures containing a five-strand Greek key motif--are presented to illustrate the strengths and weaknesses of the language. The predominant features of MMQL are a filter and pattern grammar which are combined to express a wide range of interesting biological queries. Filters permit the selection of object attributes, for example, compound name and resolution, whereas the patterns currently implemented query primary sequence, close contacts, hydrogen bonding, secondary structure, conformation and amino acid properties (volume, polarity, isoelectric point, hydrophobicity and different forms of exposure). MMQL queries are processed by MMQLlib; a C++ class library, to which new query methods and pattern types are easily added. The prototype implementation described uses PDBlib, another C(++)-based class library from representing the features of biological macromolecules at the level of detail parsable from a PDB file. Since PDBlib can represent data stored in relational and object-oriented databases, as well as PDB files, once these data are loaded they too can be queried by MMQL. Performance metrics are given for queries of PDB files for which all derived data are calculated at run time and compared to a preliminary version of OOPDB, a prototype object-oriented database with a schema based on a persistent version of PDBlib which offers more efficient data access and the potential to maintain derived information. MMQLlib, PDBlib and associated software are available via anonymous ftp from cuhhca.hhmi.columbia.edu.

  13. Information-Theoretic Inference of Large Transcriptional Regulatory Networks

    Directory of Open Access Journals (Sweden)

    Meyer Patrick

    2007-01-01

    Full Text Available The paper presents MRNET, an original method for inferring genetic networks from microarray data. The method is based on maximum relevance/minimum redundancy (MRMR, an effective information-theoretic technique for feature selection in supervised learning. The MRMR principle consists in selecting among the least redundant variables the ones that have the highest mutual information with the target. MRNET extends this feature selection principle to networks in order to infer gene-dependence relationships from microarray data. The paper assesses MRNET by benchmarking it against RELNET, CLR, and ARACNE, three state-of-the-art information-theoretic methods for large (up to several thousands of genes network inference. Experimental results on thirty synthetically generated microarray datasets show that MRNET is competitive with these methods.

  14. Information-Theoretic Inference of Large Transcriptional Regulatory Networks

    Directory of Open Access Journals (Sweden)

    Patrick E. Meyer

    2007-06-01

    Full Text Available The paper presents MRNET, an original method for inferring genetic networks from microarray data. The method is based on maximum relevance/minimum redundancy (MRMR, an effective information-theoretic technique for feature selection in supervised learning. The MRMR principle consists in selecting among the least redundant variables the ones that have the highest mutual information with the target. MRNET extends this feature selection principle to networks in order to infer gene-dependence relationships from microarray data. The paper assesses MRNET by benchmarking it against RELNET, CLR, and ARACNE, three state-of-the-art information-theoretic methods for large (up to several thousands of genes network inference. Experimental results on thirty synthetically generated microarray datasets show that MRNET is competitive with these methods.

  15. IMAGINE: Interstellar MAGnetic field INference Engine

    Science.gov (United States)

    Steininger, Theo

    2018-03-01

    IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.

  16. Damage detection in high-rise buildings using damage-induced rotations

    International Nuclear Information System (INIS)

    Sung, Seung Hun; Jung, Ho Youn; Lee, Jung Hoon; Jung, Hyung Jo

    2016-01-01

    In this paper, a new damage-detection method based on structural vibration is proposed. The essence of the proposed method is the detection of abrupt changes in rotation. Damage-induced rotation (DIR), which is determined from the modal flexibility of the structure, initially occurs only at a specific damaged location. Therefore, damage can be localized by evaluating abrupt changes in rotation. We conducted numerical simulations of two damage scenarios using a 10-story cantilever-type building model. Measurement noise was also considered in the simulation. We compared the sensitivity of the proposed method to localize damage to that of two conventional modal-flexibility-based damage-detection methods, i.e., uniform load surface (ULS) and ULS curvature. The proposed method was able to localize damage in both damage scenarios for cantilever structures, but the conventional methods could not

  17. Damage detection in high-rise buildings using damage-induced rotations

    International Nuclear Information System (INIS)

    Sung, Seung Hoon; Jung, Ho Youn; Lee, Jung Hoon; Jung, Hyung Jo

    2014-01-01

    In this paper, a new damage-detection method based on structural vibration is proposed. The essence of the proposed method is the detection of abrupt changes in rotation. Damage-induced rotation (DIR), which is determined from the modal flexibility of the structure, initially occurs only at a specific damaged location. Therefore, damage can be localized by evaluating abrupt changes in rotation. We conducted numerical simulations of two damage scenarios using a 10-story cantilever-type building model. Measurement noise was also considered in the simulation. We compared the sensitivity of the proposed method to localize damage to that of two conventional modal-flexibility-based damage-detection methods, i.e., uniform load surface (ULS) and ULS curvature. The proposed method was able to localize damage in both damage scenarios for cantilever structures, but the conventional methods could not.

  18. Inferring epidemic network topology from surveillance data.

    Directory of Open Access Journals (Sweden)

    Xiang Wan

    Full Text Available The transmission of infectious diseases can be affected by many or even hidden factors, making it difficult to accurately predict when and where outbreaks may emerge. One approach at the moment is to develop and deploy surveillance systems in an effort to detect outbreaks as timely as possible. This enables policy makers to modify and implement strategies for the control of the transmission. The accumulated surveillance data including temporal, spatial, clinical, and demographic information, can provide valuable information with which to infer the underlying epidemic networks. Such networks can be quite informative and insightful as they characterize how infectious diseases transmit from one location to another. The aim of this work is to develop a computational model that allows inferences to be made regarding epidemic network topology in heterogeneous populations. We apply our model on the surveillance data from the 2009 H1N1 pandemic in Hong Kong. The inferred epidemic network displays significant effect on the propagation of infectious diseases.

  19. A Learning Algorithm for Multimodal Grammar Inference.

    Science.gov (United States)

    D'Ulizia, A; Ferri, F; Grifoni, P

    2011-12-01

    The high costs of development and maintenance of multimodal grammars in integrating and understanding input in multimodal interfaces lead to the investigation of novel algorithmic solutions in automating grammar generation and in updating processes. Many algorithms for context-free grammar inference have been developed in the natural language processing literature. An extension of these algorithms toward the inference of multimodal grammars is necessary for multimodal input processing. In this paper, we propose a novel grammar inference mechanism that allows us to learn a multimodal grammar from its positive samples of multimodal sentences. The algorithm first generates the multimodal grammar that is able to parse the positive samples of sentences and, afterward, makes use of two learning operators and the minimum description length metrics in improving the grammar description and in avoiding the over-generalization problem. The experimental results highlight the acceptable performances of the algorithm proposed in this paper since it has a very high probability of parsing valid sentences.

  20. Measurement of damage in systemic vasculitis: a comparison of the Vasculitis Damage Index with the Combined Damage Assessment Index

    DEFF Research Database (Denmark)

    Suppiah, Ravi; Flossman, Oliver; Mukhtyar, Chetan

    2011-01-01

    To compare the Vasculitis Damage Index (VDI) with the Combined Damage Assessment Index (CDA) as measures of damage from vasculitis.......To compare the Vasculitis Damage Index (VDI) with the Combined Damage Assessment Index (CDA) as measures of damage from vasculitis....

  1. Bayesian Inference of High-Dimensional Dynamical Ocean Models

    Science.gov (United States)

    Lin, J.; Lermusiaux, P. F. J.; Lolla, S. V. T.; Gupta, A.; Haley, P. J., Jr.

    2015-12-01

    This presentation addresses a holistic set of challenges in high-dimension ocean Bayesian nonlinear estimation: i) predict the probability distribution functions (pdfs) of large nonlinear dynamical systems using stochastic partial differential equations (PDEs); ii) assimilate data using Bayes' law with these pdfs; iii) predict the future data that optimally reduce uncertainties; and (iv) rank the known and learn the new model formulations themselves. Overall, we allow the joint inference of the state, equations, geometry, boundary conditions and initial conditions of dynamical models. Examples are provided for time-dependent fluid and ocean flows, including cavity, double-gyre and Strait flows with jets and eddies. The Bayesian model inference, based on limited observations, is illustrated first by the estimation of obstacle shapes and positions in fluid flows. Next, the Bayesian inference of biogeochemical reaction equations and of their states and parameters is presented, illustrating how PDE-based machine learning can rigorously guide the selection and discovery of complex ecosystem models. Finally, the inference of multiscale bottom gravity current dynamics is illustrated, motivated in part by classic overflows and dense water formation sites and their relevance to climate monitoring and dynamics. This is joint work with our MSEAS group at MIT.

  2. Hybrid Optical Inference Machines

    Science.gov (United States)

    1991-09-27

    with labels. Now, events. a set of facts cal be generated in the dyadic form "u, R 1,2" Eichmann and Caulfield (19] consider the same type of and can...these enceding-schemes. These architectures are-based pri- 19. G. Eichmann and H. J. Caulfield, "Optical Learning (Inference)marily on optical inner

  3. A Network Inference Workflow Applied to Virulence-Related Processes in Salmonella typhimurium

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Ronald C.; Singhal, Mudita; Weller, Jennifer B.; Khoshnevis, Saeed; Shi, Liang; McDermott, Jason E.

    2009-04-20

    Inference of the structure of mRNA transcriptional regulatory networks, protein regulatory or interaction networks, and protein activation/inactivation-based signal transduction networks are critical tasks in systems biology. In this article we discuss a workflow for the reconstruction of parts of the transcriptional regulatory network of the pathogenic bacterium Salmonella typhimurium based on the information contained in sets of microarray gene expression data now available for that organism, and describe our results obtained by following this workflow. The primary tool is one of the network inference algorithms deployed in the Software Environment for BIological Network Inference (SEBINI). Specifically, we selected the algorithm called Context Likelihood of Relatedness (CLR), which uses the mutual information contained in the gene expression data to infer regulatory connections. The associated analysis pipeline automatically stores the inferred edges from the CLR runs within SEBINI and, upon request, transfers the inferred edges into either Cytoscape or the plug-in Collective Analysis of Biological of Biological Interaction Networks (CABIN) tool for further post-analysis of the inferred regulatory edges. The following article presents the outcome of this workflow, as well as the protocols followed for microarray data collection, data cleansing, and network inference. Our analysis revealed several interesting interactions, functional groups, metabolic pathways, and regulons in S. typhimurium.

  4. Training Inference Making Skills Using a Situation Model Approach Improves Reading Comprehension

    Directory of Open Access Journals (Sweden)

    Lisanne eBos

    2016-02-01

    Full Text Available This study aimed to enhance third and fourth graders’ text comprehension at the situation model level. Therefore, we tested a reading strategy training developed to target inference making skills, which are widely considered to be pivotal to situation model construction. The training was grounded in contemporary literature on situation model-based inference making and addressed the source (text-based versus knowledge-based, type (necessary versus unnecessary for (re-establishing coherence, and depth of an inference (making single lexical inferences versus combining multiple lexical inferences, as well as the type of searching strategy (forward versus backward. Results indicated that, compared to a control group (n = 51, children who followed the experimental training (n = 67 improved their inference making skills supportive to situation model construction. Importantly, our training also resulted in increased levels of general reading comprehension and motivation. In sum, this study showed that a ‘level of text representation’-approach can provide a useful framework to teach inference making skills to third and fourth graders.

  5. Robust Demographic Inference from Genomic and SNP Data

    Science.gov (United States)

    Excoffier, Laurent; Dupanloup, Isabelle; Huerta-Sánchez, Emilia; Sousa, Vitor C.; Foll, Matthieu

    2013-01-01

    We introduce a flexible and robust simulation-based framework to infer demographic parameters from the site frequency spectrum (SFS) computed on large genomic datasets. We show that our composite-likelihood approach allows one to study evolutionary models of arbitrary complexity, which cannot be tackled by other current likelihood-based methods. For simple scenarios, our approach compares favorably in terms of accuracy and speed with , the current reference in the field, while showing better convergence properties for complex models. We first apply our methodology to non-coding genomic SNP data from four human populations. To infer their demographic history, we compare neutral evolutionary models of increasing complexity, including unsampled populations. We further show the versatility of our framework by extending it to the inference of demographic parameters from SNP chips with known ascertainment, such as that recently released by Affymetrix to study human origins. Whereas previous ways of handling ascertained SNPs were either restricted to a single population or only allowed the inference of divergence time between a pair of populations, our framework can correctly infer parameters of more complex models including the divergence of several populations, bottlenecks and migration. We apply this approach to the reconstruction of African demography using two distinct ascertained human SNP panels studied under two evolutionary models. The two SNP panels lead to globally very similar estimates and confidence intervals, and suggest an ancient divergence (>110 Ky) between Yoruba and San populations. Our methodology appears well suited to the study of complex scenarios from large genomic data sets. PMID:24204310

  6. Universal Darwinism as a process of Bayesian inference

    Directory of Open Access Journals (Sweden)

    John Oberon Campbell

    2016-06-01

    Full Text Available Many of the mathematical frameworks describing natural selection are equivalent to Bayes’ Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians. As Bayesian inference can always be cast in terms of (variational free energy minimization, natural selection can be viewed as comprising two components: a generative model of an ‘experiment’ in the external world environment, and the results of that 'experiment' or the 'surprise' entailed by predicted and actual outcomes of the ‘experiment’. Minimization of free energy implies that the implicit measure of 'surprise' experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  7. Behavior Intention Derivation of Android Malware Using Ontology Inference

    Directory of Open Access Journals (Sweden)

    Jian Jiao

    2018-01-01

    Full Text Available Previous researches on Android malware mainly focus on malware detection, and malware’s evolution makes the process face certain hysteresis. The information presented by these detected results (malice judgment, family classification, and behavior characterization is limited for analysts. Therefore, a method is needed to restore the intention of malware, which reflects the relation between multiple behaviors of complex malware and its ultimate purpose. This paper proposes a novel description and derivation model of Android malware intention based on the theory of intention and malware reverse engineering. This approach creates ontology for malware intention to model the semantic relation between behaviors and its objects and automates the process of intention derivation by using SWRL rules transformed from intention model and Jess inference engine. Experiments on 75 typical samples show that the inference system can perform derivation of malware intention effectively, and 89.3% of the inference results are consistent with artificial analysis, which proves the feasibility and effectiveness of our theory and inference system.

  8. Genealogical and evolutionary inference with the human Y chromosome.

    Science.gov (United States)

    Stumpf, M P; Goldstein, D B

    2001-03-02

    Population genetics has emerged as a powerful tool for unraveling human history. In addition to the study of mitochondrial and autosomal DNA, attention has recently focused on Y-chromosome variation. Ambiguities and inaccuracies in data analysis, however, pose an important obstacle to further development of the field. Here we review the methods available for genealogical inference using Y-chromosome data. Approaches can be divided into those that do and those that do not use an explicit population model in genealogical inference. We describe the strengths and weaknesses of these model-based and model-free approaches, as well as difficulties associated with the mutation process that affect both methods. In the case of genealogical inference using microsatellite loci, we use coalescent simulations to show that relatively simple generalizations of the mutation process can greatly increase the accuracy of genealogical inference. Because model-free and model-based approaches have different biases and limitations, we conclude that there is considerable benefit in the continued use of both types of approaches.

  9. SDG multiple fault diagnosis by real-time inverse inference

    International Nuclear Information System (INIS)

    Zhang Zhaoqian; Wu Chongguang; Zhang Beike; Xia Tao; Li Anfeng

    2005-01-01

    In the past 20 years, one of the qualitative simulation technologies, signed directed graph (SDG) has been widely applied in the field of chemical fault diagnosis. However, the assumption of single fault origin was usually used by many former researchers. As a result, this will lead to the problem of combinatorial explosion and has limited SDG to the realistic application on the real process. This is mainly because that most of the former researchers used forward inference engine in the commercial expert system software to carry out the inverse diagnosis inference on the SDG model which violates the internal principle of diagnosis mechanism. In this paper, we present a new SDG multiple faults diagnosis method by real-time inverse inference. This is a method of multiple faults diagnosis from the genuine significance and the inference engine use inverse mechanism. At last, we give an example of 65t/h furnace diagnosis system to demonstrate its applicability and efficiency

  10. SDG multiple fault diagnosis by real-time inverse inference

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Zhaoqian; Wu Chongguang; Zhang Beike; Xia Tao; Li Anfeng

    2005-02-01

    In the past 20 years, one of the qualitative simulation technologies, signed directed graph (SDG) has been widely applied in the field of chemical fault diagnosis. However, the assumption of single fault origin was usually used by many former researchers. As a result, this will lead to the problem of combinatorial explosion and has limited SDG to the realistic application on the real process. This is mainly because that most of the former researchers used forward inference engine in the commercial expert system software to carry out the inverse diagnosis inference on the SDG model which violates the internal principle of diagnosis mechanism. In this paper, we present a new SDG multiple faults diagnosis method by real-time inverse inference. This is a method of multiple faults diagnosis from the genuine significance and the inference engine use inverse mechanism. At last, we give an example of 65t/h furnace diagnosis system to demonstrate its applicability and efficiency.

  11. Functional networks inference from rule-based machine learning models.

    Science.gov (United States)

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The

  12. Causal inference based on counterfactuals

    Directory of Open Access Journals (Sweden)

    Höfler M

    2005-09-01

    Full Text Available Abstract Background The counterfactual or potential outcome model has become increasingly standard for causal inference in epidemiological and medical studies. Discussion This paper provides an overview on the counterfactual and related approaches. A variety of conceptual as well as practical issues when estimating causal effects are reviewed. These include causal interactions, imperfect experiments, adjustment for confounding, time-varying exposures, competing risks and the probability of causation. It is argued that the counterfactual model of causal effects captures the main aspects of causality in health sciences and relates to many statistical procedures. Summary Counterfactuals are the basis of causal inference in medicine and epidemiology. Nevertheless, the estimation of counterfactual differences pose several difficulties, primarily in observational studies. These problems, however, reflect fundamental barriers only when learning from observations, and this does not invalidate the counterfactual concept.

  13. Nondestructive damage detection and evaluation technique for seismically damaged structures

    Science.gov (United States)

    Adachi, Yukio; Unjoh, Shigeki; Kondoh, Masuo; Ohsumi, Michio

    1999-02-01

    The development of quantitative damage detection and evaluation technique, and damage detection technique for invisible damages of structures are required according to the lessons from the 1995 Hyogo-ken Nanbu earthquake. In this study, two quantitative damage sensing techniques for highway bridge structures are proposed. One method is to measure the change of vibration characteristics of the bridge structure. According to the damage detection test for damaged bridge column by shaking table test, this method can successfully detect the vibration characteristic change caused by damage progress due to increment excitations. The other method is to use self-diagnosis intelligent materials. According to the reinforced concrete beam specimen test, the second method can detect the damage by rupture of intelligent sensors, such as optical fiber or carbon fiber reinforced plastic rod.

  14. Vertical Distributions of Macromolecular Composition of Particulate Organic Matter in the Water Column of the Amundsen Sea Polynya During the Summer in 2014

    Science.gov (United States)

    Kim, Bo Kyung; Lee, SangHoon; Ha, Sun-Yong; Jung, Jinyoung; Kim, Tae Wan; Yang, Eun Jin; Jo, Naeun; Lim, Yu Jeong; Park, Jisoo; Lee, Sang Heon

    2018-02-01

    Macromolecular compositions (carbohydrates, proteins, and lipids) of particulate organic matter (POM) are crucial as a basic marine food quality. To date, however, one investigation has been carried out in the Amundsen Sea. Water samples for macromolecular compositions were obtained at selected seven stations in the Amundsen Sea Polynya (AP) during the austral summer in 2014 to investigate vertical characteristics of POM. We found that a high proportion of carbohydrates (45.9 ± 11.4%) in photic layer which are significantly different from the previous result (27.9 ± 6.9%) in the AP, 2012. The plausible reason could be the carbohydrate content strongly associated with biomass of the dominant species (Phaeocystis antarctica). The calorific content of food material (FM) in the photic layer obtained in this study is similar with that of the Ross Sea as one of the highest primary productivity regions in the Southern Ocean. Total concentrations, calorific values, and calorific contents of FM were higher in the photic layer than the aphotic layer, which implies that a significant fraction of organic matter underwent degradation. A decreasing proteins/carbohydrates (PRT/CHO) ratio with depth could be caused by preferential nitrogen loss during sinking period. Since the biochemical compositions of POM mostly fixed in photic layers could play an important role in transporting organic carbon into the deep sea, further detail studies on the variations in biochemical compositions and main controlling factors are needed to understand sinking mechanisms of POM.

  15. The Effect of Attractive Interactions and Macromolecular Crowding on Crystallins Association.

    Directory of Open Access Journals (Sweden)

    Jiachen Wei

    Full Text Available In living systems proteins are typically found in crowded environments where their effective interactions strongly depend on the surrounding medium. Yet, their association and dissociation needs to be robustly controlled in order to enable biological function. Uncontrolled protein aggregation often causes disease. For instance, cataract is caused by the clustering of lens proteins, i.e., crystallins, resulting in enhanced light scattering and impaired vision or blindness. To investigate the molecular origins of cataract formation and to design efficient treatments, a better understanding of crystallin association in macromolecular crowded environment is needed. Here we present a theoretical study of simple coarse grained colloidal models to characterize the general features of how the association equilibrium of proteins depends on the magnitude of intermolecular attraction. By comparing the analytic results to the available experimental data on the osmotic pressure in crystallin solutions, we identify the effective parameters regimes applicable to crystallins. Moreover, the combination of two models allows us to predict that the number of binding sites on crystallin is small, i.e. one to three per protein, which is different from previous estimates. We further observe that the crowding factor is sensitive to the size asymmetry between the reactants and crowding agents, the shape of the protein clusters, and to small variations of intermolecular attraction. Our work may provide general guidelines on how to steer the protein interactions in order to control their association.

  16. Implementing and analyzing the multi-threaded LP-inference

    Science.gov (United States)

    Bolotova, S. Yu; Trofimenko, E. V.; Leschinskaya, M. V.

    2018-03-01

    The logical production equations provide new possibilities for the backward inference optimization in intelligent production-type systems. The strategy of a relevant backward inference is aimed at minimization of a number of queries to external information source (either to a database or an interactive user). The idea of the method is based on the computing of initial preimages set and searching for the true preimage. The execution of each stage can be organized independently and in parallel and the actual work at a given stage can also be distributed between parallel computers. This paper is devoted to the parallel algorithms of the relevant inference based on the advanced scheme of the parallel computations “pipeline” which allows to increase the degree of parallelism. The author also provides some details of the LP-structures implementation.

  17. International Conference on Trends and Perspectives in Linear Statistical Inference

    CERN Document Server

    Rosen, Dietrich

    2018-01-01

    This volume features selected contributions on a variety of topics related to linear statistical inference. The peer-reviewed papers from the International Conference on Trends and Perspectives in Linear Statistical Inference (LinStat 2016) held in Istanbul, Turkey, 22-25 August 2016, cover topics in both theoretical and applied statistics, such as linear models, high-dimensional statistics, computational statistics, the design of experiments, and multivariate analysis. The book is intended for statisticians, Ph.D. students, and professionals who are interested in statistical inference. .

  18. Packaging design as communicator of product attributes: Effects on consumers’ attribute inferences

    NARCIS (Netherlands)

    van Ooijen, I.

    2016-01-01

    This dissertation will focus on two types of attribute inferences that result from packaging design cues. First, the effects of product packaging design on quality related inferences are investigated. Second, the effects of product packaging design on healthiness related inferences are examined (See

  19. Surrogate based approaches to parameter inference in ocean models

    KAUST Repository

    Knio, Omar

    2016-01-06

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  20. Fast and scalable inference of multi-sample cancer lineages.

    KAUST Repository

    Popic, Victoria; Salari, Raheleh; Hajirasouliha, Iman; Kashef-Haghighi, Dorna; West, Robert B; Batzoglou, Serafim

    2015-01-01

    Somatic variants can be used as lineage markers for the phylogenetic reconstruction of cancer evolution. Since somatic phylogenetics is complicated by sample heterogeneity, novel specialized tree-building methods are required for cancer phylogeny reconstruction. We present LICHeE (Lineage Inference for Cancer Heterogeneity and Evolution), a novel method that automates the phylogenetic inference of cancer progression from multiple somatic samples. LICHeE uses variant allele frequencies of somatic single nucleotide variants obtained by deep sequencing to reconstruct multi-sample cell lineage trees and infer the subclonal composition of the samples. LICHeE is open source and available at http://viq854.github.io/lichee .

  1. Fast and scalable inference of multi-sample cancer lineages.

    KAUST Repository

    Popic, Victoria

    2015-05-06

    Somatic variants can be used as lineage markers for the phylogenetic reconstruction of cancer evolution. Since somatic phylogenetics is complicated by sample heterogeneity, novel specialized tree-building methods are required for cancer phylogeny reconstruction. We present LICHeE (Lineage Inference for Cancer Heterogeneity and Evolution), a novel method that automates the phylogenetic inference of cancer progression from multiple somatic samples. LICHeE uses variant allele frequencies of somatic single nucleotide variants obtained by deep sequencing to reconstruct multi-sample cell lineage trees and infer the subclonal composition of the samples. LICHeE is open source and available at http://viq854.github.io/lichee .

  2. Surrogate based approaches to parameter inference in ocean models

    KAUST Repository

    Knio, Omar

    2016-01-01

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  3. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...

  4. A Foreign Object Damage Event Detector Data Fusion System for Turbofan Engines

    Science.gov (United States)

    Turso, James A.; Litt, Jonathan S.

    2004-01-01

    A Data Fusion System designed to provide a reliable assessment of the occurrence of Foreign Object Damage (FOD) in a turbofan engine is presented. The FOD-event feature level fusion scheme combines knowledge of shifts in engine gas path performance obtained using a Kalman filter, with bearing accelerometer signal features extracted via wavelet analysis, to positively identify a FOD event. A fuzzy inference system provides basic probability assignments (bpa) based on features extracted from the gas path analysis and bearing accelerometers to a fusion algorithm based on the Dempster-Shafer-Yager Theory of Evidence. Details are provided on the wavelet transforms used to extract the foreign object strike features from the noisy data and on the Kalman filter-based gas path analysis. The system is demonstrated using a turbofan engine combined-effects model (CEM), providing both gas path and rotor dynamic structural response, and is suitable for rapid-prototyping of control and diagnostic systems. The fusion of the disparate data can provide significantly more reliable detection of a FOD event than the use of either method alone. The use of fuzzy inference techniques combined with Dempster-Shafer-Yager Theory of Evidence provides a theoretical justification for drawing conclusions based on imprecise or incomplete data.

  5. Making Inferences in Adulthood: Falling Leaves Mean It's Fall.

    Science.gov (United States)

    Zandi, Taher; Gregory, Monica E.

    1988-01-01

    Assessed age differences in making inferences from prose. Older adults correctly answered mean of 10 questions related to implicit information and 8 related to explicit information. Young adults answered mean of 7 implicit and 12 explicit information questions. In spite of poorer recall of factual details, older subjects made inferences to greater…

  6. Mixed normal inference on multicointegration

    NARCIS (Netherlands)

    Boswijk, H.P.

    2009-01-01

    Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the

  7. Baselines and test data for cross-lingual inference

    DEFF Research Database (Denmark)

    Agic, Zeljko; Schluter, Natalie

    2018-01-01

    The recent years have seen a revival of interest in textual entailment, sparked by i) the emergence of powerful deep neural network learners for natural language processing and ii) the timely development of large-scale evaluation datasets such as SNLI. Recast as natural language inference......, the problem now amounts to detecting the relation between pairs of statements: they either contradict or entail one another, or they are mutually neutral. Current research in natural language inference is effectively exclusive to English. In this paper, we propose to advance the research in SNLI-style natural...... language inference toward multilingual evaluation. To that end, we provide test data for four major languages: Arabic, French, Spanish, and Russian. We experiment with a set of baselines. Our systems are based on cross-lingual word embeddings and machine translation. While our best system scores an average...

  8. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  9. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  10. Intracranial EEG correlates of implicit relational inference within the hippocampus.

    Science.gov (United States)

    Reber, T P; Do Lam, A T A; Axmacher, N; Elger, C E; Helmstaedter, C; Henke, K; Fell, J

    2016-01-01

    Drawing inferences from past experiences enables adaptive behavior in future situations. Inference has been shown to depend on hippocampal processes. Usually, inference is considered a deliberate and effortful mental act which happens during retrieval, and requires the focus of our awareness. Recent fMRI studies hint at the possibility that some forms of hippocampus-dependent inference can also occur during encoding and possibly also outside of awareness. Here, we sought to further explore the feasibility of hippocampal implicit inference, and specifically address the temporal evolution of implicit inference using intracranial EEG. Presurgical epilepsy patients with hippocampal depth electrodes viewed a sequence of word pairs, and judged the semantic fit between two words in each pair. Some of the word pairs entailed a common word (e.g., "winter-red," "red-cat") such that an indirect relation was established in following word pairs (e.g., "winter-cat"). The behavioral results suggested that drawing inference implicitly from past experience is feasible because indirect relations seemed to foster "fit" judgments while the absence of indirect relations fostered "do not fit" judgments, even though the participants were unaware of the indirect relations. A event-related potential (ERP) difference emerging 400 ms post-stimulus was evident in the hippocampus during encoding, suggesting that indirect relations were already established automatically during encoding of the overlapping word pairs. Further ERP differences emerged later post-stimulus (1,500 ms), were modulated by the participants' responses and were evident during encoding and test. Furthermore, response-locked ERP effects were evident at test. These ERP effects could hence be a correlate of the interaction of implicit memory with decision-making. Together, the data map out a time-course in which the hippocampus automatically integrates memories from discrete but related episodes to implicitly influence future

  11. Estimating mountain basin-mean precipitation from streamflow using Bayesian inference

    Science.gov (United States)

    Henn, Brian; Clark, Martyn P.; Kavetski, Dmitri; Lundquist, Jessica D.

    2015-10-01

    Estimating basin-mean precipitation in complex terrain is difficult due to uncertainty in the topographical representativeness of precipitation gauges relative to the basin. To address this issue, we use Bayesian methodology coupled with a multimodel framework to infer basin-mean precipitation from streamflow observations, and we apply this approach to snow-dominated basins in the Sierra Nevada of California. Using streamflow observations, forcing data from lower-elevation stations, the Bayesian Total Error Analysis (BATEA) methodology and the Framework for Understanding Structural Errors (FUSE), we infer basin-mean precipitation, and compare it to basin-mean precipitation estimated using topographically informed interpolation from gauges (PRISM, the Parameter-elevation Regression on Independent Slopes Model). The BATEA-inferred spatial patterns of precipitation show agreement with PRISM in terms of the rank of basins from wet to dry but differ in absolute values. In some of the basins, these differences may reflect biases in PRISM, because some implied PRISM runoff ratios may be inconsistent with the regional climate. We also infer annual time series of basin precipitation using a two-step calibration approach. Assessment of the precision and robustness of the BATEA approach suggests that uncertainty in the BATEA-inferred precipitation is primarily related to uncertainties in hydrologic model structure. Despite these limitations, time series of inferred annual precipitation under different model and parameter assumptions are strongly correlated with one another, suggesting that this approach is capable of resolving year-to-year variability in basin-mean precipitation.

  12. Feature inference with uncertain categorization: Re-assessing Anderson's rational model.

    Science.gov (United States)

    Konovalova, Elizaveta; Le Mens, Gaël

    2017-09-18

    A key function of categories is to help predictions about unobserved features of objects. At the same time, humans are often in situations where the categories of the objects they perceive are uncertain. In an influential paper, Anderson (Psychological Review, 98(3), 409-429, 1991) proposed a rational model for feature inferences with uncertain categorization. A crucial feature of this model is the conditional independence assumption-it assumes that the within category feature correlation is zero. In prior research, this model has been found to provide a poor fit to participants' inferences. This evidence is restricted to task environments inconsistent with the conditional independence assumption. Currently available evidence thus provides little information about how this model would fit participants' inferences in a setting with conditional independence. In four experiments based on a novel paradigm and one experiment based on an existing paradigm, we assess the performance of Anderson's model under conditional independence. We find that this model predicts participants' inferences better than competing models. One model assumes that inferences are based on just the most likely category. The second model is insensitive to categories but sensitive to overall feature correlation. The performance of Anderson's model is evidence that inferences were influenced not only by the more likely category but also by the other candidate category. Our findings suggest that a version of Anderson's model which relaxes the conditional independence assumption will likely perform well in environments characterized by within-category feature correlation.

  13. Integrating distributed Bayesian inference and reinforcement learning for sensor management

    NARCIS (Netherlands)

    Grappiolo, C.; Whiteson, S.; Pavlin, G.; Bakker, B.

    2009-01-01

    This paper introduces a sensor management approach that integrates distributed Bayesian inference (DBI) and reinforcement learning (RL). DBI is implemented using distributed perception networks (DPNs), a multiagent approach to performing efficient inference, while RL is used to automatically

  14. Reliability of dose volume constraint inference from clinical data

    Science.gov (United States)

    Lutz, C. M.; Møller, D. S.; Hoffmann, L.; Knap, M. M.; Alber, M.

    2017-04-01

    Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an ‘ideal’ cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a ‘non-ideal’ cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates  >85 % were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.

  15. Inference in {open_quotes}poor{close_quotes} languages

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, S. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    Languages with a solvable implication problem but without complete and consistent systems of inference rules ({open_quote}poor{close_quote} languages) are considered. The problem of existence of a finite, complete, and consistent inference rule system for a {open_quotes}poor{close_quotes} language is stated independently of the language or the rule syntax. Several properties of the problem are proved. An application of the results to the language of join dependencies is given.

  16. Inference of beliefs and emotions in patients with Alzheimer's disease.

    Science.gov (United States)

    Zaitchik, Deborah; Koff, Elissa; Brownell, Hiram; Winner, Ellen; Albert, Marilyn

    2006-01-01

    The present study compared 20 patients with mild to moderate Alzheimer's disease with 20 older controls (ages 69-94 years) on their ability to make inferences about emotions and beliefs in others. Six tasks tested their ability to make 1st-order and 2nd-order inferences as well as to offer explanations and moral evaluations of human action by appeal to emotions and beliefs. Results showed that the ability to infer emotions and beliefs in 1st-order tasks remains largely intact in patients with mild to moderate Alzheimer's. Patients were able to use mental states in the prediction, explanation, and moral evaluation of behavior. Impairment on 2nd-order tasks involving inference of mental states was equivalent to impairment on control tasks, suggesting that patients' difficulty is secondary to their cognitive impairments. ((c) 2006 APA, all rights reserved).

  17. Inference method using bayesian network for diagnosis of pulmonary nodules

    International Nuclear Information System (INIS)

    Kawagishi, Masami; Iizuka, Yoshio; Yamamoto, Hiroyuki; Yakami, Masahiro; Kubo, Takeshi; Fujimoto, Koji; Togashi, Kaori

    2010-01-01

    This report describes the improvements of a naive Bayes model that infers the diagnosis of pulmonary nodules in chest CT images based on the findings obtained when a radiologist interprets the CT images. We have previously introduced an inference model using a naive Bayes classifier and have reported its clinical value based on evaluation using clinical data. In the present report, we introduce the following improvements to the original inference model: the selection of findings based on correlations and the generation of a model using only these findings, and the introduction of classifiers that integrate several simple classifiers each of which is specialized for specific diagnosis. These improvements were found to increase the inference accuracy by 10.4% (p<.01) as compared to the original model in 100 cases (222 nodules) based on leave-one-out evaluation. (author)

  18. Generative inference for cultural evolution.

    Science.gov (United States)

    Kandler, Anne; Powell, Adam

    2018-04-05

    One of the major challenges in cultural evolution is to understand why and how various forms of social learning are used in human populations, both now and in the past. To date, much of the theoretical work on social learning has been done in isolation of data, and consequently many insights focus on revealing the learning processes or the distributions of cultural variants that are expected to have evolved in human populations. In population genetics, recent methodological advances have allowed a greater understanding of the explicit demographic and/or selection mechanisms that underlie observed allele frequency distributions across the globe, and their change through time. In particular, generative frameworks-often using coalescent-based simulation coupled with approximate Bayesian computation (ABC)-have provided robust inferences on the human past, with no reliance on a priori assumptions of equilibrium. Here, we demonstrate the applicability and utility of generative inference approaches to the field of cultural evolution. The framework advocated here uses observed population-level frequency data directly to establish the likely presence or absence of particular hypothesized learning strategies. In this context, we discuss the problem of equifinality and argue that, in the light of sparse cultural data and the multiplicity of possible social learning processes, the exclusion of those processes inconsistent with the observed data might be the most instructive outcome. Finally, we summarize the findings of generative inference approaches applied to a number of case studies.This article is part of the theme issue 'Bridging cultural gaps: interdisciplinary studies in human cultural evolution'. © 2018 The Author(s).

  19. Inferring Domain Plans in Question-Answering

    National Research Council Canada - National Science Library

    Pollack, Martha E

    1986-01-01

    The importance of plan inference in models of conversation has been widely noted in the computational-linguistics literature, and its incorporation in question-answering systems has enabled a range...

  20. Bayesian inference for hybrid discrete-continuous stochastic kinetic models

    International Nuclear Information System (INIS)

    Sherlock, Chris; Golightly, Andrew; Gillespie, Colin S

    2014-01-01

    We consider the problem of efficiently performing simulation and inference for stochastic kinetic models. Whilst it is possible to work directly with the resulting Markov jump process (MJP), computational cost can be prohibitive for networks of realistic size and complexity. In this paper, we consider an inference scheme based on a novel hybrid simulator that classifies reactions as either ‘fast’ or ‘slow’ with fast reactions evolving as a continuous Markov process whilst the remaining slow reaction occurrences are modelled through a MJP with time-dependent hazards. A linear noise approximation (LNA) of fast reaction dynamics is employed and slow reaction events are captured by exploiting the ability to solve the stochastic differential equation driving the LNA. This simulation procedure is used as a proposal mechanism inside a particle MCMC scheme, thus allowing Bayesian inference for the model parameters. We apply the scheme to a simple application and compare the output with an existing hybrid approach and also a scheme for performing inference for the underlying discrete stochastic model. (paper)

  1. Comparative Study of Inference Methods for Bayesian Nonnegative Matrix Factorisation

    DEFF Research Database (Denmark)

    Brouwer, Thomas; Frellsen, Jes; Liò, Pietro

    2017-01-01

    In this paper, we study the trade-offs of different inference approaches for Bayesian matrix factorisation methods, which are commonly used for predicting missing values, and for finding patterns in the data. In particular, we consider Bayesian nonnegative variants of matrix factorisation and tri......-factorisation, and compare non-probabilistic inference, Gibbs sampling, variational Bayesian inference, and a maximum-a-posteriori approach. The variational approach is new for the Bayesian nonnegative models. We compare their convergence, and robustness to noise and sparsity of the data, on both synthetic and real...

  2. Multi-Agent Inference in Social Networks: A Finite Population Learning Approach.

    Science.gov (United States)

    Fan, Jianqing; Tong, Xin; Zeng, Yao

    When people in a society want to make inference about some parameter, each person may want to use data collected by other people. Information (data) exchange in social networks is usually costly, so to make reliable statistical decisions, people need to trade off the benefits and costs of information acquisition. Conflicts of interests and coordination problems will arise in the process. Classical statistics does not consider people's incentives and interactions in the data collection process. To address this imperfection, this work explores multi-agent Bayesian inference problems with a game theoretic social network model. Motivated by our interest in aggregate inference at the societal level, we propose a new concept, finite population learning , to address whether with high probability, a large fraction of people in a given finite population network can make "good" inference. Serving as a foundation, this concept enables us to study the long run trend of aggregate inference quality as population grows.

  3. Role of Utility and Inference in the Evolution of Functional Information

    Science.gov (United States)

    Sharov, Alexei A.

    2009-01-01

    Functional information means an encoded network of functions in living organisms from molecular signaling pathways to an organism’s behavior. It is represented by two components: code and an interpretation system, which together form a self-sustaining semantic closure. Semantic closure allows some freedom between components because small variations of the code are still interpretable. The interpretation system consists of inference rules that control the correspondence between the code and the function (phenotype) and determines the shape of the fitness landscape. The utility factor operates at multiple time scales: short-term selection drives evolution towards higher survival and reproduction rate within a given fitness landscape, and long-term selection favors those fitness landscapes that support adaptability and lead to evolutionary expansion of certain lineages. Inference rules make short-term selection possible by shaping the fitness landscape and defining possible directions of evolution, but they are under control of the long-term selection of lineages. Communication normally occurs within a set of agents with compatible interpretation systems, which I call communication system. Functional information cannot be directly transferred between communication systems with incompatible inference rules. Each biological species is a genetic communication system that carries unique functional information together with inference rules that determine evolutionary directions and constraints. This view of the relation between utility and inference can resolve the conflict between realism/positivism and pragmatism. Realism overemphasizes the role of inference in evolution of human knowledge because it assumes that logic is embedded in reality. Pragmatism substitutes usefulness for truth and therefore ignores the advantage of inference. The proposed concept of evolutionary pragmatism rejects the idea that logic is embedded in reality; instead, inference rules are

  4. Inference for shared-frailty survival models with left-truncated data

    NARCIS (Netherlands)

    van den Berg, G.J.; Drepper, B.

    2016-01-01

    Shared-frailty survival models specify that systematic unobserved determinants of duration outcomes are identical within groups of individuals. We consider random-effects likelihood-based statistical inference if the duration data are subject to left-truncation. Such inference with left-truncated

  5. Tsunami vulnerability and damage assessment in the coastal area of Rabat and Salé, Morocco

    Directory of Open Access Journals (Sweden)

    A. Atillah

    2011-12-01

    Full Text Available This study, a companion paper to Renou et al. (2011, focuses on the application of a GIS-based method to assess building vulnerability and damage in the event of a tsunami affecting the coastal area of Rabat and Salé, Morocco. This approach, designed within the framework of the European SCHEMA project (www.schemaproject.org is based on the combination of hazard results from numerical modelling of the worst case tsunami scenario (inundation depth based on the historical Lisbon earthquake of 1755 and the Portugal earthquake of 1969, together with vulnerability building types derived from Earth Observation data, field surveys and GIS data. The risk is then evaluated for this highly concentrated population area characterized by the implementation of a vast project of residential and touristic buildings within the flat area of the Bouregreg Valley separating the cities of Rabat and Salé. A GIS tool is used to derive building damage maps by crossing layers of inundation levels and building vulnerability. The inferred damage maps serve as a base for elaborating evacuation plans with appropriate rescue and relief processes and to prepare and consider appropriate measures to prevent the induced tsunami risk.

  6. Parametric inference for biological sequence analysis.

    Science.gov (United States)

    Pachter, Lior; Sturmfels, Bernd

    2004-11-16

    One of the major successes in computational biology has been the unification, by using the graphical model formalism, of a multitude of algorithms for annotating and comparing biological sequences. Graphical models that have been applied to these problems include hidden Markov models for annotation, tree models for phylogenetics, and pair hidden Markov models for alignment. A single algorithm, the sum-product algorithm, solves many of the inference problems that are associated with different statistical models. This article introduces the polytope propagation algorithm for computing the Newton polytope of an observation from a graphical model. This algorithm is a geometric version of the sum-product algorithm and is used to analyze the parametric behavior of maximum a posteriori inference calculations for graphical models.

  7. A technique for determining the deuterium/hydrogen contrast map in neutron macromolecular crystallography.

    Science.gov (United States)

    Chatake, Toshiyuki; Fujiwara, Satoru

    2016-01-01

    A difference in the neutron scattering length between hydrogen and deuterium leads to a high density contrast in neutron Fourier maps. In this study, a technique for determining the deuterium/hydrogen (D/H) contrast map in neutron macromolecular crystallography is developed and evaluated using ribonuclease A. The contrast map between the D2O-solvent and H2O-solvent crystals is calculated in real space, rather than in reciprocal space as performed in previous neutron D/H contrast crystallography. The present technique can thus utilize all of the amplitudes of the neutron structure factors for both D2O-solvent and H2O-solvent crystals. The neutron D/H contrast maps clearly demonstrate the powerful detectability of H/D exchange in proteins. In fact, alternative protonation states and alternative conformations of hydroxyl groups are observed at medium resolution (1.8 Å). Moreover, water molecules can be categorized into three types according to their tendency towards rotational disorder. These results directly indicate improvement in the neutron crystal structure analysis. This technique is suitable for incorporation into the standard structure-determination process used in neutron protein crystallography; consequently, more precise and efficient determination of the D-atom positions is possible using a combination of this D/H contrast technique and standard neutron structure-determination protocols.

  8. Improved Inference of Heteroscedastic Fixed Effects Models

    Directory of Open Access Journals (Sweden)

    Afshan Saeed

    2016-12-01

    Full Text Available Heteroscedasticity is a stern problem that distorts estimation and testing of panel data model (PDM. Arellano (1987 proposed the White (1980 estimator for PDM with heteroscedastic errors but it provides erroneous inference for the data sets including high leverage points. In this paper, our attempt is to improve heteroscedastic consistent covariance matrix estimator (HCCME for panel dataset with high leverage points. To draw robust inference for the PDM, our focus is to improve kernel bootstrap estimators, proposed by Racine and MacKinnon (2007. The Monte Carlo scheme is used for assertion of the results.

  9. A Bayesian Framework That Integrates Heterogeneous Data for Inferring Gene Regulatory Networks

    Energy Technology Data Exchange (ETDEWEB)

    Santra, Tapesh, E-mail: tapesh.santra@ucd.ie [Systems Biology Ireland, University College Dublin, Dublin (Ireland)

    2014-05-20

    Reconstruction of gene regulatory networks (GRNs) from experimental data is a fundamental challenge in systems biology. A number of computational approaches have been developed to infer GRNs from mRNA expression profiles. However, expression profiles alone are proving to be insufficient for inferring GRN topologies with reasonable accuracy. Recently, it has been shown that integration of external data sources (such as gene and protein sequence information, gene ontology data, protein–protein interactions) with mRNA expression profiles may increase the reliability of the inference process. Here, I propose a new approach that incorporates transcription factor binding sites (TFBS) and physical protein interactions (PPI) among transcription factors (TFs) in a Bayesian variable selection (BVS) algorithm which can infer GRNs from mRNA expression profiles subjected to genetic perturbations. Using real experimental data, I show that the integration of TFBS and PPI data with mRNA expression profiles leads to significantly more accurate networks than those inferred from expression profiles alone. Additionally, the performance of the proposed algorithm is compared with a series of least absolute shrinkage and selection operator (LASSO) regression-based network inference methods that can also incorporate prior knowledge in the inference framework. The results of this comparison suggest that BVS can outperform LASSO regression-based method in some circumstances.

  10. Inference of neuronal network spike dynamics and topology from calcium imaging data

    Directory of Open Access Journals (Sweden)

    Henry eLütcke

    2013-12-01

    Full Text Available Two-photon calcium imaging enables functional analysis of neuronal circuits by inferring action potential (AP occurrence ('spike trains' from cellular fluorescence signals. It remains unclear how experimental parameters such as signal-to-noise ratio (SNR and acquisition rate affect spike inference and whether additional information about network structure can be extracted. Here we present a simulation framework for quantitatively assessing how well spike dynamics and network topology can be inferred from noisy calcium imaging data. For simulated AP-evoked calcium transients in neocortical pyramidal cells, we analyzed the quality of spike inference as a function of SNR and data acquisition rate using a recently introduced peeling algorithm. Given experimentally attainable values of SNR and acquisition rate, neural spike trains could be reconstructed accurately and with up to millisecond precision. We then applied statistical neuronal network models to explore how remaining uncertainties in spike inference affect estimates of network connectivity and topological features of network organization. We define the experimental conditions suitable for inferring whether the network has a scale-free structure and determine how well hub neurons can be identified. Our findings provide a benchmark for future calcium imaging studies that aim to reliably infer neuronal network properties.

  11. A Bayesian Framework That Integrates Heterogeneous Data for Inferring Gene Regulatory Networks

    International Nuclear Information System (INIS)

    Santra, Tapesh

    2014-01-01

    Reconstruction of gene regulatory networks (GRNs) from experimental data is a fundamental challenge in systems biology. A number of computational approaches have been developed to infer GRNs from mRNA expression profiles. However, expression profiles alone are proving to be insufficient for inferring GRN topologies with reasonable accuracy. Recently, it has been shown that integration of external data sources (such as gene and protein sequence information, gene ontology data, protein–protein interactions) with mRNA expression profiles may increase the reliability of the inference process. Here, I propose a new approach that incorporates transcription factor binding sites (TFBS) and physical protein interactions (PPI) among transcription factors (TFs) in a Bayesian variable selection (BVS) algorithm which can infer GRNs from mRNA expression profiles subjected to genetic perturbations. Using real experimental data, I show that the integration of TFBS and PPI data with mRNA expression profiles leads to significantly more accurate networks than those inferred from expression profiles alone. Additionally, the performance of the proposed algorithm is compared with a series of least absolute shrinkage and selection operator (LASSO) regression-based network inference methods that can also incorporate prior knowledge in the inference framework. The results of this comparison suggest that BVS can outperform LASSO regression-based method in some circumstances.

  12. Cortical hierarchies perform Bayesian causal inference in multisensory perception.

    Directory of Open Access Journals (Sweden)

    Tim Rohe

    2015-02-01

    Full Text Available To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the "causal inference problem." Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI, and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation. At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion. Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world.

  13. Memory-Based Simple Heuristics as Attribute Substitution: Competitive Tests of Binary Choice Inference Models

    Science.gov (United States)

    Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro

    2017-01-01

    Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in…

  14. Hierarchical Active Inference: A Theory of Motivated Control.

    Science.gov (United States)

    Pezzulo, Giovanni; Rigoli, Francesco; Friston, Karl J

    2018-04-01

    Motivated control refers to the coordination of behaviour to achieve affectively valenced outcomes or goals. The study of motivated control traditionally assumes a distinction between control and motivational processes, which map to distinct (dorsolateral versus ventromedial) brain systems. However, the respective roles and interactions between these processes remain controversial. We offer a novel perspective that casts control and motivational processes as complementary aspects - goal propagation and prioritization, respectively - of active inference and hierarchical goal processing under deep generative models. We propose that the control hierarchy propagates prior preferences or goals, but their precision is informed by the motivational context, inferred at different levels of the motivational hierarchy. The ensuing integration of control and motivational processes underwrites action and policy selection and, ultimately, motivated behaviour, by enabling deep inference to prioritize goals in a context-sensitive way. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  15. SPEEDY: An Eclipse-based IDE for invariant inference

    Directory of Open Access Journals (Sweden)

    David R. Cok

    2014-04-01

    Full Text Available SPEEDY is an Eclipse-based IDE for exploring techniques that assist users in generating correct specifications, particularly including invariant inference algorithms and tools. It integrates with several back-end tools that propose invariants and will incorporate published algorithms for inferring object and loop invariants. Though the architecture is language-neutral, current SPEEDY targets C programs. Building and using SPEEDY has confirmed earlier experience demonstrating the importance of showing and editing specifications in the IDEs that developers customarily use, automating as much of the production and checking of specifications as possible, and showing counterexample information directly in the source code editing environment. As in previous work, automation of specification checking is provided by back-end SMT solvers. However, reducing the effort demanded of software developers using formal methods also requires a GUI design that guides users in writing, reviewing, and correcting specifications and automates specification inference.

  16. Non-robust dynamic inferences from macroeconometric models: Bifurcation stratification of confidence regions

    Science.gov (United States)

    Barnett, William A.; Duzhak, Evgeniya Aleksandrovna

    2008-06-01

    Grandmont [J.M. Grandmont, On endogenous competitive business cycles, Econometrica 53 (1985) 995-1045] found that the parameter space of the most classical dynamic models is stratified into an infinite number of subsets supporting an infinite number of different kinds of dynamics, from monotonic stability at one extreme to chaos at the other extreme, and with many forms of multiperiodic dynamics in between. The econometric implications of Grandmont’s findings are particularly important, if bifurcation boundaries cross the confidence regions surrounding parameter estimates in policy-relevant models. Stratification of a confidence region into bifurcated subsets seriously damages robustness of dynamical inferences. Recently, interest in policy in some circles has moved to New-Keynesian models. As a result, in this paper we explore bifurcation within the class of New-Keynesian models. We develop the econometric theory needed to locate bifurcation boundaries in log-linearized New-Keynesian models with Taylor policy rules or inflation-targeting policy rules. Central results needed in this research are our theorems on the existence and location of Hopf bifurcation boundaries in each of the cases that we consider.

  17. Efficient Exact Inference With Loss Augmented Objective in Structured Learning.

    Science.gov (United States)

    Bauer, Alexander; Nakajima, Shinichi; Muller, Klaus-Robert

    2016-08-19

    Structural support vector machine (SVM) is an elegant approach for building complex and accurate models with structured outputs. However, its applicability relies on the availability of efficient inference algorithms--the state-of-the-art training algorithms repeatedly perform inference to compute a subgradient or to find the most violating configuration. In this paper, we propose an exact inference algorithm for maximizing nondecomposable objectives due to special type of a high-order potential having a decomposable internal structure. As an important application, our method covers the loss augmented inference, which enables the slack and margin scaling formulations of structural SVM with a variety of dissimilarity measures, e.g., Hamming loss, precision and recall, Fβ-loss, intersection over union, and many other functions that can be efficiently computed from the contingency table. We demonstrate the advantages of our approach in natural language parsing and sequence segmentation applications.

  18. A general Bayes weibull inference model for accelerated life testing

    International Nuclear Information System (INIS)

    Dorp, J. Rene van; Mazzuchi, Thomas A.

    2005-01-01

    This article presents the development of a general Bayes inference model for accelerated life testing. The failure times at a constant stress level are assumed to belong to a Weibull distribution, but the specification of strict adherence to a parametric time-transformation function is not required. Rather, prior information is used to indirectly define a multivariate prior distribution for the scale parameters at the various stress levels and the common shape parameter. Using the approach, Bayes point estimates as well as probability statements for use-stress (and accelerated) life parameters may be inferred from a host of testing scenarios. The inference procedure accommodates both the interval data sampling strategy and type I censored sampling strategy for the collection of ALT test data. The inference procedure uses the well-known MCMC (Markov Chain Monte Carlo) methods to derive posterior approximations. The approach is illustrated with an example

  19. A linear programming model for protein inference problem in shotgun proteomics.

    Science.gov (United States)

    Huang, Ting; He, Zengyou

    2012-11-15

    Assembling peptides identified from tandem mass spectra into a list of proteins, referred to as protein inference, is an important issue in shotgun proteomics. The objective of protein inference is to find a subset of proteins that are truly present in the sample. Although many methods have been proposed for protein inference, several issues such as peptide degeneracy still remain unsolved. In this article, we present a linear programming model for protein inference. In this model, we use a transformation of the joint probability that each peptide/protein pair is present in the sample as the variable. Then, both the peptide probability and protein probability can be expressed as a formula in terms of the linear combination of these variables. Based on this simple fact, the protein inference problem is formulated as an optimization problem: minimize the number of proteins with non-zero probabilities under the constraint that the difference between the calculated peptide probability and the peptide probability generated from peptide identification algorithms should be less than some threshold. This model addresses the peptide degeneracy issue by forcing some joint probability variables involving degenerate peptides to be zero in a rigorous manner. The corresponding inference algorithm is named as ProteinLP. We test the performance of ProteinLP on six datasets. Experimental results show that our method is competitive with the state-of-the-art protein inference algorithms. The source code of our algorithm is available at: https://sourceforge.net/projects/prolp/. zyhe@dlut.edu.cn. Supplementary data are available at Bioinformatics Online.

  20. Statistical inference on residual life

    CERN Document Server

    Jeong, Jong-Hyeon

    2014-01-01

    This is a monograph on the concept of residual life, which is an alternative summary measure of time-to-event data, or survival data. The mean residual life has been used for many years under the name of life expectancy, so it is a natural concept for summarizing survival or reliability data. It is also more interpretable than the popular hazard function, especially for communications between patients and physicians regarding the efficacy of a new drug in the medical field. This book reviews existing statistical methods to infer the residual life distribution. The review and comparison includes existing inference methods for mean and median, or quantile, residual life analysis through medical data examples. The concept of the residual life is also extended to competing risks analysis. The targeted audience includes biostatisticians, graduate students, and PhD (bio)statisticians. Knowledge in survival analysis at an introductory graduate level is advisable prior to reading this book.