WorldWideScience

Sample records for macromolecular damage inferred

  1. Radiation damage to nucleoprotein complexes in macromolecular crystallography

    International Nuclear Information System (INIS)

    Bury, Charles; Garman, Elspeth F.; Ginn, Helen Mary; Ravelli, Raimond B. G.; Carmichael, Ian; Kneale, Geoff; McGeehan, John E.

    2015-01-01

    Quantitative X-ray induced radiation damage studies employing a model protein–DNA complex revealed a striking partition of damage sites. The DNA component was observed to be far more resistant to specific damage compared with the protein. Significant progress has been made in macromolecular crystallography over recent years in both the understanding and mitigation of X-ray induced radiation damage when collecting diffraction data from crystalline proteins. In contrast, despite the large field that is productively engaged in the study of radiation chemistry of nucleic acids, particularly of DNA, there are currently very few X-ray crystallographic studies on radiation damage mechanisms in nucleic acids. Quantitative comparison of damage to protein and DNA crystals separately is challenging, but many of the issues are circumvented by studying pre-formed biological nucleoprotein complexes where direct comparison of each component can be made under the same controlled conditions. Here a model protein–DNA complex C.Esp1396I is employed to investigate specific damage mechanisms for protein and DNA in a biologically relevant complex over a large dose range (2.07–44.63 MGy). In order to allow a quantitative analysis of radiation damage sites from a complex series of macromolecular diffraction data, a computational method has been developed that is generally applicable to the field. Typical specific damage was observed for both the protein on particular amino acids and for the DNA on, for example, the cleavage of base-sugar N 1 —C and sugar-phosphate C—O bonds. Strikingly the DNA component was determined to be far more resistant to specific damage than the protein for the investigated dose range. At low doses the protein was observed to be susceptible to radiation damage while the DNA was far more resistant, damage only being observed at significantly higher doses

  2. Bayesian inference method for stochastic damage accumulation modeling

    International Nuclear Information System (INIS)

    Jiang, Xiaomo; Yuan, Yong; Liu, Xian

    2013-01-01

    Damage accumulation based reliability model plays an increasingly important role in successful realization of condition based maintenance for complicated engineering systems. This paper developed a Bayesian framework to establish stochastic damage accumulation model from historical inspection data, considering data uncertainty. Proportional hazards modeling technique is developed to model the nonlinear effect of multiple influencing factors on system reliability. Different from other hazard modeling techniques such as normal linear regression model, the approach does not require any distribution assumption for the hazard model, and can be applied for a wide variety of distribution models. A Bayesian network is created to represent the nonlinear proportional hazards models and to estimate model parameters by Bayesian inference with Markov Chain Monte Carlo simulation. Both qualitative and quantitative approaches are developed to assess the validity of the established damage accumulation model. Anderson–Darling goodness-of-fit test is employed to perform the normality test, and Box–Cox transformation approach is utilized to convert the non-normality data into normal distribution for hypothesis testing in quantitative model validation. The methodology is illustrated with the seepage data collected from real-world subway tunnels.

  3. Macromolecular therapeutics.

    Science.gov (United States)

    Yang, Jiyuan; Kopeček, Jindřich

    2014-09-28

    This review covers water-soluble polymer-drug conjugates and macromolecules that possess biological activity without attached low molecular weight drugs. The main design principles of traditional and backbone degradable polymer-drug conjugates as well as the development of a new paradigm in nanomedicines - (low molecular weight) drug-free macromolecular therapeutics are discussed. To address the biological features of cancer, macromolecular therapeutics directed to stem/progenitor cells and the tumor microenvironment are deliberated. Finally, the future perspectives of the field are briefly debated. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Practical macromolecular cryocrystallography

    Energy Technology Data Exchange (ETDEWEB)

    Pflugrath, J. W., E-mail: jim.pflugrath@gmail.com [Rigaku Americas Corp., 9009 New Trails Drive, The Woodlands, TX 77381 (United States)

    2015-05-27

    Current methods, reagents and experimental hardware for successfully and reproducibly flash-cooling macromolecular crystals to cryogenic temperatures for X-ray diffraction data collection are reviewed. Cryocrystallography is an indispensable technique that is routinely used for single-crystal X-ray diffraction data collection at temperatures near 100 K, where radiation damage is mitigated. Modern procedures and tools to cryoprotect and rapidly cool macromolecular crystals with a significant solvent fraction to below the glass-transition phase of water are reviewed. Reagents and methods to help prevent the stresses that damage crystals when flash-cooling are described. A method of using isopentane to assess whether cryogenic temperatures have been preserved when dismounting screened crystals is also presented.

  5. Inferring Gear Damage from Oil-Debris and Vibration Data

    Science.gov (United States)

    Dempsey, Paula

    2006-01-01

    A system for real-time detection of surface-fatigue-pitting damage to gears for use in a helicopter transmission is based on fuzzy-logic used to fuse data from sensors that measure oil-borne debris, referred to as "oil debris" in the article, and vibration signatures. A system to detect helicopter-transmission gear damage is beneficial because the power train of a helicopter is essential for propulsion, lift, and maneuvering, hence, the integrity of the transmission is critical to helicopter safety. To enable detection of an impending transmission failure, an ideal diagnostic system should provide real-time monitoring of the "health" of the transmission, be capable of a high level of reliable detection (with minimization of false alarms), and provide human users with clear information on the health of the system without making it necessary for them to interpret large amounts of sensor data.

  6. Detection of multiple damages employing best achievable eigenvectors under Bayesian inference

    Science.gov (United States)

    Prajapat, Kanta; Ray-Chaudhuri, Samit

    2018-05-01

    A novel approach is presented in this work to localize simultaneously multiple damaged elements in a structure along with the estimation of damage severity for each of the damaged elements. For detection of damaged elements, a best achievable eigenvector based formulation has been derived. To deal with noisy data, Bayesian inference is employed in the formulation wherein the likelihood of the Bayesian algorithm is formed on the basis of errors between the best achievable eigenvectors and the measured modes. In this approach, the most probable damage locations are evaluated under Bayesian inference by generating combinations of various possible damaged elements. Once damage locations are identified, damage severities are estimated using a Bayesian inference Markov chain Monte Carlo simulation. The efficiency of the proposed approach has been demonstrated by carrying out a numerical study involving a 12-story shear building. It has been found from this study that damage scenarios involving as low as 10% loss of stiffness in multiple elements are accurately determined (localized and severities quantified) even when 2% noise contaminated modal data are utilized. Further, this study introduces a term parameter impact (evaluated based on sensitivity of modal parameters towards structural parameters) to decide the suitability of selecting a particular mode, if some idea about the damaged elements are available. It has been demonstrated here that the accuracy and efficiency of the Bayesian quantification algorithm increases if damage localization is carried out a-priori. An experimental study involving a laboratory scale shear building and different stiffness modification scenarios shows that the proposed approach is efficient enough to localize the stories with stiffness modification.

  7. Indicators of Macromolecular Oxidative Damage and Antioxidant Defence Examinees Exposed to the Radar Frequencies 1.5 - 10.9 GHz

    International Nuclear Information System (INIS)

    Marjanovic, A.M.; Flajs, D.; Pavicic, I.; Domijan, A.

    2011-01-01

    Radar is an object-detection system which uses microwaves (Mw). As a result of increased use of radar there is a rising concern regarding health effects of Mw radiation on human body. Living organisms are complex electrochemical systems being evolved in a relatively narrow range of well-defined environmental parameters. For life to be maintained these parameters must be kept within their normal range, since deviations can induce biochemical effects causing cell function impairment and disease. Some theories indicate connection between Mw radiation, oxidative damage as well as antioxidant defence of organism. Aim of this study was to evaluate level and damage of macromolecular structures - proteins and lipids in blood of men occupationally exposed to Mw radiation. Concentration of glutathione (GSH), a known indicator of organism antioxidant defence, was also determined. Blood samples were collected from 27 male workers occupationally exposed to radar frequencies 1.5 to 10.9 GHz. Corresponding control group (N = 8) was a part of study. Concentrations of total and oxidised proteins, protein carbonyls, and GSH were measured by spectrophotometric method, while malondialdeyde (MDA), product of lipid peroxidation, was determined by high performance liquid chromatography (HPLC). Gained concentrations of oxidised proteins, GSH and MDA were presented in relation to total proteins. Concentration of oxidised proteins between control and exposed group of examinees did not show any significant statistical difference. However, concentration of GSH in exposed group was found considerably decreased, while concentration of MDA was found to be increased. Results indicate that Mw radiation of radar operating at frequencies 1.5 - 10.9 GHz could cause damage to proteins and lipids in addition to impairment of antioxidant defence of organism. (author)

  8. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    2010-01-01

    Chapter 9: This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based on a maximum likelihood or Bayesian approach combined with markov chain Monte Carlo...... (MCMC) techniques. Due to space limitations the focus is on spatial point processes....

  9. Inference Generation during Text Comprehension by Adults with Right Hemisphere Brain Damage: Activation Failure Versus Multiple Activation.

    Science.gov (United States)

    Tompkins, Connie A.; Fassbinder, Wiltrud; Blake, Margaret Lehman; Baumgaertner, Annette; Jayaram, Nandini

    2004-01-01

    ourse comprehensionEvidence conflicts as to whether adults with right hemisphere brain damage (RHD) generate inferences during text comprehension. M. Beeman (1993) reported that adults with RHD fail to activate the lexical-semantic bases of routine bridging inferences, which are necessary for comprehension. But other evidence indicates that adults…

  10. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    (This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.1 with the ......(This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.......1 with the title ‘Inference'.) This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods using Markov chain Monte Carlo (MCMC) simulations. Due to space limitations the focus...

  11. The effect of ancient DNA damage on inferences of demographic histories

    DEFF Research Database (Denmark)

    Axelsson, Erik; Willerslev, Eske; Gilbert, Marcus Thomas Pius

    2008-01-01

    The field of ancient DNA (aDNA) is casting new light on many evolutionary questions. However, problems associated with the postmortem instability of DNA may complicate the interpretation of aDNA data. For example, in population genetic studies, the inclusion of damaged DNA may inflate estimates o...... for a change in effective population size in this data set vanishes once the effects of putative damage are removed. Our results suggest that population genetic analyses of aDNA sequences, which do not accurately account for damage, should be interpreted with great caution....

  12. Macromolecular crystallization in microgravity

    International Nuclear Information System (INIS)

    Snell, Edward H; Helliwell, John R

    2005-01-01

    Density difference fluid flows and sedimentation of growing crystals are greatly reduced when crystallization takes place in a reduced gravity environment. In the case of macromolecular crystallography a crystal of a biological macromolecule is used for diffraction experiments (x-ray or neutron) so as to determine the three-dimensional structure of the macromolecule. The better the internal order of the crystal then the greater the molecular structure detail that can be extracted. It is this structural information that enables an understanding of how the molecule functions. This knowledge is changing the biological and chemical sciences, with major potential in understanding disease pathologies. In this review, we examine the use of microgravity as an environment to grow macromolecular crystals. We describe the crystallization procedures used on the ground, how the resulting crystals are studied and the knowledge obtained from those crystals. We address the features desired in an ordered crystal and the techniques used to evaluate those features in detail. We then introduce the microgravity environment, the techniques to access that environment and the theory and evidence behind the use of microgravity for crystallization experiments. We describe how ground-based laboratory techniques have been adapted to microgravity flights and look at some of the methods used to analyse the resulting data. Several case studies illustrate the physical crystal quality improvements and the macromolecular structural advances. Finally, limitations and alternatives to microgravity and future directions for this research are covered. Macromolecular structural crystallography in general is a remarkable field where physics, biology, chemistry and mathematics meet to enable insight to the fundamentals of life. As the reader will see, there is a great deal of physics involved when the microgravity environment is applied to crystallization, some of it known, and undoubtedly much yet to

  13. Probabilistic inference of fatigue damage propagation with limited and partial information

    Directory of Open Access Journals (Sweden)

    Huang Min

    2015-08-01

    Full Text Available A general method of probabilistic fatigue damage prognostics using limited and partial information is developed. Limited and partial information refers to measurable data that are not enough or cannot directly be used to statistically identify model parameter using traditional regression analysis. In the proposed method, the prior probability distribution of model parameters is derived based on the principle of maximum entropy (MaxEnt using the limited and partial information as constraints. The posterior distribution is formulated using the principle of maximum relative entropy (MRE to perform probability updating when new information is available and reduces uncertainty in prognosis results. It is shown that the posterior distribution is equivalent to a Bayesian posterior when the new information used for updating is point measurements. A numerical quadrature interpolating method is used to calculate the asymptotic approximation for the prior distribution. Once the prior is obtained, subsequent measurement data are used to perform updating using Markov chain Monte Carlo (MCMC simulations. Fatigue crack prognosis problems with experimental data are presented for demonstration and validation.

  14. Macromolecular crystallography using synchrotron radiation

    International Nuclear Information System (INIS)

    Bartunik, H.D.; Phillips, J.C.; Fourme, R.

    1982-01-01

    The use of synchrotron X-ray sources in macromolecular crystallography is described. The properties of synchrotron radiation relevant to macromolecular crystallography are examined. The applications discussed include anomalous dispersion techniques, the acquisition of normal and high resolution data, and kinetic studies of structural changes in macromolecules; protein data are presented illustrating these applications. The apparatus used is described including information on the electronic detectors, the monitoring of the incident beam and crystal cooling. (U.K.)

  15. Geometry of the Nojima fault at Nojima-Hirabayashi, Japan - I. A simple damage structure inferred from borehole core permeability

    Science.gov (United States)

    Lockner, David A.; Tanaka, Hidemi; Ito, Hisao; Ikeda, Ryuji; Omura, Kentaro; Naka, Hisanobu

    2009-01-01

    The 1995 Kobe (Hyogo-ken Nanbu) earthquake, M = 7.2, ruptured the Nojima fault in southwest Japan. We have studied core samples taken from two scientific drillholes that crossed the fault zone SW of the epicentral region on Awaji Island. The shallower hole, drilled by the Geological Survey of Japan (GSJ), was started 75 m to the SE of the surface trace of the Nojima fault and crossed the fault at a depth of 624 m. A deeper hole, drilled by the National Research Institute for Earth Science and Disaster Prevention (NIED) was started 302 m to the SE of the fault and crossed fault strands below a depth of 1140 m. We have measured strength and matrix permeability of core samples taken from these two drillholes. We find a strong correlation between permeability and proximity to the fault zone shear axes. The half-width of the high permeability zone (approximately 15 to 25 m) is in good agreement with the fault zone width inferred from trapped seismic wave analysis and other evidence. The fault zone core or shear axis contains clays with permeabilities of approximately 0.1 to 1 microdarcy at 50 MPa effective confining pressure (10 to 30 microdarcy at in situ pressures). Within a few meters of the fault zone core, the rock is highly fractured but has sustained little net shear. Matrix permeability of this zone is approximately 30 to 60 microdarcy at 50 MPa effective confining pressure (300 to 1000 microdarcy at in situ pressures). Outside this damage zone, matrix permeability drops below 0.01 microdarcy. The clay-rich core material has the lowest strength with a coefficient of friction of approximately 0.55. Shear strength increases with distance from the shear axis. These permeability and strength observations reveal a simple fault zone structure with a relatively weak fine-grained core surrounded by a damage zone of fractured rock. In this case, the damage zone will act as a high-permeability conduit for vertical and horizontal flow in the plane of the

  16. Recent advances in macromolecular prodrugs

    DEFF Research Database (Denmark)

    Riber, Camilla Frich; Zelikin, Alexander N.

    2017-01-01

    Macromolecular prodrugs (MP) are high molar mass conjugates, typically carrying several copies of a drug or a drug combination, designed to optimize delivery of the drug, that is — its pharmacokinetics. From its advent several decades ago, design of MP has undergone significant development and es...

  17. Structure studies of macromolecular systems

    Czech Academy of Sciences Publication Activity Database

    Hašek, Jindřich; Dohnálek, Jan; Skálová, Tereza; Dušková, Jarmila; Kolenko, Petr

    2006-01-01

    Roč. 13, č. 3 (2006), s. 136 ISSN 1211-5894. [Czech and Slovak Crystallographic Colloquium. 22.06.2006-24.06.2006, Grenoble] R&D Projects: GA AV ČR IAA4050811; GA MŠk 1K05008 Keywords : structure * X-ray diffraction * synchrotron Subject RIV: CD - Macromolecular Chemistry http://www. xray .cz/ms/default.htm

  18. Progress in rational methods of cryoprotection in macromolecular crystallography

    International Nuclear Information System (INIS)

    Alcorn, Thomas; Juers, Douglas H.

    2010-01-01

    Measurements of the average thermal contractions (294→72 K) of 26 different cryosolutions are presented and discussed in conjunction with other recent advances in the rational design of protocols for cryogenic cooling in macromolecular crystallography. Cryogenic cooling of macromolecular crystals is commonly used for X-ray data collection both to reduce crystal damage from radiation and to gather functional information by cryogenically trapping intermediates. However, the cooling process can damage the crystals. Limiting cooling-induced crystal damage often requires cryoprotection strategies, which can involve substantial screening of solution conditions and cooling protocols. Here, recent developments directed towards rational methods for cryoprotection are described. Crystal damage is described in the context of the temperature response of the crystal as a thermodynamic system. As such, the internal and external parts of the crystal typically have different cryoprotection requirements. A key physical parameter, the thermal contraction, of 26 different cryoprotective solutions was measured between 294 and 72 K. The range of contractions was 2–13%, with the more polar cryosolutions contracting less. The potential uses of these results in the development of cryocooling conditions, as well as recent developments in determining minimum cryosolution soaking times, are discussed

  19. Macromolecular systems for vaccine delivery.

    Science.gov (United States)

    MuŽíková, G; Laga, R

    2016-10-20

    Vaccines have helped considerably in eliminating some life-threatening infectious diseases in past two hundred years. Recently, human medicine has focused on vaccination against some of the world's most common infectious diseases (AIDS, malaria, tuberculosis, etc.), and vaccination is also gaining popularity in the treatment of cancer or autoimmune diseases. The major limitation of current vaccines lies in their poor ability to generate a sufficient level of protective antibodies and T cell responses against diseases such as HIV, malaria, tuberculosis and cancers. Among the promising vaccination systems that could improve the potency of weakly immunogenic vaccines belong macromolecular carriers (water soluble polymers, polymer particels, micelles, gels etc.) conjugated with antigens and immunistumulatory molecules. The size, architecture, and the composition of the high molecular-weight carrier can significantly improve the vaccine efficiency. This review includes the most recently developed (bio)polymer-based vaccines reported in the literature.

  20. Collagen macromolecular drug delivery systems

    International Nuclear Information System (INIS)

    Gilbert, D.L.

    1988-01-01

    The objective of this study was to examine collagen for use as a macromolecular drug delivery system by determining the mechanism of release through a matrix. Collagen membranes varying in porosity, crosslinking density, structure and crosslinker were fabricated. Collagen characterized by infrared spectroscopy and solution viscosity was determined to be pure and native. The collagen membranes were determined to possess native vs. non-native quaternary structure and porous vs. dense aggregate membranes by electron microscopy. Collagen monolithic devices containing a model macromolecule (inulin) were fabricated. In vitro release rates were found to be linear with respect to t 1/2 and were affected by crosslinking density, crosslinker and structure. The biodegradation of the collagen matrix was also examined. In vivo biocompatibility, degradation and 14 C-inulin release rates were evaluated subcutaneously in rats

  1. Macromolecular crystallography research at Trombay

    International Nuclear Information System (INIS)

    Kannan, K.K.; Chidamrabam, R.

    1983-01-01

    Neutron diffraction studies of hydrogen positions in small molecules of biological interest at Trombay have provided valuable information that has been used in protein and enzyme structure model-building and in developing hydrogen bond potential functions. The new R-5 reactor is expected to provide higher neutron fluxes and also make possible small-angle neutron scattering studies of large biomolecules and bio-aggregates. In the last few years infrastructure facilities have also been established for macromolecular x-ray crystallography research. Meanwhile, the refinement of carbonic hydrases and lyysozyme structures have been carried out and interesting results obtained on protein dynamics and structure-function relationships. Some interesting presynaptic toxin phospholipases have also taken up for study. (author)

  2. Status and prospects of macromolecular crystallography

    Indian Academy of Sciences (India)

    technique that could be completely automated in most cases. ... major challenge in macromolecular crystallography today is ... tial characterization of crystals in the home source and make a ... opportunities for a generation of structural biolo-.

  3. Macromolecular synthesis in algal cells

    International Nuclear Information System (INIS)

    Ishida, M.R.; Kikuchi, Tadatoshi

    1980-01-01

    The present paper is a review of our experimental results obtained previously on the macromolecular biosyntheses in the cells of blue-green alga Anacystis nidulans as a representative species of prokaryote, and also in those of three species of eukaryotic algae, i.e. Euglena gracilis strain Z, Chlamydomonas reinhardi, and Cyanidium caldarium. In these algal cells, the combined methods consisting of pulse-labelling using 32 P, 3 H- and 14 C-labelled precursors for macromolecules, of their chasing and of the use of inhibitors which block specifically the syntheses of macromolecules such as proteins, RNA and DNA in living cells were very effectively applied for the analyses of the regulatory mechanism in biosyntheses of macromolecules and of the mode of their assembly into the cell structure, especially organelle constituents. Rased on the results obtained thus, the following conclusions are reached: (1) the metabolic pool for syntheses of macromolecules in the cells of prokaryotic blue-green alga is limited to the small extent and such activities couple largely with the photosynthetic mechanism; (2) 70 S ribosomes in the blue-green algal cells are assembled on the surface of thylakoid membranes widely distributed in their cytoplasm; and (3) the cells of eukaryotic unicellular algae used here have biochemical characters specific for already differentiated enzyme system involving in transcription and translation machineries as the same as in higher organisms, but the control mechanism concerning with such macromolecule syntheses are different among each species. (author)

  4. The design of macromolecular crystallography diffraction experiments

    International Nuclear Information System (INIS)

    Evans, Gwyndaf; Axford, Danny; Owen, Robin L.

    2011-01-01

    Thoughts about the decisions made in designing macromolecular X-ray crystallography experiments at synchrotron beamlines are presented. The measurement of X-ray diffraction data from macromolecular crystals for the purpose of structure determination is the convergence of two processes: the preparation of diffraction-quality crystal samples on the one hand and the construction and optimization of an X-ray beamline and end station on the other. Like sample preparation, a macromolecular crystallography beamline is geared to obtaining the best possible diffraction measurements from crystals provided by the synchrotron user. This paper describes the thoughts behind an experiment that fully exploits both the sample and the beamline and how these map into everyday decisions that users can and should make when visiting a beamline with their most precious crystals

  5. Automated data collection for macromolecular crystallography.

    Science.gov (United States)

    Winter, Graeme; McAuley, Katherine E

    2011-09-01

    An overview, together with some practical advice, is presented of the current status of the automation of macromolecular crystallography (MX) data collection, with a focus on MX beamlines at Diamond Light Source, UK. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. In situ macromolecular crystallography using microbeams.

    Science.gov (United States)

    Axford, Danny; Owen, Robin L; Aishima, Jun; Foadi, James; Morgan, Ann W; Robinson, James I; Nettleship, Joanne E; Owens, Raymond J; Moraes, Isabel; Fry, Elizabeth E; Grimes, Jonathan M; Harlos, Karl; Kotecha, Abhay; Ren, Jingshan; Sutton, Geoff; Walter, Thomas S; Stuart, David I; Evans, Gwyndaf

    2012-05-01

    Despite significant progress in high-throughput methods in macromolecular crystallography, the production of diffraction-quality crystals remains a major bottleneck. By recording diffraction in situ from crystals in their crystallization plates at room temperature, a number of problems associated with crystal handling and cryoprotection can be side-stepped. Using a dedicated goniometer installed on the microfocus macromolecular crystallography beamline I24 at Diamond Light Source, crystals have been studied in situ with an intense and flexible microfocus beam, allowing weakly diffracting samples to be assessed without a manual crystal-handling step but with good signal to noise, despite the background scatter from the plate. A number of case studies are reported: the structure solution of bovine enterovirus 2, crystallization screening of membrane proteins and complexes, and structure solution from crystallization hits produced via a high-throughput pipeline. These demonstrate the potential for in situ data collection and structure solution with microbeams. © 2012 International Union of Crystallography

  7. Macromolecular Networks Containing Fluorinated Cyclic Moieties

    Science.gov (United States)

    2015-12-12

    Briefing Charts 3. DATES COVERED (From - To) 17 Nov 2015 – 12 Dec 2015 4. TITLE AND SUBTITLE Macromolecular Networks Containing Fluorinated Cyclic... FLUORINATED CYCLIC MOIETIES 12 December 2015 Andrew J. Guenthner,1 Scott T. Iacono,2 Cynthia A. Corley,2 Christopher M. Sahagun,3 Kevin R. Lamison,4...Reinforcements Good Flame, Smoke, & Toxicity Characteristics Low Water Uptake with Near Zero Coefficient of Hygroscopic Expansion ∆ DISTRIBUTION A

  8. Macromolecular nanotheranostics for multimodal anticancer therapy

    Science.gov (United States)

    Huis in't Veld, Ruben; Storm, Gert; Hennink, Wim E.; Kiessling, Fabian; Lammers, Twan

    2011-10-01

    Macromolecular carrier materials based on N-(2-hydroxypropyl)methacrylamide (HPMA) are prototypic and well-characterized drug delivery systems that have been extensively evaluated in the past two decades, both at the preclinical and at the clinical level. Using several different imaging agents and techniques, HPMA copolymers have been shown to circulate for prolonged periods of time, and to accumulate in tumors both effectively and selectively by means of the Enhanced Permeability and Retention (EPR) effect. Because of this, HPMA-based macromolecular nanotheranostics, i.e. formulations containing both drug and imaging agents within a single formulation, have been shown to be highly effective in inducing tumor growth inhibition in animal models. In patients, however, as essentially all other tumor-targeted nanomedicines, they are generally only able to improve the therapeutic index of the attached active agent by lowering its toxicity, and they fail to improve the efficacy of the intervention. Bearing this in mind, we have recently reasoned that because of their biocompatibility and their beneficial biodistribution, nanomedicine formulations might be highly suitable systems for combination therapies. In the present manuscript, we briefly summarize several exemplary efforts undertaken in this regard in our labs in the past couple of years, and we show that long-circulating and passively tumor-targeted macromolecular nanotheranostics can be used to improve the efficacy of radiochemotherapy and of chemotherapy combinations.

  9. Effects of far-ultraviolet radiation and oxygen on macromolecular synthesis and protein induction in Bacteroides fragilis BF-2

    International Nuclear Information System (INIS)

    Schumann, J.P.

    1983-11-01

    The study deals with the effects of far-UV radiation, oxygen and hydrogen peroxide on macromolecular synthesis and viability in the obligate anaerobe, Bacteroides fragilis, as well as the specific proteins induced in this organism by these different DNA damaging agents. Irradiation of Bacteroides fragilis cells with far-UV light (254 nm) under anaerobic conditions resulted in the immediate, rapid and extensive degradation of DNA which continued for 40 to 60 min after irradiation. DNA degradation after irradiation was inhibited by chloramphenicol and caffeine. RNA and protein synthesis were decreased by UV irradiation and the degree of inhibition was proportional to the UV dose. Colony formation was not affected immediately by UV irradiation and continued for a dose-dependent period prior to inhibition. The relationship between the DNA damage-induced proteins, macromolecular synthesis in damaged B. fragilis cells and the observed physiological responses and inducible repair phenomena after the different DNA damaging treatments in this anaerobe are discussed

  10. A public database of macromolecular diffraction experiments.

    Science.gov (United States)

    Grabowski, Marek; Langner, Karol M; Cymborowski, Marcin; Porebski, Przemyslaw J; Sroka, Piotr; Zheng, Heping; Cooper, David R; Zimmerman, Matthew D; Elsliger, Marc André; Burley, Stephen K; Minor, Wladek

    2016-11-01

    The low reproducibility of published experimental results in many scientific disciplines has recently garnered negative attention in scientific journals and the general media. Public transparency, including the availability of `raw' experimental data, will help to address growing concerns regarding scientific integrity. Macromolecular X-ray crystallography has led the way in requiring the public dissemination of atomic coordinates and a wealth of experimental data, making the field one of the most reproducible in the biological sciences. However, there remains no mandate for public disclosure of the original diffraction data. The Integrated Resource for Reproducibility in Macromolecular Crystallography (IRRMC) has been developed to archive raw data from diffraction experiments and, equally importantly, to provide related metadata. Currently, the database of our resource contains data from 2920 macromolecular diffraction experiments (5767 data sets), accounting for around 3% of all depositions in the Protein Data Bank (PDB), with their corresponding partially curated metadata. IRRMC utilizes distributed storage implemented using a federated architecture of many independent storage servers, which provides both scalability and sustainability. The resource, which is accessible via the web portal at http://www.proteindiffraction.org, can be searched using various criteria. All data are available for unrestricted access and download. The resource serves as a proof of concept and demonstrates the feasibility of archiving raw diffraction data and associated metadata from X-ray crystallographic studies of biological macromolecules. The goal is to expand this resource and include data sets that failed to yield X-ray structures in order to facilitate collaborative efforts that will improve protein structure-determination methods and to ensure the availability of `orphan' data left behind for various reasons by individual investigators and/or extinct structural genomics

  11. Celebrating macromolecular crystallography: A personal perspective

    Directory of Open Access Journals (Sweden)

    Abad-Zapatero, Celerino

    2015-04-01

    Full Text Available The twentieth century has seen an enormous advance in the knowledge of the atomic structures that surround us. The discovery of the first crystal structures of simple inorganic salts by the Braggs in 1914, using the diffraction of X-rays by crystals, provided the critical elements to unveil the atomic structure of matter. Subsequent developments in the field leading to macromolecular crystallography are presented with a personal perspective, related to the cultural milieu of Spain in the late 1950’s. The journey of discovery of the author, as he developed professionally, is interwoven with the expansion of macromolecular crystallography from the first proteins (myoglobin, hemoglobin to the ‘coming of age’ of the field in 1971 and the discoveries that followed, culminating in the determination of the structure of the ribosomes at the turn of the century. A perspective is presented exploring the future of the field and also a reflection about the future generations of Spanish scientists.El siglo XX ha sido testigo del increíble avance que ha experimentado el conocimiento de la estructura atómica de la materia que nos rodea. El descubrimiento de las primeras estructuras atómicas de sales inorgánicas por los Bragg en 1914, empleando difracción de rayos X con cristales, proporcionó los elementos clave para alcanzar tal conocimiento. Posteriores desarrollos en este campo, que condujeron a la cristalografía macromolecular, se presentan aquí desde una perspectiva personal, relacionada con el contexto cultural de la España de la década de los 50. La experiencia del descubrimiento científico, durante mi desarrollo profesional, se integra en el desarrollo de la cristalografía macromolecular, desde las primeras proteínas (míoglobina y hemoglobina, hasta su madurez en 1971 que, con los posteriores descubrimientos, culmina con la determinación del la estructura del ribosoma. Asimismo, se explora el futuro de esta disciplina y se

  12. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography

    International Nuclear Information System (INIS)

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L.; Armour, Wes; Waterman, David G.; Iwata, So; Evans, Gwyndaf

    2013-01-01

    A systematic approach to the scaling and merging of data from multiple crystals in macromolecular crystallography is introduced and explained. The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of <10 µm in size. The increased likelihood of severe radiation damage where microcrystals or particularly sensitive crystals are used forces crystallographers to acquire large numbers of data sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein

  13. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Foadi, James [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Imperial College, London SW7 2AZ (United Kingdom); Aller, Pierre [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Alguel, Yilmaz; Cameron, Alex [Imperial College, London SW7 2AZ (United Kingdom); Axford, Danny; Owen, Robin L. [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Armour, Wes [Oxford e-Research Centre (OeRC), Keble Road, Oxford OX1 3QG (United Kingdom); Waterman, David G. [Research Complex at Harwell (RCaH), Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0FA (United Kingdom); Iwata, So [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Imperial College, London SW7 2AZ (United Kingdom); Evans, Gwyndaf, E-mail: gwyndaf.evans@diamond.ac.uk [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom)

    2013-08-01

    A systematic approach to the scaling and merging of data from multiple crystals in macromolecular crystallography is introduced and explained. The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of <10 µm in size. The increased likelihood of severe radiation damage where microcrystals or particularly sensitive crystals are used forces crystallographers to acquire large numbers of data sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein.

  14. Entropic Inference

    Science.gov (United States)

    Caticha, Ariel

    2011-03-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes' rule, and therefore unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme.

  15. In situ macromolecular crystallography using microbeams

    Energy Technology Data Exchange (ETDEWEB)

    Axford, Danny; Owen, Robin L.; Aishima, Jun [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Foadi, James [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Imperial College, London SW7 2AZ (United Kingdom); Morgan, Ann W.; Robinson, James I. [University of Leeds, Leeds LS9 7FT (United Kingdom); Nettleship, Joanne E.; Owens, Raymond J. [Research Complex at Harwell, Rutherford Appleton Laboratory R92, Didcot, Oxfordshire OX11 0DE (United Kingdom); Moraes, Isabel [Imperial College, London SW7 2AZ (United Kingdom); Fry, Elizabeth E.; Grimes, Jonathan M.; Harlos, Karl; Kotecha, Abhay; Ren, Jingshan; Sutton, Geoff; Walter, Thomas S. [University of Oxford, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Stuart, David I. [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); University of Oxford, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Evans, Gwyndaf, E-mail: gwyndaf.evans@diamond.ac.uk [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom)

    2012-04-17

    A sample environment for mounting crystallization trays has been developed on the microfocus beamline I24 at Diamond Light Source. The technical developments and several case studies are described. Despite significant progress in high-throughput methods in macromolecular crystallography, the production of diffraction-quality crystals remains a major bottleneck. By recording diffraction in situ from crystals in their crystallization plates at room temperature, a number of problems associated with crystal handling and cryoprotection can be side-stepped. Using a dedicated goniometer installed on the microfocus macromolecular crystallography beamline I24 at Diamond Light Source, crystals have been studied in situ with an intense and flexible microfocus beam, allowing weakly diffracting samples to be assessed without a manual crystal-handling step but with good signal to noise, despite the background scatter from the plate. A number of case studies are reported: the structure solution of bovine enterovirus 2, crystallization screening of membrane proteins and complexes, and structure solution from crystallization hits produced via a high-throughput pipeline. These demonstrate the potential for in situ data collection and structure solution with microbeams.

  16. In situ macromolecular crystallography using microbeams

    International Nuclear Information System (INIS)

    Axford, Danny; Owen, Robin L.; Aishima, Jun; Foadi, James; Morgan, Ann W.; Robinson, James I.; Nettleship, Joanne E.; Owens, Raymond J.; Moraes, Isabel; Fry, Elizabeth E.; Grimes, Jonathan M.; Harlos, Karl; Kotecha, Abhay; Ren, Jingshan; Sutton, Geoff; Walter, Thomas S.; Stuart, David I.; Evans, Gwyndaf

    2012-01-01

    A sample environment for mounting crystallization trays has been developed on the microfocus beamline I24 at Diamond Light Source. The technical developments and several case studies are described. Despite significant progress in high-throughput methods in macromolecular crystallography, the production of diffraction-quality crystals remains a major bottleneck. By recording diffraction in situ from crystals in their crystallization plates at room temperature, a number of problems associated with crystal handling and cryoprotection can be side-stepped. Using a dedicated goniometer installed on the microfocus macromolecular crystallography beamline I24 at Diamond Light Source, crystals have been studied in situ with an intense and flexible microfocus beam, allowing weakly diffracting samples to be assessed without a manual crystal-handling step but with good signal to noise, despite the background scatter from the plate. A number of case studies are reported: the structure solution of bovine enterovirus 2, crystallization screening of membrane proteins and complexes, and structure solution from crystallization hits produced via a high-throughput pipeline. These demonstrate the potential for in situ data collection and structure solution with microbeams

  17. Distributional Inference

    NARCIS (Netherlands)

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  18. Entropic Inference

    OpenAIRE

    Caticha, Ariel

    2010-01-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEn...

  19. The role of macromolecular stability in desiccation tolerance

    NARCIS (Netherlands)

    Wolkers, W.F.

    1998-01-01

    The work presented in this thesis concerns a study on the molecular interactions that play a role in the macromolecular stability of desiccation-tolerant higher plant organs. Fourier transform infrared microspectroscopy was used as the main experimental technique to assess macromolecular

  20. Perceptual inference.

    Science.gov (United States)

    Aggelopoulos, Nikolaos C

    2015-08-01

    Perceptual inference refers to the ability to infer sensory stimuli from predictions that result from internal neural representations built through prior experience. Methods of Bayesian statistical inference and decision theory model cognition adequately by using error sensing either in guiding action or in "generative" models that predict the sensory information. In this framework, perception can be seen as a process qualitatively distinct from sensation, a process of information evaluation using previously acquired and stored representations (memories) that is guided by sensory feedback. The stored representations can be utilised as internal models of sensory stimuli enabling long term associations, for example in operant conditioning. Evidence for perceptual inference is contributed by such phenomena as the cortical co-localisation of object perception with object memory, the response invariance in the responses of some neurons to variations in the stimulus, as well as from situations in which perception can be dissociated from sensation. In the context of perceptual inference, sensory areas of the cerebral cortex that have been facilitated by a priming signal may be regarded as comparators in a closed feedback loop, similar to the better known motor reflexes in the sensorimotor system. The adult cerebral cortex can be regarded as similar to a servomechanism, in using sensory feedback to correct internal models, producing predictions of the outside world on the basis of past experience. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Generalized Born Models of Macromolecular Solvation Effects

    Science.gov (United States)

    Bashford, Donald; Case, David A.

    2000-10-01

    It would often be useful in computer simulations to use a simple description of solvation effects, instead of explicitly representing the individual solvent molecules. Continuum dielectric models often work well in describing the thermodynamic aspects of aqueous solvation, and approximations to such models that avoid the need to solve the Poisson equation are attractive because of their computational efficiency. Here we give an overview of one such approximation, the generalized Born model, which is simple and fast enough to be used for molecular dynamics simulations of proteins and nucleic acids. We discuss its strengths and weaknesses, both for its fidelity to the underlying continuum model and for its ability to replace explicit consideration of solvent molecules in macromolecular simulations. We focus particularly on versions of the generalized Born model that have a pair-wise analytical form, and therefore fit most naturally into conventional molecular mechanics calculations.

  2. Sequential recovery of macromolecular components of the nucleolus.

    Science.gov (United States)

    Bai, Baoyan; Laiho, Marikki

    2015-01-01

    The nucleolus is involved in a number of cellular processes of importance to cell physiology and pathology, including cell stress responses and malignancies. Studies of macromolecular composition of the nucleolus depend critically on the efficient extraction and accurate quantification of all macromolecular components (e.g., DNA, RNA, and protein). We have developed a TRIzol-based method that efficiently and simultaneously isolates these three macromolecular constituents from the same sample of purified nucleoli. The recovered and solubilized protein can be accurately quantified by the bicinchoninic acid assay and assessed by polyacrylamide gel electrophoresis or by mass spectrometry. We have successfully applied this approach to extract and quantify the responses of all three macromolecular components in nucleoli after drug treatments of HeLa cells, and conducted RNA-Seq analysis of the nucleolar RNA.

  3. Macromolecular Crystal Growth by Means of Microfluidics

    Science.gov (United States)

    vanderWoerd, Mark; Ferree, Darren; Spearing, Scott; Monaco, Lisa; Molho, Josh; Spaid, Michael; Brasseur, Mike; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    We have performed a feasibility study in which we show that chip-based, microfluidic (LabChip(TM)) technology is suitable for protein crystal growth. This technology allows for accurate and reliable dispensing and mixing of very small volumes while minimizing bubble formation in the crystallization mixture. The amount of (protein) solution remaining after completion of an experiment is minimal, which makes this technique efficient and attractive for use with proteins, which are difficult or expensive to obtain. The nature of LabChip(TM) technology renders it highly amenable to automation. Protein crystals obtained in our initial feasibility studies were of excellent quality as determined by X-ray diffraction. Subsequent to the feasibility study, we designed and produced the first LabChip(TM) device specifically for protein crystallization in batch mode. It can reliably dispense and mix from a range of solution constituents into two independent growth wells. We are currently testing this design to prove its efficacy for protein crystallization optimization experiments. In the near future we will expand our design to incorporate up to 10 growth wells per LabChip(TM) device. Upon completion, additional crystallization techniques such as vapor diffusion and liquid-liquid diffusion will be accommodated. Macromolecular crystallization using microfluidic technology is envisioned as a fully automated system, which will use the 'tele-science' concept of remote operation and will be developed into a research facility for the International Space Station as well as on the ground.

  4. Atomic force microscopy imaging of macromolecular complexes.

    Science.gov (United States)

    Santos, Sergio; Billingsley, Daniel; Thomson, Neil

    2013-01-01

    This chapter reviews amplitude modulation (AM) AFM in air and its applications to high-resolution imaging and interpretation of macromolecular complexes. We discuss single DNA molecular imaging and DNA-protein interactions, such as those with topoisomerases and RNA polymerase. We show how relative humidity can have a major influence on resolution and contrast and how it can also affect conformational switching of supercoiled DNA. Four regimes of AFM tip-sample interaction in air are defined and described, and relate to water perturbation and/or intermittent mechanical contact of the tip with either the molecular sample or the surface. Precise control and understanding of the AFM operational parameters is shown to allow the user to switch between these different regimes: an interpretation of the origins of topographical contrast is given for each regime. Perpetual water contact is shown to lead to a high-resolution mode of operation, which we term SASS (small amplitude small set-point) imaging, and which maximizes resolution while greatly decreasing tip and sample wear and any noise due to perturbation of the surface water. Thus, this chapter provides sufficient information to reliably control the AFM in the AM AFM mode of operation in order to image both heterogeneous samples and single macromolecules including complexes, with high resolution and with reproducibility. A brief introduction to AFM, its versatility and applications to biology is also given while providing references to key work and general reviews in the field.

  5. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  6. Complex Macromolecular Architectures by Living Cationic Polymerization

    KAUST Repository

    Alghamdi, Reem D.

    2015-05-01

    Poly (vinyl ether)-based graft polymers have been synthesized by the combination of living cationic polymerization of vinyl ethers with other living or controlled/ living polymerization techniques (anionic and ATRP). The process involves the synthesis of well-defined homopolymers (PnBVE) and co/terpolymers [PnBVE-b-PCEVE-b-PSiDEGVE (ABC type) and PSiDEGVE-b-PnBVE-b-PSiDEGVE (CAC type)] by sequential living cationic polymerization of n-butyl vinyl ether (nBVE), 2-chloroethyl vinyl ether (CEVE) and tert-butyldimethylsilyl ethylene glycol vinyl ether (SiDEGVE), using mono-functional {[n-butoxyethyl acetate (nBEA)], [1-(2-chloroethoxy) ethyl acetate (CEEA)], [1-(2-(2-(t-butyldimethylsilyloxy)ethoxy) ethoxy) ethyl acetate (SiDEGEA)]} or di-functional [1,4-cyclohexanedimethanol di(1-ethyl acetate) (cHMDEA), (VEMOA)] initiators. The living cationic polymerizations of those monomers were conducted in hexane at -20 0C using Et3Al2Cl3 (catalyst) in the presence of 1 M AcOEt base.[1] The PCEVE segments of the synthesized block terpolymers were then used to react with living macroanions (PS-DPE-Li; poly styrene diphenyl ethylene lithium) to afford graft polymers. The quantitative desilylation of PSiDEGVE segments by n-Bu4N+F- in THF at 0 °C led to graft co- and terpolymers in which the polyalcohol is the outer block. These co-/terpolymers were subsequently subjected to “grafting-from” reactions by atom transfer radical polymerization (ATRP) of styrene to afford more complex macromolecular architectures. The base assisted living cationic polymerization of vinyl ethers were also used to synthesize well-defined α-hydroxyl polyvinylether (PnBVE-OH). The resulting polymers were then modified into an ATRP macro-initiator for the synthesis of well-defined block copolymers (PnBVE-b-PS). Bifunctional PnBVE with terminal malonate groups was also synthesized and used as a precursor for more complex architectures such as H-shaped block copolymer by “grafting-from” or

  7. Macromolecular crystallography beamline X25 at the NSLS

    Energy Technology Data Exchange (ETDEWEB)

    Héroux, Annie; Allaire, Marc; Buono, Richard; Cowan, Matthew L.; Dvorak, Joseph; Flaks, Leon; LaMarra, Steven; Myers, Stuart F.; Orville, Allen M.; Robinson, Howard H.; Roessler, Christian G.; Schneider, Dieter K.; Shea-McCarthy, Grace; Skinner, John M.; Skinner, Michael; Soares, Alexei S.; Sweet, Robert M.; Berman, Lonny E., E-mail: berman@bnl.gov [Brookhaven National Laboratory, PO Box 5000, Upton, NY 11973-5000 (United States)

    2014-04-08

    A description of the upgraded beamline X25 at the NSLS, operated by the PXRR and the Photon Sciences Directorate serving the Macromolecular Crystallography community, is presented. Beamline X25 at the NSLS is one of the five beamlines dedicated to macromolecular crystallography operated by the Brookhaven National Laboratory Macromolecular Crystallography Research Resource group. This mini-gap insertion-device beamline has seen constant upgrades for the last seven years in order to achieve mini-beam capability down to 20 µm × 20 µm. All major components beginning with the radiation source, and continuing along the beamline and its experimental hutch, have changed to produce a state-of-the-art facility for the scientific community.

  8. Macromolecular crystallography beamline X25 at the NSLS

    International Nuclear Information System (INIS)

    Héroux, Annie; Allaire, Marc; Buono, Richard; Cowan, Matthew L.; Dvorak, Joseph; Flaks, Leon; LaMarra, Steven; Myers, Stuart F.; Orville, Allen M.; Robinson, Howard H.; Roessler, Christian G.; Schneider, Dieter K.; Shea-McCarthy, Grace; Skinner, John M.; Skinner, Michael; Soares, Alexei S.; Sweet, Robert M.; Berman, Lonny E.

    2014-01-01

    A description of the upgraded beamline X25 at the NSLS, operated by the PXRR and the Photon Sciences Directorate serving the Macromolecular Crystallography community, is presented. Beamline X25 at the NSLS is one of the five beamlines dedicated to macromolecular crystallography operated by the Brookhaven National Laboratory Macromolecular Crystallography Research Resource group. This mini-gap insertion-device beamline has seen constant upgrades for the last seven years in order to achieve mini-beam capability down to 20 µm × 20 µm. All major components beginning with the radiation source, and continuing along the beamline and its experimental hutch, have changed to produce a state-of-the-art facility for the scientific community

  9. Control of Macromolecular Architectures for Renewable Polymers: Case Studies

    Science.gov (United States)

    Tang, Chuanbing

    The development of sustainable polymers from nature biomass is growing, but facing fierce competition from existing petrochemical-based counterparts. Controlling macromolecular architectures to maximize the properties of renewable polymers is a desirable approach to gain advantages. Given the complexity of biomass, there needs special consideration other than traditional design. In the presentation, I will talk about a few case studies on how macromolecular architectures could tune the properties of sustainable bioplastics and elastomers from renewable biomass such as resin acids (natural rosin) and plant oils.

  10. A decade of user operation on the macromolecular crystallography MAD beamline ID14-4 at the ESRF

    International Nuclear Information System (INIS)

    McCarthy, Andrew A.; Brockhauser, Sandor; Nurizzo, Didier; Theveneau, Pascal; Mairs, Trevor; Spruce, Darren; Guijarro, Matias; Lesourd, Marc; Ravelli, Raimond B. G.; McSweeney, Sean

    2009-01-01

    The improvement of the X-ray beam quality achieved on ID14-4 by the installation of new X-ray optical elements is described. ID14-4 at the ESRF is the first tunable undulator-based macromolecular crystallography beamline that can celebrate a decade of user service. During this time ID14-4 has not only been instrumental in the determination of the structures of biologically important molecules but has also contributed significantly to the development of various instruments, novel data collection schemes and pioneering radiation damage studies on biological samples. Here, the evolution of ID14-4 over the last decade is presented, and some of the major improvements that were carried out in order to maintain its status as one of the most productive macromolecular crystallography beamlines are highlighted. The experimental hutch has been upgraded to accommodate a high-precision diffractometer, a sample changer and a large CCD detector. More recently, the optical hutch has been refurbished in order to improve the X-ray beam quality on ID14-4 and to incorporate the most modern and robust optical elements used at other ESRF beamlines. These new optical elements will be described and their effect on beam stability discussed. These studies may be useful in the design, construction and maintenance of future X-ray beamlines for macromolecular crystallography and indeed other applications, such as those planned for the ESRF upgrade

  11. Analytical model for macromolecular partitioning during yeast cell division

    International Nuclear Information System (INIS)

    Kinkhabwala, Ali; Khmelinskii, Anton; Knop, Michael

    2014-01-01

    Asymmetric cell division, whereby a parent cell generates two sibling cells with unequal content and thereby distinct fates, is central to cell differentiation, organism development and ageing. Unequal partitioning of the macromolecular content of the parent cell — which includes proteins, DNA, RNA, large proteinaceous assemblies and organelles — can be achieved by both passive (e.g. diffusion, localized retention sites) and active (e.g. motor-driven transport) processes operating in the presence of external polarity cues, internal asymmetries, spontaneous symmetry breaking, or stochastic effects. However, the quantitative contribution of different processes to the partitioning of macromolecular content is difficult to evaluate. Here we developed an analytical model that allows rapid quantitative assessment of partitioning as a function of various parameters in the budding yeast Saccharomyces cerevisiae. This model exposes quantitative degeneracies among the physical parameters that govern macromolecular partitioning, and reveals regions of the solution space where diffusion is sufficient to drive asymmetric partitioning and regions where asymmetric partitioning can only be achieved through additional processes such as motor-driven transport. Application of the model to different macromolecular assemblies suggests that partitioning of protein aggregates and episomes, but not prions, is diffusion-limited in yeast, consistent with previous reports. In contrast to computationally intensive stochastic simulations of particular scenarios, our analytical model provides an efficient and comprehensive overview of partitioning as a function of global and macromolecule-specific parameters. Identification of quantitative degeneracies among these parameters highlights the importance of their careful measurement for a given macromolecular species in order to understand the dominant processes responsible for its observed partitioning

  12. Isotope labeling for NMR studies of macromolecular structure and interactions

    International Nuclear Information System (INIS)

    Wright, P.E.

    1994-01-01

    Implementation of biosynthetic methods for uniform or specific isotope labeling of proteins, coupled with the recent development of powerful heteronuclear multidimensional NMR methods, has led to a dramatic increase in the size and complexity of macromolecular systems that are now amenable to NMR structural analysis. In recent years, a new technology has emerged that combines uniform 13 C, 15 N labeling with heteronuclear multidimensional NMR methods to allow NMR structural studies of systems approaching 25 to 30 kDa in molecular weight. In addition, with the introduction of specific 13 C and 15 N labels into ligands, meaningful NMR studies of complexes of even higher molecular weight have become feasible. These advances usher in a new era in which the earlier, rather stringent molecular weight limitations have been greatly surpassed and NMR can begin to address many central biological problems that involve macromolecular structure, dynamics, and interactions

  13. Crowding-facilitated macromolecular transport in attractive micropost arrays.

    Science.gov (United States)

    Chien, Fan-Tso; Lin, Po-Keng; Chien, Wei; Hung, Cheng-Hsiang; Yu, Ming-Hung; Chou, Chia-Fu; Chen, Yeng-Long

    2017-05-02

    Our study of DNA dynamics in weakly attractive nanofabricated post arrays revealed crowding enhances polymer transport, contrary to hindered transport in repulsive medium. The coupling of DNA diffusion and adsorption to the microposts results in more frequent cross-post hopping and increased long-term diffusivity with increased crowding density. We performed Langevin dynamics simulations and found maximum long-term diffusivity in post arrays with gap sizes comparable to the polymer radius of gyration. We found that macromolecular transport in weakly attractive post arrays is faster than in non-attractive dense medium. Furthermore, we employed hidden Markov analysis to determine the transition of macromolecular adsorption-desorption on posts and hopping between posts. The apparent free energy barriers are comparable to theoretical estimates determined from polymer conformational fluctuations.

  14. Isotope labeling for NMR studies of macromolecular structure and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Wright, P.E. [Scripps Research Institute, La Jolla, CA (United States)

    1994-12-01

    Implementation of biosynthetic methods for uniform or specific isotope labeling of proteins, coupled with the recent development of powerful heteronuclear multidimensional NMR methods, has led to a dramatic increase in the size and complexity of macromolecular systems that are now amenable to NMR structural analysis. In recent years, a new technology has emerged that combines uniform {sup 13}C, {sup 15}N labeling with heteronuclear multidimensional NMR methods to allow NMR structural studies of systems approaching 25 to 30 kDa in molecular weight. In addition, with the introduction of specific {sup 13}C and {sup 15}N labels into ligands, meaningful NMR studies of complexes of even higher molecular weight have become feasible. These advances usher in a new era in which the earlier, rather stringent molecular weight limitations have been greatly surpassed and NMR can begin to address many central biological problems that involve macromolecular structure, dynamics, and interactions.

  15. Stochastic reaction-diffusion algorithms for macromolecular crowding

    Science.gov (United States)

    Sturrock, Marc

    2016-06-01

    Compartment-based (lattice-based) reaction-diffusion algorithms are often used for studying complex stochastic spatio-temporal processes inside cells. In this paper the influence of macromolecular crowding on stochastic reaction-diffusion simulations is investigated. Reaction-diffusion processes are considered on two different kinds of compartmental lattice, a cubic lattice and a hexagonal close packed lattice, and solved using two different algorithms, the stochastic simulation algorithm and the spatiocyte algorithm (Arjunan and Tomita 2010 Syst. Synth. Biol. 4, 35-53). Obstacles (modelling macromolecular crowding) are shown to have substantial effects on the mean squared displacement and average number of molecules in the domain but the nature of these effects is dependent on the choice of lattice, with the cubic lattice being more susceptible to the effects of the obstacles. Finally, improvements for both algorithms are presented.

  16. Diffusion accessibility as a method for visualizing macromolecular surface geometry.

    Science.gov (United States)

    Tsai, Yingssu; Holton, Thomas; Yeates, Todd O

    2015-10-01

    Important three-dimensional spatial features such as depth and surface concavity can be difficult to convey clearly in the context of two-dimensional images. In the area of macromolecular visualization, the computer graphics technique of ray-tracing can be helpful, but further techniques for emphasizing surface concavity can give clearer perceptions of depth. The notion of diffusion accessibility is well-suited for emphasizing such features of macromolecular surfaces, but a method for calculating diffusion accessibility has not been made widely available. Here we make available a web-based platform that performs the necessary calculation by solving the Laplace equation for steady state diffusion, and produces scripts for visualization that emphasize surface depth by coloring according to diffusion accessibility. The URL is http://services.mbi.ucla.edu/DiffAcc/. © 2015 The Protein Society.

  17. Modeling the multi-scale mechanisms of macromolecular resource allocation

    DEFF Research Database (Denmark)

    Yang, Laurence; Yurkovich, James T; King, Zachary A

    2018-01-01

    As microbes face changing environments, they dynamically allocate macromolecular resources to produce a particular phenotypic state. Broad 'omics' data sets have revealed several interesting phenomena regarding how the proteome is allocated under differing conditions, but the functional consequen...... and detail how mathematical models have aided in our understanding of these processes. Ultimately, such modeling efforts have helped elucidate the principles of proteome allocation and hold promise for further discovery....

  18. What Macromolecular Crowding Can Do to a Protein

    Science.gov (United States)

    Kuznetsova, Irina M.; Turoverov, Konstantin K.; Uversky, Vladimir N.

    2014-01-01

    The intracellular environment represents an extremely crowded milieu, with a limited amount of free water and an almost complete lack of unoccupied space. Obviously, slightly salted aqueous solutions containing low concentrations of a biomolecule of interest are too simplistic to mimic the “real life” situation, where the biomolecule of interest scrambles and wades through the tightly packed crowd. In laboratory practice, such macromolecular crowding is typically mimicked by concentrated solutions of various polymers that serve as model “crowding agents”. Studies under these conditions revealed that macromolecular crowding might affect protein structure, folding, shape, conformational stability, binding of small molecules, enzymatic activity, protein-protein interactions, protein-nucleic acid interactions, and pathological aggregation. The goal of this review is to systematically analyze currently available experimental data on the variety of effects of macromolecular crowding on a protein molecule. The review covers more than 320 papers and therefore represents one of the most comprehensive compendia of the current knowledge in this exciting area. PMID:25514413

  19. Macromolecular target prediction by self-organizing feature maps.

    Science.gov (United States)

    Schneider, Gisbert; Schneider, Petra

    2017-03-01

    Rational drug discovery would greatly benefit from a more nuanced appreciation of the activity of pharmacologically active compounds against a diverse panel of macromolecular targets. Already, computational target-prediction models assist medicinal chemists in library screening, de novo molecular design, optimization of active chemical agents, drug re-purposing, in the spotting of potential undesired off-target activities, and in the 'de-orphaning' of phenotypic screening hits. The self-organizing map (SOM) algorithm has been employed successfully for these and other purposes. Areas covered: The authors recapitulate contemporary artificial neural network methods for macromolecular target prediction, and present the basic SOM algorithm at a conceptual level. Specifically, they highlight consensus target-scoring by the employment of multiple SOMs, and discuss the opportunities and limitations of this technique. Expert opinion: Self-organizing feature maps represent a straightforward approach to ligand clustering and classification. Some of the appeal lies in their conceptual simplicity and broad applicability domain. Despite known algorithmic shortcomings, this computational target prediction concept has been proven to work in prospective settings with high success rates. It represents a prototypic technique for future advances in the in silico identification of the modes of action and macromolecular targets of bioactive molecules.

  20. Design and application of a C++ macromolecular class library.

    Science.gov (United States)

    Chang, W; Shindyalov, I N; Pu, C; Bourne, P E

    1994-01-01

    PDBlib is an extensible object oriented class library written in C++ for representing the 3-dimensional structure of biological macromolecules. PDBlib forms the kernel of a larger software framework being developed for assiting in knowledge discovery from macromolecular structure data. The software design strategy used by PDBlib, how the library may be used and several prototype applications that use the library are summarized. PDBlib represents the structural features of proteins, DNA, RNA, and complexes thereof, at a level of detail on a par with that which can be parsed from a Protein Data Bank (PDB) entry. However, the memory resident representation of the macromolecule is independent of the PDB entry and can be obtained from other back-end data sources, for example, existing relational databases and our own object oriented database (OOPDB) built on top of the commercial object oriented database, ObjectStore. At the front-end are several prototype applications that use the library: Macromolecular Query Language (MMQL) is based on a separate class library (MMQLlib) for building complex queries pertaining to macromolecular structure; PDBtool is an interactive structure verification tool; and PDBview, is a structure rendering tool used either as a standalone tool or as part of another application. Each of these software components are described. All software is available via anonymous ftp from cuhhca.hhmi.columbia.edu.

  1. Development of an online UV–visible microspectrophotometer for a macromolecular crystallography beamline

    Energy Technology Data Exchange (ETDEWEB)

    Shimizu, Nobutaka, E-mail: nobutaka.shimizu@kek.jp [SPring-8/JASRI, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5198 (Japan); High Energy Accelerator Research Organization (KEK), 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); Shimizu, Tetsuya [RIKEN SPring-8 Center, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5148 (Japan); Baba, Seiki; Hasegawa, Kazuya [SPring-8/JASRI, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5198 (Japan); Yamamoto, Masaki [RIKEN SPring-8 Center, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5148 (Japan); Kumasaka, Takashi [SPring-8/JASRI, 1-1-1 Koto, Sayo-cho, Sayo-gun, Hyogo 679-5198 (Japan)

    2013-11-01

    An online UV–visible microspectrophotometer has been developed for the macromolecular crystallography beamline at SPring-8. Details of this spectrophotometer are reported. Measurement of the UV–visible absorption spectrum is a convenient technique for detecting chemical changes of proteins, and it is therefore useful to combine spectroscopy and diffraction studies. An online microspectrophotometer for the UV–visible region was developed and installed on the macromolecular crystallography beamline, BL38B1, at SPring-8. This spectrophotometer is equipped with a difference dispersive double monochromator, a mercury–xenon lamp as the light source, and a photomultiplier as the detector. The optical path is mostly constructed using mirrors, in order to obtain high brightness in the UV region, and the confocal optics are assembled using a cross-slit diaphragm like an iris to eliminate stray light. This system can measure optical densities up to a maximum of 4.0. To study the effect of radiation damage, preliminary measurements of glucose isomerase and thaumatin crystals were conducted in the UV region. Spectral changes dependent on X-ray dose were observed at around 280 nm, suggesting that structural changes involving Trp or Tyr residues occurred in the protein crystal. In the case of the thaumatin crystal, a broad peak around 400 nm was also generated after X-ray irradiation, suggesting the cleavage of a disulfide bond. Dose-dependent spectral changes were also observed in cryo-solutions alone, and these changes differed with the composition of the cryo-solution. These responses in the UV region are informative regarding the state of the sample; consequently, this device might be useful for X-ray crystallography.

  2. Repair of radiation damage in mammalian cells

    Energy Technology Data Exchange (ETDEWEB)

    Setlow, R.B.

    1981-01-01

    The responses, such as survival, mutation, and carcinogenesis, of mammalian cells and tissues to radiation are dependent not only on the magnitude of the damage to macromolecular structures - DNA, RNA, protein, and membranes - but on the rates of macromolecular syntheses of cells relative to the half-lives of the damages. Cells possess a number of mechanisms for repairing damage to DNA. If the repair systems are rapid and error free, cells can tolerate much larger doses than if repair is slow or error prone. It is important to understand the effects of radiation and the repair of radiation damage because there exist reasonable amounts of epidemiological data that permits the construction of dose-response curves for humans. The shapes of such curves or the magnitude of the response will depend on repair. Radiation damage is emphasized because: (a) radiation dosimetry, with all its uncertainties for populations, is excellent compared to chemical dosimetry; (b) a number of cancer-prone diseases are known in which there are defects in DNA repair and radiation results in more chromosomal damage in cells from such individuals than in cells from normal individuals; (c) in some cases, specific radiation products in DNA have been correlated with biological effects, and (d) many chemical effects seem to mimic radiation effects. A further reason for emphasizing damage to DNA is the wealth of experimental evidence indicating that damages to DNA can be initiating events in carcinogenesis.

  3. Repair of radiation damage in mammalian cells

    International Nuclear Information System (INIS)

    Setlow, R.B.

    1981-01-01

    The responses, such as survival, mutation, and carcinogenesis, of mammalian cells and tissues to radiation are dependent not only on the magnitude of the damage to macromolecular structures - DNA, RNA, protein, and membranes - but on the rates of macromolecular syntheses of cells relative to the half-lives of the damages. Cells possess a number of mechanisms for repairing damage to DNA. If the repair systems are rapid and error free, cells can tolerate much larger doses than if repair is slow or error prone. It is important to understand the effects of radiation and the repair of radiation damage because there exist reasonable amounts of epidemiological data that permits the construction of dose-response curves for humans. The shapes of such curves or the magnitude of the response will depend on repair. Radiation damage is emphasized because: (a) radiation dosimetry, with all its uncertainties for populations, is excellent compared to chemical dosimetry; (b) a number of cancer-prone diseases are known in which there are defects in DNA repair and radiation results in more chromosomal damage in cells from such individuals than in cells from normal individuals; (c) in some cases, specific radiation products in DNA have been correlated with biological effects, and (d) many chemical effects seem to mimic radiation effects. A further reason for emphasizing damage to DNA is the wealth of experimental evidence indicating that damages to DNA can be initiating events in carcinogenesis

  4. Workshop on algorithms for macromolecular modeling. Final project report, June 1, 1994--May 31, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Leimkuhler, B.; Hermans, J.; Skeel, R.D.

    1995-07-01

    A workshop was held on algorithms and parallel implementations for macromolecular dynamics, protein folding, and structural refinement. This document contains abstracts and brief reports from that workshop.

  5. The contrasting effect of macromolecular crowding on amyloid fibril formation.

    Directory of Open Access Journals (Sweden)

    Qian Ma

    Full Text Available Amyloid fibrils associated with neurodegenerative diseases can be considered biologically relevant failures of cellular quality control mechanisms. It is known that in vivo human Tau protein, human prion protein, and human copper, zinc superoxide dismutase (SOD1 have the tendency to form fibril deposits in a variety of tissues and they are associated with different neurodegenerative diseases, while rabbit prion protein and hen egg white lysozyme do not readily form fibrils and are unlikely to cause neurodegenerative diseases. In this study, we have investigated the contrasting effect of macromolecular crowding on fibril formation of different proteins.As revealed by assays based on thioflavin T binding and turbidity, human Tau fragments, when phosphorylated by glycogen synthase kinase-3β, do not form filaments in the absence of a crowding agent but do form fibrils in the presence of a crowding agent, and the presence of a strong crowding agent dramatically promotes amyloid fibril formation of human prion protein and its two pathogenic mutants E196K and D178N. Such an enhancing effect of macromolecular crowding on fibril formation is also observed for a pathological human SOD1 mutant A4V. On the other hand, rabbit prion protein and hen lysozyme do not form amyloid fibrils when a crowding agent at 300 g/l is used but do form fibrils in the absence of a crowding agent. Furthermore, aggregation of these two proteins is remarkably inhibited by Ficoll 70 and dextran 70 at 200 g/l.We suggest that proteins associated with neurodegenerative diseases are more likely to form amyloid fibrils under crowded conditions than in dilute solutions. By contrast, some of the proteins that are not neurodegenerative disease-associated are unlikely to misfold in crowded physiological environments. A possible explanation for the contrasting effect of macromolecular crowding on these two sets of proteins (amyloidogenic proteins and non-amyloidogenic proteins has been

  6. Structural analysis of nanoparticulate carriers for encapsulation of macromolecular drugs

    Czech Academy of Sciences Publication Activity Database

    Angelov, Borislav; Garamus, V.M.; Drechsler, M.; Angelova, A.

    2017-01-01

    Roč. 235, Jun (2017), s. 83-89 ISSN 0167-7322 R&D Projects: GA MŠk EF15_003/0000447; GA MŠk EF15_008/0000162 Grant - others:OP VVV - ELIBIO(XE) CZ.02.1.01/0.0/0.0/15_003/0000447; ELI Beamlines(XE) CZ.02.1.01/0.0/0.0/15_008/0000162 Institutional support: RVO:68378271 Keywords : self-assembled nanocarriers * liquid crystalline phase transitions * cationic lipids * macromolecular drugs Subject RIV: BO - Biophysics OBOR OECD: Biophysics Impact factor: 3.648, year: 2016

  7. Bringing macromolecular machinery to life using 3D animation.

    Science.gov (United States)

    Iwasa, Janet H

    2015-04-01

    Over the past decade, there has been a rapid rise in the use of three-dimensional (3D) animation to depict molecular and cellular processes. Much of the growth in molecular animation has been in the educational arena, but increasingly, 3D animation software is finding its way into research laboratories. In this review, I will discuss a number of ways in which 3d animation software can play a valuable role in visualizing and communicating macromolecular structures and dynamics. I will also consider the challenges of using animation tools within the research sphere. Copyright © 2015. Published by Elsevier Ltd.

  8. Protein crystal growth studies at the Center for Macromolecular Crystallography

    International Nuclear Information System (INIS)

    DeLucas, Lawrence J.; Long, Marianna M.; Moore, Karen M.; Harrington, Michael; McDonald, William T.; Smith, Craig D.; Bray, Terry; Lewis, Johanna; Crysel, William B.; Weise, Lance D.

    2000-01-01

    The Center for Macromolecular Crystallography (CMC) has been involved in fundamental studies of protein crystal growth (PCG) in microgravity and in our earth-based laboratories. A large group of co-investigators from academia and industry participated in these experiments by providing protein samples and by performing the x-ray crystallographic analysis. These studies have clearly demonstrated the usefulness of a microgravity environment for enhancing the quality and size of protein crystals. Review of the vapor diffusion (VDA) PCG results from nineteen space shuttle missions is given in this paper

  9. Thiomers for oral delivery of hydrophilic macromolecular drugs.

    Science.gov (United States)

    Bernkop-Schnürch, Andreas; Hoffer, Martin H; Kafedjiiski, Krum

    2004-11-01

    In recent years thiolated polymers (thiomers) have appeared as a promising new tool in oral drug delivery. Thiomers are obtained by the immobilisation of thio-bearing ligands to mucoadhesive polymeric excipients. By the formation of disulfide bonds with mucus glycoproteins, the mucoadhesive properties of thiomers are up to 130-fold improved compared with the corresponding unmodified polymers. Owing to the formation of inter- and intramolecular disulfide bonds within the thiomer itself, matrix tablets and particulate delivery systems show strong cohesive properties, resulting in comparatively higher stability, prolonged disintegration times and a more controlled drug release. The permeation of hydrophilic macromolecular drugs through the gastrointestinal (GI) mucosa can be improved by the use of thiomers. Furthermore, some thiomers exhibit improved inhibitory properties towards GI peptidases. The efficacy of thiomers in oral drug delivery has been demonstrated by various in vivo studies. A pharmacological efficacy of 1%, for example, was achieved in rats by oral administration of calcitonin tablets comprising a thiomer. Furthermore, tablets comprising a thiomer and pegylated insulin resulted in a pharmacological efficacy of 7% after oral application to diabetic mice. Low-molecular-weight heparin embedded in thiolated polycarbophil led to an absolute bioavailability of > or = 20% after oral administration to rats. In these studies, formulations comprising the corresponding unmodified polymer had only a marginal or no effect. These results indicate drug carrier systems based on thiomers appear to be a promising tool for oral delivery of hydrophilic macromolecular drugs.

  10. Outrunning free radicals in room-temperature macromolecular crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Owen, Robin L., E-mail: robin.owen@diamond.ac.uk; Axford, Danny [Diamond Light Source, Harwell Science and Innovation Campus, Didcot OX11 0DE (United Kingdom); Nettleship, Joanne E.; Owens, Raymond J. [Rutherford Appleton Laboratory, Didcot OX11 0FA (United Kingdom); The Henry Wellcome Building for Genomic Medicine, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Robinson, James I.; Morgan, Ann W. [University of Leeds, Leeds LS9 7FT (United Kingdom); Doré, Andrew S. [Heptares Therapeutics Ltd, BioPark, Welwyn Garden City AL7 3AX (United Kingdom); Lebon, Guillaume; Tate, Christopher G. [MRC Laboratory of Molecular Biology, Hills Road, Cambridge CB2 0QH (United Kingdom); Fry, Elizabeth E.; Ren, Jingshan [The Henry Wellcome Building for Genomic Medicine, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Stuart, David I. [Diamond Light Source, Harwell Science and Innovation Campus, Didcot OX11 0DE (United Kingdom); The Henry Wellcome Building for Genomic Medicine, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Evans, Gwyndaf [Diamond Light Source, Harwell Science and Innovation Campus, Didcot OX11 0DE (United Kingdom)

    2012-06-15

    A systematic increase in lifetime is observed in room-temperature protein and virus crystals through the use of reduced exposure times and a fast detector. A significant increase in the lifetime of room-temperature macromolecular crystals is reported through the use of a high-brilliance X-ray beam, reduced exposure times and a fast-readout detector. This is attributed to the ability to collect diffraction data before hydroxyl radicals can propagate through the crystal, fatally disrupting the lattice. Hydroxyl radicals are shown to be trapped in amorphous solutions at 100 K. The trend in crystal lifetime was observed in crystals of a soluble protein (immunoglobulin γ Fc receptor IIIa), a virus (bovine enterovirus serotype 2) and a membrane protein (human A{sub 2A} adenosine G-protein coupled receptor). The observation of a similar effect in all three systems provides clear evidence for a common optimal strategy for room-temperature data collection and will inform the design of future synchrotron beamlines and detectors for macromolecular crystallography.

  11. Outrunning free radicals in room-temperature macromolecular crystallography

    International Nuclear Information System (INIS)

    Owen, Robin L.; Axford, Danny; Nettleship, Joanne E.; Owens, Raymond J.; Robinson, James I.; Morgan, Ann W.; Doré, Andrew S.; Lebon, Guillaume; Tate, Christopher G.; Fry, Elizabeth E.; Ren, Jingshan; Stuart, David I.; Evans, Gwyndaf

    2012-01-01

    A systematic increase in lifetime is observed in room-temperature protein and virus crystals through the use of reduced exposure times and a fast detector. A significant increase in the lifetime of room-temperature macromolecular crystals is reported through the use of a high-brilliance X-ray beam, reduced exposure times and a fast-readout detector. This is attributed to the ability to collect diffraction data before hydroxyl radicals can propagate through the crystal, fatally disrupting the lattice. Hydroxyl radicals are shown to be trapped in amorphous solutions at 100 K. The trend in crystal lifetime was observed in crystals of a soluble protein (immunoglobulin γ Fc receptor IIIa), a virus (bovine enterovirus serotype 2) and a membrane protein (human A 2A adenosine G-protein coupled receptor). The observation of a similar effect in all three systems provides clear evidence for a common optimal strategy for room-temperature data collection and will inform the design of future synchrotron beamlines and detectors for macromolecular crystallography

  12. Variable effects of soman on macromolecular secretion by ferret trachea

    International Nuclear Information System (INIS)

    McBride, R.K.; Zwierzynski, D.J.; Stone, K.K.; Culp, D.J.; Marin, M.G.

    1991-01-01

    The purpose of this study was to examine the effect of the anticholinesterase agent, soman, on macromolecular secretion by ferret trachea, in vitro. We mounted pieces of ferret trachea in Ussing-type chambers. Secreted sulfated macromolecules were radiolabeled by adding 500 microCi of 35 SO 4 to the submucosal medium and incubating for 17 hr. Soman added to the submucosal side produced a concentration-dependent increase in radiolabeled macromolecular release with a maximal secretory response (mean +/- SD) of 202 +/- 125% (n = 8) relative to the basal secretion rate at a concentration of 10 - 7 M. The addition of either 10 -6 M pralidoxime (acetylcholinesterase reactivator) or 10 -6 M atropine blocked the response to 10 -7 M soman. At soman concentrations greater than 10 -7 M, secretion rate decreased and was not significantly different from basal secretion. Additional experiments utilizing acetylcholine and the acetylcholinesterase inhibitor, physostigmine, suggest that inhibition of secretion by high concentrations of soman may be due to a secondary antagonistic effect of soman on muscarinic receptors

  13. Dendrimer-based Macromolecular MRI Contrast Agents: Characteristics and Application

    Directory of Open Access Journals (Sweden)

    Hisataka Kobayashi

    2003-01-01

    Full Text Available Numerous macromolecular MRI contrast agents prepared employing relatively simple chemistry may be readily available that can provide sufficient enhancement for multiple applications. These agents operate using a ~100-fold lower concentration of gadolinium ions in comparison to the necessary concentration of iodine employed in CT imaging. Herein, we describe some of the general potential directions of macromolecular MRI contrast agents using our recently reported families of dendrimer-based agents as examples. Changes in molecular size altered the route of excretion. Smaller-sized contrast agents less than 60 kDa molecular weight were excreted through the kidney resulting in these agents being potentially suitable as functional renal contrast agents. Hydrophilic and larger-sized contrast agents were found better suited for use as blood pool contrast agents. Hydrophobic variants formed with polypropylenimine diaminobutane dendrimer cores created liver contrast agents. Larger hydrophilic agents are useful for lymphatic imaging. Finally, contrast agents conjugated with either monoclonal antibodies or with avidin are able to function as tumor-specific contrast agents, which also might be employed as therapeutic drugs for either gadolinium neutron capture therapy or in conjunction with radioimmunotherapy.

  14. PRIGo: a new multi-axis goniometer for macromolecular crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Waltersperger, Sandro; Olieric, Vincent, E-mail: vincent.olieric@psi.ch; Pradervand, Claude [Paul Scherrer Institute, Villigen PSI (Switzerland); Glettig, Wayne [Centre Suisse d’Electronique et Microtechnique SA, Neuchâtel 2002 (Switzerland); Salathe, Marco; Fuchs, Martin R.; Curtin, Adrian; Wang, Xiaoqiang; Ebner, Simon; Panepucci, Ezequiel; Weinert, Tobias [Paul Scherrer Institute, Villigen PSI (Switzerland); Schulze-Briese, Clemens [Dectris Ltd, Baden 5400 (Switzerland); Wang, Meitian, E-mail: vincent.olieric@psi.ch [Paul Scherrer Institute, Villigen PSI (Switzerland)

    2015-05-09

    The design and performance of the new multi-axis goniometer PRIGo developed at the Swiss Light Source at Paul Scherrer Institute is described. The Parallel Robotics Inspired Goniometer (PRIGo) is a novel compact and high-precision goniometer providing an alternative to (mini-)kappa, traditional three-circle goniometers and Eulerian cradles used for sample reorientation in macromolecular crystallography. Based on a combination of serial and parallel kinematics, PRIGo emulates an arc. It is mounted on an air-bearing stage for rotation around ω and consists of four linear positioners working synchronously to achieve x, y, z translations and χ rotation (0–90°), followed by a ϕ stage (0–360°) for rotation around the sample holder axis. Owing to the use of piezo linear positioners and active correction, PRIGo features spheres of confusion of <1 µm, <7 µm and <10 µm for ω, χ and ϕ, respectively, and is therefore very well suited for micro-crystallography. PRIGo enables optimal strategies for both native and experimental phasing crystallographic data collection. Herein, PRIGo hardware and software, its calibration, as well as applications in macromolecular crystallography are described.

  15. Data Management System at the Photon Factory Macromolecular Crystallography Beamline

    International Nuclear Information System (INIS)

    Yamada, Y; Matsugaki, N; Chavas, L M G; Hiraki, M; Igarashi, N; Wakatsuki, S

    2013-01-01

    Macromolecular crystallography is a very powerful tool to investigate three-dimensional structures of macromolecules at the atomic level, and is widely spread among structural biology researchers. Due to recent upgrades of the macromolecular crystallography beamlines at the Photon Factory, beamline throughput has improved, allowing more experiments to be conducted during a user's beam time. Although the number of beamlines has increased, so has the number of beam time applications. Consequently, both the experimental data from users' experiments and data derived from beamline operations have dramatically increased, causing difficulties in organizing these diverse and large amounts of data for the beamline operation staff and users. To overcome this problem, we have developed a data management system by introducing commercial middleware, which consists of a controller, database, and web servers. We have prepared several database projects using this system. Each project is dedicated to a certain aspect such as experimental results, beam time applications, beam time schedule, or beamline operation reports. Then we designed a scheme to link all the database projects.

  16. Enzymes as Green Catalysts for Precision Macromolecular Synthesis.

    Science.gov (United States)

    Shoda, Shin-ichiro; Uyama, Hiroshi; Kadokawa, Jun-ichi; Kimura, Shunsaku; Kobayashi, Shiro

    2016-02-24

    The present article comprehensively reviews the macromolecular synthesis using enzymes as catalysts. Among the six main classes of enzymes, the three classes, oxidoreductases, transferases, and hydrolases, have been employed as catalysts for the in vitro macromolecular synthesis and modification reactions. Appropriate design of reaction including monomer and enzyme catalyst produces macromolecules with precisely controlled structure, similarly as in vivo enzymatic reactions. The reaction controls the product structure with respect to substrate selectivity, chemo-selectivity, regio-selectivity, stereoselectivity, and choro-selectivity. Oxidoreductases catalyze various oxidation polymerizations of aromatic compounds as well as vinyl polymerizations. Transferases are effective catalysts for producing polysaccharide having a variety of structure and polyesters. Hydrolases catalyzing the bond-cleaving of macromolecules in vivo, catalyze the reverse reaction for bond forming in vitro to give various polysaccharides and functionalized polyesters. The enzymatic polymerizations allowed the first in vitro synthesis of natural polysaccharides having complicated structures like cellulose, amylose, xylan, chitin, hyaluronan, and chondroitin. These polymerizations are "green" with several respects; nontoxicity of enzyme, high catalyst efficiency, selective reactions under mild conditions using green solvents and renewable starting materials, and producing minimal byproducts. Thus, the enzymatic polymerization is desirable for the environment and contributes to "green polymer chemistry" for maintaining sustainable society.

  17. SEMANTIC PATCH INFERENCE

    DEFF Research Database (Denmark)

    Andersen, Jesper

    2009-01-01

    Collateral evolution the problem of updating several library-using programs in response to API changes in the used library. In this dissertation we address the issue of understanding collateral evolutions by automatically inferring a high-level specification of the changes evident in a given set ...... specifications inferred by spdiff in Linux are shown. We find that the inferred specifications concisely capture the actual collateral evolution performed in the examples....

  18. Macromolecular crowding directs extracellular matrix organization and mesenchymal stem cell behavior.

    Directory of Open Access Journals (Sweden)

    Adam S Zeiger

    Full Text Available Microenvironments of biological cells are dominated in vivo by macromolecular crowding and resultant excluded volume effects. This feature is absent in dilute in vitro cell culture. Here, we induced macromolecular crowding in vitro by using synthetic macromolecular globules of nm-scale radius at physiological levels of fractional volume occupancy. We quantified the impact of induced crowding on the extracellular and intracellular protein organization of human mesenchymal stem cells (MSCs via immunocytochemistry, atomic force microscopy (AFM, and AFM-enabled nanoindentation. Macromolecular crowding in extracellular culture media directly induced supramolecular assembly and alignment of extracellular matrix proteins deposited by cells, which in turn increased alignment of the intracellular actin cytoskeleton. The resulting cell-matrix reciprocity further affected adhesion, proliferation, and migration behavior of MSCs. Macromolecular crowding can thus aid the design of more physiologically relevant in vitro studies and devices for MSCs and other cells, by increasing the fidelity between materials synthesized by cells in vivo and in vitro.

  19. Macromolecular crowding directs extracellular matrix organization and mesenchymal stem cell behavior.

    Science.gov (United States)

    Zeiger, Adam S; Loe, Felicia C; Li, Ran; Raghunath, Michael; Van Vliet, Krystyn J

    2012-01-01

    Microenvironments of biological cells are dominated in vivo by macromolecular crowding and resultant excluded volume effects. This feature is absent in dilute in vitro cell culture. Here, we induced macromolecular crowding in vitro by using synthetic macromolecular globules of nm-scale radius at physiological levels of fractional volume occupancy. We quantified the impact of induced crowding on the extracellular and intracellular protein organization of human mesenchymal stem cells (MSCs) via immunocytochemistry, atomic force microscopy (AFM), and AFM-enabled nanoindentation. Macromolecular crowding in extracellular culture media directly induced supramolecular assembly and alignment of extracellular matrix proteins deposited by cells, which in turn increased alignment of the intracellular actin cytoskeleton. The resulting cell-matrix reciprocity further affected adhesion, proliferation, and migration behavior of MSCs. Macromolecular crowding can thus aid the design of more physiologically relevant in vitro studies and devices for MSCs and other cells, by increasing the fidelity between materials synthesized by cells in vivo and in vitro.

  20. Electron damage in organic crystals

    International Nuclear Information System (INIS)

    Howitt, D.G.; Thomas, G.

    1977-01-01

    The effects of radiation damage in three crystalline organic materials (l-valine, cytosine, copper phthalocyanine) have been investigated by electron microscopy. The degradation of these materials has been found to be consistent with a gradual collapse of their crystal structures brought about by ionization damage to the comprising molecules. It is inferred that the crystallinity of these materials is destroyed by ionizing radiation because the damaged molecules cannot be incorporated into the framework of their original structures. (author)

  1. In-vacuum long-wavelength macromolecular crystallography.

    Science.gov (United States)

    Wagner, Armin; Duman, Ramona; Henderson, Keith; Mykhaylyk, Vitaliy

    2016-03-01

    Structure solution based on the weak anomalous signal from native (protein and DNA) crystals is increasingly being attempted as part of synchrotron experiments. Maximizing the measurable anomalous signal by collecting diffraction data at longer wavelengths presents a series of technical challenges caused by the increased absorption of X-rays and larger diffraction angles. A new beamline at Diamond Light Source has been built specifically for collecting data at wavelengths beyond the capability of other synchrotron macromolecular crystallography beamlines. Here, the theoretical considerations in support of the long-wavelength beamline are outlined and the in-vacuum design of the endstation is discussed, as well as other hardware features aimed at enhancing the accuracy of the diffraction data. The first commissioning results, representing the first in-vacuum protein structure solution, demonstrate the promising potential of the beamline.

  2. Efficient analysis of macromolecular rotational diffusion from heteronuclear relaxation data

    International Nuclear Information System (INIS)

    Dosset, Patrice; Hus, Jean-Christophe; Blackledge, Martin; Marion, Dominique

    2000-01-01

    A novel program has been developed for the interpretation of 15 N relaxation rates in terms of macromolecular anisotropic rotational diffusion. The program is based on a highly efficient simulated annealing/minimization algorithm, designed specifically to search the parametric space described by the isotropic, axially symmetric and fully anisotropic rotational diffusion tensor models. The high efficiency of this algorithm allows extensive noise-based Monte Carlo error analysis. Relevant statistical tests are systematically applied to provide confidence limits for the proposed tensorial models. The program is illustrated here using the example of the cytochrome c' from Rhodobacter capsulatus, a four-helix bundle heme protein, for which data at three different field strengths were independently analysed and compared

  3. Macromolecular Crystallization in Microfluidics for the International Space Station

    Science.gov (United States)

    Monaco, Lisa A.; Spearing, Scott

    2003-01-01

    At NASA's Marshall Space Flight Center, the Iterative Biological Crystallization (IBC) project has begun development on scientific hardware for macromolecular crystallization on the International Space Station (ISS). Currently ISS crystallization research is limited to solution recipes that were prepared on the ground prior to launch. The proposed hardware will conduct solution mixing and dispensing on board the ISS, be fully automated, and have imaging functions via remote commanding from the ground. Utilizing microfluidic technology, IBC will allow for on orbit iterations. The microfluidics LabChip(R) devices that have been developed, along with Caliper Technologies, will greatly benefit researchers by allowing for precise fluid handling of nano/pico liter sized volumes. IBC will maximize the amount of science return by utilizing the microfluidic approach and be a valuable tool to structural biologists investigating medically relevant projects.

  4. Macromolecular and dendrimer-based magnetic resonance contrast agents

    Energy Technology Data Exchange (ETDEWEB)

    Bumb, Ambika; Brechbiel, Martin W. (Radiation Oncology Branch, National Cancer Inst., National Inst. of Health, Bethesda, MD (United States)), e-mail: pchoyke@mail.nih.gov; Choyke, Peter (Molecular Imaging Program, National Cancer Inst., National Inst. of Health, Bethesda, MD (United States))

    2010-09-15

    Magnetic resonance imaging (MRI) is a powerful imaging modality that can provide an assessment of function or molecular expression in tandem with anatomic detail. Over the last 20-25 years, a number of gadolinium-based MR contrast agents have been developed to enhance signal by altering proton relaxation properties. This review explores a range of these agents from small molecule chelates, such as Gd-DTPA and Gd-DOTA, to macromolecular structures composed of albumin, polylysine, polysaccharides (dextran, inulin, starch), poly(ethylene glycol), copolymers of cystamine and cystine with GD-DTPA, and various dendritic structures based on polyamidoamine and polylysine (Gadomers). The synthesis, structure, biodistribution, and targeting of dendrimer-based MR contrast agents are also discussed

  5. The monitoring system for macromolecular crystallography beamlines at BSRF

    International Nuclear Information System (INIS)

    Guo Xian; Chang Guangcai; Gan Quan; Shi Hong; Liu Peng; Sun Gongxing

    2012-01-01

    The monitoring system for macromolecular crystallography beamlines at BSRF (Beijing Synchrotron Radiation Facility) based on LabVIEW is introduced. In order to guarantee a safe, stable, and reliable running for the beamline devices, the system monitors the state of vacuum, cooling-water, optical components, beam, Liquid nitrogen in the beamlines in real time, detects faults and gives the alarm timely. System underlying uses the driver developed for the field devices for data acquisition, Data of collection is uploaded to the data-sharing platform makes it accessible via a network share. The upper system divides modules according to the actual function, and establishes the main interface of the monitoring system of beamline. To Facilitate data storage, management and inquiry, the system use LabSQL toolkit to achieve the interconnection with MySQL database which data of collection is sent to. (authors)

  6. 129 Xe NMR Relaxation-Based Macromolecular Sensing

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Muller D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Dao, Phuong [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Jeong, Keunhong [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Slack, Clancy C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Vassiliou, Christophoros C. [Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Finbloom, Joel A. [Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Francis, Matthew B. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Wemmer, David E. [Univ. of California, Berkeley, CA (United States). Dept. of Chemistry; Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Physical Biosciences Division; Pines, Alexander [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Materials Sciences Division; Univ. of California, Berkeley, CA (United States). Dept. of Chemistry

    2016-07-29

    A 129Xe NMR relaxation-based sensing approach is reported on that exploits changes in the bulk xenon relaxation rate induced by slowed tumbling of a cryptophane-based sensor upon target binding. The amplification afforded by detection of the bulk dissolved xenon allows sensitive detection of targets. The sensor comprises a xenon-binding cryptophane cage, a target interaction element, and a metal chelating agent. Xenon associated with the target-bound cryptophane cage is rapidly relaxed and then detected after exchange with the bulk. Here we show that large macromolecular targets increase the rotational correlation time of xenon, increasing its relaxation rate. Upon binding of a biotin-containing sensor to avidin at 1.5 μM concentration, the free xenon T2 is reduced by a factor of 4.

  7. E-MSD: the European Bioinformatics Institute Macromolecular Structure Database.

    Science.gov (United States)

    Boutselakis, H; Dimitropoulos, D; Fillon, J; Golovin, A; Henrick, K; Hussain, A; Ionides, J; John, M; Keller, P A; Krissinel, E; McNeil, P; Naim, A; Newman, R; Oldfield, T; Pineda, J; Rachedi, A; Copeland, J; Sitnov, A; Sobhany, S; Suarez-Uruena, A; Swaminathan, J; Tagari, M; Tate, J; Tromm, S; Velankar, S; Vranken, W

    2003-01-01

    The E-MSD macromolecular structure relational database (http://www.ebi.ac.uk/msd) is designed to be a single access point for protein and nucleic acid structures and related information. The database is derived from Protein Data Bank (PDB) entries. Relational database technologies are used in a comprehensive cleaning procedure to ensure data uniformity across the whole archive. The search database contains an extensive set of derived properties, goodness-of-fit indicators, and links to other EBI databases including InterPro, GO, and SWISS-PROT, together with links to SCOP, CATH, PFAM and PROSITE. A generic search interface is available, coupled with a fast secondary structure domain search tool.

  8. NATO Advanced Study Institute on Evolving Methods for Macromolecular Gystallography

    CERN Document Server

    Read, Randy J

    2007-01-01

    X-ray crystallography is the pre-eminent technique for visualizing the structures of macromolecules at atomic resolution. These structures are central to understanding the detailed mechanisms of biological processes, and to discovering novel therapeutics using a structure-based approach. As yet, structures are known for only a small fraction of the proteins encoded by human and pathogenic genomes. To counter the myriad modern threats of disease, there is an urgent need to determine the structures of the thousands of proteins whose structure and function remain unknown. This volume draws on the expertise of leaders in the field of macromolecular crystallography to illuminate the dramatic developments that are accelerating progress in structural biology. Their contributions span the range of techniques from crystallization through data collection, structure solution and analysis, and show how modern high-throughput methods are contributing to a deeper understanding of medical problems.

  9. MR lymphography with macromolecular Gd-DTPA compounds

    International Nuclear Information System (INIS)

    Hamm, B.; Wagner, S.; Branding, G.; Taupitz, M.; Wolf, K.J.

    1990-01-01

    This paper investigates the suitability of macromolecular Gd-DTPA compounds as signal-enhancing lymphographic agents in MR imaging. Two Gd-DTPA polylysin compounds and Gd-DTPA albumin, with molecular weights of 48,000,170,000, and 87,000 daltons, respectively, were tested in rabbits at gadolinium doses of 5 and 15 μmol per animal. Three animals were examined at each dose with T1-weighted sequences. The iliac lymph nodes were imaged prior to and during unilateral endolymphatic infusion into a femoral lymph vessel as well as over a period of 2 hours thereafter. All contrast media showed a homogeneous and pronounced signal enhancement in the lymph nodes during infusion at both doses

  10. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    Science.gov (United States)

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  11. Extracting trends from two decades of microgravity macromolecular crystallization history.

    Science.gov (United States)

    Judge, Russell A; Snell, Edward H; van der Woerd, Mark J

    2005-06-01

    Since the 1980s hundreds of macromolecular crystal growth experiments have been performed in the reduced acceleration environment of an orbiting spacecraft. Significant enhancements in structural knowledge have resulted from X-ray diffraction of the crystals grown. Similarly, many samples have shown no improvement or degradation in comparison to those grown on the ground. A complex series of interrelated factors affect these experiments and by building a comprehensive archive of the results it was aimed to identify factors that result in success and those that result in failure. Specifically, it was found that dedicated microgravity missions increase the chance of success when compared with those where crystallization took place as a parasitic aspect of the mission. It was also found that the chance of success could not be predicted based on any discernible property of the macromolecule available to us.

  12. Macromolecular contrast agents for MR mammography: current status

    International Nuclear Information System (INIS)

    Daldrup-Link, Heike E.; Brasch, Robert C.

    2003-01-01

    Macromolecular contrast media (MMCM) encompass a new class of diagnostic drugs that can be applied with dynamic MRI to extract both physiologic and morphologic information in breast lesions. Kinetic analysis of dynamic MMCM-enhanced MR data in breast tumor patients provides useful estimates of tumor blood volume and microvascular permeability, typically increased in cancer. These tumor characteristics can be applied to differentiate benign from malignant lesions, to define the angiogenesis status of cancers, and to monitor tumor response to therapy. The most immediate challenge to the development of MMCM-enhanced mammography is the identification of those candidate compounds that demonstrate the requisite long intravascular distribution and have the high tolerance necessary for clinical use. Potential mammographic applications and limitations of various MMCM, defined by either experimental animal testing or clinical testing in patients, are reviewed in this article. (orig.)

  13. Macromolecular organization of xyloglucan and cellulose in pea epicotyls

    International Nuclear Information System (INIS)

    Hayashi, T.; Maclachlan, G.

    1984-01-01

    Xyloglucan is known to occur widely in the primary cell walls of higher plants. This polysaccharide in most dicots possesses a cellulose-like main chain with three of every four consecutive residues substituted with xylose and minor addition of other sugars. Xyloglucan and cellulose metabolism is regulated by different processes; since different enzyme systems are probably required for the synthesis of their 1,4-β-linkages. A macromolecular complex composed of xyloglucan and cellulose only was obtained from elongating regions of etiolated pea stems. It was examined by light microscopy using iodine staining, by radioautography after labeling with [ 3 H]fructose, by fluorescence microscopy using a fluorescein-lectin (fructose-binding) as probe, and by electron microscopy after shadowing. The techniques all demonstrated that the macromolecule was present in files of cell shapes, referred to here as cell-wall ghosts, in which xyloglucan was localized both on and between the cellulose microfibrils

  14. Inference in `poor` languages

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, S.

    1996-10-01

    Languages with a solvable implication problem but without complete and consistent systems of inference rules (`poor` languages) are considered. The problem of existence of finite complete and consistent inference rule system for a ``poor`` language is stated independently of the language or rules syntax. Several properties of the problem arc proved. An application of results to the language of join dependencies is given.

  15. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  16. Geometric statistical inference

    International Nuclear Information System (INIS)

    Periwal, Vipul

    1999-01-01

    A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined

  17. Practical Bayesian Inference

    Science.gov (United States)

    Bailer-Jones, Coryn A. L.

    2017-04-01

    Preface; 1. Probability basics; 2. Estimation and uncertainty; 3. Statistical models and inference; 4. Linear models, least squares, and maximum likelihood; 5. Parameter estimation: single parameter; 6. Parameter estimation: multiple parameters; 7. Approximating distributions; 8. Monte Carlo methods for inference; 9. Parameter estimation: Markov chain Monte Carlo; 10. Frequentist hypothesis testing; 11. Model comparison; 12. Dealing with more complicated problems; References; Index.

  18. Probing the hydration water diffusion of macromolecular surfaces and interfaces

    International Nuclear Information System (INIS)

    Ortony, Julia H; Cheng, Chi-Yuan; Franck, John M; Pavlova, Anna; Hunt, Jasmine; Han, Songi; Kausik, Ravinath

    2011-01-01

    We probe the translational dynamics of the hydration water surrounding the macromolecular surfaces of selected polyelectrolytes, lipid vesicles and intrinsically disordered proteins with site specificity in aqueous solutions. These measurements are made possible by the recent development of a new instrumental and methodological approach based on Overhauser dynamic nuclear polarization (DNP)-enhanced nuclear magnetic resonance (NMR) spectroscopy. This technique selectively amplifies 1 H NMR signals of hydration water around a spin label that is attached to a molecular site of interest. The selective 1 H NMR amplification within molecular length scales of a spin label is achieved by utilizing short-distance range (∼r -3 ) magnetic dipolar interactions between the 1 H spin of water and the electron spin of a nitroxide radical-based label. Key features include the fact that only minute quantities (<10 μl) and dilute (≥100 μM) sample concentrations are needed. There is no size limit on the macromolecule or molecular assembly to be analyzed. Hydration water with translational correlation times between 10 and 800 ps is measured within ∼10 A distance of the spin label, encompassing the typical thickness of a hydration layer with three water molecules across. The hydration water moving within this time scale has significant implications, as this is what is modulated whenever macromolecules or molecular assemblies undergo interactions, binding or conformational changes. We demonstrate, with the examples of polymer complexation, protein aggregation and lipid-polymer interaction, that the measurements of interfacial hydration dynamics can sensitively and site specifically probe macromolecular interactions.

  19. Knowledge and inference

    CERN Document Server

    Nagao, Makoto

    1990-01-01

    Knowledge and Inference discusses an important problem for software systems: How do we treat knowledge and ideas on a computer and how do we use inference to solve problems on a computer? The book talks about the problems of knowledge and inference for the purpose of merging artificial intelligence and library science. The book begins by clarifying the concept of """"knowledge"""" from many points of view, followed by a chapter on the current state of library science and the place of artificial intelligence in library science. Subsequent chapters cover central topics in the artificial intellig

  20. Logical inference and evaluation

    International Nuclear Information System (INIS)

    Perey, F.G.

    1981-01-01

    Most methodologies of evaluation currently used are based upon the theory of statistical inference. It is generally perceived that this theory is not capable of dealing satisfactorily with what are called systematic errors. Theories of logical inference should be capable of treating all of the information available, including that not involving frequency data. A theory of logical inference is presented as an extension of deductive logic via the concept of plausibility and the application of group theory. Some conclusions, based upon the application of this theory to evaluation of data, are also given

  1. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  2. On quantum statistical inference

    NARCIS (Netherlands)

    Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have

  3. INFERENCE BUILDING BLOCKS

    Science.gov (United States)

    2018-02-15

    expressed a variety of inference techniques on discrete and continuous distributions: exact inference, importance sampling, Metropolis-Hastings (MH...without redoing any math or rewriting any code. And although our main goal is composable reuse, our performance is also good because we can use...control paths. • The Hakaru language can express mixtures of discrete and continuous distributions, but the current disintegration transformation

  4. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  5. Fluid Physics and Macromolecular Crystal Growth in Microgravity

    Science.gov (United States)

    Helliwell, John R.; Snell, Edward H.; Chayen, Naomi E.; Judge, Russell A.; Boggon, Titus J.; Pusey, M. L.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The first protein crystallization experiment in microgravity was launched in April, 1981 and used Germany's Technologische Experimente unter Schwerelosigkeit (TEXUS 3) sounding rocket. The protein P-galactosidase (molecular weight 465Kda) was chosen as the sample with a liquid-liquid diffusion growth method. A sliding device brought the protein, buffer and salt solution into contact when microgravity was reached. The sounding rocket gave six minutes of microgravity time with a cine camera and schlieren optics used to monitor the experiment, a single growth cell. In microgravity a strictly laminar diffusion process was observed in contrast to the turbulent convection seen on the ground. Several single crystals, approx 100micron in length, were formed in the flight which were of inferior but of comparable visual quality to those grown on the ground over several days. A second experiment using the same protocol but with solutions cooled to -8C (kept liquid with glycerol antifreeze) again showed laminar diffusion. The science of macromolecular structural crystallography involves crystallization of the macromolecule followed by use of the crystal for X-ray diffraction experiments to determine the three dimensional structure of the macromolecule. Neutron protein crystallography is employed for elucidation of H/D exchange and for improved definition of the bound solvent (D20). The structural information enables an understanding of how the molecule functions with important potential for rational drug design, improved efficiency of industrial enzymes and agricultural chemical development. The removal of turbulent convection and sedimentation in microgravity, and the assumption that higher quality crystals will be produced, has given rise to the growing number of crystallization experiments now flown. Many experiments can be flown in a small volume with simple, largely automated, equipment - an ideal combination for a microgravity experiment. The term "protein crystal growth

  6. Functionalization of Planet-Satellite Nanostructures Revealed by Nanoscopic Localization of Distinct Macromolecular Species

    KAUST Repository

    Rossner, Christian; Roddatis, Vladimir; Lopatin, Sergei; Vana, Philipp

    2016-01-01

    The development of a straightforward method is reported to form hybrid polymer/gold planet-satellite nanostructures (PlSNs) with functional polymer. Polyacrylate type polymer with benzyl chloride in its backbone as a macromolecular tracer

  7. Nitrogen isotopic composition of macromolecular organic matter in interplanetary dust particles

    Science.gov (United States)

    Aléon, Jérôme; Robert, François; Chaussidon, Marc; Marty, Bernard

    2003-10-01

    Nitrogen concentrations and isotopic compositions were measured by ion microprobe scanning imaging in two interplanetary dust particles L2021 K1 and L2036 E22, in which imaging of D/H and C/H ratios has previously evidenced the presence of D-rich macromolecular organic components. High nitrogen concentrations of 10-20 wt% and δ 15N values up to +400‰ are observed in these D-rich macromolecular components. The previous study of D/H and C/H ratios has revealed three different D-rich macromolecular phases. The one previously ascribed to macromolecular organic matter akin the insoluble organic matter (IOM) from carbonaceous chondrites is enriched in nitrogen by one order of magnitude compared to the carbonaceous chondrite IOM, although its isotopic composition is still similar to what is known from Renazzo (δ 15N = +208‰). The correlation observed in macromolecular organic material between the D- and 15N-excesses suggests that the latter originate probably from chemical reactions typical of the cold interstellar medium. These interstellar materials preserved to some extent in IDPs are therefore macromolecular organic components with various aliphaticity and aromaticity. They are heavily N-heterosubstituted as shown by their high nitrogen concentrations >10 wt%. They have high D/H ratios >10 -3 and δ 15N values ≥ +400‰. In L2021 K1 a mixture is observed at the micron scale between interstellar and chondritic-like organic phases. This indicates that some IDPs contain organic materials processed at various heliocentric distances in a turbulent nebula. Comparison with observation in comets suggests that these molecules may be cometary macromolecules. A correlation is observed between the D/H ratios and δ 15N values of macromolecular organic matter from IDPs, meteorites, the Earth and of major nebular reservoirs. This suggests that most macromolecular organic matter in the inner solar system was probably issued from interstellar precursors and further processed

  8. Timely deposition of macromolecular structures is necessary for peer review

    International Nuclear Information System (INIS)

    Joosten, Robbie P.; Soueidan, Hayssam; Wessels, Lodewyk F. A.; Perrakis, Anastassis

    2013-01-01

    Deposition of crystallographic structures should be concurrent with or prior to manuscript submission for peer review, enabling validation and increasing reliability of the PDB. Most of the macromolecular structures in the Protein Data Bank (PDB), which are used daily by thousands of educators and scientists alike, are determined by X-ray crystallography. It was examined whether the crystallographic models and data were deposited to the PDB at the same time as the publications that describe them were submitted for peer review. This condition is necessary to ensure pre-publication validation and the quality of the PDB public archive. It was found that a significant proportion of PDB entries were submitted to the PDB after peer review of the corresponding publication started, and many were only submitted after peer review had ended. It is argued that clear description of journal policies and effective policing is important for pre-publication validation, which is key in ensuring the quality of the PDB and of peer-reviewed literature

  9. Timely deposition of macromolecular structures is necessary for peer review

    Energy Technology Data Exchange (ETDEWEB)

    Joosten, Robbie P. [Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX Amsterdam (Netherlands); Soueidan, Hayssam; Wessels, Lodewyk F. A. [Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX, Amsterdam (Netherlands); Perrakis, Anastassis, E-mail: a.perrakis@nki.nl [Netherlands Cancer Institute, Plesmanlaan 121, 1066 CX Amsterdam (Netherlands)

    2013-12-01

    Deposition of crystallographic structures should be concurrent with or prior to manuscript submission for peer review, enabling validation and increasing reliability of the PDB. Most of the macromolecular structures in the Protein Data Bank (PDB), which are used daily by thousands of educators and scientists alike, are determined by X-ray crystallography. It was examined whether the crystallographic models and data were deposited to the PDB at the same time as the publications that describe them were submitted for peer review. This condition is necessary to ensure pre-publication validation and the quality of the PDB public archive. It was found that a significant proportion of PDB entries were submitted to the PDB after peer review of the corresponding publication started, and many were only submitted after peer review had ended. It is argued that clear description of journal policies and effective policing is important for pre-publication validation, which is key in ensuring the quality of the PDB and of peer-reviewed literature.

  10. JBluIce-EPICS control system for macromolecular crystallography

    International Nuclear Information System (INIS)

    Stepanov, S.; Makarov, O.; Hilgart, M.; Pothineni, S.; Urakhchin, A.; Devarapalli, S.; Yoder, D.; Becker, M.; Ogata, C.; Sanishvili, R.; Nagarajan, V.; Smith, J.L.; Fischetti, R.F.

    2011-01-01

    The trio of macromolecular crystallography beamlines constructed by the General Medicine and Cancer Institutes Collaborative Access Team (GM/CA-CAT) in Sector 23 of the Advanced Photon Source (APS) have been in growing demand owing to their outstanding beam quality and capacity to measure data from crystals of only a few micrometres in size. To take full advantage of the state-of-the-art mechanical and optical design of these beamlines, a significant effort has been devoted to designing fast, convenient, intuitive and robust beamline controls that could easily accommodate new beamline developments. The GM/CA-CAT beamline controls are based on the power of EPICS for distributed hardware control, the rich Java graphical user interface of Eclipse RCP and the task-oriented philosophy as well as the look and feel of the successful SSRL BluIce graphical user interface for crystallography. These beamline controls feature a minimum number of software layers, the wide use of plug-ins that can be written in any language and unified motion controls that allow on-the-fly scanning and optimization of any beamline component. This paper describes the ways in which BluIce was combined with EPICS and converted into the Java-based JBluIce, discusses the solutions aimed at streamlining and speeding up operations and gives an overview of the tools that are provided by this new open-source control system for facilitating crystallographic experiments, especially in the field of microcrystallography.

  11. Structural changes in the ordering processes of macromolecular compounds

    International Nuclear Information System (INIS)

    Kobayashi, M.; Tashiro, K.

    1998-01-01

    In order to clarify the microscopically-viewed relationship between the conformational ordering process and the aggregation process of the macromolecular chains in the phase transitions from melt to solid or from solution to gel, the time-resolved Fourier-transform infrared spectra and small-angle X-ray or neutron scattering data have been analyzed in an organized manner. Two concrete examples were presented. (1) In the gelation phenomenon of syndiotactic polystyrene-organic solvent system, the ordered TTGG conformation is formed and develops with time. This conformational ordering is accelerated by the aggregation of these chain segments, resulting in the formation of macroscopic gel network. (2) In the isothermal crystallization process from the melt of polyethylene, the following ordering mechanism was revealed. The conformationally-disordered short trans conformers appear at first in the random coils of the melt. These disordered trans sequences grow to longer and more regular trans sequences of the orthorhombic-type crystal and then the isolated lamellae are formed. Afterwards, the stacked lamellar structure is developed without change of lamellar thickness but with small decrease in the long period, indicating an insertion of new lamellae between the already produced lamellar layers

  12. On macromolecular refinement at subatomic resolution with interatomic scatterers

    Energy Technology Data Exchange (ETDEWEB)

    Afonine, Pavel V., E-mail: pafonine@lbl.gov; Grosse-Kunstleve, Ralf W.; Adams, Paul D. [Lawrence Berkeley National Laboratory, One Cyclotron Road, BLDG 64R0121, Berkeley, CA 94720 (United States); Lunin, Vladimir Y. [Institute of Mathematical Problems of Biology, Russian Academy of Sciences, Pushchino 142290 (Russian Federation); Urzhumtsev, Alexandre [IGMBC, 1 Rue L. Fries, 67404 Illkirch and IBMC, 15 Rue R. Descartes, 67084 Strasbourg (France); Faculty of Sciences, Nancy University, 54506 Vandoeuvre-lès-Nancy (France); Lawrence Berkeley National Laboratory, One Cyclotron Road, BLDG 64R0121, Berkeley, CA 94720 (United States)

    2007-11-01

    Modelling deformation electron density using interatomic scatters is simpler than multipolar methods, produces comparable results at subatomic resolution and can easily be applied to macromolecules. A study of the accurate electron-density distribution in molecular crystals at subatomic resolution (better than ∼1.0 Å) requires more detailed models than those based on independent spherical atoms. A tool that is conventionally used in small-molecule crystallography is the multipolar model. Even at upper resolution limits of 0.8–1.0 Å, the number of experimental data is insufficient for full multipolar model refinement. As an alternative, a simpler model composed of conventional independent spherical atoms augmented by additional scatterers to model bonding effects has been proposed. Refinement of these mixed models for several benchmark data sets gave results that were comparable in quality with the results of multipolar refinement and superior to those for conventional models. Applications to several data sets of both small molecules and macromolecules are shown. These refinements were performed using the general-purpose macromolecular refinement module phenix.refine of the PHENIX package.

  13. Polycapillary x-ray optics for macromolecular crystallography

    International Nuclear Information System (INIS)

    Owens, S.M.; Gibson, W.M.; Carter, D.C.; Sisk, R.C.; Ho, J.X.

    1996-01-01

    Polycapillary x-ray optics have found potential application in many different fields, including antiscatter and magnification in mammography, radiography, x-ray fluorescence, x-ray lithography, and x-ray diffraction techniques. In x-ray diffraction, an optic is used to collect divergent x-rays from a point source and redirect them into a quasi-parallel, or slightly focused beam. Monolithic polycapillary optics have been developed recently for macromolecular crystallography and have already shown considerable gains in diffracted beam intensity over pinhole collimation. Development is being pursued through a series of simulations and prototype optics. Many improvements have been made over the stage 1 prototype reported previously, which include better control over the manufacturing process, reducing the diameter of the output beam, and addition of a slight focusing at the output of the optic to further increase x-ray flux at the sample. The authors report the characteristics and performance of the stage 1 and stage 2 optics

  14. A beamline for macromolecular crystallography at the Advanced Light Source

    International Nuclear Information System (INIS)

    Padmore, H.A.; Earnest, T.; Kim, S.H.; Thompson, A.C.; Robinson, A.L.

    1994-08-01

    A beamline for macromolecular crystallography has been designed for the ALS. The source will be a 37-pole wiggler with a, 2-T on-axis peak field. The wiggler will illuminate three beamlines, each accepting 3 mrad of horizontal aperture. The central beamline will primarily be used for multiple-wavelength anomalous dispersion measurements in the wavelength range from 4 to 0.9 angstrom. The beamline optics will comprise a double-crystal monochromator with a collimating pre-mirror and a double-focusing mirror after the monochromator. The two side stations will be used for fixed-wavelength experiments within the wavelength range from 1.5 to 0.95 angstrom. The optics will consist of a conventional vertically focusing cylindrical mirror followed by an asymmetrically cut curved-crystal monochromator. This paper presents details of the optimization of the wiggler source for crystallography, gives a description of the beamline configuration, and discusses the reasons for the choices made

  15. Macromolecular crystallization in microgravity generated by a superconducting magnet.

    Science.gov (United States)

    Wakayama, N I; Yin, D C; Harata, K; Kiyoshi, T; Fujiwara, M; Tanimoto, Y

    2006-09-01

    About 30% of the protein crystals grown in space yield better X-ray diffraction data than the best crystals grown on the earth. The microgravity environments provided by the application of an upward magnetic force constitute excellent candidates for simulating the microgravity conditions in space. Here, we describe a method to control effective gravity and formation of protein crystals in various levels of effective gravity. Since 2002, the stable and long-time durable microgravity generated by a convenient type of superconducting magnet has been available for protein crystal growth. For the first time, protein crystals, orthorhombic lysozyme, were grown at microgravity on the earth, and it was proved that this microgravity improved the crystal quality effectively and reproducibly. The present method always accompanies a strong magnetic field, and the magnetic field itself seems to improve crystal quality. Microgravity is not always effective for improving crystal quality. When we applied this microgravity to the formation of cubic porcine insulin and tetragonal lysozyme crystals, we observed no dependence of effective gravity on crystal quality. Thus, this kind of test will be useful for selecting promising proteins prior to the space experiments. Finally, the microgravity generated by the magnet is compared with that in space, considering the cost, the quality of microgravity, experimental convenience, etc., and the future use of this microgravity for macromolecular crystal growth is discussed.

  16. Macromolecular crystallography with a large format CMOS detector

    Energy Technology Data Exchange (ETDEWEB)

    Nix, Jay C., E-mail: jcnix@lbl.gov [Molecular Biology Consortium 12003 S. Pulaski Rd. #166 Alsip, IL 60803 U.S.A (United States)

    2016-07-27

    Recent advances in CMOS technology have allowed the production of large surface area detectors suitable for macromolecular crystallography experiments [1]. The Molecular Biology Consortium (MBC) Beamline 4.2.2 at the Advanced Light Source in Berkeley, CA, has installed a 2952 x 2820 mm RDI CMOS-8M detector with funds from NIH grant S10OD012073. The detector has a 20nsec dead pixel time and performs well with shutterless data collection strategies. The sensor obtains sharp point response and minimal optical distortion by use of a thin fiber-optic plate between the phosphor and sensor module. Shutterless data collections produce high-quality redundant datasets that can be obtained in minutes. The fine-sliced data are suitable for processing in standard crystallographic software packages (XDS, HKL2000, D*TREK, MOSFLM). Faster collection times relative to the previous CCD detector have resulted in a record number of datasets collected in a calendar year and de novo phasing experiments have resulted in publications in both Science and Nature [2,3]. The faster collections are due to a combination of the decreased overhead requirements of shutterless collections combined with exposure times that have decreased by over a factor of 2 for images with comparable signal to noise of the NOIR-1 detector. The overall increased productivity has allowed the development of new beamline capabilities and data collection strategies.

  17. Macromolecular refinement by model morphing using non-atomic parameterizations.

    Science.gov (United States)

    Cowtan, Kevin; Agirre, Jon

    2018-02-01

    Refinement is a critical step in the determination of a model which explains the crystallographic observations and thus best accounts for the missing phase components. The scattering density is usually described in terms of atomic parameters; however, in macromolecular crystallography the resolution of the data is generally insufficient to determine the values of these parameters for individual atoms. Stereochemical and geometric restraints are used to provide additional information, but produce interrelationships between parameters which slow convergence, resulting in longer refinement times. An alternative approach is proposed in which parameters are not attached to atoms, but to regions of the electron-density map. These parameters can move the density or change the local temperature factor to better explain the structure factors. Varying the size of the region which determines the parameters at a particular position in the map allows the method to be applied at different resolutions without the use of restraints. Potential applications include initial refinement of molecular-replacement models with domain motions, and potentially the use of electron density from other sources such as electron cryo-microscopy (cryo-EM) as the refinement model.

  18. Hypoxic tumor environments exhibit disrupted collagen I fibers and low macromolecular transport.

    Directory of Open Access Journals (Sweden)

    Samata M Kakkad

    Full Text Available Hypoxic tumor microenvironments result in an aggressive phenotype and resistance to therapy that lead to tumor progression, recurrence, and metastasis. While poor vascularization and the resultant inadequate drug delivery are known to contribute to drug resistance, the effect of hypoxia on molecular transport through the interstitium, and the role of the extracellular matrix (ECM in mediating this transport are unexplored. The dense mesh of fibers present in the ECM can especially influence the movement of macromolecules. Collagen 1 (Col1 fibers form a key component of the ECM in breast cancers. Here we characterized the influence of hypoxia on macromolecular transport in tumors, and the role of Col1 fibers in mediating this transport using an MDA-MB-231 breast cancer xenograft model engineered to express red fluorescent protein under hypoxia. Magnetic resonance imaging of macromolecular transport was combined with second harmonic generation microscopy of Col1 fibers. Hypoxic tumor regions displayed significantly decreased Col1 fiber density and volume, as well as significantly lower macromolecular draining and pooling rates, than normoxic regions. Regions adjacent to severely hypoxic areas revealed higher deposition of Col1 fibers and increased macromolecular transport. These data suggest that Col1 fibers may facilitate macromolecular transport in tumors, and their reduction in hypoxic regions may reduce this transport. Decreased macromolecular transport in hypoxic regions may also contribute to poor drug delivery and tumor recurrence in hypoxic regions. High Col1 fiber density observed around hypoxic regions may facilitate the escape of aggressive cancer cells from hypoxic regions.

  19. Development of an online UV-visible microspectrophotometer for a macromolecular crystallography beamline.

    Science.gov (United States)

    Shimizu, Nobutaka; Shimizu, Tetsuya; Baba, Seiki; Hasegawa, Kazuya; Yamamoto, Masaki; Kumasaka, Takashi

    2013-11-01

    Measurement of the UV-visible absorption spectrum is a convenient technique for detecting chemical changes of proteins, and it is therefore useful to combine spectroscopy and diffraction studies. An online microspectrophotometer for the UV-visible region was developed and installed on the macromolecular crystallography beamline, BL38B1, at SPring-8. This spectrophotometer is equipped with a difference dispersive double monochromator, a mercury-xenon lamp as the light source, and a photomultiplier as the detector. The optical path is mostly constructed using mirrors, in order to obtain high brightness in the UV region, and the confocal optics are assembled using a cross-slit diaphragm like an iris to eliminate stray light. This system can measure optical densities up to a maximum of 4.0. To study the effect of radiation damage, preliminary measurements of glucose isomerase and thaumatin crystals were conducted in the UV region. Spectral changes dependent on X-ray dose were observed at around 280 nm, suggesting that structural changes involving Trp or Tyr residues occurred in the protein crystal. In the case of the thaumatin crystal, a broad peak around 400 nm was also generated after X-ray irradiation, suggesting the cleavage of a disulfide bond. Dose-dependent spectral changes were also observed in cryo-solutions alone, and these changes differed with the composition of the cryo-solution. These responses in the UV region are informative regarding the state of the sample; consequently, this device might be useful for X-ray crystallography.

  20. Type Inference with Inequalities

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    1991-01-01

    of (monotonic) inequalities on the types of variables and expressions. A general result about systems of inequalities over semilattices yields a solvable form. We distinguish between deciding typability (the existence of solutions) and type inference (the computation of a minimal solution). In our case, both......Type inference can be phrased as constraint-solving over types. We consider an implicitly typed language equipped with recursive types, multiple inheritance, 1st order parametric polymorphism, and assignments. Type correctness is expressed as satisfiability of a possibly infinite collection...

  1. Macromolecular query language (MMQL): prototype data model and implementation.

    Science.gov (United States)

    Shindyalov, I N; Chang, W; Pu, C; Bourne, P E

    1994-11-01

    Macromolecular query language (MMQL) is an extensible interpretive language in which to pose questions concerning the experimental or derived features of the 3-D structure of biological macromolecules. MMQL portends to be intuitive with a simple syntax, so that from a user's perspective complex queries are easily written. A number of basic queries and a more complex query--determination of structures containing a five-strand Greek key motif--are presented to illustrate the strengths and weaknesses of the language. The predominant features of MMQL are a filter and pattern grammar which are combined to express a wide range of interesting biological queries. Filters permit the selection of object attributes, for example, compound name and resolution, whereas the patterns currently implemented query primary sequence, close contacts, hydrogen bonding, secondary structure, conformation and amino acid properties (volume, polarity, isoelectric point, hydrophobicity and different forms of exposure). MMQL queries are processed by MMQLlib; a C++ class library, to which new query methods and pattern types are easily added. The prototype implementation described uses PDBlib, another C(++)-based class library from representing the features of biological macromolecules at the level of detail parsable from a PDB file. Since PDBlib can represent data stored in relational and object-oriented databases, as well as PDB files, once these data are loaded they too can be queried by MMQL. Performance metrics are given for queries of PDB files for which all derived data are calculated at run time and compared to a preliminary version of OOPDB, a prototype object-oriented database with a schema based on a persistent version of PDBlib which offers more efficient data access and the potential to maintain derived information. MMQLlib, PDBlib and associated software are available via anonymous ftp from cuhhca.hhmi.columbia.edu.

  2. Macromolecular weight specificity in covalent binding of bromobenzene

    International Nuclear Information System (INIS)

    Sun, J.D.; Dent, J.G.

    1984-01-01

    Bromobenzene is a hepatotoxicant that causes centrilobular necrosis. Pretreatment of animals with 3-methylcholanthrene decreases and phenobarbital pretreatment enhances the hepatotoxic action of this compound. We have investigated the macromolecular weight specificity of the covalent interactions of bromobenzene with liver macromolecules following incubation of [ 14 C]bromobenzene in isolated hepatocytes. Hepatocytes were prepared from Fischer-344 rats treated for 3 days with 3-methylcholanthrene, phenobarbital, or normal saline. After a 1-hr incubation, total covalent binding, as measured by sodium dodecyl sulfate-equilibrium dialysis, was twofold less in hepatocytes from 3-methylcholanthrene-treated rats and sixfold greater in hepatocytes from phenobarbital-treated rats, as compared to hepatocytes from control animals. Analysis of the arylated macromolecules by electrophoresis on 15% sodium dodecyl sulfate-polyacrylamide disc gels indicated that in the first 1 to 3 min of incubation substantial amounts of covalently bound radiolabel were associated with macromolecules of between 20,000 and 40,000. The amount of radioactivity associated with these macromolecules rapidly diminished in hepatocytes from control and 3-methylcholanthrene-treated animals. In hepatocytes from phenobarbital-treated animals, the amount of radioactivity associated with macromolecules, 20,000, increased throughout the incubation. The amount of radiolabel associated with macromolecules, 20,000, increased in all incubations. When nontoxic doses of phenylmethylsulfonyl fluoride, a specific inhibitor of serine proteases, were added to control hepatocytes incubated with [ 14 C]-bromobenzene, the decrease in radioactivity associated with larger (greater than 20,000) macromolecules was inhibited and a corresponding lack of increase in radioactivity associated with smaller macromolecules was observed

  3. Inference as Prediction

    Science.gov (United States)

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  4. Hybrid Optical Inference Machines

    Science.gov (United States)

    1991-09-27

    with labels. Now, events. a set of facts cal be generated in the dyadic form "u, R 1,2" Eichmann and Caulfield (19] consider the same type of and can...these enceding-schemes. These architectures are-based pri- 19. G. Eichmann and H. J. Caulfield, "Optical Learning (Inference)marily on optical inner

  5. A simple quantitative model of macromolecular crowding effects on protein folding: Application to the murine prion protein(121-231)

    Science.gov (United States)

    Bergasa-Caceres, Fernando; Rabitz, Herschel A.

    2013-06-01

    A model of protein folding kinetics is applied to study the effects of macromolecular crowding on protein folding rate and stability. Macromolecular crowding is found to promote a decrease of the entropic cost of folding of proteins that produces an increase of both the stability and the folding rate. The acceleration of the folding rate due to macromolecular crowding is shown to be a topology-dependent effect. The model is applied to the folding dynamics of the murine prion protein (121-231). The differential effect of macromolecular crowding as a function of protein topology suffices to make non-native configurations relatively more accessible.

  6. Inference rule and problem solving

    Energy Technology Data Exchange (ETDEWEB)

    Goto, S

    1982-04-01

    Intelligent information processing signifies an opportunity of having man's intellectual activity executed on the computer, in which inference, in place of ordinary calculation, is used as the basic operational mechanism for such an information processing. Many inference rules are derived from syllogisms in formal logic. The problem of programming this inference function is referred to as a problem solving. Although logically inference and problem-solving are in close relation, the calculation ability of current computers is on a low level for inferring. For clarifying the relation between inference and computers, nonmonotonic logic has been considered. The paper deals with the above topics. 16 references.

  7. Towards a compact and precise sample holder for macromolecular crystallography.

    Science.gov (United States)

    Papp, Gergely; Rossi, Christopher; Janocha, Robert; Sorez, Clement; Lopez-Marrero, Marcos; Astruc, Anthony; McCarthy, Andrew; Belrhali, Hassan; Bowler, Matthew W; Cipriani, Florent

    2017-10-01

    Most of the sample holders currently used in macromolecular crystallography offer limited storage density and poor initial crystal-positioning precision upon mounting on a goniometer. This has now become a limiting factor at high-throughput beamlines, where data collection can be performed in a matter of seconds. Furthermore, this lack of precision limits the potential benefits emerging from automated harvesting systems that could provide crystal-position information which would further enhance alignment at beamlines. This situation provided the motivation for the development of a compact and precise sample holder with corresponding pucks, handling tools and robotic transfer protocols. The development process included four main phases: design, prototype manufacture, testing with a robotic sample changer and validation under real conditions on a beamline. Two sample-holder designs are proposed: NewPin and miniSPINE. They share the same robot gripper and allow the storage of 36 sample holders in uni-puck footprint-style pucks, which represents 252 samples in a dry-shipping dewar commonly used in the field. The pucks are identified with human- and machine-readable codes, as well as with radio-frequency identification (RFID) tags. NewPin offers a crystal-repositioning precision of up to 10 µm but requires a specific goniometer socket. The storage density could reach 64 samples using a special puck designed for fully robotic handling. miniSPINE is less precise but uses a goniometer mount compatible with the current SPINE standard. miniSPINE is proposed for the first implementation of the new standard, since it is easier to integrate at beamlines. An upgraded version of the SPINE sample holder with a corresponding puck named SPINEplus is also proposed in order to offer a homogenous and interoperable system. The project involved several European synchrotrons and industrial companies in the fields of consumables and sample-changer robotics. Manual handling of mini

  8. Effects of macromolecular crowding on protein conformational changes.

    Directory of Open Access Journals (Sweden)

    Hao Dong

    2010-07-01

    Full Text Available Many protein functions can be directly linked to conformational changes. Inside cells, the equilibria and transition rates between different conformations may be affected by macromolecular crowding. We have recently developed a new approach for modeling crowding effects, which enables an atomistic representation of "test" proteins. Here this approach is applied to study how crowding affects the equilibria and transition rates between open and closed conformations of seven proteins: yeast protein disulfide isomerase (yPDI, adenylate kinase (AdK, orotidine phosphate decarboxylase (ODCase, Trp repressor (TrpR, hemoglobin, DNA beta-glucosyltransferase, and Ap(4A hydrolase. For each protein, molecular dynamics simulations of the open and closed states are separately run. Representative open and closed conformations are then used to calculate the crowding-induced changes in chemical potential for the two states. The difference in chemical-potential change between the two states finally predicts the effects of crowding on the population ratio of the two states. Crowding is found to reduce the open population to various extents. In the presence of crowders with a 15 A radius and occupying 35% of volume, the open-to-closed population ratios of yPDI, AdK, ODCase and TrpR are reduced by 79%, 78%, 62% and 55%, respectively. The reductions for the remaining three proteins are 20-44%. As expected, the four proteins experiencing the stronger crowding effects are those with larger conformational changes between open and closed states (e.g., as measured by the change in radius of gyration. Larger proteins also tend to experience stronger crowding effects than smaller ones [e.g., comparing yPDI (480 residues and TrpR (98 residues]. The potentials of mean force along the open-closed reaction coordinate of apo and ligand-bound ODCase are altered by crowding, suggesting that transition rates are also affected. These quantitative results and qualitative trends will

  9. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  10. Making Type Inference Practical

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Oxhøj, Nicholas; Palsberg, Jens

    1992-01-01

    We present the implementation of a type inference algorithm for untyped object-oriented programs with inheritance, assignments, and late binding. The algorithm significantly improves our previous one, presented at OOPSLA'91, since it can handle collection classes, such as List, in a useful way. Abo......, the complexity has been dramatically improved, from exponential time to low polynomial time. The implementation uses the techniques of incremental graph construction and constraint template instantiation to avoid representing intermediate results, doing superfluous work, and recomputing type information....... Experiments indicate that the implementation type checks as much as 100 lines pr. second. This results in a mature product, on which a number of tools can be based, for example a safety tool, an image compression tool, a code optimization tool, and an annotation tool. This may make type inference for object...

  11. Russell and Humean Inferences

    Directory of Open Access Journals (Sweden)

    João Paulo Monteiro

    2001-12-01

    Full Text Available Russell's The Problems of Philosophy tries to establish a new theory of induction, at the same time that Hume is there accused of an irrational/ scepticism about induction". But a careful analysis of the theory of knowledge explicitly acknowledged by Hume reveals that, contrary to the standard interpretation in the XXth century, possibly influenced by Russell, Hume deals exclusively with causal inference (which he never classifies as "causal induction", although now we are entitled to do so, never with inductive inference in general, mainly generalizations about sensible qualities of objects ( whether, e.g., "all crows are black" or not is not among Hume's concerns. Russell's theories are thus only false alternatives to Hume's, in (1912 or in his (1948.

  12. Causal inference in econometrics

    CERN Document Server

    Kreinovich, Vladik; Sriboonchitta, Songsak

    2016-01-01

    This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.

  13. New Paradigm for Macromolecular Crystallography Experiments at SSRL: Automated Crystal Screening And Remote Data Collection

    International Nuclear Information System (INIS)

    Soltis, S.M.; Cohen, A.E.; Deacon, A.; Eriksson, T.; Gonzalez, A.; McPhillips, S.; Chui, H.; Dunten, P.; Hollenbeck, M.; Mathews, I.; Miller, M.; Moorhead, P.; Phizackerley, R.P.; Smith, C.; Song, J.; Bedem, H. van dem; Ellis, P.; Kuhn, P.; McPhillips, T.; Sauter, N.; Sharp, K.

    2009-01-01

    Complete automation of the macromolecular crystallography experiment has been achieved at Stanford Synchrotron Radiation Lightsource (SSRL) through the combination of robust mechanized experimental hardware and a flexible control system with an intuitive user interface. These highly reliable systems have enabled crystallography experiments to be carried out from the researchers' home institutions and other remote locations while retaining complete control over even the most challenging systems. A breakthrough component of the system, the Stanford Auto-Mounter (SAM), has enabled the efficient mounting of cryocooled samples without human intervention. Taking advantage of this automation, researchers have successfully screened more than 200 000 samples to select the crystals with the best diffraction quality for data collection as well as to determine optimal crystallization and cryocooling conditions. These systems, which have been deployed on all SSRL macromolecular crystallography beamlines and several beamlines worldwide, are used by more than 80 research groups in remote locations, establishing a new paradigm for macromolecular crystallography experimentation.

  14. Active inference and learning.

    Science.gov (United States)

    Friston, Karl; FitzGerald, Thomas; Rigoli, Francesco; Schwartenbeck, Philipp; O Doherty, John; Pezzulo, Giovanni

    2016-09-01

    This paper offers an active inference account of choice behaviour and learning. It focuses on the distinction between goal-directed and habitual behaviour and how they contextualise each other. We show that habits emerge naturally (and autodidactically) from sequential policy optimisation when agents are equipped with state-action policies. In active inference, behaviour has explorative (epistemic) and exploitative (pragmatic) aspects that are sensitive to ambiguity and risk respectively, where epistemic (ambiguity-resolving) behaviour enables pragmatic (reward-seeking) behaviour and the subsequent emergence of habits. Although goal-directed and habitual policies are usually associated with model-based and model-free schemes, we find the more important distinction is between belief-free and belief-based schemes. The underlying (variational) belief updating provides a comprehensive (if metaphorical) process theory for several phenomena, including the transfer of dopamine responses, reversal learning, habit formation and devaluation. Finally, we show that active inference reduces to a classical (Bellman) scheme, in the absence of ambiguity. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Learning Convex Inference of Marginals

    OpenAIRE

    Domke, Justin

    2012-01-01

    Graphical models trained using maximum likelihood are a common tool for probabilistic inference of marginal distributions. However, this approach suffers difficulties when either the inference process or the model is approximate. In this paper, the inference process is first defined to be the minimization of a convex function, inspired by free energy approximations. Learning is then done directly in terms of the performance of the inference process at univariate marginal prediction. The main ...

  16. Probabilistic inductive inference: a survey

    OpenAIRE

    Ambainis, Andris

    2001-01-01

    Inductive inference is a recursion-theoretic theory of learning, first developed by E. M. Gold (1967). This paper surveys developments in probabilistic inductive inference. We mainly focus on finite inference of recursive functions, since this simple paradigm has produced the most interesting (and most complex) results.

  17. Multimodel inference and adaptive management

    Science.gov (United States)

    Rehme, S.E.; Powell, L.A.; Allen, Craig R.

    2011-01-01

    Ecology is an inherently complex science coping with correlated variables, nonlinear interactions and multiple scales of pattern and process, making it difficult for experiments to result in clear, strong inference. Natural resource managers, policy makers, and stakeholders rely on science to provide timely and accurate management recommendations. However, the time necessary to untangle the complexities of interactions within ecosystems is often far greater than the time available to make management decisions. One method of coping with this problem is multimodel inference. Multimodel inference assesses uncertainty by calculating likelihoods among multiple competing hypotheses, but multimodel inference results are often equivocal. Despite this, there may be pressure for ecologists to provide management recommendations regardless of the strength of their study’s inference. We reviewed papers in the Journal of Wildlife Management (JWM) and the journal Conservation Biology (CB) to quantify the prevalence of multimodel inference approaches, the resulting inference (weak versus strong), and how authors dealt with the uncertainty. Thirty-eight percent and 14%, respectively, of articles in the JWM and CB used multimodel inference approaches. Strong inference was rarely observed, with only 7% of JWM and 20% of CB articles resulting in strong inference. We found the majority of weak inference papers in both journals (59%) gave specific management recommendations. Model selection uncertainty was ignored in most recommendations for management. We suggest that adaptive management is an ideal method to resolve uncertainty when research results in weak inference.

  18. Dexamethasone attenuates grain sorghum dust extract-induced increase in macromolecular efflux in vivo.

    Science.gov (United States)

    Akhter, S R; Ikezaki, H; Gao, X P; Rubinstein, I

    1999-05-01

    The purpose of this study was to determine whether dexamethasone attenuates grain sorghum dust extract-induced increase in macromolecular efflux from the in situ hamster cheek pouch and, if so, whether this response is specific. By using intravital microscopy, we found that an aqueous extract of grain sorghum dust elicited significant, concentration-dependent leaky site formation and increase in clearance of FITC-labeled dextran (FITC-dextran; mol mass, 70 kDa) from the in situ hamster cheek pouch (P grain sorghum dust extract- and substance P-induced increases in macromolecular efflux from the in situ hamster cheek pouch in a specific fashion.

  19. An acoustic on-chip goniometer for room temperature macromolecular crystallography.

    Science.gov (United States)

    Burton, C G; Axford, D; Edwards, A M J; Gildea, R J; Morris, R H; Newton, M I; Orville, A M; Prince, M; Topham, P D; Docker, P T

    2017-12-05

    This paper describes the design, development and successful use of an on-chip goniometer for room-temperature macromolecular crystallography via acoustically induced rotations. We present for the first time a low cost, rate-tunable, acoustic actuator for gradual in-fluid sample reorientation about varying axes and its utilisation for protein structure determination on a synchrotron beamline. The device enables the efficient collection of diffraction data via a rotation method from a sample within a surface confined droplet. This method facilitates efficient macromolecular structural data acquisition in fluid environments for dynamical studies.

  20. The Joint Structural Biology Group beam lines at the ESRF: Modern macromolecular crystallography

    CERN Document Server

    Mitchell, E P

    2001-01-01

    Macromolecular crystallography has evolved considerably over the last decade. Data sets in under an hour are now possible on high throughput beam lines leading to electron density and, possibly, initial models calculated on-site. There are five beam lines currently dedicated to macromolecular crystallography: the ID14 complex and BM-14 (soon to be superseded by ID-29). These lines handle over five hundred projects every six months and demand is increasing. Automated sample handling, alignment and data management protocols will be required to work efficiently with this demanding load. Projects developing these themes are underway within the JSBG.

  1. Aging changes of macromolecular synthesis in the mitochondria of mouse hepatocytes as revealed by microscopic radioautography

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, Tetsuji [Shinshu University, Matsumoto (Japan). Dept. of Anatomy and Cell Biology

    2007-07-01

    This mini-review reports aging changes of macromolecular synthesis in the mitochondria of mouse hepatocytes. We have observed the macromolecular synthesis, such as DNA, RNA and proteins, in the mitochondria of various mammalian cells by means of electron microscopic radioautography technique developed in our laboratory. The number of mitochondria per cell, number of labeled mitochondria per cell with 3H-thymidine, 3H-uridine and 3H-leucine, precursors for DNA, RNA and proteins, respectively, were counted and the labeling indices at various ages, from fetal to postnatal early days and several months to 1 and 2 years in senescence, were calculated, which showed variations due to aging. (author)

  2. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  3. Emotional inferences by pragmatics

    OpenAIRE

    Iza-Miqueleiz, Mauricio

    2017-01-01

    It has for long been taken for granted that, along the course of reading a text, world knowledge is often required in order to establish coherent links between sentences (McKoon & Ratcliff 1992, Iza & Ezquerro 2000). The content grasped from a text turns out to be strongly dependent upon the reader’s additional knowledge that allows a coherent interpretation of the text as a whole. The world knowledge directing the inference may be of distinctive nature. Gygax et al. (2007) showed that m...

  4. Generic patch inference

    DEFF Research Database (Denmark)

    Andersen, Jesper; Lawall, Julia

    2010-01-01

    A key issue in maintaining Linux device drivers is the need to keep them up to date with respect to evolutions in Linux internal libraries. Currently, there is little tool support for performing and documenting such changes. In this paper we present a tool, spdiff, that identifies common changes...... developers can use it to extract an abstract representation of the set of changes that others have made. Our experiments on recent changes in Linux show that the inferred generic patches are more concise than the corresponding patches found in commits to the Linux source tree while being safe with respect...

  5. The Postgraduate Study of Macromolecular Sciences at the University of Zagreb (1971-1980

    Directory of Open Access Journals (Sweden)

    Kunst, B.

    2008-07-01

    Full Text Available The postgraduate study of macromolecular sciences (PSMS was established at the University of Zagreb in 1971 as a university study in the time of expressed interdisciplinary permeation of natural sciences - physics, chemistry and biology, and application of their achievements in technologicaldisciplines. PSMS was established by a group of prominent university professors from the schools of Science, Chemical Technology, Pharmacy and Medicine, as well as from the Institute of Biology. The study comprised basic fields of macromolecular sciences: organic chemistry of synthetic macromolecules, physical chemistry of macromolecules, physics of macromolecules, biological macromolecules and polymer engineering with polymer application and processing, and teaching was performed in 29 lecture courses lead by 30 professors with their collaborators. PSMS ceased to exist with the change of legislation in Croatia in 1980, when the attitude prevailed to render back postgraduate studies to the university schools. During 9 years of existence of PSMS the MSci grade was awarded to 37 macromolecular experts. It was assessed that the PSMS some thirty years ago was an important example of modern postgraduate education as compared with the international postgraduate development. In concordance with the recent introduction of similar interdisciplinary studies in macromolecular sciences elsewhere in the world, the establishment of a modern interdisciplinary study in the field would be of importance for further development of these sciences in Croatia.

  6. MMTF-An efficient file format for the transmission, visualization, and analysis of macromolecular structures.

    Directory of Open Access Journals (Sweden)

    Anthony R Bradley

    2017-06-01

    Full Text Available Recent advances in experimental techniques have led to a rapid growth in complexity, size, and number of macromolecular structures that are made available through the Protein Data Bank. This creates a challenge for macromolecular visualization and analysis. Macromolecular structure files, such as PDB or PDBx/mmCIF files can be slow to transfer, parse, and hard to incorporate into third-party software tools. Here, we present a new binary and compressed data representation, the MacroMolecular Transmission Format, MMTF, as well as software implementations in several languages that have been developed around it, which address these issues. We describe the new format and its APIs and demonstrate that it is several times faster to parse, and about a quarter of the file size of the current standard format, PDBx/mmCIF. As a consequence of the new data representation, it is now possible to visualize structures with millions of atoms in a web browser, keep the whole PDB archive in memory or parse it within few minutes on average computers, which opens up a new way of thinking how to design and implement efficient algorithms in structural bioinformatics. The PDB archive is available in MMTF file format through web services and data that are updated on a weekly basis.

  7. Synthesis and characterization of macromolecular rhodamine tethers and their interactions with P-glycoprotein.

    Science.gov (United States)

    Crawford, Lindsey; Putnam, David

    2014-08-20

    Rhodamine dyes are well-known P-glycoprotein (P-gp) substrates that have played an important role in the detection of inhibitors and other substrates of P-gp, as well as in the understanding of P-gp function. Macromolecular conjugates of rhodamines could prove useful as tethers for further probing of P-gp structure and function. Two macromolecular derivatives of rhodamine, methoxypolyethylene glycol-rhodamine6G and methoxypolyethylene glycol-rhodamine123, were synthesized through the 2'-position of rhodamine6G and rhodamine123, thoroughly characterized, and then evaluated by inhibition with verapamil for their ability to interact with P-gp and to act as efflux substrates. To put the results into context, the P-gp interactions of the new conjugates were compared to the commercially available methoxypolyethylene glycol-rhodamineB. FACS analysis confirmed that macromolecular tethers of rhodamine6G, rhodamine123, and rhodamineB were accumulated in P-gp expressing cells 5.2 ± 0.3%, 26.2 ± 4%, and 64.2 ± 6%, respectively, compared to a sensitive cell line that does not overexpress P-gp. Along with confocal imaging, the efflux analysis confirmed that the macromolecular rhodamine tethers remain P-gp substrates. These results open potential avenues for new ways to probe the function of P-gp both in vitro and in vivo.

  8. Interplay between the bacterial nucleoid protein H-NS and macromolecular crowding in compacting DNA

    NARCIS (Netherlands)

    Wintraecken, C.H.J.M.

    2012-01-01

    In this dissertation we discuss H-NS and its connection to nucleoid compaction and organization. Nucleoid formation involves a dramatic reduction in coil volume of the genomic DNA. Four factors are thought to influence coil volume: supercoiling, DNA charge neutralization, macromolecular

  9. Effect of macromolecular crowding on the rate of diffusion-limited ...

    Indian Academy of Sciences (India)

    The enzymatic reaction rate has been shown to be affected by the presence of such macromolecules. A simple numerical model is proposed here based on percolation and diffusion in disordered systems to study the effect of macromolecular crowding on the enzymatic reaction rates. The model qualitatively explains some ...

  10. Detection and cellular localisation of the synthetic soluble macromolecular drug carrier pHPMA

    Czech Academy of Sciences Publication Activity Database

    Kissel, M.; Peschke, P.; Šubr, Vladimír; Ulbrich, Karel; Strunz, A. M.; Kühnlein, R.; Debus, J.; Friedrich, E.

    2002-01-01

    Roč. 29, č. 8 (2002), s. 1055-1062 ISSN 1619-7070 R&D Projects: GA ČR GV307/96/K226 Institutional research plan: CEZ:AV0Z4050913 Keywords : EPR effect * Radiolabelled macromolecules * Pharmacokinetic Subject RIV: CD - Macromolecular Chemistry Impact factor: 3.568, year: 2002

  11. Atomic Scale Structural Studies of Macromolecular Assemblies by Solid-state Nuclear Magnetic Resonance Spectroscopy.

    Science.gov (United States)

    Loquet, Antoine; Tolchard, James; Berbon, Melanie; Martinez, Denis; Habenstein, Birgit

    2017-09-17

    Supramolecular protein assemblies play fundamental roles in biological processes ranging from host-pathogen interaction, viral infection to the propagation of neurodegenerative disorders. Such assemblies consist in multiple protein subunits organized in a non-covalent way to form large macromolecular objects that can execute a variety of cellular functions or cause detrimental consequences. Atomic insights into the assembly mechanisms and the functioning of those macromolecular assemblies remain often scarce since their inherent insolubility and non-crystallinity often drastically reduces the quality of the data obtained from most techniques used in structural biology, such as X-ray crystallography and solution Nuclear Magnetic Resonance (NMR). We here present magic-angle spinning solid-state NMR spectroscopy (SSNMR) as a powerful method to investigate structures of macromolecular assemblies at atomic resolution. SSNMR can reveal atomic details on the assembled complex without size and solubility limitations. The protocol presented here describes the essential steps from the production of 13 C/ 15 N isotope-labeled macromolecular protein assemblies to the acquisition of standard SSNMR spectra and their analysis and interpretation. As an example, we show the pipeline of a SSNMR structural analysis of a filamentous protein assembly.

  12. Thermodynamics of Macromolecular Association in Heterogeneous Crowding Environments: Theoretical and Simulation Studies with a Simplified Model.

    Science.gov (United States)

    Ando, Tadashi; Yu, Isseki; Feig, Michael; Sugita, Yuji

    2016-11-23

    The cytoplasm of a cell is crowded with many different kinds of macromolecules. The macromolecular crowding affects the thermodynamics and kinetics of biological reactions in a living cell, such as protein folding, association, and diffusion. Theoretical and simulation studies using simplified models focus on the essential features of the crowding effects and provide a basis for analyzing experimental data. In most of the previous studies on the crowding effects, a uniform crowder size is assumed, which is in contrast to the inhomogeneous size distribution of macromolecules in a living cell. Here, we evaluate the free energy changes upon macromolecular association in a cell-like inhomogeneous crowding system via a theory of hard-sphere fluids and free energy calculations using Brownian dynamics trajectories. The inhomogeneous crowding model based on 41 different types of macromolecules represented by spheres with different radii mimics the physiological concentrations of macromolecules in the cytoplasm of Mycoplasma genitalium. The free energy changes of macromolecular association evaluated by the theory and simulations were in good agreement with each other. The crowder size distribution affects both specific and nonspecific molecular associations, suggesting that not only the volume fraction but also the size distribution of macromolecules are important factors for evaluating in vivo crowding effects. This study relates in vitro experiments on macromolecular crowding to in vivo crowding effects by using the theory of hard-sphere fluids with crowder-size heterogeneity.

  13. Hyperbranched Polyethylenebased Macromolecular Architectures: Synthesis, Characterization, and Selfassembly

    KAUST Repository

    Al-Sulami, Ahlam

    2018-05-01

    "Chain walking” catalytic polymerization CWCP is a powerful tool for the one-pot synthesis of a unique class of hyperbranched polyethylene HBPE-based macromolecules with a controllable molecular weight, topology, and composition. This dissertation focuses on new synthetic routes to prepare HBPE-based macromolecular architectures by combining the CWCP technique with ring opening polymerization ROP, atom–transfer radical polymerization ATRP, and “click” chemistry. Taking advantage of end-functionalized HBPE, and a new ethynyl-soketal star-shape agent, we were able to synthesize different types of the HBPE-based architectures including hyperbranched-on-hyperbranched core-shell nanostructure, and miktoarm-star-HBPE-based block copolymers. The first part of the dissertation provides a general introduction to the synthesis of polyethylene types with controllable structures. Well-defined polyethylene with different macromolecule architectures were synthesized either for academic or industrial purposes. In the second part, the HBPE with different topologies was synthesized by CWCP, using a α-diimine Pd (II) catalyst. The effect of the temperature and pressure on the catalyst activity and polymer properties, including branch content, molecular weight, distribution, and thermal properties were studied. Two series of samples were synthesized: a) serial samples (A) under pressures of 1, 5, and 27 atm at 5˚C, and b) serial samples (B) at temperatures of 5, 15, and 35 ˚C under 5 atm. Proton nuclear magnetic resonance spectroscopy, 1H NMR, and gel permeation chromatography, GPC, analysis were used to calculate the branching content, molecular weight, and distribution, whereas differential scanning calorimetry, DSC, was used to record the melting and glass transition temperatures as well as the degree of the crystallinity. Well-defined HBPE-based core diblock copolymers with predictable amphiphilic properties are studied in the third part of the project. Hyperbranched

  14. The complex portal--an encyclopaedia of macromolecular complexes.

    Science.gov (United States)

    Meldal, Birgit H M; Forner-Martinez, Oscar; Costanzo, Maria C; Dana, Jose; Demeter, Janos; Dumousseau, Marine; Dwight, Selina S; Gaulton, Anna; Licata, Luana; Melidoni, Anna N; Ricard-Blum, Sylvie; Roechert, Bernd; Skyzypek, Marek S; Tiwari, Manu; Velankar, Sameer; Wong, Edith D; Hermjakob, Henning; Orchard, Sandra

    2015-01-01

    The IntAct molecular interaction database has created a new, free, open-source, manually curated resource, the Complex Portal (www.ebi.ac.uk/intact/complex), through which protein complexes from major model organisms are being collated and made available for search, viewing and download. It has been built in close collaboration with other bioinformatics services and populated with data from ChEMBL, MatrixDB, PDBe, Reactome and UniProtKB. Each entry contains information about the participating molecules (including small molecules and nucleic acids), their stoichiometry, topology and structural assembly. Complexes are annotated with details about their function, properties and complex-specific Gene Ontology (GO) terms. Consistent nomenclature is used throughout the resource with systematic names, recommended names and a list of synonyms all provided. The use of the Evidence Code Ontology allows us to indicate for which entries direct experimental evidence is available or if the complex has been inferred based on homology or orthology. The data are searchable using standard identifiers, such as UniProt, ChEBI and GO IDs, protein, gene and complex names or synonyms. This reference resource will be maintained and grow to encompass an increasing number of organisms. Input from groups and individuals with specific areas of expertise is welcome. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Flexibility damps macromolecular crowding effects on protein folding dynamics: Application to the murine prion protein (121-231)

    Science.gov (United States)

    Bergasa-Caceres, Fernando; Rabitz, Herschel A.

    2014-01-01

    A model of protein folding kinetics is applied to study the combined effects of protein flexibility and macromolecular crowding on protein folding rate and stability. It is found that the increase in stability and folding rate promoted by macromolecular crowding is damped for proteins with highly flexible native structures. The model is applied to the folding dynamics of the murine prion protein (121-231). It is found that the high flexibility of the native isoform of the murine prion protein (121-231) reduces the effects of macromolecular crowding on its folding dynamics. The relevance of these findings for the pathogenic mechanism are discussed.

  16. Feature Inference Learning and Eyetracking

    Science.gov (United States)

    Rehder, Bob; Colner, Robert M.; Hoffman, Aaron B.

    2009-01-01

    Besides traditional supervised classification learning, people can learn categories by inferring the missing features of category members. It has been proposed that feature inference learning promotes learning a category's internal structure (e.g., its typical features and interfeature correlations) whereas classification promotes the learning of…

  17. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framewor...

  18. Gauging Variational Inference

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ahn, Sungsoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of); Shin, Jinwoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2017-05-25

    Computing partition function is the most important statistical inference task arising in applications of Graphical Models (GM). Since it is computationally intractable, approximate methods have been used to resolve the issue in practice, where meanfield (MF) and belief propagation (BP) are arguably the most popular and successful approaches of a variational type. In this paper, we propose two new variational schemes, coined Gauged-MF (G-MF) and Gauged-BP (G-BP), improving MF and BP, respectively. Both provide lower bounds for the partition function by utilizing the so-called gauge transformation which modifies factors of GM while keeping the partition function invariant. Moreover, we prove that both G-MF and G-BP are exact for GMs with a single loop of a special structure, even though the bare MF and BP perform badly in this case. Our extensive experiments, on complete GMs of relatively small size and on large GM (up-to 300 variables) confirm that the newly proposed algorithms outperform and generalize MF and BP.

  19. Social Inference Through Technology

    Science.gov (United States)

    Oulasvirta, Antti

    Awareness cues are computer-mediated, real-time indicators of people’s undertakings, whereabouts, and intentions. Already in the mid-1970 s, UNIX users could use commands such as “finger” and “talk” to find out who was online and to chat. The small icons in instant messaging (IM) applications that indicate coconversants’ presence in the discussion space are the successors of “finger” output. Similar indicators can be found in online communities, media-sharing services, Internet relay chat (IRC), and location-based messaging applications. But presence and availability indicators are only the tip of the iceberg. Technological progress has enabled richer, more accurate, and more intimate indicators. For example, there are mobile services that allow friends to query and follow each other’s locations. Remote monitoring systems developed for health care allow relatives and doctors to assess the wellbeing of homebound patients (see, e.g., Tang and Venables 2000). But users also utilize cues that have not been deliberately designed for this purpose. For example, online gamers pay attention to other characters’ behavior to infer what the other players are like “in real life.” There is a common denominator underlying these examples: shared activities rely on the technology’s representation of the remote person. The other human being is not physically present but present only through a narrow technological channel.

  20. Room-temperature macromolecular serial crystallography using synchrotron radiation

    Directory of Open Access Journals (Sweden)

    Francesco Stellato

    2014-07-01

    Full Text Available A new approach for collecting data from many hundreds of thousands of microcrystals using X-ray pulses from a free-electron laser has recently been developed. Referred to as serial crystallography, diffraction patterns are recorded at a constant rate as a suspension of protein crystals flows across the path of an X-ray beam. Events that by chance contain single-crystal diffraction patterns are retained, then indexed and merged to form a three-dimensional set of reflection intensities for structure determination. This approach relies upon several innovations: an intense X-ray beam; a fast detector system; a means to rapidly flow a suspension of crystals across the X-ray beam; and the computational infrastructure to process the large volume of data. Originally conceived for radiation-damage-free measurements with ultrafast X-ray pulses, the same methods can be employed with synchrotron radiation. As in powder diffraction, the averaging of thousands of observations per Bragg peak may improve the ratio of signal to noise of low-dose exposures. Here, it is shown that this paradigm can be implemented for room-temperature data collection using synchrotron radiation and exposure times of less than 3 ms. Using lysozyme microcrystals as a model system, over 40 000 single-crystal diffraction patterns were obtained and merged to produce a structural model that could be refined to 2.1 Å resolution. The resulting electron density is in excellent agreement with that obtained using standard X-ray data collection techniques. With further improvements the method is well suited for even shorter exposures at future and upgraded synchrotron radiation facilities that may deliver beams with 1000 times higher brightness than they currently produce.

  1. Coevolutionary constraints in the sequence-space of macromolecular complexes reflect their self-assembly pathways.

    Science.gov (United States)

    Mallik, Saurav; Kundu, Sudip

    2017-07-01

    Is the order in which biomolecular subunits self-assemble into functional macromolecular complexes imprinted in their sequence-space? Here, we demonstrate that the temporal order of macromolecular complex self-assembly can be efficiently captured using the landscape of residue-level coevolutionary constraints. This predictive power of coevolutionary constraints is irrespective of the structural, functional, and phylogenetic classification of the complex and of the stoichiometry and quaternary arrangement of the constituent monomers. Combining this result with a number of structural attributes estimated from the crystal structure data, we find indications that stronger coevolutionary constraints at interfaces formed early in the assembly hierarchy probably promotes coordinated fixation of mutations that leads to high-affinity binding with higher surface area, increased surface complementarity and elevated number of molecular contacts, compared to those that form late in the assembly. Proteins 2017; 85:1183-1189. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  2. Variationally optimal selection of slow coordinates and reaction coordinates in macromolecular systems

    Science.gov (United States)

    Noe, Frank

    To efficiently simulate and generate understanding from simulations of complex macromolecular systems, the concept of slow collective coordinates or reaction coordinates is of fundamental importance. Here we will introduce variational approaches to approximate the slow coordinates and the reaction coordinates between selected end-states given MD simulations of the macromolecular system and a (possibly large) basis set of candidate coordinates. We will then discuss how to select physically intuitive order paremeters that are good surrogates of this variationally optimal result. These result can be used in order to construct Markov state models or other models of the stationary and kinetics properties, in order to parametrize low-dimensional / coarse-grained model of the dynamics. Deutsche Forschungsgemeinschaft, European Research Council.

  3. Local analysis of strains and rotations for macromolecular electron microscopy maps

    Energy Technology Data Exchange (ETDEWEB)

    Martin-Ramos, A.; Prieto, F.; Melero, R.; Martin-Benito, J.; Jonic, S.; Navas-Calvente, J.; Vargas, J.; Oton, J.; Abrishami, V.; Rosa-Trevin, J.L. de la; Gomez-Blanco, J.; Vilas, J.L.; Marabini, R.; Carazo, R.; Sorzano, C.O.S.

    2016-07-01

    Macromolecular complexes can be considered as molecular nano-machines that must have mobile parts in order to perform their physiological functions. The reordering of their parts is essential to execute their task. These rearrangements induce local strains and rotations which, after analyzing them, may provide relevant information about how the proteins perform their function. In this project these deformations of the macromolecular complexes are characterized, translating into a “mathematical language” the conformational changes of the complexes when they perform their function. Electron Microscopy (EM) volumes are analyzed using a method that uses B-splines as its basis functions. It is shown that the results obtained are consistent with the conformational changes described in their corresponding reference publications. (Author)

  4. Macromolecularly crowded in vitro microenvironments accelerate the production of extracellular matrix-rich supramolecular assemblies.

    Science.gov (United States)

    Kumar, Pramod; Satyam, Abhigyan; Fan, Xingliang; Collin, Estelle; Rochev, Yury; Rodriguez, Brian J; Gorelov, Alexander; Dillon, Simon; Joshi, Lokesh; Raghunath, Michael; Pandit, Abhay; Zeugolis, Dimitrios I

    2015-03-04

    Therapeutic strategies based on the principles of tissue engineering by self-assembly put forward the notion that functional regeneration can be achieved by utilising the inherent capacity of cells to create highly sophisticated supramolecular assemblies. However, in dilute ex vivo microenvironments, prolonged culture time is required to develop an extracellular matrix-rich implantable device. Herein, we assessed the influence of macromolecular crowding, a biophysical phenomenon that regulates intra- and extra-cellular activities in multicellular organisms, in human corneal fibroblast culture. In the presence of macromolecules, abundant extracellular matrix deposition was evidenced as fast as 48 h in culture, even at low serum concentration. Temperature responsive copolymers allowed the detachment of dense and cohesive supramolecularly assembled living substitutes within 6 days in culture. Morphological, histological, gene and protein analysis assays demonstrated maintenance of tissue-specific function. Macromolecular crowding opens new avenues for a more rational design in engineering of clinically relevant tissue modules in vitro.

  5. Atomic force microscopy applied to study macromolecular content of embedded biological material

    Energy Technology Data Exchange (ETDEWEB)

    Matsko, Nadejda B. [Electron Microscopy Centre, Institute of Applied Physics, HPM C 15.1, ETH-Hoenggerberg, CH-8093, Zurich (Switzerland)]. E-mail: matsko@iap.phys.ethz.ch

    2007-02-15

    We demonstrate that atomic force microscopy represents a powerful tool for the estimation of structural preservation of biological samples embedded in epoxy resin, in terms of their macromolecular distribution and architecture. The comparison of atomic force microscopy (AFM) and transmission electron microscopy (TEM) images of a biosample (Caenorhabditis elegans) prepared following to different types of freeze-substitution protocols (conventional OsO{sub 4} fixation, epoxy fixation) led to the conclusion that high TEM stainability of the sample results from a low macromolecular density of the cellular matrix. We propose a novel procedure aimed to obtain AFM and TEM images of the same particular organelle, which strongly facilitates AFM image interpretation and reveals new ultrastructural aspects (mainly protein arrangement) of a biosample in addition to TEM data.

  6. Extraction of cobalt ion from textile using a complexing macromolecular surfactant in supercritical carbon dioxide

    International Nuclear Information System (INIS)

    Chirat, Mathieu; Ribaut, Tiphaine; Clerc, Sebastien; Lacroix-Desmazes, Patrick; Charton, Frederic; Fournel, Bruno

    2013-01-01

    Cobalt ion under the form of cobalt nitrate is removed from a textile lab coat using supercritical carbon dioxide extraction. The process involves a macromolecular additive of well-defined architecture, acting both as a surfactant and a complexing agent. The extraction efficiency of cobalt reaches 66% when using a poly(1,1,2,2-tetrahydroperfluoro-decyl-acrylate-co-vinyl-benzylphosphonic diacid) gradient copolymer in the presence of water at 160 bar and 40 C. The synergy of the two additives, namely the copolymer and water which are useless if used separately, is pointed out. The potential of the supercritical carbon dioxide process using complexing macromolecular surfactant lies in the ability to modulate the complexing unit as a function of the metal as well as the architecture of the surface-active agent for applications ranging for instance from nuclear decontamination to the recovery of strategic metals. (authors)

  7. Pi sampling: a methodical and flexible approach to initial macromolecular crystallization screening

    International Nuclear Information System (INIS)

    Gorrec, Fabrice; Palmer, Colin M.; Lebon, Guillaume; Warne, Tony

    2011-01-01

    Pi sampling, derived from the incomplete factorial approach, is an effort to maximize the diversity of macromolecular crystallization conditions and to facilitate the preparation of 96-condition initial screens. The Pi sampling method is derived from the incomplete factorial approach to macromolecular crystallization screen design. The resulting ‘Pi screens’ have a modular distribution of a given set of up to 36 stock solutions. Maximally diverse conditions can be produced by taking into account the properties of the chemicals used in the formulation and the concentrations of the corresponding solutions. The Pi sampling method has been implemented in a web-based application that generates screen formulations and recipes. It is particularly adapted to screens consisting of 96 different conditions. The flexibility and efficiency of Pi sampling is demonstrated by the crystallization of soluble proteins and of an integral membrane-protein sample

  8. Macromolecular Engineering: New Routes Towards the Synthesis of Well-??Defined Polyethers/Polyesters Co/Terpolymers with Different Architectures

    KAUST Repository

    Alamri, Haleema

    2016-01-01

    Macromolecular engineering (as discussed in the first chapter) of homo/copolymers refers to the specific tailoring of these materials for achieving an easy and reproducible synthesis that results in precise molecular

  9. The Postgraduate Study of Macromolecular Sciences at the University of Zagreb (1971-1980)

    OpenAIRE

    Kunst, B.; Dezelic, D.; Veksli, Z.

    2008-01-01

    The postgraduate study of macromolecular sciences (PSMS) was established at the University of Zagreb in 1971 as a university study in the time of expressed interdisciplinary permeation of natural sciences - physics, chemistry and biology, and application of their achievements in technologicaldisciplines. PSMS was established by a group of prominent university professors from the schools of Science, Chemical Technology, Pharmacy and Medicine, as well as from the Institute of Biology. The study...

  10. The Postgraduate Study of Macromolecular Sciences at the University of Zagreb (1971– 1980)

    OpenAIRE

    Deželić, D.; Kunst, B.; Veksli, Zorica

    2008-01-01

    The postgraduate study of macromolecular sciences (PSMS) was established at the University of Zagreb in 1971 as a university study in the time of expressed interdisciplinary permeation of natural sciences - physics, chemistry and biology, and application of their achievements in technological disciplines. PSMS was established by a group of prominent university professors from the schools of Science, Chemical Technology, Pharmacy and Medicine, as well as from the Institute of Biology. The s...

  11. Measurement and Interpretation of Diffuse Scattering in X-Ray Diffraction for Macromolecular Crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Michael E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-16

    X-ray diffraction from macromolecular crystals includes both sharply peaked Bragg reflections and diffuse intensity between the peaks. The information in Bragg scattering reflects the mean electron density in the unit cells of the crystal. The diffuse scattering arises from correlations in the variations of electron density that may occur from one unit cell to another, and therefore contains information about collective motions in proteins.

  12. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  13. Macromolecular shape and interactions in layer-by-layer assemblies within cylindrical nanopores.

    Science.gov (United States)

    Lazzara, Thomas D; Lau, K H Aaron; Knoll, Wolfgang; Janshoff, Andreas; Steinem, Claudia

    2012-01-01

    Layer-by-layer (LbL) deposition of polyelectrolytes and proteins within the cylindrical nanopores of anodic aluminum oxide (AAO) membranes was studied by optical waveguide spectroscopy (OWS). AAO has aligned cylindrical, nonintersecting pores with a defined pore diameter d(0) and functions as a planar optical waveguide so as to monitor, in situ, the LbL process by OWS. The LbL deposition of globular proteins, i.e., avidin and biotinylated bovine serum albumin was compared with that of linear polyelectrolytes (linear-PEs), both species being of similar molecular weight. LbL deposition within the cylindrical AAO geometry for different pore diameters (d(0) = 25-80 nm) for the various macromolecular species, showed that the multilayer film growth was inhibited at different maximum numbers of LbL steps (n(max)). The value of n(max) was greatest for linear-PEs, while proteins had a lower value. The cylindrical pore geometry imposes a physical limit to LbL growth such that n(max) is strongly dependent on the overall internal structure of the LbL film. For all macromolecular species, deposition was inhibited in native AAO, having pores of d(0) = 25-30 nm. Both, OWS and scanning electron microscopy showed that LbL growth in larger AAO pores (d(0) > 25-30 nm) became inhibited when approaching a pore diameter of d(eff,n_max) = 25-35 nm, a similar size to that of native AAO pores, with d(0) = 25-30 nm. For a reasonable estimation of d(eff,n_max), the actual volume occupied by a macromolecular assembly must be taken into consideration. The results clearly show that electrostatic LbL allowed for compact macromolecular layers, whereas proteins formed loosely packed multilayers.

  14. Tuning the properties of an anthracene-based PPE-PPV copolymer by fine variation of its macromolecular parameters

    Czech Academy of Sciences Publication Activity Database

    Tinti, F.; Sabir, F. K.; Gazzano, M.; Righi, S.; Ulbricht, C.; Usluer, Ö.; Pokorná, Veronika; Cimrová, Věra; Yohannes, T.; Egbe, D. A. M.; Camaioni, N.

    2013-01-01

    Roč. 3, č. 19 (2013), s. 6972-6980 ISSN 2046-2069 R&D Projects: GA ČR GAP106/12/0827; GA ČR(CZ) GA13-26542S Institutional support: RVO:61389013 Keywords : anthracene-containing PPE-PPV copolymer * macromolecular parameters * structural and transport properties Subject RIV: CD - Macromolecular Chemistry Impact factor: 3.708, year: 2013

  15. Macromolecular diffusion in crowded media beyond the hard-sphere model.

    Science.gov (United States)

    Blanco, Pablo M; Garcés, Josep Lluís; Madurga, Sergio; Mas, Francesc

    2018-04-25

    The effect of macromolecular crowding on diffusion beyond the hard-core sphere model is studied. A new coarse-grained model is presented, the Chain Entanglement Softened Potential (CESP) model, which takes into account the macromolecular flexibility and chain entanglement. The CESP model uses a shoulder-shaped interaction potential that is implemented in the Brownian Dynamics (BD) computations. The interaction potential contains only one parameter associated with the chain entanglement energetic cost (Ur). The hydrodynamic interactions are included in the BD computations via Tokuyama mean-field equations. The model is used to analyze the diffusion of a streptavidin protein among different sized dextran obstacles. For this system, Ur is obtained by fitting the streptavidin experimental long-time diffusion coefficient Dlongversus the macromolecular concentration for D50 (indicating their molecular weight in kg mol-1) dextran obstacles. The obtained Dlong values show better quantitative agreement with experiments than those obtained with hard-core spheres. Moreover, once parametrized, the CESP model is also able to quantitatively predict Dlong and the anomalous exponent (α) for streptavidin diffusion among D10, D400 and D700 dextran obstacles. Dlong, the short-time diffusion coefficient (Dshort) and α are obtained from the BD simulations by using a new empirical expression, able to describe the full temporal evolution of the diffusion coefficient.

  16. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    Science.gov (United States)

    Donovan, Preston; Chehreghanianzabi, Yasaman; Rathinam, Muruhan; Zustiak, Silviya Petrova

    2016-01-01

    The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

  17. Principles and Overview of Sampling Methods for Modeling Macromolecular Structure and Dynamics.

    Science.gov (United States)

    Maximova, Tatiana; Moffatt, Ryan; Ma, Buyong; Nussinov, Ruth; Shehu, Amarda

    2016-04-01

    Investigation of macromolecular structure and dynamics is fundamental to understanding how macromolecules carry out their functions in the cell. Significant advances have been made toward this end in silico, with a growing number of computational methods proposed yearly to study and simulate various aspects of macromolecular structure and dynamics. This review aims to provide an overview of recent advances, focusing primarily on methods proposed for exploring the structure space of macromolecules in isolation and in assemblies for the purpose of characterizing equilibrium structure and dynamics. In addition to surveying recent applications that showcase current capabilities of computational methods, this review highlights state-of-the-art algorithmic techniques proposed to overcome challenges posed in silico by the disparate spatial and time scales accessed by dynamic macromolecules. This review is not meant to be exhaustive, as such an endeavor is impossible, but rather aims to balance breadth and depth of strategies for modeling macromolecular structure and dynamics for a broad audience of novices and experts.

  18. In Vitro and In Vivo Evaluation of Microparticulate Drug Delivery Systems Composed of Macromolecular Prodrugs

    Directory of Open Access Journals (Sweden)

    Yoshiharu Machida

    2008-08-01

    Full Text Available Macromolecular prodrugs are very useful systems for achieving controlled drug release and drug targeting. In particular, various macromolecule-antitumor drug conjugates enhance the effectiveness and improve the toxic side effects. Also, polymeric micro- and nanoparticles have been actively examined and their in vivo behaviors elucidated, and it has been realized that their particle characteristics are very useful to control drug behavior. Recently, researches based on the combination of the concepts of macromolecular prodrugs and micro- or nanoparticles have been reported, although they are limited. Macromolecular prodrugs enable drugs to be released at a certain controlled release rate based on the features of the macromolecule-drug linkage. Micro- and nanoparticles can control in vivo behavior based on their size, surface charge and surface structure. These merits are expected for systems produced by the combination of each concept. In this review, several micro- or nanoparticles composed of macromolecule-drug conjugates are described for their preparation, in vitro properties and/or in vivo behavior.

  19. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    Directory of Open Access Journals (Sweden)

    Preston Donovan

    Full Text Available The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

  20. MxCuBE: a synchrotron beamline control environment customized for macromolecular crystallography experiments

    International Nuclear Information System (INIS)

    Gabadinho, José; Beteva, Antonia; Guijarro, Matias; Rey-Bakaikoa, Vicente; Spruce, Darren

    2010-01-01

    MxCuBE is a beamline control environment optimized for the needs of macromolecular crystallography. This paper describes the design of the software and the features that MxCuBE currently provides. The design and features of a beamline control software system for macromolecular crystallography (MX) experiments developed at the European Synchrotron Radiation Facility (ESRF) are described. This system, MxCuBE, allows users to easily and simply interact with beamline hardware components and provides automated routines for common tasks in the operation of a synchrotron beamline dedicated to experiments in MX. Additional functionality is provided through intuitive interfaces that enable the assessment of the diffraction characteristics of samples, experiment planning, automatic data collection and the on-line collection and analysis of X-ray emission spectra. The software can be run in a tandem client-server mode that allows for remote control and relevant experimental parameters and results are automatically logged in a relational database, ISPyB. MxCuBE is modular, flexible and extensible and is currently deployed on eight macromolecular crystallography beamlines at the ESRF. Additionally, the software is installed at MAX-lab beamline I911-3 and at BESSY beamline BL14.1

  1. A smooth and differentiable bulk-solvent model for macromolecular diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Fenn, T. D. [Department of Molecular and Cellular Physiology and Howard Hughes Medical Institute, Stanford, California (United States); Schnieders, M. J. [Department of Chemistry, Stanford, California (United States); Brunger, A. T., E-mail: brunger@stanford.edu [Department of Molecular and Cellular Physiology and Howard Hughes Medical Institute, Stanford, California (United States); Departments of Neurology and Neurological Sciences, Structural Biology and Photon Science, Stanford, California (United States)

    2010-09-01

    A new method for modeling the bulk solvent in macromolecular diffraction data based on Babinet’s principle is presented. The proposed models offer the advantage of differentiability with respect to atomic coordinates. Inclusion of low-resolution data in macromolecular crystallography requires a model for the bulk solvent. Previous methods have used a binary mask to accomplish this, which has proven to be very effective, but the mask is discontinuous at the solute–solvent boundary (i.e. the mask value jumps from zero to one) and is not differentiable with respect to atomic parameters. Here, two algorithms are introduced for computing bulk-solvent models using either a polynomial switch or a smoothly thresholded product of Gaussians, and both models are shown to be efficient and differentiable with respect to atomic coordinates. These alternative bulk-solvent models offer algorithmic improvements, while showing similar agreement of the model with the observed amplitudes relative to the binary model as monitored using R, R{sub free} and differences between experimental and model phases. As with the standard solvent models, the alternative models improve the agreement primarily with lower resolution (>6 Å) data versus no bulk solvent. The models are easily implemented into crystallographic software packages and can be used as a general method for bulk-solvent correction in macromolecular crystallography.

  2. A smooth and differentiable bulk-solvent model for macromolecular diffraction

    International Nuclear Information System (INIS)

    Fenn, T. D.; Schnieders, M. J.; Brunger, A. T.

    2010-01-01

    A new method for modeling the bulk solvent in macromolecular diffraction data based on Babinet’s principle is presented. The proposed models offer the advantage of differentiability with respect to atomic coordinates. Inclusion of low-resolution data in macromolecular crystallography requires a model for the bulk solvent. Previous methods have used a binary mask to accomplish this, which has proven to be very effective, but the mask is discontinuous at the solute–solvent boundary (i.e. the mask value jumps from zero to one) and is not differentiable with respect to atomic parameters. Here, two algorithms are introduced for computing bulk-solvent models using either a polynomial switch or a smoothly thresholded product of Gaussians, and both models are shown to be efficient and differentiable with respect to atomic coordinates. These alternative bulk-solvent models offer algorithmic improvements, while showing similar agreement of the model with the observed amplitudes relative to the binary model as monitored using R, R free and differences between experimental and model phases. As with the standard solvent models, the alternative models improve the agreement primarily with lower resolution (>6 Å) data versus no bulk solvent. The models are easily implemented into crystallographic software packages and can be used as a general method for bulk-solvent correction in macromolecular crystallography

  3. Gaussian-Based Smooth Dielectric Function: A Surface-Free Approach for Modeling Macromolecular Binding in Solvents

    Directory of Open Access Journals (Sweden)

    Arghya Chakravorty

    2018-03-01

    Full Text Available Conventional modeling techniques to model macromolecular solvation and its effect on binding in the framework of Poisson-Boltzmann based implicit solvent models make use of a geometrically defined surface to depict the separation of macromolecular interior (low dielectric constant from the solvent phase (high dielectric constant. Though this simplification saves time and computational resources without significantly compromising the accuracy of free energy calculations, it bypasses some of the key physio-chemical properties of the solute-solvent interface, e.g., the altered flexibility of water molecules and that of side chains at the interface, which results in dielectric properties different from both bulk water and macromolecular interior, respectively. Here we present a Gaussian-based smooth dielectric model, an inhomogeneous dielectric distribution model that mimics the effect of macromolecular flexibility and captures the altered properties of surface bound water molecules. Thus, the model delivers a smooth transition of dielectric properties from the macromolecular interior to the solvent phase, eliminating any unphysical surface separating the two phases. Using various examples of macromolecular binding, we demonstrate its utility and illustrate the comparison with the conventional 2-dielectric model. We also showcase some additional abilities of this model, viz. to account for the effect of electrolytes in the solution and to render the distribution profile of water across a lipid membrane.

  4. Optimization methods for logical inference

    CERN Document Server

    Chandru, Vijay

    2011-01-01

    Merging logic and mathematics in deductive inference-an innovative, cutting-edge approach. Optimization methods for logical inference? Absolutely, say Vijay Chandru and John Hooker, two major contributors to this rapidly expanding field. And even though ""solving logical inference problems with optimization methods may seem a bit like eating sauerkraut with chopsticks. . . it is the mathematical structure of a problem that determines whether an optimization model can help solve it, not the context in which the problem occurs."" Presenting powerful, proven optimization techniques for logic in

  5. Radiation damage

    CERN Document Server

    Heijne, Erik H M; CERN. Geneva

    1998-01-01

    a) Radiation damage in organic materials. This series of lectures will give an overview of radiation effects on materials and components frequently used in accelerator engineering and experiments. Basic degradation phenomena will be presented for organic materials with comprehensive damage threshold doses for commonly used rubbers, thermoplastics, thermosets and composite materials. Some indications will be given for glass, scintillators and optical fibres. b) Radiation effects in semiconductor materials and devices. The major part of the time will be devoted to treat radiation effects in semiconductor sensors and the associated electronics, in particular displacement damage, interface and single event phenomena. Evaluation methods and practical aspects will be shown. Strategies will be developed for the survival of the materials under the expected environmental conditions of the LHC machine and detectors. I will describe profound revolution in our understanding of black holes and their relation to quantum me...

  6. On principles of inductive inference

    OpenAIRE

    Kostecki, Ryszard Paweł

    2011-01-01

    We propose an intersubjective epistemic approach to foundations of probability theory and statistical inference, based on relative entropy and category theory, and aimed to bypass the mathematical and conceptual problems of existing foundational approaches.

  7. Statistical inference via fiducial methods

    OpenAIRE

    Salomé, Diemer

    1998-01-01

    In this thesis the attention is restricted to inductive reasoning using a mathematical probability model. A statistical procedure prescribes, for every theoretically possible set of data, the inference about the unknown of interest. ... Zie: Summary

  8. Statistical inference for stochastic processes

    National Research Council Canada - National Science Library

    Basawa, Ishwar V; Prakasa Rao, B. L. S

    1980-01-01

    The aim of this monograph is to attempt to reduce the gap between theory and applications in the area of stochastic modelling, by directing the interest of future researchers to the inference aspects...

  9. Active inference, communication and hermeneutics.

    Science.gov (United States)

    Friston, Karl J; Frith, Christopher D

    2015-07-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others--during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions--both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then--in principle--they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Tort Damages

    NARCIS (Netherlands)

    L.T. Visscher (Louis)

    2008-01-01

    textabstractAbstract: In this Chapter, I provide an overview of Law and Economics literature regarding tort damages. Where necessary, attention is also spent to rules of tort liability. Both types of rules provide behavioral incentives to both injurers and victims, with respect to their level of

  11. Optimal inference with suboptimal models: Addiction and active Bayesian inference

    Science.gov (United States)

    Schwartenbeck, Philipp; FitzGerald, Thomas H.B.; Mathys, Christoph; Dolan, Ray; Wurst, Friedrich; Kronbichler, Martin; Friston, Karl

    2015-01-01

    When casting behaviour as active (Bayesian) inference, optimal inference is defined with respect to an agent’s beliefs – based on its generative model of the world. This contrasts with normative accounts of choice behaviour, in which optimal actions are considered in relation to the true structure of the environment – as opposed to the agent’s beliefs about worldly states (or the task). This distinction shifts an understanding of suboptimal or pathological behaviour away from aberrant inference as such, to understanding the prior beliefs of a subject that cause them to behave less ‘optimally’ than our prior beliefs suggest they should behave. Put simply, suboptimal or pathological behaviour does not speak against understanding behaviour in terms of (Bayes optimal) inference, but rather calls for a more refined understanding of the subject’s generative model upon which their (optimal) Bayesian inference is based. Here, we discuss this fundamental distinction and its implications for understanding optimality, bounded rationality and pathological (choice) behaviour. We illustrate our argument using addictive choice behaviour in a recently described ‘limited offer’ task. Our simulations of pathological choices and addictive behaviour also generate some clear hypotheses, which we hope to pursue in ongoing empirical work. PMID:25561321

  12. Human brain lesion-deficit inference remapped.

    Science.gov (United States)

    Mah, Yee-Haur; Husain, Masud; Rees, Geraint; Nachev, Parashkev

    2014-09-01

    Our knowledge of the anatomical organization of the human brain in health and disease draws heavily on the study of patients with focal brain lesions. Historically the first method of mapping brain function, it is still potentially the most powerful, establishing the necessity of any putative neural substrate for a given function or deficit. Great inferential power, however, carries a crucial vulnerability: without stronger alternatives any consistent error cannot be easily detected. A hitherto unexamined source of such error is the structure of the high-dimensional distribution of patterns of focal damage, especially in ischaemic injury-the commonest aetiology in lesion-deficit studies-where the anatomy is naturally shaped by the architecture of the vascular tree. This distribution is so complex that analysis of lesion data sets of conventional size cannot illuminate its structure, leaving us in the dark about the presence or absence of such error. To examine this crucial question we assembled the largest known set of focal brain lesions (n = 581), derived from unselected patients with acute ischaemic injury (mean age = 62.3 years, standard deviation = 17.8, male:female ratio = 0.547), visualized with diffusion-weighted magnetic resonance imaging, and processed with validated automated lesion segmentation routines. High-dimensional analysis of this data revealed a hidden bias within the multivariate patterns of damage that will consistently distort lesion-deficit maps, displacing inferred critical regions from their true locations, in a manner opaque to replication. Quantifying the size of this mislocalization demonstrates that past lesion-deficit relationships estimated with conventional inferential methodology are likely to be significantly displaced, by a magnitude dependent on the unknown underlying lesion-deficit relationship itself. Past studies therefore cannot be retrospectively corrected, except by new knowledge that would render them redundant

  13. Auto- and cross-power spectral analysis of dual trap optical tweezer experiments using Bayesian inference.

    Science.gov (United States)

    von Hansen, Yann; Mehlich, Alexander; Pelz, Benjamin; Rief, Matthias; Netz, Roland R

    2012-09-01

    The thermal fluctuations of micron-sized beads in dual trap optical tweezer experiments contain complete dynamic information about the viscoelastic properties of the embedding medium and-if present-macromolecular constructs connecting the two beads. To quantitatively interpret the spectral properties of the measured signals, a detailed understanding of the instrumental characteristics is required. To this end, we present a theoretical description of the signal processing in a typical dual trap optical tweezer experiment accounting for polarization crosstalk and instrumental noise and discuss the effect of finite statistics. To infer the unknown parameters from experimental data, a maximum likelihood method based on the statistical properties of the stochastic signals is derived. In a first step, the method can be used for calibration purposes: We propose a scheme involving three consecutive measurements (both traps empty, first one occupied and second empty, and vice versa), by which all instrumental and physical parameters of the setup are determined. We test our approach for a simple model system, namely a pair of unconnected, but hydrodynamically interacting spheres. The comparison to theoretical predictions based on instantaneous as well as retarded hydrodynamics emphasizes the importance of hydrodynamic retardation effects due to vorticity diffusion in the fluid. For more complex experimental scenarios, where macromolecular constructs are tethered between the two beads, the same maximum likelihood method in conjunction with dynamic deconvolution theory will in a second step allow one to determine the viscoelastic properties of the tethered element connecting the two beads.

  14. The influence of oxygen exposure time on the composition of macromolecular organic matter as revealed by surface sediments on the Murray Ridge (Arabian Sea)

    Science.gov (United States)

    Nierop, Klaas G. J.; Reichart, Gert-Jan; Veld, Harry; Sinninghe Damsté, Jaap S.

    2017-06-01

    The Arabian Sea represents a prime example of an open ocean extended oxygen minimum zone (OMZ) with low oxygen concentrations (down to less than 2 μM) between 200 and 1000 m water depth. The OMZ impinges on the ocean floor, affecting organic matter (OM) mineralization. We investigated impact of oxygen depletion on the composition of macromolecular OM (MOM) along a transect through the OMZ on the slopes of the Murray Ridge. This sub-marine high in the northern Arabian Sea, with the top at approximately 500 m below sea surface (mbss), intersects the OMZ. We analyzed sediments deposited in the core of OMZ (suboxic conditions), directly below the OMZ (dysoxic conditions) and well below the OMZ (fully oxic conditions). The upper 18 cm of sediments from three stations recovered at different depths were studied. MOM was investigated by Rock Eval and flash pyrolysis techniques. The MOM was of a predominant marine origin and inferred from their pyrolysis products, most biomolecules (tetra-alkylpyrrole pigments, polysaccharides, proteins and their transformation products, and polyphenols including phlorotannins), showed a progressive relative degradation with increasing exposure to oxygen. Alkylbenzenes and, in particular, aliphatic macromolecules increased relatively. The observed differences in MOM composition between sediment deposited under various bottom water oxygen conditions (i.e. in terms of concentration and exposure time) was much larger than within sediment cores, implying that early diagenetic alteration of organic matter depends largely on bottom water oxygenation rather than subsequent anaerobic degradation within the sediments, even at longer time scales.

  15. Localization of protein aggregation in Escherichia coli is governed by diffusion and nucleoid macromolecular crowding effect.

    Directory of Open Access Journals (Sweden)

    Anne-Sophie Coquel

    2013-04-01

    Full Text Available Aggregates of misfolded proteins are a hallmark of many age-related diseases. Recently, they have been linked to aging of Escherichia coli (E. coli where protein aggregates accumulate at the old pole region of the aging bacterium. Because of the potential of E. coli as a model organism, elucidating aging and protein aggregation in this bacterium may pave the way to significant advances in our global understanding of aging. A first obstacle along this path is to decipher the mechanisms by which protein aggregates are targeted to specific intercellular locations. Here, using an integrated approach based on individual-based modeling, time-lapse fluorescence microscopy and automated image analysis, we show that the movement of aging-related protein aggregates in E. coli is purely diffusive (Brownian. Using single-particle tracking of protein aggregates in live E. coli cells, we estimated the average size and diffusion constant of the aggregates. Our results provide evidence that the aggregates passively diffuse within the cell, with diffusion constants that depend on their size in agreement with the Stokes-Einstein law. However, the aggregate displacements along the cell long axis are confined to a region that roughly corresponds to the nucleoid-free space in the cell pole, thus confirming the importance of increased macromolecular crowding in the nucleoids. We thus used 3D individual-based modeling to show that these three ingredients (diffusion, aggregation and diffusion hindrance in the nucleoids are sufficient and necessary to reproduce the available experimental data on aggregate localization in the cells. Taken together, our results strongly support the hypothesis that the localization of aging-related protein aggregates in the poles of E. coli results from the coupling of passive diffusion-aggregation with spatially non-homogeneous macromolecular crowding. They further support the importance of "soft" intracellular structuring (based on

  16. A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design.

    Directory of Open Access Journals (Sweden)

    Shane Ó Conchúir

    Full Text Available The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.

  17. [Macromolecular aromatic network characteristics of Chinese power coal analyzed by synchronous fluorescence and X-ray diffraction].

    Science.gov (United States)

    Ye, Cui-Ping; Feng, Jie; Li, Wen-Ying

    2012-07-01

    Coal structure, especially the macromolecular aromatic skeleton structure, has a strong influence on coke reactivity and coal gasification, so it is the key to grasp the macromolecular aromatic skeleton coal structure for getting the reasonable high efficiency utilization of coal. However, it is difficult to acquire their information due to the complex compositions and structure of coal. It has been found that the macromolecular aromatic network coal structure would be most isolated if small molecular of coal was first extracted. Then the macromolecular aromatic skeleton coal structure would be clearly analyzed by instruments, such as X-ray diffraction (XRD), fluorescence spectroscopy with synchronous mode (Syn-F), Gel permeation chromatography (GPC) etc. Based on the previous results, according to the stepwise fractional liquid extraction, two Chinese typical power coals, PS and HDG, were extracted by silica gel as stationary phase and acetonitrile, tetrahydrofuran (THF), pyridine and 1-methyl-2-pyrollidinone (NMP) as a solvent group for sequential elution. GPC, Syn-F and XRD were applied to investigate molecular mass distribution, condensed aromatic structure and crystal characteristics. The results showed that the size of aromatic layers (La) is small (3-3.95 nm) and the stacking heights (Lc) are 0.8-1.2 nm. The molecular mass distribution of the macromolecular aromatic network structure is between 400 and 1 130 amu, with condensed aromatic numbers of 3-7 in the structure units.

  18. Grain sorghum dust increases macromolecular efflux from the in situ nasal mucosa.

    Science.gov (United States)

    Gao, X P

    1998-04-01

    The purpose of this study was to determine whether an aqueous extract of grain sorghum dust increases macromolecular efflux from the nasal mucosa in vivo and, if so, whether this response is mediated, in part, by substance P. Suffusion of grain sorghum dust extract on the in situ nasal mucosa of anesthetized hamsters elicits a significant increase in clearance of fluorescein isothiocyanate-labeled dextran (FITC-dextran; mol mass, 70 kDa; P grain sorghum dust elicits neurogenic plasma exudation from the in situ nasal mucosa.

  19. Evaluation of quantum-chemical methods of radiolysis stability for macromolecular structures

    International Nuclear Information System (INIS)

    Postolache, Cristian; Matei, Lidia

    2005-01-01

    The behavior of macromolecular structures in ionising fields was analyzed by quantum-chemical methods. In this study the primary radiolytic effect was analyzed using a two-step radiolytic mechanism: a) ionisation of molecule and spatial redistribution of atoms in order to reach a minimum value of energy, characteristic to the quantum state; b) neutralisation of the molecule by electron capture and its rapid dissociation into free radicals. Chemical bonds suspected to break are located in the distribution region of LUMO orbital and have minimal homolytic dissociation energies. Representative polymer structures (polyethylene, polypropylene, polystyrene, poly α and β polystyrene, polyisobutylene, polytetrafluoroethylene, poly methylsiloxanes) were analyzed. (authors)

  20. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography

    Science.gov (United States)

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L.; Armour, Wes; Waterman, David G.; Iwata, So; Evans, Gwyndaf

    2013-01-01

    The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein. PMID:23897484

  1. Remote Access to the PXRR Macromolecular Crystallography Facilities at the NSLS

    Energy Technology Data Exchange (ETDEWEB)

    A Soares; D Schneider; J Skinner; M Cowan; R Buono; H Robinson; A Heroux; M Carlucci-Dayton; A Saxena; R Sweet

    2011-12-31

    The most recent surge of innovations that have simplified and streamlined the process of determining macromolecular structures by crystallography owes much to the efforts of the structural genomics community. However, this was only the last step in a long evolution that saw the metamorphosis of crystallography from an heroic effort that involved years of dedication and skill into a straightforward measurement that is occasionally almost trivial. Many of the steps in this remarkable odyssey involved reducing the physical labor that is demanded of experimenters in the field. Other steps reduced the technical expertise required for conducting those experiments.

  2. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography.

    Science.gov (United States)

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L; Armour, Wes; Waterman, David G; Iwata, So; Evans, Gwyndaf

    2013-08-01

    The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein.

  3. Remote Access to the PXRR Macromolecular Crystallography Facilities at the NSLS

    International Nuclear Information System (INIS)

    Soares, A.; Schneider, D.; Skinner, J.; Cowan, M.; Buono, R.; Robinson, H.; Heroux, A.; Carlucci-Dayton, M.; Saxena, A.; Sweet, R.

    2008-01-01

    The most recent surge of innovations that have simplified and streamlined the process of determining macromolecular structures by crystallography owes much to the efforts of the structural genomics community. However, this was only the last step in a long evolution that saw the metamorphosis of crystallography from an heroic effort that involved years of dedication and skill into a straightforward measurement that is occasionally almost trivial. Many of the steps in this remarkable odyssey involved reducing the physical labor that is demanded of experimenters in the field. Other steps reduced the technical expertise required for conducting those experiments.

  4. Interactive Instruction in Bayesian Inference

    DEFF Research Database (Denmark)

    Khan, Azam; Breslav, Simon; Hornbæk, Kasper

    2018-01-01

    An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction. These pri......An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction....... These principles concern coherence, personalization, signaling, segmenting, multimedia, spatial contiguity, and pretraining. Principles of self-explanation and interactivity are also applied. Four experiments on the Mammography Problem showed that these principles help participants answer the questions...... that an instructional approach to improving human performance in Bayesian inference is a promising direction....

  5. On Maximum Entropy and Inference

    Directory of Open Access Journals (Sweden)

    Luigi Gresele

    2017-11-01

    Full Text Available Maximum entropy is a powerful concept that entails a sharp separation between relevant and irrelevant variables. It is typically invoked in inference, once an assumption is made on what the relevant variables are, in order to estimate a model from data, that affords predictions on all other (dependent variables. Conversely, maximum entropy can be invoked to retrieve the relevant variables (sufficient statistics directly from the data, once a model is identified by Bayesian model selection. We explore this approach in the case of spin models with interactions of arbitrary order, and we discuss how relevant interactions can be inferred. In this perspective, the dimensionality of the inference problem is not set by the number of parameters in the model, but by the frequency distribution of the data. We illustrate the method showing its ability to recover the correct model in a few prototype cases and discuss its application on a real dataset.

  6. Eight challenges in phylodynamic inference

    Directory of Open Access Journals (Sweden)

    Simon D.W. Frost

    2015-03-01

    Full Text Available The field of phylodynamics, which attempts to enhance our understanding of infectious disease dynamics using pathogen phylogenies, has made great strides in the past decade. Basic epidemiological and evolutionary models are now well characterized with inferential frameworks in place. However, significant challenges remain in extending phylodynamic inference to more complex systems. These challenges include accounting for evolutionary complexities such as changing mutation rates, selection, reassortment, and recombination, as well as epidemiological complexities such as stochastic population dynamics, host population structure, and different patterns at the within-host and between-host scales. An additional challenge exists in making efficient inferences from an ever increasing corpus of sequence data.

  7. Problem solving and inference mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Furukawa, K; Nakajima, R; Yonezawa, A; Goto, S; Aoyama, A

    1982-01-01

    The heart of the fifth generation computer will be powerful mechanisms for problem solving and inference. A deduction-oriented language is to be designed, which will form the core of the whole computing system. The language is based on predicate logic with the extended features of structuring facilities, meta structures and relational data base interfaces. Parallel computation mechanisms and specialized hardware architectures are being investigated to make possible efficient realization of the language features. The project includes research into an intelligent programming system, a knowledge representation language and system, and a meta inference system to be built on the core. 30 references.

  8. A brief history of macromolecular crystallography, illustrated by a family tree and its Nobel fruits.

    Science.gov (United States)

    Jaskolski, Mariusz; Dauter, Zbigniew; Wlodawer, Alexander

    2014-09-01

    As a contribution to the celebration of the year 2014, declared by the United Nations to be 'The International Year of Crystallography', the FEBS Journal is dedicating this issue to papers showcasing the intimate union between macromolecular crystallography and structural biology, both in historical perspective and in current research. Instead of a formal editorial piece, by way of introduction, this review discusses the most important, often iconic, achievements of crystallographers that led to major advances in our understanding of the structure and function of biological macromolecules. We identified at least 42 scientists who received Nobel Prizes in Physics, Chemistry or Medicine for their contributions that included the use of X-rays or neutrons and crystallography, including 24 who made seminal discoveries in macromolecular sciences. Our spotlight is mostly, but not only, on the recipients of this most prestigious scientific honor, presented in approximately chronological order. As a summary of the review, we attempt to construct a genealogy tree of the principal lineages of protein crystallography, leading from the founding members to the present generation. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  9. Long-wavelength macromolecular crystallography - First successful native SAD experiment close to the sulfur edge

    Science.gov (United States)

    Aurelius, O.; Duman, R.; El Omari, K.; Mykhaylyk, V.; Wagner, A.

    2017-11-01

    Phasing of novel macromolecular crystal structures has been challenging since the start of structural biology. Making use of anomalous diffraction of natively present elements, such as sulfur and phosphorus, for phasing has been possible for some systems, but hindered by the necessity to access longer X-ray wavelengths in order to make most use of the anomalous scattering contributions of these elements. Presented here are the results from a first successful experimental phasing study of a macromolecular crystal structure at a wavelength close to the sulfur K edge. This has been made possible by the in-vacuum setup and the long-wavelength optimised experimental setup at the I23 beamline at Diamond Light Source. In these early commissioning experiments only standard data collection and processing procedures have been applied, in particular no dedicated absorption correction has been used. Nevertheless the success of the experiment demonstrates that the capability to extract phase information can be even further improved once data collection protocols and data processing have been optimised.

  10. Organ specific acute toxicity of the carcinogen trans-4-acetylaminostilbene is not correlated with macromolecular binding.

    Science.gov (United States)

    Pfeifer, A; Neumann, H G

    1986-09-01

    trans-4-Acetylaminostilbene (trans-AAS) is acutely toxic in rats and lesions are produced specifically in the glandular stomach. Toxicity is slightly increased by pretreating the animals with phenobarbital (PB) and is completely prevented by pretreatment with methylcholanthrene (MC). The prostaglandin inhibitors, indomethacin and acetyl salicylic acid, do not reduce toxicity. The high efficiency of MC suggested that toxicity is caused by reactive metabolites. trans-[3H]-AAS was administered orally to untreated and to PB- or MC-pretreated female Wistar rats and target doses in different tissues were measured by means of covalent binding to proteins, RNA and DNA. Macromolecular binding in the target tissue of poisoned animals was significantly lower than in liver and kidney and comparable to other non-target tissues. Pretreatment with MC lowered macromolecular binding in all extrahepatic tissues but not in liver. These findings are not in line with tissue specific metabolic activation. The only unique property of the target tissue, glandular stomach, that we observed was a particular affinity for the systemically available parent compound. In the early phase of poisoning, tissue concentrations were exceedingly high and the stomach function was impaired.

  11. Can visco-elastic phase separation, macromolecular crowding and colloidal physics explain nuclear organisation?

    Directory of Open Access Journals (Sweden)

    Iborra Francisco J

    2007-04-01

    Full Text Available Abstract Background The cell nucleus is highly compartmentalized with well-defined domains, it is not well understood how this nuclear order is maintained. Many scientists are fascinated by the different set of structures observed in the nucleus to attribute functions to them. In order to distinguish functional compartments from non-functional aggregates, I believe is important to investigate the biophysical nature of nuclear organisation. Results The various nuclear compartments can be divided broadly as chromatin or protein and/or RNA based, and they have very different dynamic properties. The chromatin compartment displays a slow, constrained diffusional motion. On the other hand, the protein/RNA compartment is very dynamic. Physical systems with dynamical asymmetry go to viscoelastic phase separation. This phase separation phenomenon leads to the formation of a long-lived interaction network of slow components (chromatin scattered within domains rich in fast components (protein/RNA. Moreover, the nucleus is packed with macromolecules in the order of 300 mg/ml. This high concentration of macromolecules produces volume exclusion effects that enhance attractive interactions between macromolecules, known as macromolecular crowding, which favours the formation of compartments. In this paper I hypothesise that nuclear compartmentalization can be explained by viscoelastic phase separation of the dynamically different nuclear components, in combination with macromolecular crowding and the properties of colloidal particles. Conclusion I demonstrate that nuclear structure can satisfy the predictions of this hypothesis. I discuss the functional implications of this phenomenon.

  12. Time-efficient, high-resolution, whole brain three-dimensional macromolecular proton fraction mapping.

    Science.gov (United States)

    Yarnykh, Vasily L

    2016-05-01

    Macromolecular proton fraction (MPF) mapping is a quantitative MRI method that reconstructs parametric maps of a relative amount of macromolecular protons causing the magnetization transfer (MT) effect and provides a biomarker of myelination in neural tissues. This study aimed to develop a high-resolution whole brain MPF mapping technique using a minimal number of source images for scan time reduction. The described technique was based on replacement of an actually acquired reference image without MT saturation by a synthetic one reconstructed from R1 and proton density maps, thus requiring only three source images. This approach enabled whole brain three-dimensional MPF mapping with isotropic 1.25 × 1.25 × 1.25 mm(3) voxel size and a scan time of 20 min. The synthetic reference method was validated against standard MPF mapping with acquired reference images based on data from eight healthy subjects. Mean MPF values in segmented white and gray matter appeared in close agreement with no significant bias and small within-subject coefficients of variation (maps demonstrated sharp white-gray matter contrast and clear visualization of anatomical details, including gray matter structures with high iron content. The proposed synthetic reference method improves resolution of MPF mapping and combines accurate MPF measurements with unique neuroanatomical contrast features. © 2015 Wiley Periodicals, Inc.

  13. ISPyB: an information management system for synchrotron macromolecular crystallography.

    Science.gov (United States)

    Delagenière, Solange; Brenchereau, Patrice; Launer, Ludovic; Ashton, Alun W; Leal, Ricardo; Veyrier, Stéphanie; Gabadinho, José; Gordon, Elspeth J; Jones, Samuel D; Levik, Karl Erik; McSweeney, Seán M; Monaco, Stéphanie; Nanao, Max; Spruce, Darren; Svensson, Olof; Walsh, Martin A; Leonard, Gordon A

    2011-11-15

    Individual research groups now analyze thousands of samples per year at synchrotron macromolecular crystallography (MX) resources. The efficient management of experimental data is thus essential if the best possible experiments are to be performed and the best possible data used in downstream processes in structure determination pipelines. Information System for Protein crystallography Beamlines (ISPyB), a Laboratory Information Management System (LIMS) with an underlying data model allowing for the integration of analyses down-stream of the data collection experiment was developed to facilitate such data management. ISPyB is now a multisite, generic LIMS for synchrotron-based MX experiments. Its initial functionality has been enhanced to include improved sample tracking and reporting of experimental protocols, the direct ranking of the diffraction characteristics of individual samples and the archiving of raw data and results from ancillary experiments and post-experiment data processing protocols. This latter feature paves the way for ISPyB to play a central role in future macromolecular structure solution pipelines and validates the application of the approach used in ISPyB to other experimental techniques, such as biological solution Small Angle X-ray Scattering and spectroscopy, which have similar sample tracking and data handling requirements.

  14. A new paradigm for macromolecular crystallography beamlines derived from high-pressure methodology and results

    Energy Technology Data Exchange (ETDEWEB)

    Fourme, Roger, E-mail: roger.fourme@synchrotron-soleil.fr [Synchrotron SOLEIL, BP 48, Saint Aubin, 91192 Gif-sur-Yvette (France); Girard, Eric [IBS (UMR 5075 CEA-CNRS-UJF-PSB), 41 rue Jules Horowitz, 38027 Grenoble Cedex (France); Dhaussy, Anne-Claire [CRISMAT, ENSICAEN, 6 Boulevard du Maréchal Juin, 14000 Caen (France); Medjoubi, Kadda [Synchrotron SOLEIL, BP 48, Saint Aubin, 91192 Gif-sur-Yvette (France); Prangé, Thierry [LCRB (UMR 8015 CNRS), Université Paris Descartes, Faculté de Pharmacie, 4 avenue de l’Observatoire, 75270 Paris (France); Ascone, Isabella [ENSCP (UMR CNRS 7223), 11 rue Pierre et Marie Curie, 75231 Paris Cedex 05 (France); Mezouar, Mohamed [ESRF, BP 220, 38043 Grenoble (France); Kahn, Richard [IBS (UMR 5075 CEA-CNRS-UJF-PSB), 41 rue Jules Horowitz, 38027 Grenoble Cedex (France)

    2011-01-01

    Macromolecular crystallography at high pressure (HPMX) is a mature technique. Shorter X-ray wavelengths increase data collection efficiency on cryocooled crystals. Extending applications and exploiting spin-off of HPMX will require dedicated synchrotron radiation beamlines based on a new paradigm. Biological structures can now be investigated at high resolution by high-pressure X-ray macromolecular crystallography (HPMX). The number of HPMX studies is growing, with applications to polynucleotides, monomeric and multimeric proteins, complex assemblies and even a virus capsid. Investigations of the effects of pressure perturbation have encompassed elastic compression of the native state, study of proteins from extremophiles and trapping of higher-energy conformers that are often of biological interest; measurements of the compressibility of crystals and macromolecules were also performed. HPMX results were an incentive to investigate short and ultra-short wavelengths for standard biocrystallography. On cryocooled lysozyme crystals it was found that the data collection efficiency using 33 keV photons is increased with respect to 18 keV photons. This conclusion was extended from 33 keV down to 6.5 keV by exploiting previously published data. To be fully exploited, the potential of higher-energy photons requires detectors with a good efficiency. Accordingly, a new paradigm for MX beamlines was suggested, using conventional short and ultra-short wavelengths, aiming at the collection of very high accuracy data on crystals under standard conditions or under high pressure. The main elements of such beamlines are outlined.

  15. Polydisulfide Manganese(II) Complexes as Non-Gadolinium Biodegradable Macromolecular MRI Contrast Agents

    Science.gov (United States)

    Ye, Zhen; Jeong, Eun-Kee; Wu, Xueming; Tan, Mingqian; Yin, Shouyu; Lu, Zheng-Rong

    2011-01-01

    Purpose To develop safe and effective manganese(II) based biodegradable macromolecular MRI contrast agents. Materials and Methods In this study, we synthesized and characterized two polydisulfide manganese(II) complexes, Mn-DTPA cystamine copolymers and Mn-EDTA cystamine copolymers, as new biodegradable macromolecular MRI contrast agents. The contrast enhancement of the two manganese based contrast agents were evaluated in mice bearing MDA-MB-231 human breast carcinoma xenografts, in comparison with MnCl2. Results The T1 and T2 relaxivities were 4.74 and 10.38 mM−1s−1 per manganese at 3T for Mn-DTPA cystamine copolymers (Mn=30.50 kDa) and 6.41 and 9.72 mM−1s−1 for Mn-EDTA cystamine copolymers (Mn= 61.80 kDa). Both polydisulfide Mn(II) complexes showed significant liver, myocardium and tumor enhancement. Conclusion The manganese based polydisulfide contrast agents have a potential to be developed as alternative non-gadolinium contrast agents for MR cancer and myocardium imaging. PMID:22031457

  16. Object-Oriented Type Inference

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Palsberg, Jens

    1991-01-01

    We present a new approach to inferring types in untyped object-oriented programs with inheritance, assignments, and late binding. It guarantees that all messages are understood, annotates the program with type information, allows polymorphic methods, and can be used as the basis of an op...

  17. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  18. Mixed normal inference on multicointegration

    NARCIS (Netherlands)

    Boswijk, H.P.

    2009-01-01

    Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the

  19. Statistical inference and Aristotle's Rhetoric.

    Science.gov (United States)

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  20. MX1: a bending-magnet crystallography beamline serving both chemical and macromolecular crystallography communities at the Australian Synchrotron

    International Nuclear Information System (INIS)

    Cowieson, Nathan Philip; Aragao, David; Clift, Mark; Ericsson, Daniel J.; Gee, Christine; Harrop, Stephen J.; Mudie, Nathan; Panjikar, Santosh; Price, Jason R.; Riboldi-Tunnicliffe, Alan; Williamson, Rachel; Caradoc-Davies, Tom

    2015-01-01

    The macromolecular crystallography beamline MX1 at the Australian Synchrotron is described. MX1 is a bending-magnet crystallography beamline at the 3 GeV Australian Synchrotron. The beamline delivers hard X-rays in the energy range from 8 to 18 keV to a focal spot at the sample position of 120 µm FWHM. The beamline endstation and ancillary equipment facilitate local and remote access for both chemical and biological macromolecular crystallography. Here, the design of the beamline and endstation are discussed. The beamline has enjoyed a full user program for the last seven years and scientific highlights from the user program are also presented

  1. Irradiation damage

    Energy Technology Data Exchange (ETDEWEB)

    Howe, L.M

    2000-07-01

    There is considerable interest in irradiation effects in intermetallic compounds from both the applied and fundamental aspects. Initially, this interest was associated mainly with nuclear reactor programs but it now extends to the fields of ion-beam modification of metals, behaviour of amorphous materials, ion-beam processing of electronic materials, and ion-beam simulations of various kinds. The field of irradiation damage in intermetallic compounds is rapidly expanding, and no attempt will be made in this chapter to cover all of the various aspects. Instead, attention will be focused on some specific areas and, hopefully, through these, some insight will be given into the physical processes involved, the present state of our knowledge, and the challenge of obtaining more comprehensive understanding in the future. The specific areas that will be covered are: point defects in intermetallic compounds; irradiation-enhanced ordering and irradiation-induced disordering of ordered alloys; irradiation-induced amorphization.

  2. Irradiation damage

    International Nuclear Information System (INIS)

    Howe, L.M.

    2000-01-01

    There is considerable interest in irradiation effects in intermetallic compounds from both the applied and fundamental aspects. Initially, this interest was associated mainly with nuclear reactor programs but it now extends to the fields of ion-beam modification of metals, behaviour of amorphous materials, ion-beam processing of electronic materials, and ion-beam simulations of various kinds. The field of irradiation damage in intermetallic compounds is rapidly expanding, and no attempt will be made in this chapter to cover all of the various aspects. Instead, attention will be focused on some specific areas and, hopefully, through these, some insight will be given into the physical processes involved, the present state of our knowledge, and the challenge of obtaining more comprehensive understanding in the future. The specific areas that will be covered are: point defects in intermetallic compounds; irradiation-enhanced ordering and irradiation-induced disordering of ordered alloys; irradiation-induced amorphization

  3. Statistical learning and selective inference.

    Science.gov (United States)

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  4. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  5. Statistical inference an integrated approach

    CERN Document Server

    Migon, Helio S; Louzada, Francisco

    2014-01-01

    Introduction Information The concept of probability Assessing subjective probabilities An example Linear algebra and probability Notation Outline of the bookElements of Inference Common statistical modelsLikelihood-based functions Bayes theorem Exchangeability Sufficiency and exponential family Parameter elimination Prior Distribution Entirely subjective specification Specification through functional forms Conjugacy with the exponential family Non-informative priors Hierarchical priors Estimation Introduction to decision theoryBayesian point estimation Classical point estimation Empirical Bayes estimation Comparison of estimators Interval estimation Estimation in the Normal model Approximating Methods The general problem of inference Optimization techniquesAsymptotic theory Other analytical approximations Numerical integration methods Simulation methods Hypothesis Testing Introduction Classical hypothesis testingBayesian hypothesis testing Hypothesis testing and confidence intervalsAsymptotic tests Prediction...

  6. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  7. Causal inference based on counterfactuals

    Directory of Open Access Journals (Sweden)

    Höfler M

    2005-09-01

    Full Text Available Abstract Background The counterfactual or potential outcome model has become increasingly standard for causal inference in epidemiological and medical studies. Discussion This paper provides an overview on the counterfactual and related approaches. A variety of conceptual as well as practical issues when estimating causal effects are reviewed. These include causal interactions, imperfect experiments, adjustment for confounding, time-varying exposures, competing risks and the probability of causation. It is argued that the counterfactual model of causal effects captures the main aspects of causality in health sciences and relates to many statistical procedures. Summary Counterfactuals are the basis of causal inference in medicine and epidemiology. Nevertheless, the estimation of counterfactual differences pose several difficulties, primarily in observational studies. These problems, however, reflect fundamental barriers only when learning from observations, and this does not invalidate the counterfactual concept.

  8. System Support for Forensic Inference

    Science.gov (United States)

    Gehani, Ashish; Kirchner, Florent; Shankar, Natarajan

    Digital evidence is playing an increasingly important role in prosecuting crimes. The reasons are manifold: financially lucrative targets are now connected online, systems are so complex that vulnerabilities abound and strong digital identities are being adopted, making audit trails more useful. If the discoveries of forensic analysts are to hold up to scrutiny in court, they must meet the standard for scientific evidence. Software systems are currently developed without consideration of this fact. This paper argues for the development of a formal framework for constructing “digital artifacts” that can serve as proxies for physical evidence; a system so imbued would facilitate sound digital forensic inference. A case study involving a filesystem augmentation that provides transparent support for forensic inference is described.

  9. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  10. Statistical inference on residual life

    CERN Document Server

    Jeong, Jong-Hyeon

    2014-01-01

    This is a monograph on the concept of residual life, which is an alternative summary measure of time-to-event data, or survival data. The mean residual life has been used for many years under the name of life expectancy, so it is a natural concept for summarizing survival or reliability data. It is also more interpretable than the popular hazard function, especially for communications between patients and physicians regarding the efficacy of a new drug in the medical field. This book reviews existing statistical methods to infer the residual life distribution. The review and comparison includes existing inference methods for mean and median, or quantile, residual life analysis through medical data examples. The concept of the residual life is also extended to competing risks analysis. The targeted audience includes biostatisticians, graduate students, and PhD (bio)statisticians. Knowledge in survival analysis at an introductory graduate level is advisable prior to reading this book.

  11. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  12. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  13. On Quantum Statistical Inference, II

    OpenAIRE

    Barndorff-Nielsen, O. E.; Gill, R. D.; Jupp, P. E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, theoretical developments in the theory of quantum measurements have brought the basic mathematical framework for the probability calculations much closer to that of classical probability theory. The present paper reviews this field and proposes and inte...

  14. Nonparametric predictive inference in reliability

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Yan, K.J.

    2002-01-01

    We introduce a recently developed statistical approach, called nonparametric predictive inference (NPI), to reliability. Bounds for the survival function for a future observation are presented. We illustrate how NPI can deal with right-censored data, and discuss aspects of competing risks. We present possible applications of NPI for Bernoulli data, and we briefly outline applications of NPI for replacement decisions. The emphasis is on introduction and illustration of NPI in reliability contexts, detailed mathematical justifications are presented elsewhere

  15. Damaged Skylab

    Science.gov (United States)

    1973-01-01

    The Saturn V vehicle, carrying the unmarned orbital workshop for the Skylab-1 mission, lifted off successfully and all systems performed normally. Sixty-three seconds into the flight, engineers in the operation support and control center saw an unexpected telemetry indication that signalled that damages occurred on one solar array and the micrometeoroid shield during the launch. The micrometeoroid shield, a thin protective cylinder surrounding the workshop protecting it from tiny space particles and the sun's scorching heat, ripped loose from its position around the workshop. This caused the loss of one solar wing and jammed the other. Still unoccupied, the Skylab was stricken with the loss of the heat shield and sunlight beat mercilessly on the lab's sensitive skin. Internal temperatures soared, rendering the station uninhabitable, threatening foods, medicines, films, and experiments. This image, taken during a fly-around inspection by the Skylab-2 crew, shows a crippled Skylab in orbit. The crew found their home in space to be in serious shape; the heat shield gone, one solar wing gone, and the other jammed. The Marshall Space Flight Center (MSFC) developed, tested, rehearsed, and approved three repair options. These options included a parasol sunshade and a twin-pole sunshade to restore the temperature inside the workshop, and a set of metal cutting tools to free the jammed solar panel.

  16. Structural damage

    International Nuclear Information System (INIS)

    Gray, R.E.; Bruhn, R.W.

    1992-01-01

    Virtually all structures show some signs of distress due to deterioration of the building components, to changed loads, or to changed support conditions. Changed support conditions result from ground movements. In mining regions many cases of structural distress are attributed to mining without considering alternative causes. This is particularly true of coal mining since it occurs under extensive areas. Coal mining is estimated to have already undermined more than eight million acres and may eventually undermine 40 million acres in the United States. Other nonmetal and metal underground mines impact much smaller areas. Although it is sometimes difficult, even with careful study, to identify the actual cause of damage, persons responsible for underground coal mining should at least be aware of possible causes of building stress other than mine subsidence. This paper presents information on distress to structures and briefly reviews a number of causes of ground movements other than subsidence: Mass movements, dissolution, erosion, frost action, shrinking and swelling, yield into excavations and compressibility

  17. Variational inference & deep learning : A new synthesis

    NARCIS (Netherlands)

    Kingma, D.P.

    2017-01-01

    In this thesis, Variational Inference and Deep Learning: A New Synthesis, we propose novel solutions to the problems of variational (Bayesian) inference, generative modeling, representation learning, semi-supervised learning, and stochastic optimization.

  18. Variational inference & deep learning: A new synthesis

    OpenAIRE

    Kingma, D.P.

    2017-01-01

    In this thesis, Variational Inference and Deep Learning: A New Synthesis, we propose novel solutions to the problems of variational (Bayesian) inference, generative modeling, representation learning, semi-supervised learning, and stochastic optimization.

  19. Continuous Integrated Invariant Inference, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed project will develop a new technique for invariant inference and embed this and other current invariant inference and checking techniques in an...

  20. Proceedings of a one-week course on exploiting anomalous scattering in macromolecular structure determination (EMBO'07)

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, M.S.; Shepard, W.; Dauter, Z.; Leslie, A.; Diederichs, K.; Evans, G.; Svensson, O.; Schneider, T.; Bricogne, G.; Dauter, Z.; Flensburg, C.; Terwilliger, T.; Lamzin, V.; Leslie, A.; Kabsch, W.; Flensburg, C.; Terwilliger, T.; Lamzin, V.; Read, R.; Panjikar, S.; Pannu, N.S.; Dauter, Z.; Weiss, M.S.; McSweeney, S

    2007-07-01

    This course, which was directed to young scientists, illustrated both theoretical and practical aspects of macromolecular crystal structure solution using synchrotron radiation. Some software dedicated to data collection, processing and analysis were presented. This document gathers only the slides of the presentations.

  1. Macromolecular crowding compacts unfolded apoflavodoxin and causes severe aggregation of the off-pathway intermediate during apoflavodoxin folding

    NARCIS (Netherlands)

    Engel, R.; Westphal, A.H.; Huberts, D.; Nabuurs, S.M.; Lindhoud, S.; Visser, A.J.W.G.; Mierlo, van C.P.M.

    2008-01-01

    To understand how proteins fold in vivo, it is important to investigate the effects of macromolecular crowding on protein folding. Here, the influence of crowding on in vitro apoflavodoxin folding, which involves a relatively stable off-pathway intermediate with molten globule characteristics, is

  2. Proceedings of a one-week course on exploiting anomalous scattering in macromolecular structure determination (EMBO'07)

    International Nuclear Information System (INIS)

    Weiss, M.S.; Shepard, W.; Dauter, Z.; Leslie, A.; Diederichs, K.; Evans, G.; Svensson, O.; Schneider, T.; Bricogne, G.; Dauter, Z.; Flensburg, C.; Terwilliger, T.; Lamzin, V.; Leslie, A.; Kabsch, W.; Flensburg, C.; Terwilliger, T.; Lamzin, V.; Read, R.; Panjikar, S.; Pannu, N.S.; Dauter, Z.; Weiss, M.S.; McSweeney, S.

    2007-01-01

    This course, which was directed to young scientists, illustrated both theoretical and practical aspects of macromolecular crystal structure solution using synchrotron radiation. Some software dedicated to data collection, processing and analysis were presented. This document gathers only the slides of the presentations

  3. Probing the Interplay of Size, Shape, and Solution Environment in Macromolecular Diffusion Using a Simple Refraction Experiment

    Science.gov (United States)

    Mankidy, Bijith D.; Coutinho, Cecil A.; Gupta, Vinay K.

    2010-01-01

    The diffusion coefficient of polymers is a critical parameter in biomedicine, catalysis, chemical separations, nanotechnology, and other industrial applications. Here, measurement of macromolecular diffusion in solutions is described using a visually instructive, undergraduate-level optical refraction experiment based on Weiner's method. To…

  4. Proceedings of a one-week course on exploiting anomalous scattering in macromolecular structure determination (EMBO'07)

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, M S; Shepard, W; Dauter, Z; Leslie, A; Diederichs, K; Evans, G; Svensson, O; Schneider, T; Bricogne, G; Dauter, Z; Flensburg, C; Terwilliger, T; Lamzin, V; Leslie, A; Kabsch, W; Flensburg, C; Terwilliger, T; Lamzin, V; Read, R; Panjikar, S; Pannu, N S; Dauter, Z; Weiss, M S; McSweeney, S

    2007-07-01

    This course, which was directed to young scientists, illustrated both theoretical and practical aspects of macromolecular crystal structure solution using synchrotron radiation. Some software dedicated to data collection, processing and analysis were presented. This document gathers only the slides of the presentations.

  5. Radiation damage prediction system using damage function

    International Nuclear Information System (INIS)

    Tanaka, Yoshihisa; Mori, Seiji

    1979-01-01

    The irradiation damage analysis system using a damage function was investigated. This irradiation damage analysis system consists of the following three processes, the unfolding of a damage function, the calculation of the neutron flux spectrum of the object of damage analysis and the estimation of irradiation effect of the object of damage analysis. The damage function is calculated by applying the SAND-2 code. The ANISN and DOT3, 5 codes are used to calculate neutron flux. The neutron radiation and the allowable time of reactor operation can be estimated based on these calculations of the damage function and neutron flux. The flow diagram of the process of analyzing irradiation damage by a damage function and the flow diagram of SAND-2 code are presented, and the analytical code for estimating damage, which is determined with a damage function and a neutron spectrum, is explained. The application of the irradiation damage analysis system using a damage function was carried out to the core support structure of a fast breeder reactor for the damage estimation and the uncertainty evaluation. The fundamental analytical conditions and the analytical model for this work are presented, then the irradiation data for SUS304, the initial estimated values of a damage function, the error analysis for a damage function and the analytical results are explained concerning the computation of a damage function for 10% total elongation. Concerning the damage estimation of FBR core support structure, the standard and lower limiting values of damage, the permissible neutron flux and the allowable years of reactor operation are presented and were evaluated. (Nakai, Y.)

  6. A Test of Macromolecular Crystallization in Microgravity: Large, Well-Ordered Insulin Crystals

    Science.gov (United States)

    Borgstahl, Gloria E. O.; Vahedi-Faridi, Ardeschir; Lovelace, Jeff; Bellamy, Henry D.; Snell, Edward H.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    Crystals of insulin grown in microgravity on space shuttle mission STS-95 were extremely well-ordered and unusually large (many > 2 mm). The physical characteristics of six microgravity and six earth-grown crystals were examined by X-ray analysis employing superfine f slicing and unfocused synchrotron radiation. This experimental setup allowed hundreds of reflections to be precisely examined for each crystal in a short period of time. The microgravity crystals were on average 34 times larger, had 7 times lower mosaicity, had 54 times higher reflection peak heights and diffracted to significantly higher resolution than their earth grown counterparts. A single mosaic domain model could account for reflections in microgravity crystals whereas reflections from earth crystals required a model with multiple mosaic domains. This statistically significant and unbiased characterization indicates that the microgravity environment was useful for the improvement of crystal growth and resultant diffraction quality in insulin crystals and may be similarly useful for macromolecular crystals in general.

  7. Acoustic methods for high-throughput protein crystal mounting at next-generation macromolecular crystallographic beamlines.

    Science.gov (United States)

    Roessler, Christian G; Kuczewski, Anthony; Stearns, Richard; Ellson, Richard; Olechno, Joseph; Orville, Allen M; Allaire, Marc; Soares, Alexei S; Héroux, Annie

    2013-09-01

    To take full advantage of advanced data collection techniques and high beam flux at next-generation macromolecular crystallography beamlines, rapid and reliable methods will be needed to mount and align many samples per second. One approach is to use an acoustic ejector to eject crystal-containing droplets onto a solid X-ray transparent surface, which can then be positioned and rotated for data collection. Proof-of-concept experiments were conducted at the National Synchrotron Light Source on thermolysin crystals acoustically ejected onto a polyimide `conveyor belt'. Small wedges of data were collected on each crystal, and a complete dataset was assembled from a well diffracting subset of these crystals. Future developments and implementation will focus on achieving ejection and translation of single droplets at a rate of over one hundred per second.

  8. Proteome-wide dataset supporting the study of ancient metazoan macromolecular complexes

    Directory of Open Access Journals (Sweden)

    Sadhna Phanse

    2016-03-01

    Full Text Available Our analysis examines the conservation of multiprotein complexes among metazoa through use of high resolution biochemical fractionation and precision mass spectrometry applied to soluble cell extracts from 5 representative model organisms Caenorhabditis elegans, Drosophila melanogaster, Mus musculus, Strongylocentrotus purpuratus, and Homo sapiens. The interaction network obtained from the data was validated globally in 4 distant species (Xenopus laevis, Nematostella vectensis, Dictyostelium discoideum, Saccharomyces cerevisiae and locally by targeted affinity-purification experiments. Here we provide details of our massive set of supporting biochemical fractionation data available via ProteomeXchange (http://www.ebi.ac.uk/pride/archive/projects/PXD002319-http://www.ebi.ac.uk/pride/archive/projects/PXD002328, PPIs via BioGRID (185267; and interaction network projections via (http://metazoa.med.utoronto.ca made fully accessible to allow further exploration. The datasets here are related to the research article on metazoan macromolecular complexes in Nature [1]. Keywords: Proteomics, Metazoa, Protein complexes, Biochemical, Fractionation

  9. Functionalization of Planet-Satellite Nanostructures Revealed by Nanoscopic Localization of Distinct Macromolecular Species

    KAUST Repository

    Rossner, Christian

    2016-09-26

    The development of a straightforward method is reported to form hybrid polymer/gold planet-satellite nanostructures (PlSNs) with functional polymer. Polyacrylate type polymer with benzyl chloride in its backbone as a macromolecular tracer is synthesized to study its localization within PlSNs by analyzing the elemental distribution of chlorine. The functionalized nanohybrid structures are analyzed by scanning transmission electron microscopy, electron energy loss spectroscopy, and spectrum imaging. The results show that the RAFT (reversible addition-fragmentation chain transfer) polymers\\' sulfur containing end groups are colocalized at the gold cores, both within nanohybrids of simple core-shell morphology and within higher order PlSNs, providing microscopic evidence for the affinity of the RAFT group toward gold surfaces. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA., Weinheim.

  10. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    Science.gov (United States)

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  11. Recent Major Improvements to the ALS Sector 5 Macromolecular Crystallography Beamlines

    International Nuclear Information System (INIS)

    Morton, Simon A.; Glossinger, James; Smith-Baumann, Alexis; McKean, John P.; Trame, Christine; Dickert, Jeff; Rozales, Anthony; Dauz, Azer; Taylor, John; Zwart, Petrus; Duarte, Robert; Padmore, Howard; McDermott, Gerry; Adams, Paul

    2007-01-01

    Although the Advanced Light Source (ALS) was initially conceived primarily as a low energy (1.9GeV) 3rd generation source of VUV and soft x-ray radiation it was realized very early in the development of the facility that a multipole wiggler source coupled with high quality, (brightness preserving), optics would result in a beamline whose performance across the optimal energy range (5-15keV) for macromolecular crystallography (MX) would be comparable to, or even exceed, that of many existing crystallography beamlines at higher energy facilities. Hence, starting in 1996, a suite of three beamlines, branching off a single wiggler source, was constructed, which together formed the ALS Macromolecular Crystallography Facility. From the outset this facility was designed to cater equally to the needs of both academic and industrial users with a heavy emphasis placed on the development and introduction of high throughput crystallographic tools, techniques, and facilities--such as large area CCD detectors, robotic sample handling and automounting facilities, a service crystallography program, and a tightly integrated, centralized, and highly automated beamline control environment for users. This facility was immediately successful, with the primary Multiwavelength Anomalous Diffraction beamline (5.0.2) in particular rapidly becoming one of the foremost crystallographic facilities in the US--responsible for structures such as the 70S ribosome. This success in-turn triggered enormous growth of the ALS macromolecular crystallography community and spurred the development of five additional ALS MX beamlines all utilizing the newly developed superconducting bending magnets ('superbends') as sources. However in the years since the original Sector 5.0 beamlines were built the performance demands of macromolecular crystallography users have become ever more exacting; with growing emphasis placed on studying larger complexes, more difficult structures, weakly diffracting or smaller

  12. Mix and Inject: Reaction Initiation by Diffusion for Time-Resolved Macromolecular Crystallography

    Directory of Open Access Journals (Sweden)

    Marius Schmidt

    2013-01-01

    Full Text Available Time-resolved macromolecular crystallography unifies structure determination with chemical kinetics, since the structures of transient states and chemical and kinetic mechanisms can be determined simultaneously from the same data. To start a reaction in an enzyme, typically, an initially inactive substrate present in the crystal is activated. This has particular disadvantages that are circumvented when active substrate is directly provided by diffusion. However, then it is prohibitive to use macroscopic crystals because diffusion times become too long. With small micro- and nanocrystals diffusion times are adequately short for most enzymes and the reaction can be swiftly initiated. We demonstrate here that a time-resolved crystallographic experiment becomes feasible by mixing substrate with enzyme nanocrystals which are subsequently injected into the X-ray beam of a pulsed X-ray source.

  13. OCTOPUS: an innovative multimodal diffractometer for neutron macromolecular crystallography across the length scales

    International Nuclear Information System (INIS)

    Blakeley, M.P.; Andersen, K.; Kreuz, M.; Giroud, B.; McSweeney, S.; Mitchell, E.; Teixeira, S.C.M.; Forsyth, V.T.

    2011-01-01

    We propose to construct a novel protein diffractometer at position H112B. The new instrument will deliver major efficiency gains, as well as offering greatly extended flexibility through the option of several easily interchangeable modes of operation. This proposal builds on the demonstrable need to extend ILL's capacity for high resolution structural studies of protein systems, as well as a need to widen the scope of biological crystallography - in particular for monochromatic studies at both high and low resolution. The development will be carried out in close collaboration with structural biologists at the ESRF, and engineered in such a way that the user interface of the instrument (from sample to software) will be transparently identifiable to a large, dynamic, and driven community of European synchrotron X-ray macromolecular crystallographers. (authors)

  14. C1 Polymerization: a unique tool towards polyethylene-based complex macromolecular architectures

    KAUST Repository

    Wang, De

    2017-05-09

    The recent developments in organoborane initiated C1 polymerization (chain grows by one atom at a time) of ylides opens unique horizons towards well-defined/perfectly linear polymethylenes (equivalent to polyethylenes, PE) and PE-based complex macromolecular architectures. The general mechanism of C1 polymerization (polyhomologation) involves the formation of a Lewis complex between a methylide (monomer) and a borane (initiator), followed by migration/insertion of a methylene into the initiator and after oxidation/hydrolysis to afford OH-terminated polyethylenes. This review summarizes efforts towards conventional and newly discovered borane-initiators and ylides (monomers), as well as a combination of polyhomologation with other polymerization methods. Initial efforts dealing with C3 polymerization and the synthesis of the first C1/C3 copolymers are also given. Finally, some thoughts for the future of these polymerizations are presented.

  15. Control and data acquisition system for the macromolecular crystallography beamline of SSRF

    International Nuclear Information System (INIS)

    Wang Qisheng; Huang Sheng; Sun Bo; Tang Lin; He Jianhua

    2012-01-01

    The macromolecular crystallography beamline BL17U1 of Shanghai Synchrotron Radiation Facility (SSRF) is an important platform for structure biological science. High performance of the beamline would benefit the users greatly in their experiment and data acquisition. To take full advantage of the state-of-the-art mechanical and physical design of the beamline, we have made a series of efforts to develop a robust control and data acquisition system, with user-friendly GUI. These were done by adopting EPICS and Blu-Ice systems on the BL17U1 beamline, with considerations on easy accommodation of new beeline components. In this paper, we report the integration of EPICS and Blu-Ice systems. By using the EPICS gateway interface and several new DHS, Blu-Ice was successfully established for the BL17U1 beamline. As a result, the experiment control and data acquisition system is reliable and functional for users. (authors)

  16. Reliable and efficient solution of genome-scale models of Metabolism and macromolecular Expression

    DEFF Research Database (Denmark)

    Ma, Ding; Yang, Laurence; Fleming, Ronan M. T.

    2017-01-01

    orders of magnitude. Data values also have greatly varying magnitudes. Standard double-precision solvers may return inaccurate solutions or report that no solution exists. Exact simplex solvers based on rational arithmetic require a near-optimal warm start to be practical on large problems (current ME......Constraint-Based Reconstruction and Analysis (COBRA) is currently the only methodology that permits integrated modeling of Metabolism and macromolecular Expression (ME) at genome-scale. Linear optimization computes steady-state flux solutions to ME models, but flux values are spread over many...... models have 70,000 constraints and variables and will grow larger). We have developed a quadrupleprecision version of our linear and nonlinear optimizer MINOS, and a solution procedure (DQQ) involving Double and Quad MINOS that achieves reliability and efficiency for ME models and other challenging...

  17. Site-selective electroless nickel plating on patterned thin films of macromolecular metal complexes.

    Science.gov (United States)

    Kimura, Mutsumi; Yamagiwa, Hiroki; Asakawa, Daisuke; Noguchi, Makoto; Kurashina, Tadashi; Fukawa, Tadashi; Shirai, Hirofusa

    2010-12-01

    We demonstrate a simple route to depositing nickel layer patterns using photocross-linked polymer thin films containing palladium catalysts, which can be used as adhesive interlayers for fabrication of nickel patterns on glass and plastic substrates. Electroless nickel patterns can be obtained in three steps: (i) the pattern formation of partially quaterized poly(vinyl pyridine) by UV irradiation, (ii) the formation of macromolecular metal complex with palladium, and (iii) the nickel metallization using electroless plating bath. Metallization is site-selective and allows for a high resolution. And the resulting nickel layered structure shows good adhesion with glass and plastic substrates. The direct patterning of metallic layers onto insulating substrates indicates a great potential for fabricating micro/nano devices.

  18. Structure, function and folding of phosphoglycerate kinase are strongly perturbed by macromolecular crowding.

    Science.gov (United States)

    Samiotakis, Antonios; Dhar, Apratim; Ebbinghaus, Simon; Nienhaus, Lea; Homouz, Dirar; Gruebele, Martin; Cheung, Margaret

    2010-10-01

    We combine experiment and computer simulation to show how macromolecular crowding dramatically affects the structure, function and folding landscape of phosphoglycerate kinase (PGK). Fluorescence labeling shows that compact states of yeast PGK are populated as the amount of crowding agents (Ficoll 70) increases. Coarse-grained molecular simulations reveal three compact ensembles: C (crystal structure), CC (collapsed crystal) and Sph (spherical compact). With an adjustment for viscosity, crowded wild type PGK and fluorescent PGK are about 15 times or more active in 200 mg/ml Ficoll than in aqueous solution. Our results suggest a new solution to the classic problem of how the ADP and diphosphoglycerate binding sites of PGK come together to make ATP: rather than undergoing a hinge motion, the ADP and substrate sites are already located in proximity under crowded conditions that mimic the in vivo conditions under which the enzyme actually operates.

  19. Macromolecular contrast media. A new approach for characterising breast tumors with MR-mammography

    International Nuclear Information System (INIS)

    Daldrup, H.E.; Gossmann, A.; Koeln Univ.; Wendland, M.; Brasch, R.C.; Rosenau, W.

    1997-01-01

    The value of macromolecular contrast agents (MMCM) for the characterization of benign and malignant breast tumors will be demonstrated in this review. Animal studies suggest a high potential of MMCM to increase the specificity of MR-mammography. The concept of tumor differentiation is based on the pathological hyperpermeability of microvessels in malignant tumors. MMCM show a leak into the interstitium of carcinomas, whereas they are confined to the intravascular space in benign tumors. Capabilities and limitations of the MMCM-prototype. Albumin-Gd-DTPA, for breast tumor characterization will be summarized and compared to the standard low molecular weight contrast agent Gd-DTPA. Initial experience with new MMCM, such as Dendrimers, Gd-DTPA-Polylysine and MS-325 will be outlined. The potential of 'blood-pool'-iron oxides, such as AMI-227 for the evaluation of tumor microvascular permeabilities will be discussed. (orig.) [de

  20. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines.

    Science.gov (United States)

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.

  1. C1 Polymerization: a unique tool towards polyethylene-based complex macromolecular architectures

    KAUST Repository

    Wang, De; Zhang, Zhen; Hadjichristidis, Nikolaos

    2017-01-01

    The recent developments in organoborane initiated C1 polymerization (chain grows by one atom at a time) of ylides opens unique horizons towards well-defined/perfectly linear polymethylenes (equivalent to polyethylenes, PE) and PE-based complex macromolecular architectures. The general mechanism of C1 polymerization (polyhomologation) involves the formation of a Lewis complex between a methylide (monomer) and a borane (initiator), followed by migration/insertion of a methylene into the initiator and after oxidation/hydrolysis to afford OH-terminated polyethylenes. This review summarizes efforts towards conventional and newly discovered borane-initiators and ylides (monomers), as well as a combination of polyhomologation with other polymerization methods. Initial efforts dealing with C3 polymerization and the synthesis of the first C1/C3 copolymers are also given. Finally, some thoughts for the future of these polymerizations are presented.

  2. Variations on Bayesian Prediction and Inference

    Science.gov (United States)

    2016-05-09

    inference 2.2.1 Background There are a number of statistical inference problems that are not generally formulated via a full probability model...problem of inference about an unknown parameter, the Bayesian approach requires a full probability 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...the problem of inference about an unknown parameter, the Bayesian approach requires a full probability model/likelihood which can be an obstacle

  3. Adaptive Inference on General Graphical Models

    OpenAIRE

    Acar, Umut A.; Ihler, Alexander T.; Mettu, Ramgopal; Sumer, Ozgur

    2012-01-01

    Many algorithms and applications involve repeatedly solving variations of the same inference problem; for example we may want to introduce new evidence to the model or perform updates to conditional dependencies. The goal of adaptive inference is to take advantage of what is preserved in the model and perform inference more rapidly than from scratch. In this paper, we describe techniques for adaptive inference on general graphs that support marginal computation and updates to the conditional ...

  4. A 3D Image Filter for Parameter-Free Segmentation of Macromolecular Structures from Electron Tomograms

    Science.gov (United States)

    Ali, Rubbiya A.; Landsberg, Michael J.; Knauth, Emily; Morgan, Garry P.; Marsh, Brad J.; Hankamer, Ben

    2012-01-01

    3D image reconstruction of large cellular volumes by electron tomography (ET) at high (≤5 nm) resolution can now routinely resolve organellar and compartmental membrane structures, protein coats, cytoskeletal filaments, and macromolecules. However, current image analysis methods for identifying in situ macromolecular structures within the crowded 3D ultrastructural landscape of a cell remain labor-intensive, time-consuming, and prone to user-bias and/or error. This paper demonstrates the development and application of a parameter-free, 3D implementation of the bilateral edge-detection (BLE) algorithm for the rapid and accurate segmentation of cellular tomograms. The performance of the 3D BLE filter has been tested on a range of synthetic and real biological data sets and validated against current leading filters—the pseudo 3D recursive and Canny filters. The performance of the 3D BLE filter was found to be comparable to or better than that of both the 3D recursive and Canny filters while offering the significant advantage that it requires no parameter input or optimisation. Edge widths as little as 2 pixels are reproducibly detected with signal intensity and grey scale values as low as 0.72% above the mean of the background noise. The 3D BLE thus provides an efficient method for the automated segmentation of complex cellular structures across multiple scales for further downstream processing, such as cellular annotation and sub-tomogram averaging, and provides a valuable tool for the accurate and high-throughput identification and annotation of 3D structural complexity at the subcellular level, as well as for mapping the spatial and temporal rearrangement of macromolecular assemblies in situ within cellular tomograms. PMID:22479430

  5. Optimization of selective inversion recovery magnetization transfer imaging for macromolecular content mapping in the human brain.

    Science.gov (United States)

    Dortch, Richard D; Bagnato, Francesca; Gochberg, Daniel F; Gore, John C; Smith, Seth A

    2018-03-24

    To optimize a selective inversion recovery (SIR) sequence for macromolecular content mapping in the human brain at 3.0T. SIR is a quantitative method for measuring magnetization transfer (qMT) that uses a low-power, on-resonance inversion pulse. This results in a biexponential recovery of free water signal that can be sampled at various inversion/predelay times (t I/ t D ) to estimate a subset of qMT parameters, including the macromolecular-to-free pool-size-ratio (PSR), the R 1 of free water (R 1f ), and the rate of MT exchange (k mf ). The adoption of SIR has been limited by long acquisition times (≈4 min/slice). Here, we use Cramér-Rao lower bound theory and data reduction strategies to select optimal t I /t D combinations to reduce imaging times. The schemes were experimentally validated in phantoms, and tested in healthy volunteers (N = 4) and a multiple sclerosis patient. Two optimal sampling schemes were determined: (i) a 5-point scheme (k mf estimated) and (ii) a 4-point scheme (k mf assumed). In phantoms, the 5/4-point schemes yielded parameter estimates with similar SNRs as our previous 16-point scheme, but with 4.1/6.1-fold shorter scan times. Pair-wise comparisons between schemes did not detect significant differences for any scheme/parameter. In humans, parameter values were consistent with published values, and similar levels of precision were obtained from all schemes. Furthermore, fixing k mf reduced the sensitivity of PSR to partial-volume averaging, yielding more consistent estimates throughout the brain. qMT parameters can be robustly estimated in ≤1 min/slice (without independent measures of ΔB 0 , B1+, and T 1 ) when optimized t I -t D combinations are selected. © 2018 International Society for Magnetic Resonance in Medicine.

  6. Macromolecular composition of terrestrial and marine organic matter in sediments across the East Siberian Arctic Shelf

    Science.gov (United States)

    Sparkes, Robert B.; Doğrul Selver, Ayça; Gustafsson, Örjan; Semiletov, Igor P.; Haghipour, Negar; Wacker, Lukas; Eglinton, Timothy I.; Talbot, Helen M.; van Dongen, Bart E.

    2016-10-01

    Mobilisation of terrestrial organic carbon (terrOC) from permafrost environments in eastern Siberia has the potential to deliver significant amounts of carbon to the Arctic Ocean, via both fluvial and coastal erosion. Eroded terrOC can be degraded during offshore transport or deposited across the wide East Siberian Arctic Shelf (ESAS). Most studies of terrOC on the ESAS have concentrated on solvent-extractable organic matter, but this represents only a small proportion of the total terrOC load. In this study we have used pyrolysis-gas chromatography-mass spectrometry (py-GCMS) to study all major groups of macromolecular components of the terrOC; this is the first time that this technique has been applied to the ESAS. This has shown that there is a strong offshore trend from terrestrial phenols, aromatics and cyclopentenones to marine pyridines. There is good agreement between proportion phenols measured using py-GCMS and independent quantification of lignin phenol concentrations (r2 = 0.67, p radiocarbon data for bulk OC (14COC) which, when coupled with previous measurements, allows us to produce the most comprehensive 14COC map of the ESAS to date. Combining the 14COC and py-GCMS data suggests that the aromatics group of compounds is likely sourced from old, aged terrOC, in contrast to the phenols group, which is likely sourced from modern woody material. We propose that an index of the relative proportions of phenols and pyridines can be used as a novel terrestrial vs. marine proxy measurement for macromolecular organic matter. Principal component analysis found that various terrestrial vs. marine proxies show different patterns across the ESAS, and it shows that multiple river-ocean transects of surface sediments transition from river-dominated to coastal-erosion-dominated to marine-dominated signatures.

  7. Superhydrophobic hybrid membranes by grafting arc-like macromolecular bridges on graphene sheets: Synthesis, characterization and properties

    Science.gov (United States)

    Mo, Zhao-Hua; Luo, Zheng; Huang, Qiang; Deng, Jian-Ping; Wu, Yi-Xian

    2018-05-01

    Grafting single end-tethered polymer chains on the surface of graphene is a conventional way to modify the surface properties of graphene oxide. However, grafting arc-like macromolecular bridges on graphene surfaces has been barely reported. Herein, a novel arc-like polydimethylsiloxane (PDMS) macromolecular bridges grafted graphene sheets (GO-g-Arc PDMS) was successfully synthesized via a confined interface reaction at 90 °C. Both the hydrophilic α- and ω-amino groups of linear hydrophobic NH2-PDMS-NH2 macromolecular chains rapidly reacted with epoxy and carboxyl groups on the surfaces of graphene oxide in water suspension to form arc-like PDMS macromolecular bridges on graphene sheets. The grafting density of arc-like PDMS bridges on graphene sheets can reach up to 0.80 mmol g-1 or 1.32 arc-like bridges per nm2 by this confined interface reaction. The water contact angle (WCA) of the hybrid membrane could be increased with increasing both the grafting density and content of covalent arc-like bridges architecture. The superhydrophobic hybrid membrane with a WCA of 153.4° was prepared by grinding of the above arc-like PDMS bridges grafted graphene hybrid, dispersing in ethanol and filtrating by organic filter membrane. This superhydrophobic hybrid membrane shows good self-cleaning and complete oil-water separation properties, which provides potential applications in anticontamination coating and oil-water separation. To the best of our knowledge, this is the first report on the synthesis of functional hybrid membranes by grafting arc-like PDMS macromolecular bridges on graphene sheets via a confined interface reaction.

  8. More than one kind of inference: re-examining what's learned in feature inference and classification.

    Science.gov (United States)

    Sweller, Naomi; Hayes, Brett K

    2010-08-01

    Three studies examined how task demands that impact on attention to typical or atypical category features shape the category representations formed through classification learning and inference learning. During training categories were learned via exemplar classification or by inferring missing exemplar features. In the latter condition inferences were made about missing typical features alone (typical feature inference) or about both missing typical and atypical features (mixed feature inference). Classification and mixed feature inference led to the incorporation of typical and atypical features into category representations, with both kinds of features influencing inferences about familiar (Experiments 1 and 2) and novel (Experiment 3) test items. Those in the typical inference condition focused primarily on typical features. Together with formal modelling, these results challenge previous accounts that have characterized inference learning as producing a focus on typical category features. The results show that two different kinds of inference learning are possible and that these are subserved by different kinds of category representations.

  9. Generative inference for cultural evolution.

    Science.gov (United States)

    Kandler, Anne; Powell, Adam

    2018-04-05

    One of the major challenges in cultural evolution is to understand why and how various forms of social learning are used in human populations, both now and in the past. To date, much of the theoretical work on social learning has been done in isolation of data, and consequently many insights focus on revealing the learning processes or the distributions of cultural variants that are expected to have evolved in human populations. In population genetics, recent methodological advances have allowed a greater understanding of the explicit demographic and/or selection mechanisms that underlie observed allele frequency distributions across the globe, and their change through time. In particular, generative frameworks-often using coalescent-based simulation coupled with approximate Bayesian computation (ABC)-have provided robust inferences on the human past, with no reliance on a priori assumptions of equilibrium. Here, we demonstrate the applicability and utility of generative inference approaches to the field of cultural evolution. The framework advocated here uses observed population-level frequency data directly to establish the likely presence or absence of particular hypothesized learning strategies. In this context, we discuss the problem of equifinality and argue that, in the light of sparse cultural data and the multiplicity of possible social learning processes, the exclusion of those processes inconsistent with the observed data might be the most instructive outcome. Finally, we summarize the findings of generative inference approaches applied to a number of case studies.This article is part of the theme issue 'Bridging cultural gaps: interdisciplinary studies in human cultural evolution'. © 2018 The Author(s).

  10. sick: The Spectroscopic Inference Crank

    Science.gov (United States)

    Casey, Andrew R.

    2016-03-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  11. Inferring network structure from cascades

    Science.gov (United States)

    Ghonge, Sushrut; Vural, Dervis Can

    2017-07-01

    Many physical, biological, and social phenomena can be described by cascades taking place on a network. Often, the activity can be empirically observed, but not the underlying network of interactions. In this paper we offer three topological methods to infer the structure of any directed network given a set of cascade arrival times. Our formulas hold for a very general class of models where the activation probability of a node is a generic function of its degree and the number of its active neighbors. We report high success rates for synthetic and real networks, for several different cascade models.

  12. SICK: THE SPECTROSCOPIC INFERENCE CRANK

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Andrew R., E-mail: arc@ast.cam.ac.uk [Institute of Astronomy, University of Cambridge, Madingley Road, Cambdridge, CB3 0HA (United Kingdom)

    2016-03-15

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  13. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  14. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  15. Inference in hybrid Bayesian networks

    International Nuclear Information System (INIS)

    Langseth, Helge; Nielsen, Thomas D.; Rumi, Rafael; Salmeron, Antonio

    2009-01-01

    Since the 1980s, Bayesian networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability techniques (like fault trees and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (the so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability.

  16. SICK: THE SPECTROSCOPIC INFERENCE CRANK

    International Nuclear Information System (INIS)

    Casey, Andrew R.

    2016-01-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  17. Estimation of insurance premiums for coverage against natural disaster risk: an application of Bayesian Inference

    NARCIS (Netherlands)

    Paudel, Y.; Botzen, W.J.W.; Aerts, J.C.J.H.

    2013-01-01

    This study applies Bayesian Inference to estimate flood risk for 53 dyke ring areas in the Netherlands, and focuses particularly on the data scarcity and extreme behaviour of catastrophe risk. The probability density curves of flood damage are estimated through Monte Carlo simulations. Based on

  18. Subjective randomness as statistical inference.

    Science.gov (United States)

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  20. Macromolecular composition of terrestrial and marine organic matter in sediments across the East Siberian Arctic Shelf

    Directory of Open Access Journals (Sweden)

    R. B. Sparkes

    2016-10-01

    Full Text Available Mobilisation of terrestrial organic carbon (terrOC from permafrost environments in eastern Siberia has the potential to deliver significant amounts of carbon to the Arctic Ocean, via both fluvial and coastal erosion. Eroded terrOC can be degraded during offshore transport or deposited across the wide East Siberian Arctic Shelf (ESAS. Most studies of terrOC on the ESAS have concentrated on solvent-extractable organic matter, but this represents only a small proportion of the total terrOC load. In this study we have used pyrolysis–gas chromatography–mass spectrometry (py-GCMS to study all major groups of macromolecular components of the terrOC; this is the first time that this technique has been applied to the ESAS. This has shown that there is a strong offshore trend from terrestrial phenols, aromatics and cyclopentenones to marine pyridines. There is good agreement between proportion phenols measured using py-GCMS and independent quantification of lignin phenol concentrations (r2 = 0.67, p < 0.01, n = 24. Furfurals, thought to represent carbohydrates, show no offshore trend and are likely found in both marine and terrestrial organic matter. We have also collected new radiocarbon data for bulk OC (14COC which, when coupled with previous measurements, allows us to produce the most comprehensive 14COC map of the ESAS to date. Combining the 14COC and py-GCMS data suggests that the aromatics group of compounds is likely sourced from old, aged terrOC, in contrast to the phenols group, which is likely sourced from modern woody material. We propose that an index of the relative proportions of phenols and pyridines can be used as a novel terrestrial vs. marine proxy measurement for macromolecular organic matter. Principal component analysis found that various terrestrial vs. marine proxies show different patterns across the ESAS, and it shows that multiple river–ocean transects of surface sediments transition from river-dominated to

  1. Synthesis of branched polymers under continuous-flow microprocess: an improvement of the control of macromolecular architectures.

    Science.gov (United States)

    Bally, Florence; Serra, Christophe A; Brochon, Cyril; Hadziioannou, Georges

    2011-11-15

    Polymerization reactions can benefit from continuous-flow microprocess in terms of kinetics control, reactants mixing or simply efficiency when high-throughput screening experiments are carried out. In this work, we perform for the first time the synthesis of branched macromolecular architecture through a controlled/'living' polymerization technique, in tubular microreactor. Just by tuning process parameters, such as flow rates of the reactants, we manage to generate a library of polymers with various macromolecular characteristics. Compared to conventional batch process, polymerization kinetics shows a faster initiation step and more interestingly an improved branching efficiency. Due to reduced diffusion pathway, a characteristic of microsystems, it is thus possible to reach branched polymers exhibiting a denser architecture, and potentially a higher functionality for later applications. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. The 2D Structure of the T. brucei Preedited RPS12 mRNA Is Not Affected by Macromolecular Crowding

    Directory of Open Access Journals (Sweden)

    W.-Matthias Leeder

    2017-01-01

    Full Text Available Mitochondrial transcript maturation in African trypanosomes requires RNA editing to convert sequence-deficient pre-mRNAs into translatable mRNAs. The different pre-mRNAs have been shown to adopt highly stable 2D folds; however, it is not known whether these structures resemble the in vivo folds given the extreme “crowding” conditions within the mitochondrion. Here, we analyze the effects of macromolecular crowding on the structure of the mitochondrial RPS12 pre-mRNA. We use high molecular mass polyethylene glycol as a macromolecular cosolute and monitor the structure of the RNA globally and with nucleotide resolution. We demonstrate that crowding has no impact on the 2D fold and we conclude that the MFE structure in dilute solvent conditions represents a good proxy for the folding of the pre-mRNA in its mitochondrial solvent context.

  3. Distribution and enzymatic activity of heterotrophic bacteria decomposing selected macromolecular compounds in a Baltic Sea sandy beach

    Science.gov (United States)

    Podgórska, B.; Mudryk, Z. J.

    2003-03-01

    The potential capability to decompose macromolecular compounds, and the level of extracellular enzyme activities were determined in heterotrophic bacteria isolated from a sandy beach in Sopot on the Southern Baltic Sea coast. Individual isolates were capable of hydrolysing a wide spectrum of organic macromolecular compounds. Lipids, gelatine, and DNA were hydrolyzed most efficiently. Only a very small percentage of strains were able to decompose cellulose, and no pectinolytic bacteria were found. Except for starch-hydrolysis, no significant differences in the intensity of organic compound decomposition were recorded between horizontal and vertical profiles of the studied beach. Of all the studied extracellular enzymes, alkaline phosphatase, esterase lipase, and leucine acrylaminidase were most active; in contrast, the activity α-fucosidase, α-galactosidase and β-glucouronidase was the weakest. The level of extracellular enzyme activity was similar in both sand layers.

  4. Lower complexity bounds for lifted inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2015-01-01

    instances of the model. Numerous approaches for such “lifted inference” techniques have been proposed. While it has been demonstrated that these techniques will lead to significantly more efficient inference on some specific models, there are only very recent and still quite restricted results that show...... the feasibility of lifted inference on certain syntactically defined classes of models. Lower complexity bounds that imply some limitations for the feasibility of lifted inference on more expressive model classes were established earlier in Jaeger (2000; Jaeger, M. 2000. On the complexity of inference about...... that under the assumption that NETIME≠ETIME, there is no polynomial lifted inference algorithm for knowledge bases of weighted, quantifier-, and function-free formulas. Further strengthening earlier results, this is also shown to hold for approximate inference and for knowledge bases not containing...

  5. Statistical inference for financial engineering

    CERN Document Server

    Taniguchi, Masanobu; Ogata, Hiroaki; Taniai, Hiroyuki

    2014-01-01

    This monograph provides the fundamentals of statistical inference for financial engineering and covers some selected methods suitable for analyzing financial time series data. In order to describe the actual financial data, various stochastic processes, e.g. non-Gaussian linear processes, non-linear processes, long-memory processes, locally stationary processes etc. are introduced and their optimal estimation is considered as well. This book also includes several statistical approaches, e.g., discriminant analysis, the empirical likelihood method, control variate method, quantile regression, realized volatility etc., which have been recently developed and are considered to be powerful tools for analyzing the financial data, establishing a new bridge between time series and financial engineering. This book is well suited as a professional reference book on finance, statistics and statistical financial engineering. Readers are expected to have an undergraduate-level knowledge of statistics.

  6. Type inference for correspondence types

    DEFF Research Database (Denmark)

    Hüttel, Hans; Gordon, Andy; Hansen, Rene Rydhof

    2009-01-01

    We present a correspondence type/effect system for authenticity in a π-calculus with polarized channels, dependent pair types and effect terms and show how one may, given a process P and an a priori type environment E, generate constraints that are formulae in the Alternating Least Fixed......-Point (ALFP) logic. We then show how a reasonable model of the generated constraints yields a type/effect assignment such that P becomes well-typed with respect to E if and only if this is possible. The formulae generated satisfy a finite model property; a system of constraints is satisfiable if and only...... if it has a finite model. As a consequence, we obtain the result that type/effect inference in our system is polynomial-time decidable....

  7. Causal inference in public health.

    Science.gov (United States)

    Glass, Thomas A; Goodman, Steven N; Hernán, Miguel A; Samet, Jonathan M

    2013-01-01

    Causal inference has a central role in public health; the determination that an association is causal indicates the possibility for intervention. We review and comment on the long-used guidelines for interpreting evidence as supporting a causal association and contrast them with the potential outcomes framework that encourages thinking in terms of causes that are interventions. We argue that in public health this framework is more suitable, providing an estimate of an action's consequences rather than the less precise notion of a risk factor's causal effect. A variety of modern statistical methods adopt this approach. When an intervention cannot be specified, causal relations can still exist, but how to intervene to change the outcome will be unclear. In application, the often-complex structure of causal processes needs to be acknowledged and appropriate data collected to study them. These newer approaches need to be brought to bear on the increasingly complex public health challenges of our globalized world.

  8. A facile metal-free "grafting-from" route from acrylamide-based substrate toward complex macromolecular combs

    KAUST Repository

    Zhao, Junpeng

    2013-01-01

    High-molecular-weight poly(N,N-dimethylacrylamide-co-acrylamide) was used as a model functional substrate to investigate phosphazene base (t-BuP 4)-promoted metal-free anionic graft polymerization utilizing primary amide moieties as initiating sites. The (co)polymerization of epoxides was proven to be effective, leading to macromolecular combs with side chains being single- or double-graft homopolymer, block copolymer and statistical copolymer. © 2013 The Royal Society of Chemistry.

  9. Macromolecular pHPMA-based nanoparticles with cholesterol for solid tumor targeting: behavior in HSA protein environment

    Czech Academy of Sciences Publication Activity Database

    Zhang, X.; Niebuur, B.-J.; Chytil, Petr; Etrych, Tomáš; Filippov, Sergey K.; Kikhney, A.; Wieland, D. C. F.; Svergun, D. I.; Papadakis, C. M.

    2018-01-01

    Roč. 19, č. 2 (2018), s. 470-480 ISSN 1525-7797 R&D Projects: GA ČR(CZ) GC15-10527J; GA MZd(CZ) NV16-28594A; GA MŠk(CZ) LO1507 Institutional support: RVO:61389013 Keywords : polymer carriers * N-(2-hydroxypropyl)methacrylamide * tumor targeting Subject RIV: CD - Macromolecular Chemistry OBOR OECD: Polymer science Impact factor: 5.246, year: 2016

  10. C,N-2-[(Dimethylamino)methyl]phenylplatinum Complexes Functionalized with C60 as Macromolecular Building Blocks

    NARCIS (Netherlands)

    Koten, G. van; Meijer, M.D.; Wolf, E. de; Lutz, M.H.; Spek, A.L.; Klink, G.P.M. van

    2001-01-01

    The application of platinum(II) complexes based on the N,N-dimethylbenzylamine ligand (abbreviated as H-C,N) in macromolecular synthesis was demonstrated. Two cationic C,N-platinum moieties were linked with a 4,4'-bipyridine bridge, giving [{C6H4(CH2NMe2)-2-Pt(PPh3)}2(4,4'-bpy)](BF4)2 (2), the

  11. UQlust: combining profile hashing with linear-time ranking for efficient clustering and analysis of big macromolecular data.

    Science.gov (United States)

    Adamczak, Rafal; Meller, Jarek

    2016-12-28

    Advances in computing have enabled current protein and RNA structure prediction and molecular simulation methods to dramatically increase their sampling of conformational spaces. The quickly growing number of experimentally resolved structures, and databases such as the Protein Data Bank, also implies large scale structural similarity analyses to retrieve and classify macromolecular data. Consequently, the computational cost of structure comparison and clustering for large sets of macromolecular structures has become a bottleneck that necessitates further algorithmic improvements and development of efficient software solutions. uQlust is a versatile and easy-to-use tool for ultrafast ranking and clustering of macromolecular structures. uQlust makes use of structural profiles of proteins and nucleic acids, while combining a linear-time algorithm for implicit comparison of all pairs of models with profile hashing to enable efficient clustering of large data sets with a low memory footprint. In addition to ranking and clustering of large sets of models of the same protein or RNA molecule, uQlust can also be used in conjunction with fragment-based profiles in order to cluster structures of arbitrary length. For example, hierarchical clustering of the entire PDB using profile hashing can be performed on a typical laptop, thus opening an avenue for structural explorations previously limited to dedicated resources. The uQlust package is freely available under the GNU General Public License at https://github.com/uQlust . uQlust represents a drastic reduction in the computational complexity and memory requirements with respect to existing clustering and model quality assessment methods for macromolecular structure analysis, while yielding results on par with traditional approaches for both proteins and RNAs.

  12. Macromolecular basis for homocystein-induced changes in proteoglycan structure in growth and arteriosclerosis.

    Science.gov (United States)

    McCully, K S

    1972-01-01

    Cell culture monolayers deficient in cystathionine synthetase bound more inorganic sulfate than normal cell monolayers during growth to confluence; this was correlated with the production of granular proteoglycan by the abnormal cells and fibrillar proteoglycan by normal cells. Homocysteine was demonstrated to be an active precursor of esterified sulfate, confirming our previous finding of this sulfation pathway in liver. The cell cultures deficient in cystathionine synthetase were found to assume an abnormal cellular distribution on the surface of the culture dish, resembling the distribution assumed by neoplastic cells with loss of contact inhibition; the degree of abnormality of the cellular distribution was correlated with the amount of granular proteoglycan produced by the cells and the amount of inorganic sulfate binding by the cell monolayers. Pyridoxine was found to increase the growth rate of cell cultures from a patient with pyridoxineresponsive homocystinuria and to increase the production of fibrillar proteoglycan by the cells; no effect of pyridoxine was observed in the cell cultures from a patient who failed to respond to pyridoxine therapy. The findings suggest that the change in macromolecular conformation of cellular proteoglycans from fibrillar to granular is due to increased sulfation of the carbohydrate envelope of the molecule. The significance of the findings is related to the pathogenesis of homocystinuria, the phenomenon of contact inhibition, the action of growth hormone and initiation of arteriosclerotic plaques.

  13. Lactoferricin B inhibits bacterial macromolecular synthesis in Escherichia coli and Bacillus subtilis.

    Science.gov (United States)

    Ulvatne, Hilde; Samuelsen, Ørjan; Haukland, Hanne H; Krämer, Manuela; Vorland, Lars H

    2004-08-15

    Most antimicrobial peptides have an amphipathic, cationic structure, and an effect on the cytoplasmic membrane of susceptible bacteria has been postulated as the main mode of action. Other mechanisms have been reported, including inhibition of cellular functions by binding to DNA, RNA and proteins, and the inhibition of DNA and/or protein synthesis. Lactoferricin B (Lfcin B), a cationic peptide derived from bovine lactoferrin, exerts slow inhibitory and bactericidal activity and does not lyse susceptible bacteria, indicating a possible intracellular target. In the present study incorporation of radioactive precursors into DNA, RNA and proteins was used to demonstrate effects of Lfcin B on macromolecular synthesis in bacteria. In Escherichia coli UC 6782, Lfcin B induces an initial increase in protein and RNA synthesis and a decrease in DNA synthesis. After 10 min, the DNA-synthesis increases while protein and RNA-synthesis decreases significantly. In Bacillus subtilis, however, all synthesis of macromolecules is inhibited for at least 20 min. After 20 min RNA-synthesis increases. The results presented here show that Lfcin B at concentrations not sufficient to kill bacterial cells inhibits incorporation of radioactive precursors into macromolecules in both Gram-positive and Gram-negative bacteria.

  14. About Small Streams and Shiny Rocks: Macromolecular Crystal Growth in Microfluidics

    Science.gov (United States)

    vanderWoerd, Mark; Ferree, Darren; Spearing, Scott; Monaco, Lisa; Molho, Josh; Spaid, Michael; Brasseur, Mike; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    We are developing a novel technique with which we have grown diffraction quality protein crystals in very small volumes, utilizing chip-based, microfluidic ("LabChip") technology. With this technology volumes smaller than achievable with any laboratory pipette can be dispensed with high accuracy. We have performed a feasibility study in which we crystallized several proteins with the aid of a LabChip device. The protein crystals are of excellent quality as shown by X-ray diffraction. The advantages of this new technology include improved accuracy of dispensing for small volumes, complete mixing of solution constituents without bubble formation, highly repeatable recipe and growth condition replication, and easy automation of the method. We have designed a first LabChip device specifically for protein crystallization in batch mode and can reliably dispense and mix from a range of solution constituents. We are currently testing this design. Upon completion additional crystallization techniques, such as vapor diffusion and liquid-liquid diffusion will be accommodated. Macromolecular crystallization using microfluidic technology is envisioned as a fully automated system, which will use the 'tele-science' concept of remote operation and will be developed into a research facility aboard the International Space Station.

  15. Improved reproducibility of unit-cell parameters in macromolecular cryocrystallography by limiting dehydration during crystal mounting.

    Science.gov (United States)

    Farley, Christopher; Burks, Geoffry; Siegert, Thomas; Juers, Douglas H

    2014-08-01

    In macromolecular cryocrystallography unit-cell parameters can have low reproducibility, limiting the effectiveness of combining data sets from multiple crystals and inhibiting the development of defined repeatable cooling protocols. Here, potential sources of unit-cell variation are investigated and crystal dehydration during loop-mounting is found to be an important factor. The amount of water lost by the unit cell depends on the crystal size, the loop size, the ambient relative humidity and the transfer distance to the cooling medium. To limit water loss during crystal mounting, a threefold strategy has been implemented. Firstly, crystal manipulations are performed in a humid environment similar to the humidity of the crystal-growth or soaking solution. Secondly, the looped crystal is transferred to a vial containing a small amount of the crystal soaking solution. Upon loop transfer, the vial is sealed, which allows transport of the crystal at its equilibrated humidity. Thirdly, the crystal loop is directly mounted from the vial into the cold gas stream. This strategy minimizes the exposure of the crystal to relatively low humidity ambient air, improves the reproducibility of low-temperature unit-cell parameters and offers some new approaches to crystal handling and cryoprotection.

  16. A vibrating membrane bioreactor (VMBR): Macromolecular transmission-influence of extracellular polymeric substances

    DEFF Research Database (Denmark)

    Beier, Søren; Jonsson, Gunnar Eigil

    2009-01-01

    The vibrating membrane bioreactor (VMBR) system facilitates the possibility of conducting a separation of macromolecules (BSA) from larger biological components (yeast cells) with a relatively high and stable macromolecular transmission at sub-critical flux. This is not possible to achieve...... for a static non-vibrating membrane module. A BSA transmission of 74% has been measured in the separation of 4g/L BSA from 8 g/L dry weight yeast cells in suspension at sub-critical flux (20L/(m(2) h)). However, this transmission is lower than the 85% BSA transmission measured for at pure 4g/L BSA solution....... This can be ascribed to the presence of extracellular polymeric substances (EPS) from the yeast cells. The initial fouling rate for constant sub-critical flux filtration of unwashed yeast cells is 3-4 times larger than for washed yeast cells (18(mbar/h)/5(mbar/h)). At sub-critical flux, an EPS transmission...

  17. Synthesis and Self-Assembly of Amphiphilic Triblock Terpolymers with Complex Macromolecular Architecture

    KAUST Repository

    Polymeropoulos, George; Zapsas, George; Hadjichristidis, Nikolaos; Avgeropoulos, Apostolos

    2015-01-01

    Two star triblock terpolymers (PS-b-P2VP-b-PEO)3 and one dendritic-like terpolymer [PS-b-P2VP-b-(PEO)2]3 of PS (polystyrene), P2VP (poly(2-vinylpyridine)), and PEO (poly(ethylene oxide)), never reported before, were synthesized by combining atom transfer radical and anionic polymerizations. The synthesis involves the transformation of the -Br groups of the previously reported Br-terminated 3-arm star diblock copolymers to one or two -OH groups, followed by anionic polymerization of ethylene oxide to afford the star or dendritic structure, respectively. The well-defined structure of the terpolymers was confirmed by static light scattering, size exclusion chromatography, and NMR spectroscopy. The self-assembly in solution and the morphology in bulk of the terpolymers, studied by dynamic light scattering and transmission electron microscopy, respectively, reveal new insights in the phase separation of these materials with complex macromolecular architecture. © 2015 American Chemical Society.

  18. Using support vector machines to improve elemental ion identification in macromolecular crystal structures

    Energy Technology Data Exchange (ETDEWEB)

    Morshed, Nader [University of California, Berkeley, CA 94720 (United States); Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Echols, Nathaniel, E-mail: nechols@lbl.gov [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Adams, Paul D., E-mail: nechols@lbl.gov [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); University of California, Berkeley, CA 94720 (United States)

    2015-05-01

    A method to automatically identify possible elemental ions in X-ray crystal structures has been extended to use support vector machine (SVM) classifiers trained on selected structures in the PDB, with significantly improved sensitivity over manually encoded heuristics. In the process of macromolecular model building, crystallographers must examine electron density for isolated atoms and differentiate sites containing structured solvent molecules from those containing elemental ions. This task requires specific knowledge of metal-binding chemistry and scattering properties and is prone to error. A method has previously been described to identify ions based on manually chosen criteria for a number of elements. Here, the use of support vector machines (SVMs) to automatically classify isolated atoms as either solvent or one of various ions is described. Two data sets of protein crystal structures, one containing manually curated structures deposited with anomalous diffraction data and another with automatically filtered, high-resolution structures, were constructed. On the manually curated data set, an SVM classifier was able to distinguish calcium from manganese, zinc, iron and nickel, as well as all five of these ions from water molecules, with a high degree of accuracy. Additionally, SVMs trained on the automatically curated set of high-resolution structures were able to successfully classify most common elemental ions in an independent validation test set. This method is readily extensible to other elemental ions and can also be used in conjunction with previous methods based on a priori expectations of the chemical environment and X-ray scattering.

  19. A technique for determining the deuterium/hydrogen contrast map in neutron macromolecular crystallography.

    Science.gov (United States)

    Chatake, Toshiyuki; Fujiwara, Satoru

    2016-01-01

    A difference in the neutron scattering length between hydrogen and deuterium leads to a high density contrast in neutron Fourier maps. In this study, a technique for determining the deuterium/hydrogen (D/H) contrast map in neutron macromolecular crystallography is developed and evaluated using ribonuclease A. The contrast map between the D2O-solvent and H2O-solvent crystals is calculated in real space, rather than in reciprocal space as performed in previous neutron D/H contrast crystallography. The present technique can thus utilize all of the amplitudes of the neutron structure factors for both D2O-solvent and H2O-solvent crystals. The neutron D/H contrast maps clearly demonstrate the powerful detectability of H/D exchange in proteins. In fact, alternative protonation states and alternative conformations of hydroxyl groups are observed at medium resolution (1.8 Å). Moreover, water molecules can be categorized into three types according to their tendency towards rotational disorder. These results directly indicate improvement in the neutron crystal structure analysis. This technique is suitable for incorporation into the standard structure-determination process used in neutron protein crystallography; consequently, more precise and efficient determination of the D-atom positions is possible using a combination of this D/H contrast technique and standard neutron structure-determination protocols.

  20. In Vitro and In Vivo Biocompatibility Evaluation of Polyallylamine and Macromolecular Heparin Conjugates Modified Alginate Microbeads.

    Science.gov (United States)

    Vaithilingam, Vijayaganapathy; Steinkjer, Bjørg; Ryan, Liv; Larsson, Rolf; Tuch, Bernard Edward; Oberholzer, Jose; Rokstad, Anne Mari

    2017-09-15

    Host reactivity to biocompatible immunoisolation devices is a major challenge for cellular therapies, and a human screening model would be of great value. We designed new types of surface modified barium alginate microspheres, and evaluated their inflammatory properties using human whole blood, and the intraperitoneal response after three weeks in Wistar rats. Microspheres were modified using proprietary polyallylamine (PAV) and coupled with macromolecular heparin conjugates (Corline Heparin Conjugate, CHC). The PAV-CHC strategy resulted in uniform and stable coatings with increased anti-clot activity and low cytotoxicity. In human whole blood, PAV coating at high dose (100 µg/ml) induced elevated complement, leukocyte CD11b and inflammatory mediators, and in Wistar rats increased fibrotic overgrowth. Coating of high dose PAV with CHC significantly reduced these responses. Low dose PAV (10 µg/ml) ± CHC and unmodified alginate microbeads showed low responses. That the human whole blood inflammatory reactions paralleled the host response shows a link between inflammatory potential and initial fibrotic response. CHC possessed anti-inflammatory activity, but failed to improve overall biocompatibility. We conclude that the human whole blood assay is an efficient first-phase screening model for inflammation, and a guiding tool in development of new generation microspheres for cell encapsulation therapy.

  1. Photochemical internalisation of a macromolecular protein toxin using a cell penetrating peptide-photosensitiser conjugate.

    Science.gov (United States)

    Wang, Julie T-W; Giuntini, Francesca; Eggleston, Ian M; Bown, Stephen G; MacRobert, Alexander J

    2012-01-30

    Photochemical internalisation (PCI) is a site-specific technique for improving cellular delivery of macromolecular drugs. In this study, a cell penetrating peptide, containing the core HIV-1 Tat 48-57 sequence, conjugated with a porphyrin photosensitiser has been shown to be effective for PCI. Herein we report an investigation of the photophysical and photobiological properties of a water soluble bioconjugate of the cationic Tat peptide with a hydrophobic tetraphenylporphyrin derivative. The cellular uptake and localisation of the amphiphilic bioconjugate was examined in the HN5 human head and neck squamous cell carcinoma cell line. Efficient cellular uptake and localisation in endo/lysosomal vesicles was found using fluorescence detection, and light-induced, rupture of the vesicles resulting in a more diffuse intracellular fluorescence distribution was observed. Conjugation of the Tat sequence with a hydrophobic porphyrin thus enables cellular delivery of an amphiphilic photosensitiser which can then localise in endo/lysosomal membranes, as required for effective PCI treatment. PCI efficacy was tested in combination with a protein toxin, saporin, and a significant reduction in cell viability was measured versus saporin or photosensitiser treatment alone. This study demonstrates that the cell penetrating peptide-photosensitiser bioconjugation strategy is a promising and versatile approach for enhancing the therapeutic potential of bioactive agents through photochemical internalisation. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. The Effect of Attractive Interactions and Macromolecular Crowding on Crystallins Association.

    Directory of Open Access Journals (Sweden)

    Jiachen Wei

    Full Text Available In living systems proteins are typically found in crowded environments where their effective interactions strongly depend on the surrounding medium. Yet, their association and dissociation needs to be robustly controlled in order to enable biological function. Uncontrolled protein aggregation often causes disease. For instance, cataract is caused by the clustering of lens proteins, i.e., crystallins, resulting in enhanced light scattering and impaired vision or blindness. To investigate the molecular origins of cataract formation and to design efficient treatments, a better understanding of crystallin association in macromolecular crowded environment is needed. Here we present a theoretical study of simple coarse grained colloidal models to characterize the general features of how the association equilibrium of proteins depends on the magnitude of intermolecular attraction. By comparing the analytic results to the available experimental data on the osmotic pressure in crystallin solutions, we identify the effective parameters regimes applicable to crystallins. Moreover, the combination of two models allows us to predict that the number of binding sites on crystallin is small, i.e. one to three per protein, which is different from previous estimates. We further observe that the crowding factor is sensitive to the size asymmetry between the reactants and crowding agents, the shape of the protein clusters, and to small variations of intermolecular attraction. Our work may provide general guidelines on how to steer the protein interactions in order to control their association.

  3. Endocytic Uptake, Transport and Macromolecular Interactions of Anionic PAMAM Dendrimers within Lung Tissue.

    Science.gov (United States)

    Morris, Christopher J; Aljayyoussi, Ghaith; Mansour, Omar; Griffiths, Peter; Gumbleton, Mark

    2017-12-01

    Polyamidoamine (PAMAM) dendrimers are a promising class of nanocarrier with applications in both small and large molecule drug delivery. Here we report a comprehensive evaluation of the uptake and transport pathways that contribute to the lung disposition of dendrimers. Anionic PAMAM dendrimers and control dextran probes were applied to an isolated perfused rat lung (IPRL) model and lung epithelial monolayers. Endocytosis pathways were examined in primary alveolar epithelial cultures by confocal microscopy. Molecular interactions of dendrimers with protein and lipid lung fluid components were studied using small angle neutron scattering (SANS). Dendrimers were absorbed across the intact lung via a passive, size-dependent transport pathway at rates slower than dextrans of similar molecular sizes. SANS investigations of concentration-dependent PAMAM transport in the IPRL confirmed no aggregation of PAMAMs with either albumin or dipalmitoylphosphatidylcholine lung lining fluid components. Distinct endocytic compartments were identified within primary alveolar epithelial cells and their functionality in the rapid uptake of fluorescent dendrimers and model macromolecular probes was confirmed by co-localisation studies. PAMAM dendrimers display favourable lung biocompatibility but modest lung to blood absorption kinetics. These data support the investigation of dendrimer-based carriers for controlled-release drug delivery to the deep lung.

  4. FitEM2EM--tools for low resolution study of macromolecular assembly and dynamics.

    Directory of Open Access Journals (Sweden)

    Ziv Frankenstein

    Full Text Available Studies of the structure and dynamics of macromolecular assemblies often involve comparison of low resolution models obtained using different techniques such as electron microscopy or atomic force microscopy. We present new computational tools for comparing (matching and docking of low resolution structures, based on shape complementarity. The matched or docked objects are represented by three dimensional grids where the value of each grid point depends on its position with regard to the interior, surface or exterior of the object. The grids are correlated using fast Fourier transformations producing either matches of related objects or docking models depending on the details of the grid representations. The procedures incorporate thickening and smoothing of the surfaces of the objects which effectively compensates for differences in the resolution of the matched/docked objects, circumventing the need for resolution modification. The presented matching tool FitEM2EMin successfully fitted electron microscopy structures obtained at different resolutions, different conformers of the same structure and partial structures, ranking correct matches at the top in every case. The differences between the grid representations of the matched objects can be used to study conformation differences or to characterize the size and shape of substructures. The presented low-to-low docking tool FitEM2EMout ranked the expected models at the top.

  5. Olfactory nerve transport of macromolecular drugs to the brain. A problem in olfactory impaired patients

    International Nuclear Information System (INIS)

    Shiga, Hideaki; Yamamoto, Junpei; Miwa, Takaki

    2012-01-01

    Nasal administration of macromolecular drugs (including peptides and nanoparticles) has the potential to enable drug delivery system beyond the blood brain barrier (BBB) via olfactory nerve transport. Basic research on drug deliver systems to the brain via nasal administration has been well reported. Insulin-like growth factor-I (IGF-I) is associated with the development and growth of the central nervous system. Clinical application of IGF-I with nasal administration is intended to enable drug delivery to brain through the BBB. Uptake of IGF-I in the olfactory bulb and central nervous system increased according to the dosage of nasally administered IGF-I in normal ICR mice, however IGF-I uptake in the trigeminal nerve remained unchanged. Olfactory nerve transport is important for the delivery of nasally administered IGF-I to the brain in vivo. Because a safe olfactory nerve tracer has not been clinically available, olfactory nerve transport has not been well studied in humans. Nasal thallium-201 ( 201 Tl) administration has been safely used to assess the direct pathway to the brain via the nose in healthy volunteers with a normal olfactory threshold. 201 Tl olfactory nerve transport has recently been shown to decrease in patients with hyposmia. The olfactory nerve transport function in patients with olfactory disorders will be determined using 201 Tl olfacto-scintigraphy for the exclusion of candidates in a clinical trial to assess the usefulness of nasal administration of IGF-I. (author)

  6. Macromolecular crowding-assisted fabrication of liquid-crystalline imprinted polymers.

    Science.gov (United States)

    Zhang, Chen; Zhang, Jing; Huang, Yan-Ping; Liu, Zhao-Sheng

    2015-04-01

    A macromolecular crowding-assisted liquid-crystalline molecularly imprinted monolith (LC-MIM) was prepared successfully for the first time. The imprinted stationary phase was synthesized with polymethyl methacrylate (PMMA) or polystyrene (PS) as the crowding agent, 4-cyanophenyl dicyclohexyl propylene (CPCE) as the liquid-crystal monomer, and hydroquinidine as the pseudo-template for the chiral separation of cinchona alkaloids in HPLC. A low level of cross-linker (26%) has been found to be sufficient to achieve molecular recognition on the crowding-assisted LC-MIM due to the physical cross-linking of mesogenic groups in place of chemical cross-linking, and baseline separation of quinidine and quinine could be achieved with good resolution (R(s) = 2.96), selectivity factor (α = 2.16), and column efficiency (N = 2650 plates/m). In contrast, the LC-MIM prepared without crowding agents displayed the smallest diastereoselectivity (α = 1.90), while the crowding-assisted MIM with high level of cross-linker (80%) obtained the greatest selectivity factor (α = 7.65), but the lowest column efficiency (N = 177 plates/m).

  7. Quickly Getting the Best Data from Your Macromolecular Crystals with a New Generation of Beamline Instruments

    International Nuclear Information System (INIS)

    Cipriani, Florent; Felisaz, Franck; Lavault, Bernard; Brockhauser, Sandor; Ravelli, Raimond; Launer, Ludovic; Leonard, Gordon; Renier, Michel

    2007-01-01

    While routine Macromolecular x-ray (MX) crystallography has relied on well established techniques for some years all the synchrotrons around the world are improving the throughput of their MX beamlines. Third generation synchrotrons provide small intense beams that make data collection of 5-10 microns sized crystals possible. The EMBL/ESRF MX Group in Grenoble has developed a new generation of instruments to easily collect data on 10 μm size crystals in an automated environment. This work is part of the Grenoble automation program that enables FedEx like crystallography using fully automated data collection and web monitored experiments. Seven ESRF beamlines and the MRC BM14 ESRF/CRG beamline are currently equipped with these latest instruments. We describe here the main features of the MD2x diffractometer family and the SC3 sample changer robot. Although the SC3 was primarily designed to increase the throughput of MX beamlines, it has also been shown to be efficient in improving the quality of the data collected. Strategies in screening a large number of crystals, selecting the best, and collecting a full data set from several re-oriented micro-crystals can now be run with minimum time and effort. The MD2x and SC3 instruments are now commercialised by the company ACCEL GmbH

  8. Macromolecular scaffolding: the relationship between nanoscale architecture and function in multichromophoric arrays for organic electronics.

    Science.gov (United States)

    Palermo, Vincenzo; Schwartz, Erik; Finlayson, Chris E; Liscio, Andrea; Otten, Matthijs B J; Trapani, Sara; Müllen, Klaus; Beljonne, David; Friend, Richard H; Nolte, Roeland J M; Rowan, Alan E; Samorì, Paolo

    2010-02-23

    The optimization of the electronic properties of molecular materials based on optically or electrically active organic building blocks requires a fine-tuning of their self-assembly properties at surfaces. Such a fine-tuning can be obtained on a scale up to 10 nm by mastering principles of supramolecular chemistry, i.e., by using suitably designed molecules interacting via pre-programmed noncovalent forces. The control and fine-tuning on a greater length scale is more difficult and challenging. This Research News highlights recent results we obtained on a new class of macromolecules that possess a very rigid backbone and side chains that point away from this backbone. Each side chain contains an organic semiconducting moiety, whose position and electronic interaction with neighboring moieties are dictated by the central macromolecular scaffold. A combined experimental and theoretical approach has made it possible to unravel the physical and chemical properties of this system across multiple length scales. The (opto)electronic properties of the new functional architectures have been explored by constructing prototypes of field-effect transistors and solar cells, thereby providing direct insight into the relationship between architecture and function.

  9. The structural biology center at the APS: an integrated user facility for macromolecular crystallography

    International Nuclear Information System (INIS)

    Rosenbaum, G.; Westbrook, E.M.

    1997-01-01

    The Structural Biology Center (SBC) has developed and operates a sector (undulator and bending magnet) of the APS as a user facility for macromolecular crystallography. Crystallographically determined structures of proteins, nucleic acids and their complexes with proteins, viruses, and complexes between macromolecules and small ligands have become of central importance in molecular and cellular biology. Major design goals were to make the extremely high brilliance of the APS available for brilliance limited studies, and to achieve a high throughput of less demanding studies, as well as optimization for MAS-phasing. Crystal samples will include extremely small crystals, crystals with large unit cells (viruses, ribosomes, etc.) and ensembles of closely similar crystal structures for drug design, protein engineering, etc. Data are recorded on a 3000x3000 pixel CCD-area detector (optionally on image plates). The x-ray optics of both beamlines has been designed to produce a highly demagnified image of the source in order to match the focal size with the sizes of the sample and the resolution element of the detector. Vertical focusing is achieved by a flat, cylindrically bent mirror. Horizontal focusing is achieved by sagitally bending the second crystal of the double crystal monochromator. Monochromatic fluxes of 1.3 * 10 13 ph/s into focal sizes of 0.08 mm (horizontal)x0.04 mm (vertical) FWHM (flux density 3.5 * 10 15 ph/s/mm 2 ) have been recorded.copyright 1997 American Institute of Physics

  10. Rapid automated superposition of shapes and macromolecular models using spherical harmonics.

    Science.gov (United States)

    Konarev, Petr V; Petoukhov, Maxim V; Svergun, Dmitri I

    2016-06-01

    A rapid algorithm to superimpose macromolecular models in Fourier space is proposed and implemented ( SUPALM ). The method uses a normalized integrated cross-term of the scattering amplitudes as a proximity measure between two three-dimensional objects. The reciprocal-space algorithm allows for direct matching of heterogeneous objects including high- and low-resolution models represented by atomic coordinates, beads or dummy residue chains as well as electron microscopy density maps and inhomogeneous multi-phase models ( e.g. of protein-nucleic acid complexes). Using spherical harmonics for the computation of the amplitudes, the method is up to an order of magnitude faster than the real-space algorithm implemented in SUPCOMB by Kozin & Svergun [ J. Appl. Cryst. (2001 ▸), 34 , 33-41]. The utility of the new method is demonstrated in a number of test cases and compared with the results of SUPCOMB . The spherical harmonics algorithm is best suited for low-resolution shape models, e.g . those provided by solution scattering experiments, but also facilitates a rapid cross-validation against structural models obtained by other methods.

  11. Synthesis and Self-Assembly of Amphiphilic Triblock Terpolymers with Complex Macromolecular Architecture

    KAUST Repository

    Polymeropoulos, George

    2015-11-25

    Two star triblock terpolymers (PS-b-P2VP-b-PEO)3 and one dendritic-like terpolymer [PS-b-P2VP-b-(PEO)2]3 of PS (polystyrene), P2VP (poly(2-vinylpyridine)), and PEO (poly(ethylene oxide)), never reported before, were synthesized by combining atom transfer radical and anionic polymerizations. The synthesis involves the transformation of the -Br groups of the previously reported Br-terminated 3-arm star diblock copolymers to one or two -OH groups, followed by anionic polymerization of ethylene oxide to afford the star or dendritic structure, respectively. The well-defined structure of the terpolymers was confirmed by static light scattering, size exclusion chromatography, and NMR spectroscopy. The self-assembly in solution and the morphology in bulk of the terpolymers, studied by dynamic light scattering and transmission electron microscopy, respectively, reveal new insights in the phase separation of these materials with complex macromolecular architecture. © 2015 American Chemical Society.

  12. Inference Attacks and Control on Database Structures

    Directory of Open Access Journals (Sweden)

    Muhamed Turkanovic

    2015-02-01

    Full Text Available Today’s databases store information with sensitivity levels that range from public to highly sensitive, hence ensuring confidentiality can be highly important, but also requires costly control. This paper focuses on the inference problem on different database structures. It presents possible treats on privacy with relation to the inference, and control methods for mitigating these treats. The paper shows that using only access control, without any inference control is inadequate, since these models are unable to protect against indirect data access. Furthermore, it covers new inference problems which rise from the dimensions of new technologies like XML, semantics, etc.

  13. Predictive Mechanical Characterization of Macro-Molecular Material Chemistry Structures of Cement Paste at Nano Scale - Two-phase Macro-Molecular Structures of Calcium Silicate Hydrate, Tri-Calcium Silicate, Di-Calcium Silicate and Calcium Hydroxide

    Science.gov (United States)

    Padilla Espinosa, Ingrid Marcela

    Concrete is a hierarchical composite material with a random structure over a wide range of length scales. At submicron length scale the main component of concrete is cement paste, formed by the reaction of Portland cement clinkers and water. Cement paste acts as a binding matrix for the other components and is responsible for the strength of concrete. Cement paste microstructure contains voids, hydrated and unhydrated cement phases. The main crystalline phases of unhydrated cement are tri-calcium silicate (C3S) and di-calcium silicate (C2S), and of hydrated cement are calcium silicate hydrate (CSH) and calcium hydroxide (CH). Although efforts have been made to comprehend the chemical and physical nature of cement paste, studies at molecular level have primarily been focused on individual components. Present research focuses on the development of a method to model, at molecular level, and analysis of the two-phase combination of hydrated and unhydrated phases of cement paste as macromolecular systems. Computational molecular modeling could help in understanding the influence of the phase interactions on the material properties, and mechanical performance of cement paste. Present work also strives to create a framework for molecular level models suitable for potential better comparisons with low length scale experimental methods, in which the sizes of the samples involve the mixture of different hydrated and unhydrated crystalline phases of cement paste. Two approaches based on two-phase cement paste macromolecular structures, one involving admixed molecular phases, and the second involving cluster of two molecular phases are investigated. The mechanical properties of two-phase macromolecular systems of cement paste consisting of key hydrated phase CSH and unhydrated phases C3S or C2S, as well as CSH with the second hydrated phase CH were calculated. It was found that these cement paste two-phase macromolecular systems predicted an isotropic material behavior. Also

  14. Damage analysis: damage function development and application

    International Nuclear Information System (INIS)

    Simons, R.L.; Odette, G.R.

    1975-01-01

    The derivation and application of damage functions, including recent developments for the U.S. LMFBR and CTR programs, is reviewed. A primary application of damage functions is in predicting component life expectancies; i.e., the fluence required in a service spectrum to attain a specified design property change. An important part of the analysis is the estimation of the uncertainty in such fluence limit predictions. The status of standardizing the procedures for the derivation and application of damage functions is discussed. Improvements in several areas of damage function development are needed before standardization can be completed. These include increasing the quantity and quality of the data used in the analysis, determining the limitations of the analysis due to the presence of multiple damage mechanisms, and finally, testing of damage function predictions against data obtained from material surveillance programs in operating thermal and fast reactors. 23 references. (auth)

  15. Plasma membrane damage detected by nucleic acid leakage

    International Nuclear Information System (INIS)

    Fortunati, E.; Bianchi, V.

    1989-01-01

    Among the indicators of membrane damage, the leakage of intracellular components into the medium is the most directly related to the perturbations of the membrane molecular organization. The extent of the damage can be evaluated from the size of the released components. We have designed a protocol for the detection of membrane leakage based on the preincubation of cells with tritiated adenine for 24 h, followed by a 24-h chase in nonradioactive medium. The treatment takes place when the distribution of the precursor among its end products has reached the plateau, and thus the differences of radioactivity in the fractions obtained from the control and treated cultures (medium, nucleotide pool, RNA, DNA) correspond to actual quantitative variations induced by the test chemical. Aliquots of the medium are processed to determine which percentage of the released material is macromolecular, in order to distinguish between mild and severe membrane damage. The origin of the extracellular radioactivity can be recognized from the variations of RNA counts in the treated cells. DNA radioactivity is used to evaluate the number of cells that remain attached to the plates in the different conditions of treatment. By this means, generalized permeabilization of membranes to macromolecules is distinguished from complete solubilization of only a subpopulation of cells. We present some examples of application of the protocol with detergents (LAS, SDS, Triton X-100) and with Cr(VI), which damages cell membranes by a different mechanism of action

  16. LAIT: a local ancestry inference toolkit.

    Science.gov (United States)

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  17. Forward and backward inference in spatial cognition.

    Directory of Open Access Journals (Sweden)

    Will D Penny

    Full Text Available This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of 'lower-level' computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.

  18. Generative Inferences Based on Learned Relations

    Science.gov (United States)

    Chen, Dawn; Lu, Hongjing; Holyoak, Keith J.

    2017-01-01

    A key property of relational representations is their "generativity": From partial descriptions of relations between entities, additional inferences can be drawn about other entities. A major theoretical challenge is to demonstrate how the capacity to make generative inferences could arise as a result of learning relations from…

  19. Inference in models with adaptive learning

    NARCIS (Netherlands)

    Chevillon, G.; Massmann, M.; Mavroeidis, S.

    2010-01-01

    Identification of structural parameters in models with adaptive learning can be weak, causing standard inference procedures to become unreliable. Learning also induces persistent dynamics, and this makes the distribution of estimators and test statistics non-standard. Valid inference can be

  20. Fiducial inference - A Neyman-Pearson interpretation

    NARCIS (Netherlands)

    Salome, D; VonderLinden, W; Dose,; Fischer, R; Preuss, R

    1999-01-01

    Fisher's fiducial argument is a tool for deriving inferences in the form of a probability distribution on the parameter space, not based on Bayes's Theorem. Lindley established that in exceptional situations fiducial inferences coincide with posterior distributions; in the other situations fiducial

  1. Uncertainty in prediction and in inference

    NARCIS (Netherlands)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close re-lationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in

  2. Causal inference in economics and marketing.

    Science.gov (United States)

    Varian, Hal R

    2016-07-05

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual-a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference.

  3. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  4. The Impact of Disablers on Predictive Inference

    Science.gov (United States)

    Cummins, Denise Dellarosa

    2014-01-01

    People consider alternative causes when deciding whether a cause is responsible for an effect (diagnostic inference) but appear to neglect them when deciding whether an effect will occur (predictive inference). Five experiments were conducted to test a 2-part explanation of this phenomenon: namely, (a) that people interpret standard predictive…

  5. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  6. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  7. Extended likelihood inference in reliability

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Beckman, R.J.; Waller, R.A.

    1978-10-01

    Extended likelihood methods of inference are developed in which subjective information in the form of a prior distribution is combined with sampling results by means of an extended likelihood function. The extended likelihood function is standardized for use in obtaining extended likelihood intervals. Extended likelihood intervals are derived for the mean of a normal distribution with known variance, the failure-rate of an exponential distribution, and the parameter of a binomial distribution. Extended second-order likelihood methods are developed and used to solve several prediction problems associated with the exponential and binomial distributions. In particular, such quantities as the next failure-time, the number of failures in a given time period, and the time required to observe a given number of failures are predicted for the exponential model with a gamma prior distribution on the failure-rate. In addition, six types of life testing experiments are considered. For the binomial model with a beta prior distribution on the probability of nonsurvival, methods are obtained for predicting the number of nonsurvivors in a given sample size and for predicting the required sample size for observing a specified number of nonsurvivors. Examples illustrate each of the methods developed. Finally, comparisons are made with Bayesian intervals in those cases where these are known to exist

  8. Reinforcement learning or active inference?

    Science.gov (United States)

    Friston, Karl J; Daunizeau, Jean; Kiebel, Stefan J

    2009-07-29

    This paper questions the need for reinforcement learning or control theory when optimising behaviour. We show that it is fairly simple to teach an agent complicated and adaptive behaviours using a free-energy formulation of perception. In this formulation, agents adjust their internal states and sampling of the environment to minimize their free-energy. Such agents learn causal structure in the environment and sample it in an adaptive and self-supervised fashion. This results in behavioural policies that reproduce those optimised by reinforcement learning and dynamic programming. Critically, we do not need to invoke the notion of reward, value or utility. We illustrate these points by solving a benchmark problem in dynamic programming; namely the mountain-car problem, using active perception or inference under the free-energy principle. The ensuing proof-of-concept may be important because the free-energy formulation furnishes a unified account of both action and perception and may speak to a reappraisal of the role of dopamine in the brain.

  9. Reinforcement learning or active inference?

    Directory of Open Access Journals (Sweden)

    Karl J Friston

    2009-07-01

    Full Text Available This paper questions the need for reinforcement learning or control theory when optimising behaviour. We show that it is fairly simple to teach an agent complicated and adaptive behaviours using a free-energy formulation of perception. In this formulation, agents adjust their internal states and sampling of the environment to minimize their free-energy. Such agents learn causal structure in the environment and sample it in an adaptive and self-supervised fashion. This results in behavioural policies that reproduce those optimised by reinforcement learning and dynamic programming. Critically, we do not need to invoke the notion of reward, value or utility. We illustrate these points by solving a benchmark problem in dynamic programming; namely the mountain-car problem, using active perception or inference under the free-energy principle. The ensuing proof-of-concept may be important because the free-energy formulation furnishes a unified account of both action and perception and may speak to a reappraisal of the role of dopamine in the brain.

  10. Active inference and epistemic value.

    Science.gov (United States)

    Friston, Karl; Rigoli, Francesco; Ognibene, Dimitri; Mathys, Christoph; Fitzgerald, Thomas; Pezzulo, Giovanni

    2015-01-01

    We offer a formal treatment of choice behavior based on the premise that agents minimize the expected free energy of future outcomes. Crucially, the negative free energy or quality of a policy can be decomposed into extrinsic and epistemic (or intrinsic) value. Minimizing expected free energy is therefore equivalent to maximizing extrinsic value or expected utility (defined in terms of prior preferences or goals), while maximizing information gain or intrinsic value (or reducing uncertainty about the causes of valuable outcomes). The resulting scheme resolves the exploration-exploitation dilemma: Epistemic value is maximized until there is no further information gain, after which exploitation is assured through maximization of extrinsic value. This is formally consistent with the Infomax principle, generalizing formulations of active vision based upon salience (Bayesian surprise) and optimal decisions based on expected utility and risk-sensitive (Kullback-Leibler) control. Furthermore, as with previous active inference formulations of discrete (Markovian) problems, ad hoc softmax parameters become the expected (Bayes-optimal) precision of beliefs about, or confidence in, policies. This article focuses on the basic theory, illustrating the ideas with simulations. A key aspect of these simulations is the similarity between precision updates and dopaminergic discharges observed in conditioning paradigms.

  11. Ancient Biomolecules and Evolutionary Inference.

    Science.gov (United States)

    Cappellini, Enrico; Prohaska, Ana; Racimo, Fernando; Welker, Frido; Pedersen, Mikkel Winther; Allentoft, Morten E; de Barros Damgaard, Peter; Gutenbrunner, Petra; Dunne, Julie; Hammann, Simon; Roffet-Salque, Mélanie; Ilardo, Melissa; Moreno-Mayar, J Víctor; Wang, Yucheng; Sikora, Martin; Vinner, Lasse; Cox, Jürgen; Evershed, Richard P; Willerslev, Eske

    2018-04-25

    Over the last decade, studies of ancient biomolecules-particularly ancient DNA, proteins, and lipids-have revolutionized our understanding of evolutionary history. Though initially fraught with many challenges, the field now stands on firm foundations. Researchers now successfully retrieve nucleotide and amino acid sequences, as well as lipid signatures, from progressively older samples, originating from geographic areas and depositional environments that, until recently, were regarded as hostile to long-term preservation of biomolecules. Sampling frequencies and the spatial and temporal scope of studies have also increased markedly, and with them the size and quality of the data sets generated. This progress has been made possible by continuous technical innovations in analytical methods, enhanced criteria for the selection of ancient samples, integrated experimental methods, and advanced computational approaches. Here, we discuss the history and current state of ancient biomolecule research, its applications to evolutionary inference, and future directions for this young and exciting field. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  12. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  13. EI: A Program for Ecological Inference

    Directory of Open Access Journals (Sweden)

    Gary King

    2004-09-01

    Full Text Available The program EI provides a method of inferring individual behavior from aggregate data. It implements the statistical procedures, diagnostics, and graphics from the book A Solution to the Ecological Inference Problem: Reconstructing Individual Behavior from Aggregate Data (King 1997. Ecological inference, as traditionally defined, is the process of using aggregate (i.e., "ecological" data to infer discrete individual-level relationships of interest when individual-level data are not available. Ecological inferences are required in political science research when individual-level surveys are unavailable (e.g., local or comparative electoral politics, unreliable (racial politics, insufficient (political geography, or infeasible (political history. They are also required in numerous areas of ma jor significance in public policy (e.g., for applying the Voting Rights Act and other academic disciplines ranging from epidemiology and marketing to sociology and quantitative history.

  14. Macromolecular crystallographic results obtained using a 2048x2048 CCD detector at CHESS

    International Nuclear Information System (INIS)

    Thiel, D.J.; Ealick, S.E.; Tate, M.W.; Gruner, S.M.; Eikenberry, E.F.

    1996-01-01

    We present results of macromolecular crystallographic experiments performed at the Cornell High Energy Synchrotron Source (CHESS) with a new CCD-based detector. This detector, installed in January 1995, complements a 1024x1024 CCD detector that has been in continuous operation at CHESS since December 1993. The new detector is based on a 4-port, 2048x2048 pixel CCD that is directly coupled to a Gd 2 O 2 S:Tb phosphor by a 3:1 tapered fiber optic. The active area of the phosphor is a square 82 mm on an edge. The readout time is 7 seconds. In the standard mode of operation, the pixel size at the active area is 41 μm on the edge leading to the capability of resolving approximately 200 orders of diffraction across the detector face. The detector also operates in a 1024x1024 mode in which the pixel size is electronically increased by a factor of 4 in area resulting in smaller data files and faster detector readout but at the expense of spatial resolution. Most of the data that has been collected by this detector has been collected in this mode. Dozens of data sets have been collected by many experimenters using this detector at CHESS during the four month period from its installation until the start of the six-month down period of the storage ring. The capabilities of the detector will be illustrated with results from various crystallographic measurements including experiments in which the recorded diffraction patterns extend in resolution as far as 1 A. The results demonstrate that this detector is capable of collecting data of quality at least equal to that of imaging plates but, in many circumstances, with much greater beamline efficiency. copyright 1996 American Institute of Physics

  15. Interpretation of ensembles created by multiple iterative rebuilding of macromolecular models

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Adams, Paul D.; Moriarty, Nigel W.; Zwart, Peter; Read, Randy J.; Turk, Dusan; Hung, Li-Wei

    2007-01-01

    Heterogeneity in ensembles generated by independent model rebuilding principally reflects the limitations of the data and of the model-building process rather than the diversity of structures in the crystal. Automation of iterative model building, density modification and refinement in macromolecular crystallography has made it feasible to carry out this entire process multiple times. By using different random seeds in the process, a number of different models compatible with experimental data can be created. Sets of models were generated in this way using real data for ten protein structures from the Protein Data Bank and using synthetic data generated at various resolutions. Most of the heterogeneity among models produced in this way is in the side chains and loops on the protein surface. Possible interpretations of the variation among models created by repetitive rebuilding were investigated. Synthetic data were created in which a crystal structure was modelled as the average of a set of ‘perfect’ structures and the range of models obtained by rebuilding a single starting model was examined. The standard deviations of coordinates in models obtained by repetitive rebuilding at high resolution are small, while those obtained for the same synthetic crystal structure at low resolution are large, so that the diversity within a group of models cannot generally be a quantitative reflection of the actual structures in a crystal. Instead, the group of structures obtained by repetitive rebuilding reflects the precision of the models, and the standard deviation of coordinates of these structures is a lower bound estimate of the uncertainty in coordinates of the individual models

  16. a Study of the Concentration Dependence of Macromolecular Diffusion Using Photon Correlation Spectroscopy.

    Science.gov (United States)

    Marlowe, Robert Lloyd

    The dynamic light scattering technique of photon correlation spectroscopy has been used to investigate the dependence of the mutual diffusion coefficient of a macromolecular system upon concentration. The first part of the research was devoted to the design and construction of a single-clipping autocorrelator based on newly-developed integrated circuits. The resulting 128 channel instrument can perform real time autocorrelation for sample time intervals >(, )10 (mu)s, and batch processed autocorrelation for intervals down to 3 (mu)s. An improved design for a newer, all-digital autocorrelator is given. Homodyne light scattering experiments were then undertaken on monodisperse solutions of polystyrene spheres. The single-mode TEM(,oo) beam of an argon-ion laser ((lamda) = 5145 (ANGSTROM)) was used as the light source; all solutions were studied at room temperature. The scattering angle was varied from 30(DEGREES) to 110(DEGREES). Excellent agreement with the manufacturer's specification for the particle size was obtained from the photon correlation studies. Finally, aqueous solutions of the globular protein ovalbumin, ranging in concentration from 18.9 to 244.3 mg/ml, were illuminated under the same conditions of temperature and wavelength as before; the homodyne scattered light was detected at a fixed scattering angle of 30(DEGREES). The single-clipped photocount autocorrelation function was analyzed using the homodyne exponential integral method of Meneely et al. The resulting diffusion coefficients showed a general linear dependence upon concentration, as predicted by the generalized Stokes-Einstein equation. However, a clear peak in the data was evident at c (TURNEQ) 100 mg/ml, which could not be explained on the basis of a non -interacting particle theory. A semi-quantitative approach based on the Debye-Huckel theory of electrostatic interactions is suggested as the probable cause for the peak's rise, and an excluded volume effect for its decline.

  17. Novel use for polyvinylpyrrolidone as a macromolecular crowder for enhanced extracellular matrix deposition and cell proliferation.

    Science.gov (United States)

    Rashid, Rafi; Lim, Natalie Sheng Jie; Chee, Stella Min Ling; Png, Si Ning; Wohland, Thorsten; Raghunath, Michael

    2014-12-01

    Macromolecular crowding (MMC) is a biophysical effect that governs biochemical processes inside and outside of cells. Since standard cell culture media lack this effect, the physiological performance of differentiated and progenitor cells, including extracellular matrix (ECM) deposition, is impaired in vitro. To bring back physiological crowdedness to in vitro systems, we have previously introduced carbohydrate-based macromolecules to culture media and have achieved marked improvements with mixed MMC in terms of ECM deposition and differentiation of mesenchymal stem cells (MSCs). We show here that although this system is successful, it is limited, due to viscosity, to only 33% of the fractional volume occupancy (FVO) of full serum, which we calculated to have an FVO of approximately 54% v/v. We show here that full-serum FVO can be achieved using polyvinylpyrrolidone (PVP) 360 kDa. Under these conditions, ECM deposition in human fibroblasts and MSCs is on par, if not stronger than, with original MMC protocols using carbohydrates, but with a viscosity that is not significantly changed. In addition, we have found that the proliferation rate for bone marrow-derived MSCs and fibroblasts increases slightly in the presence of PVP360, similar to that observed with carbohydrate-based crowders. A palette of MMC compounds is now emerging that enables us to tune the crowdedness of culture media seamlessly from interstitial fluid (9% FVO), in which the majority of tissue cells might be based, to serum environments mimicking intravascular conditions. Despite identical FVO's, individual crowder size effects play a role and different cell types appear to have preferences in terms of FVO and the crowder that this is achieved with. However, in the quest of crowders that we have predicted to have a smoother regulatory approval path, PVP is a highly interesting compound, as it has been widely used in the medical and food industries and shows a novel promising use in cell culture and

  18. Improving 2D and 3D Skin In Vitro Models Using Macromolecular Crowding.

    Science.gov (United States)

    Benny, Paula; Badowski, Cedric; Lane, E Birgitte; Raghunath, Michael

    2016-08-22

    The glycoprotein family of collagens represents the main structural proteins in the human body, and are key components of biomaterials used in modern tissue engineering. A technical bottleneck is the deposition of collagen in vitro, as it is notoriously slow, resulting in sub-optimal formation of connective tissue and subsequent tissue cohesion, particularly in skin models. Here, we describe a method which involves the addition of differentially-sized sucrose co-polymers to skin cultures to generate macromolecular crowding (MMC), which results in a dramatic enhancement of collagen deposition. Particularly, dermal fibroblasts deposited a significant amount of collagen I/IV/VII and fibronectin under MMC in comparison to controls. The protocol also describes a method to decellularize crowded cell layers, exposing significant amounts of extracellular matrix (ECM) which were retained on the culture surface as evidenced by immunocytochemistry. Total matrix mass and distribution pattern was studied using interference reflection microscopy. Interestingly, fibroblasts, keratinocytes and co-cultures produced cell-derived matrices (CDM) of varying composition and morphology. CDM could be used as "bio-scaffolds" for secondary cell seeding, where the current use of coatings or scaffolds, typically from xenogenic animal sources, can be avoided, thus moving towards more clinically relevant applications. In addition, this protocol describes the application of MMC during the submerged phase of a 3D-organotypic skin co-culture model which was sufficient to enhance ECM deposition in the dermo-epidermal junction (DEJ), in particular, collagen VII, the major component of anchoring fibrils. Electron microscopy confirmed the presence of anchoring fibrils in cultures developed with MMC, as compared to controls. This is significant as anchoring fibrils tether the dermis to the epidermis, hence, having a pre-formed mature DEJ may benefit skin graft recipients in terms of graft stability and

  19. Phase behaviour of macromolecular liquid crystalline materials. Computational studies at the molecular level

    International Nuclear Information System (INIS)

    Stimson, Lorna M.

    2003-01-01

    Molecular simulations provide an increasingly useful insight into the static and dynamic characteristics of materials. In this thesis molecular simulations of macro-molecular liquid crystalline materials are reported. The first liquid crystalline material that has been investigated is a side chain liquid crystal polymer (SCLCP). In this study semi-atomistic molecular dynamics simulations have been conducted at a range of temperatures and an aligning potential has been applied to mimic the effect of a magnetic field. In cooling the SCLCP from an isotropic melt, microphase separation was observed yielding a domain structure. The application of a magnetic field to this structure aligns the domains producing a stable smectic mesophase. This is the first study in which mesophases have been observed using an off-lattice model of a SCLCP. The second material that has been investigated is a dendrimer with terminal mesogenic functionalization. Here, a multi-scale approach has been taken with Monte Carlo studies of a single dendrimer molecule in the gas phase at the atomistic level, semi-atomistic molecular dynamics of a single molecule in liquid crystalline solvents and a coarse-grained molecular dynamics study of the dendrimer in the bulk. The coarse-grained model has been developed and parameterized using the results of the atomistic and semi-atomistic work. The single molecule studies showed that the liquid crystalline dendrimer was able to change its structure by conformational changes in the flexible chains that link the mesogenic groups to the core. Structural change was seen under the application of a mean field ordering potential in the gas phase, and in the presence of liquid crystalline solvents. No liquid crystalline phases were observed for the bulk phase studies of the coarse-grained model. However, when the length of the mesogenic units was increased there was some evidence for microphase separation in these systems. (author)

  20. Assessing Exhaustiveness of Stochastic Sampling for Integrative Modeling of Macromolecular Structures.

    Science.gov (United States)

    Viswanath, Shruthi; Chemmama, Ilan E; Cimermancic, Peter; Sali, Andrej

    2017-12-05

    Modeling of macromolecular structures involves structural sampling guided by a scoring function, resulting in an ensemble of good-scoring models. By necessity, the sampling is often stochastic, and must be exhaustive at a precision sufficient for accurate modeling and assessment of model uncertainty. Therefore, the very first step in analyzing the ensemble is an estimation of the highest precision at which the sampling is exhaustive. Here, we present an objective and automated method for this task. As a proxy for sampling exhaustiveness, we evaluate whether two independently and stochastically generated sets of models are sufficiently similar. The protocol includes testing 1) convergence of the model score, 2) whether model scores for the two samples were drawn from the same parent distribution, 3) whether each structural cluster includes models from each sample proportionally to its size, and 4) whether there is sufficient structural similarity between the two model samples in each cluster. The evaluation also provides the sampling precision, defined as the smallest clustering threshold that satisfies the third, most stringent test. We validate the protocol with the aid of enumerated good-scoring models for five illustrative cases of binary protein complexes. Passing the proposed four tests is necessary, but not sufficient for thorough sampling. The protocol is general in nature and can be applied to the stochastic sampling of any set of models, not just structural models. In addition, the tests can be used to stop stochastic sampling as soon as exhaustiveness at desired precision is reached, thereby improving sampling efficiency; they may also help in selecting a model representation that is sufficiently detailed to be informative, yet also sufficiently coarse for sampling to be exhaustive. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  1. Methyl-β-cyclodextrin quaternary ammonium chitosan conjugate: nanoparticles vs macromolecular soluble complex

    Science.gov (United States)

    Piras, Anna Maria; Fabiano, Angela; Chiellini, Federica; Zambito, Ylenia

    2018-01-01

    Purpose The present study aimed to compare a novel cyclodextrin–polymer–drug complex in solution with a dispersed supramolecular nanosize system, made of the same complex, for ability to carry dexamethasone (DEX) across excised rat intestine. Results Methyl-β-cyclodextrin-quaternary ammonium chitosan conjugate (QA-Ch-MCD) was obtained by covalent grafting through a 10-atom spacer. The conjugate was characterized by 1H-NMR, resulting in 24.4% w/w of MCD content. Phase solubility profile analysis of the QA-Ch-MCD/DEX complex yielded an association constant of 14037 M−1, vs 4428 M−1 for the plain MCD/DEX complex. Nanoparticle (NP) dispersions resulted from ionotropic gelation of the QA-Ch-MCD/DEX complex by sodium tripolyphosphate, leading to 9.9%±1.4% drug loading efficiency. The mean diameter and zeta potential for NP were 299±32 nm (polydispersity index [PI] 0.049) and 11.5±1.1 mV, respectively. Those for QA-Ch-MCD/DEX were 2.7±0.4 nm (PI 0.048) and 6.7±0.6 mV. QA-Ch-MCD/DEX solutions and corresponding NP dispersions were compared in vitro for water-assisted transport through mucus, DEX permeation through excised rat intestine, and ex vivo mucoadhesivity. The complex showed higher mucoadhesion and lower transport rate through mucus; also, it provided faster drug permeation across excised rat intestine. Conclusion Carrier adhesion to mucus surface has played a most important role in favoring transepithelial permeation. Then, within the concerns of the present study, the use of NP seems not to provide any determinant advantage over using the simpler macromolecular complex. PMID:29731628

  2. The use of workflows in the design and implementation of complex experiments in macromolecular crystallography

    International Nuclear Information System (INIS)

    Brockhauser, Sandor; Svensson, Olof; Bowler, Matthew W.; Nanao, Max; Gordon, Elspeth; Leal, Ricardo M. F.; Popov, Alexander; Gerring, Matthew; McCarthy, Andrew A.; Gotz, Andy

    2012-01-01

    A powerful and easy-to-use workflow environment has been developed at the ESRF for combining experiment control with online data analysis on synchrotron beamlines. This tool provides the possibility of automating complex experiments without the need for expertise in instrumentation control and programming, but rather by accessing defined beamline services. The automation of beam delivery, sample handling and data analysis, together with increasing photon flux, diminishing focal spot size and the appearance of fast-readout detectors on synchrotron beamlines, have changed the way that many macromolecular crystallography experiments are planned and executed. Screening for the best diffracting crystal, or even the best diffracting part of a selected crystal, has been enabled by the development of microfocus beams, precise goniometers and fast-readout detectors that all require rapid feedback from the initial processing of images in order to be effective. All of these advances require the coupling of data feedback to the experimental control system and depend on immediate online data-analysis results during the experiment. To facilitate this, a Data Analysis WorkBench (DAWB) for the flexible creation of complex automated protocols has been developed. Here, example workflows designed and implemented using DAWB are presented for enhanced multi-step crystal characterizations, experiments involving crystal reorientation with kappa goniometers, crystal-burning experiments for empirically determining the radiation sensitivity of a crystal system and the application of mesh scans to find the best location of a crystal to obtain the highest diffraction quality. Beamline users interact with the prepared workflows through a specific brick within the beamline-control GUI MXCuBE

  3. MolProbity: all-atom structure validation for macromolecular crystallography

    International Nuclear Information System (INIS)

    Chen, Vincent B.; Arendall, W. Bryan III; Headd, Jeffrey J.; Keedy, Daniel A.; Immormino, Robert M.; Kapral, Gary J.; Murray, Laura W.; Richardson, Jane S.; Richardson, David C.

    2010-01-01

    MolProbity structure validation will diagnose most local errors in macromolecular crystal structures and help to guide their correction. MolProbity is a structure-validation web service that provides broad-spectrum solidly based evaluation of model quality at both the global and local levels for both proteins and nucleic acids. It relies heavily on the power and sensitivity provided by optimized hydrogen placement and all-atom contact analysis, complemented by updated versions of covalent-geometry and torsion-angle criteria. Some of the local corrections can be performed automatically in MolProbity and all of the diagnostics are presented in chart and graphical forms that help guide manual rebuilding. X-ray crystallography provides a wealth of biologically important molecular data in the form of atomic three-dimensional structures of proteins, nucleic acids and increasingly large complexes in multiple forms and states. Advances in automation, in everything from crystallization to data collection to phasing to model building to refinement, have made solving a structure using crystallography easier than ever. However, despite these improvements, local errors that can affect biological interpretation are widespread at low resolution and even high-resolution structures nearly all contain at least a few local errors such as Ramachandran outliers, flipped branched protein side chains and incorrect sugar puckers. It is critical both for the crystallographer and for the end user that there are easy and reliable methods to diagnose and correct these sorts of errors in structures. MolProbity is the authors’ contribution to helping solve this problem and this article reviews its general capabilities, reports on recent enhancements and usage, and presents evidence that the resulting improvements are now beneficially affecting the global database

  4. Prospects for simulating macromolecular surfactant chemistry at the ocean–atmosphere boundary

    International Nuclear Information System (INIS)

    Elliott, S; Burrows, S M; Liu, X; Deal, C; Long, M; Ogunro, O; Wingenter, O; Russell, L M

    2014-01-01

    Biogenic lipids and polymers are surveyed for their ability to adsorb at the water–air interfaces associated with bubbles, marine microlayers and particles in the overlying boundary layer. Representative ocean biogeochemical regimes are defined in order to estimate local concentrations for the major macromolecular classes. Surfactant equilibria and maximum excess are then derived based on a network of model compounds. Relative local coverage and upward mass transport follow directly, and specific chemical structures can be placed into regional rank order. Lipids and denatured protein-like polymers dominate at the selected locations. The assigned monolayer phase states are variable, whether assessed along bubbles or at the atmospheric spray droplet perimeter. Since oceanic film compositions prove to be irregular, effects on gas and organic transfer are expected to exhibit geographic dependence as well. Moreover, the core arguments extend across the sea–air interface into aerosol–cloud systems. Fundamental nascent chemical properties including mass to carbon ratio and density depend strongly on the geochemical state of source waters. High surface pressures may suppress the Kelvin effect, and marine organic hygroscopicities are almost entirely unconstrained. While bubble adsorption provides a well-known means for transporting lipidic or proteinaceous material into sea spray, the same cannot be said of polysaccharides. Carbohydrates tend to be strongly hydrophilic so that their excess carbon mass is low despite stacked polymeric geometries. Since sugars are abundant in the marine aerosol, gel-based mechanisms may be required to achieve uplift. Uncertainties distill to a global scale dearth of information regarding two dimensional kinetics and equilibria. Nonetheless simulations are recommended, to initiate the process of systems level quantification. (papers)

  5. Design of cellulose ether-based macromolecular prodrugs of ciprofloxacin for extended release and enhanced bioavailability.

    Science.gov (United States)

    Amin, Muhammad; Abbas, Nazia Shahana; Hussain, Muhammad Ajaz; Sher, Muhammad; Edgar, Kevin J

    2018-07-01

    The present study reveals the syntheses of hydroxypropylcellulose‑(HPC) and hydroxyethylcellulose‑(HEC) based macromolecular prodrugs (MPDs) of ciprofloxacin (CIP) using homogeneous reaction methodology. Covalently loaded drug content (DC) of each prodrug was quantified using UV-Vis spectrophotometry to determine degree of substitution (DS). HPC-ciprofloxacin (HPC-CIP) conjugates showed DS of CIP in the range 0.87-1.15 whereas HEC-ciprofloxacin (HEC-CIP) conjugates showed DS range 0.51-0.75. Transmission electron microscopy revealed that HPC-CIP conjugate 2 and HEC-CIP conjugate 6 self-assembled into nanoparticles of 150-300 and 180-250nm, respectively. Size exclusion chromatography revealed HPC-CIP conjugate 2 and HEC-CIP conjugate 6 as monodisperse systems. In vitro drug release studies indicated 15 and 43% CIP release from HPC-CIP conjugate 2 after 6h in simulated gastric and simulated intestinal fluids (SGF and SIF), respectively. HEC-CIP conjugate 6 showed 16% and 46% release after 6h in SGF and SIF, respectively. HPC-CIP conjugate 2 and HEC-CIP conjugate 6 exhibited half-lives of 10.87 and 11.71h, respectively with area under the curve values of 164 and 175hμgmL -1 , respectively, indicating enhanced bioavailability and improved pharmacokinetic profiles in animal model. Equal antibacterial activities to that of unmodified CIP confirmed their competitive efficacies. Cytotoxicity studies supported their non-toxic nature and biocompatibility. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Phase-Separated Liposomes Enhance the Efficiency of Macromolecular Delivery to the Cellular Cytoplasm.

    Science.gov (United States)

    Imam, Zachary I; Kenyon, Laura E; Ashby, Grant; Nagib, Fatema; Mendicino, Morgan; Zhao, Chi; Gadok, Avinash K; Stachowiak, Jeanne C

    2017-10-01

    From viruses to organelles, fusion of biological membranes is used by diverse biological systems to deliver macromolecules across membrane barriers. Membrane fusion is also a potentially efficient mechanism for the delivery of macromolecular therapeutics to the cellular cytoplasm. However, a key shortcoming of existing fusogenic liposomal systems is that they are inefficient, requiring a high concentration of fusion-promoting lipids in order to cross cellular membrane barriers. Toward addressing this limitation, our experiments explore the extent to which membrane fusion can be amplified by using the process of lipid membrane phase separation to concentrate fusion-promoting lipids within distinct regions of the membrane surface. We used confocal fluorescence microscopy to investigate the integration of fusion-promoting lipids into a ternary lipid membrane system that separated into liquid-ordered and liquid-disordered membrane phases. Additionally, we quantified the impact of membrane phase separation on the efficiency with which liposomes transferred lipids and encapsulated macromolecules to cells, using a combination of confocal fluorescence imaging and flow cytometry. Here we report that concentrating fusion-promoting lipids within phase-separated lipid domains on the surfaces of liposomes significantly increases the efficiency of liposome fusion with model membranes and cells. In particular, membrane phase separation enhanced the delivery of lipids and model macromolecules to the cytoplasm of tumor cells by at least 4-fold in comparison to homogenous liposomes. Our findings demonstrate that phase separation can enhance membrane fusion by locally concentrating fusion-promoting lipids on the surface of liposomes. This work represents the first application of lipid membrane phase separation in the design of biomaterials-based delivery systems. Additionally, these results lay the ground work for developing fusogenic liposomes that are triggered by physical and

  7. Enhanced conjugation stability and blood circulation time of macromolecular gadolinium-DTPA contrast agent.

    Science.gov (United States)

    Jenjob, Ratchapol; Kun, Na; Ghee, Jung Yeon; Shen, Zheyu; Wu, Xiaoxia; Cho, Steve K; Lee, Don Haeng; Yang, Su-Geun

    2016-04-01

    In this study, we prepared macromolecular MR T1 contrast agent: pullulan-conjugated Gd diethylene triamine pentaacetate (Gd-DTPA-Pullulan) and estimated residual free Gd(3+), chelation stability in competition with metal ions, plasma and tissue pharmacokinetics, and abdominal MR contrast on rats. Residual free Gd(3+) in Gd-DTPA-Pullulan was measured using colorimetric spectroscopy. The transmetalation of Gd(3+) incubated with Ca(2+) was performed by using a dialysis membrane (MWCO 100-500 Da) and investigated by ICP-OES. The plasma concentration profiles of Gd-DTPA-Pullulan were estimated after intravenous injection at a dose 0.1 mmol/kg of Gd. The coronal-plane abdominal images of normal rats were observed by MR imaging. The content of free Gd(3+), the toxic residual form, was less than 0.01%. Chelation stability of Gd-DTPA-Pullulan was estimated, and only 0.2% and 0.00045% of Gd(3+) were released from Gd-DTPA-Pullulan after 2h incubation with Ca(2+) and Fe(2+), respectively. Gd-DTPA-Pullulan displayed the extended plasma half-life (t1/2,α=0.43 h, t1/2,β=2.32 h), much longer than 0.11h and 0.79 h of Gd-EOB-DTPA. Abdominal MR imaging showed Gd-DTPA-Pullulan maintained initial MR contrast for 30 min. The extended plasma half-life of Gd-DTPA-Pullulan probably allows the prolonged MR acquisition time in clinic with enhanced MR contrast. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Novel types of DNA-sugar damage in neocarzinostatin cytotoxicity and mutagenesis

    International Nuclear Information System (INIS)

    Goldberg, I.H.

    1986-01-01

    Although a number of antitumor antibiotics interact with DNA to form covalent adducts with the bases, relatively few damage DNA by interacting with the deoxyribose moiety. Neocarzinostatin (NCS), a member of a family of macromolecular antibiotics obtained from filtrates of Streptomyces, is such an agent. Many of the biochemical and cellular effects of NCS resemble those of ionizing radiation. Most, possibly all, of the DNA lesions caused by NCS appear to result from the direct attack of an activated form of the drug on the deoxyribose of DNA. This is to be contrasted with ionizing radiation or the antibiotic bleomycin, that damage DNA deoxyribose through the intervention of a reduced form of oxygen. This paper describes the nature of the interaction between the active component of NCS and DNA, on the mechanism of the ensuing deoxyribose damage, and on some of the biological consequences of these actions. 24 refs., 7 figs

  9. Radiation damage of nonmetallic solids

    International Nuclear Information System (INIS)

    Goland, A.N.

    1975-01-01

    A review of data and information on radiation damage in nonmetallic solids is presented. Discussions are included on defects in nonmetals, radiation damage processes in nonmetals, electronic damage processes, physical damage processes, atomic displacement, photochemical damage processes, and ion implantation

  10. Femoral nerve damage (image)

    Science.gov (United States)

    The femoral nerve is located in the leg and supplies the muscles that assist help straighten the leg. It supplies sensation ... leg. One risk of damage to the femoral nerve is pelvic fracture. Symptoms of femoral nerve damage ...

  11. Statistical inference an integrated Bayesianlikelihood approach

    CERN Document Server

    Aitkin, Murray

    2010-01-01

    Filling a gap in current Bayesian theory, Statistical Inference: An Integrated Bayesian/Likelihood Approach presents a unified Bayesian treatment of parameter inference and model comparisons that can be used with simple diffuse prior specifications. This novel approach provides new solutions to difficult model comparison problems and offers direct Bayesian counterparts of frequentist t-tests and other standard statistical methods for hypothesis testing.After an overview of the competing theories of statistical inference, the book introduces the Bayes/likelihood approach used throughout. It pre

  12. Permeability to macromolecular contrast media quantified by dynamic MRI correlates with tumor tissue assays of vascular endothelial growth factor (VEGF)

    International Nuclear Information System (INIS)

    Cyran, Clemens C.; Sennino, Barbara; Fu, Yanjun; Rogut, Victor; Shames, David M.; Chaopathomkul, Bundit; Wendland, Michael F.; McDonald, Donald M.; Brasch, Robert C.; Raatschen, Hans-Juergen

    2012-01-01

    Purpose: To correlate dynamic MRI assays of macromolecular endothelial permeability with microscopic area–density measurements of vascular endothelial growth factor (VEGF) in tumors. Methods and material: This study compared tumor xenografts from two different human cancer cell lines, MDA-MB-231 tumors (n = 5), and MDA-MB-435 (n = 8), reported to express respectively higher and lower levels of VEGF. Dynamic MRI was enhanced by a prototype macromolecular contrast medium (MMCM), albumin-(Gd-DTPA)35. Quantitative estimates of tumor microvascular permeability (K PS ; μl/min × 100 cm 3 ), obtained using a two-compartment kinetic model, were correlated with immunohistochemical measurements of VEGF in each tumor. Results: Mean K PS was 2.4 times greater in MDA-MB-231 tumors (K PS = 58 ± 30.9 μl/min × 100 cm 3 ) than in MDA-MB-435 tumors (K PS = 24 ± 8.4 μl/min × 100 cm 3 ) (p < 0.05). Correspondingly, the area–density of VEGF in MDA-MB-231 tumors was 2.6 times greater (27.3 ± 2.2%, p < 0.05) than in MDA-MB-435 cancers (10.5 ± 0.5%, p < 0.05). Considering all tumors without regard to cell type, a significant positive correlation (r = 0.67, p < 0.05) was observed between MRI-estimated endothelial permeability and VEGF immunoreactivity. Conclusion: Correlation of MRI assays of endothelial permeability to a MMCM and VEGF immunoreactivity of tumors support the hypothesis that VEGF is a major contributor to increased macromolecular permeability in cancers. When applied clinically, the MMCM-enhanced MRI approach could help to optimize the appropriate application of VEGF-inhibiting therapy on an individual patient basis.

  13. Nitrogen limitation in natural populations of cyanobacteria (Spirulina and Oscillatoria spp.) and its effect on macromolecular synthesis

    International Nuclear Information System (INIS)

    van Rijn, J.; Shilo, M.

    1986-01-01

    Natural populations of the cyanobacteria Spirulina species and Oscillatoria species obtained from Israeli fish ponds were limited in growth by nitrogen availability in summer. Physiological indicators for nitrogen limitation, such as phycocyanin, chlorophyll a, and carbohydrate content, did not show clear evidence for nitrogen limited growth, since these organisms are capable of vertical migration from and to the nitrogen-rich bottom. By means of 14 C labeling of the cells under simulated pond conditions followed by cell fractionation into macromolecular compounds, it was found that carbohydrates synthesized at the lighted surface were partially utilized for dark protein synthesis at the bottom of these ponds

  14. Comparison of two self-assembled macromolecular prodrug micelles with different conjugate positions of SN38 for enhancing antitumor activity

    Directory of Open Access Journals (Sweden)

    Liu Y

    2015-03-01

    Full Text Available Yi Liu,1 Hongyu Piao,1 Ying Gao,1 Caihong Xu,2 Ye Tian,1 Lihong Wang,1 Jinwen Liu,1 Bo Tang,1 Meijuan Zou,1 Gang Cheng1 1Department of Pharmaceutics, Shenyang Pharmaceutical University, Shenyang, Liaoning Province, People’s Republic of China; 2Department of Food Science, Shenyang Normal University, Shenyang, Liaoning Province, People’s Republic of China Abstract: 7-Ethyl-10-hydroxycamptothecin (SN38, an active metabolite of irinotecan (CPT-11, is a remarkably potent antitumor agent. The clinical application of SN38 has been extremely restricted by its insolubility in water. In this study, we successfully synthesized two macromolecular prodrugs of SN38 with different conjugate positions (chitosan-(C10-OHSN38 and chitosan-(C20-OHSN38 to improve the water solubility and antitumor activity of SN38. These prodrugs can self-assemble into micelles in aqueous medium. The particle size, morphology, zeta potential, and in vitro drug release of SN38 and its derivatives, as well as their cytotoxicity, pharmacokinetics, and in vivo antitumor activity in a xenograft BALB/c mouse model were studied. In vitro, chitosan-(C10-OHSN38 (CS-(10sSN38 and chitosan-(C20-OHSN38 (CS-(20sSN38 were 13.3- and 25.9-fold more potent than CPT-11 in the murine colon adenocarcinoma cell line CT26, respectively. The area under the curve (AUC0–24 of SN38 after intravenously administering CS-(10sSN38 and CS-(20sSN38 to Sprague Dawley rats was greatly improved when compared with CPT-11 (both P<0.01. A larger AUC0–24 of CS-(20sSN38 was observed when compared to CS-(10sSN38 (P<0.05. Both of the novel self-assembled chitosan-SN38 prodrugs demonstrated superior anticancer activity to CPT-11 in the CT26 xenograft BALB/c mouse model. We have also investigated the differences between these macromolecular prodrug micelles with regards to enhancing the antitumor activity of SN38. CS-(20sSN38 exhibited better in vivo antitumor activity than CS-(10sSN38 at a dose of 2.5 mg/kg (P<0

  15. Glycogen-graft-poly(2-alkyl-2-oxazolines) - the new versatile biopolymer-based thermoresponsive macromolecular toolbox

    Czech Academy of Sciences Publication Activity Database

    Pospíšilová, Aneta; Filippov, Sergey K.; Bogomolova, Anna; Turner, S.; Sedláček, Ondřej; Matushkin, Nikolai; Černochová, Zulfiya; Štěpánek, Petr; Hrubý, Martin

    2014-01-01

    Roč. 4, č. 106 (2014), s. 61580-61588 ISSN 2046-2069 R&D Projects: GA ČR GA13-08336S; GA MŠk(CZ) LH14079 Grant - others:AV ČR(CZ) M200501201; AV ČR(CZ) ASCR/CONICET 2012CZ006 Program:M Institutional support: RVO:61389013 Keywords : glycogen * poly(2-alkyl-2-oxazoline) * hybrid copolymer Subject RIV: CD - Macromolecular Chemistry Impact factor: 3.840, year: 2014

  16. Structure analysis of molecular systems in the Institute of Macromolecular Chemistry of the Czech Academy of Sciences

    Czech Academy of Sciences Publication Activity Database

    Hašek, Jindřich

    2010-01-01

    Roč. 17, 2a (2010), k32-k34 ISSN 1211-5894. [Struktura 2010. Soláň, 14.06.2010-17.06.2010] R&D Projects: GA AV ČR IAA500500701; GA ČR GA305/07/1073 Institutional research plan: CEZ:AV0Z40500505 Keywords : Academy of Sciences of the Czech Republic * X-ray structure analysis * crystallography Subject RIV: CD - Macromolecular Chemistry http:// xray .cz/ms/bul2010-2a/hasek.pdf

  17. Macromolecular crowding gives rise to microviscosity, anomalous diffusion and accelerated actin polymerization.

    Science.gov (United States)

    Rashid, Rafi; Chee, Stella Min Ling; Raghunath, Michael; Wohland, Thorsten

    2015-04-30

    Macromolecular crowding (MMC) has been used in various in vitro experimental systems to mimic in vivo physiology. This is because the crowded cytoplasm of cells contains many different types of solutes dissolved in an aqueous medium. MMC in the extracellular microenvironment is involved in maintaining stem cells in their undifferentiated state (niche) as well as in aiding their differentiation after they have travelled to new locations outside the niche. MMC at physiologically relevant fractional volume occupancies (FVOs) significantly enhances the adipogenic differentiation of human bone marrow-derived mesenchymal stem cells during chemically induced adipogenesis. The mechanism by which MMC produces this enhancement is not entirely known. In the context of extracellular collagen deposition, we have recently reported the importance of optimizing the FVO while minimizing the bulk viscosity. Two opposing properties will determine the net rate of a biochemical reaction: the negative effect of bulk viscosity and the positive effect of the excluded volume, the latter being expressed by the FVO. In this study we have looked more closely at the effect of viscosity on reaction rates. We have used fluorimetry to measure the rate of actin polymerization and fluorescence correlation spectroscopy (FCS) to measure diffusion of various probes in solutions containing the crowder Ficoll at physiological concentrations. Similar to its effect on collagen, Ficoll enhanced the actin polymerization rate despite increasing the bulk viscosity. Our FCS measurements reveal a relatively minor component of anomalous diffusion. In addition, our measurements do suggest that microviscosity becomes relevant in a crowded environment. We ruled out bulk viscosity as a cause of the rate enhancement by performing the actin polymerization assay in glycerol. These opposite effects of Ficoll and glycerol led us to conclude that microviscosity becomes relevant at the length scale of the reacting

  18. Macromolecular crowding gives rise to microviscosity, anomalous diffusion and accelerated actin polymerization

    Science.gov (United States)

    Rashid, Rafi; Chee, Stella Min Ling; Raghunath, Michael; Wohland, Thorsten

    2015-05-01

    Macromolecular crowding (MMC) has been used in various in vitro experimental systems to mimic in vivo physiology. This is because the crowded cytoplasm of cells contains many different types of solutes dissolved in an aqueous medium. MMC in the extracellular microenvironment is involved in maintaining stem cells in their undifferentiated state (niche) as well as in aiding their differentiation after they have travelled to new locations outside the niche. MMC at physiologically relevant fractional volume occupancies (FVOs) significantly enhances the adipogenic differentiation of human bone marrow-derived mesenchymal stem cells during chemically induced adipogenesis. The mechanism by which MMC produces this enhancement is not entirely known. In the context of extracellular collagen deposition, we have recently reported the importance of optimizing the FVO while minimizing the bulk viscosity. Two opposing properties will determine the net rate of a biochemical reaction: the negative effect of bulk viscosity and the positive effect of the excluded volume, the latter being expressed by the FVO. In this study we have looked more closely at the effect of viscosity on reaction rates. We have used fluorimetry to measure the rate of actin polymerization and fluorescence correlation spectroscopy (FCS) to measure diffusion of various probes in solutions containing the crowder Ficoll at physiological concentrations. Similar to its effect on collagen, Ficoll enhanced the actin polymerization rate despite increasing the bulk viscosity. Our FCS measurements reveal a relatively minor component of anomalous diffusion. In addition, our measurements do suggest that microviscosity becomes relevant in a crowded environment. We ruled out bulk viscosity as a cause of the rate enhancement by performing the actin polymerization assay in glycerol. These opposite effects of Ficoll and glycerol led us to conclude that microviscosity becomes relevant at the length scale of the reacting

  19. Leaching of organic acids from macromolecular organic matter by non-supercritical CO2

    Science.gov (United States)

    Sauer, P.; Glombitza, C.; Kallmeyer, J.

    2012-04-01

    The storage of CO2 in underground reservoirs is discussed controversly in the scientific literature. The worldwide search for suitable storage formations also considers coal-bearing strata. CO2 is already injected into seams for enhanced recovery of coal bed methane. However, the effects of increased CO2 concentration, especially on organic matter rich formations, are rarely investigated. The injected CO2 will dissolve in the pore water, causing a decrease in pH and resulting in acidic formation waters. Huge amounts of low molecular weight organic acids (LMWOAs) are chemically bound to the macromolecular matrix of sedimentary organic matter and may be liberated by hydrolysis, which is enhanced by the acidic porewater. Recent investigations outlined the importance of LMWOAs as a feedstock for microbial life in the subsurface [1]. Therefore, injection of CO2 into coal formations may result in enhanced nutrient supply for subsurface microbes. To investigate the effect of high concentrations of dissolved CO2 on the release of LMWOAs from coal we developed an inexpensive high-pressure high temperature system that allows manipulating the partial pressure of dissolved gases at pressures and temperatures up to 60 MPa and 120° C, respectively. In a reservoir vessel, gases are added to saturate the extraction medium to the desired level. Inside the extraction vessel hangs a flexible and inert PVDF sleeve (polyvinylidene fluoride, almost impermeable for gases), holding the sample and separating it from the pressure fluid. The flexibility of the sleeve allows for subsampling without loss of pressure. Coal samples from the DEBITS-1 well, Waikato Basin, NZ (R0 = 0.29, TOC = 30%). were extracted at 90° C and 5 MPa, either with pure or CO2-saturated water. Subsamples were taken at different time points during the extraction. The extracted LMWOAs such as formate, acetate and oxalate were analysed by ion chromatography. Yields of LMWOAs were higher with pure water than with CO2

  20. Photon-counting single-molecule spectroscopy for studying conformational dynamics and macromolecular interactions

    Energy Technology Data Exchange (ETDEWEB)

    Laurence, Ted Alfred [Univ. of California, Berkeley, CA (United States)

    2002-01-01

    Single-molecule methods have the potential to provide information about conformational dynamics and molecular interactions that cannot be obtained by other methods. Removal of ensemble averaging provides several benefits, including the ability to detect heterogeneous populations and the ability to observe asynchronous reactions. Single-molecule diffusion methodologies using fluorescence resonance energy transfer (FRET) are developed to monitor conformational dynamics while minimizing perturbations introduced by interactions between molecules and surfaces. These methods are used to perform studies of the folding of Chymotrypsin Inhibitor 2, a small, single-domain protein, and of single-stranded DNA (ssDNA) homopolymers. Confocal microscopy is used in combination with sensitive detectors to detect bursts of photons from fluorescently labeled biomolecules as they diffuse through the focal volume. These bursts are analyzed to extract fluorescence resonance energy transfer (FRET) efficiency. Advances in data acquisition and analysis techniques that are providing a more complete picture of the accessible molecular information are discussed. Photon Arrival-time Interval Distribution (PAID) analysis is a new method for monitoring macromolecular interactions by fluorescence detection with simultaneous determination of coincidence, brightness, diffusion time, and occupancy (proportional to concentration) of fluorescently-labeled molecules undergoing diffusion in a confocal detection volume. This method is based on recording the time of arrival of all detected photons, and then plotting the two-dimensional histogram of photon pairs, where one axis is the time interval between each pair of photons 1 and 2, and the second axis is the number of other photons detected in the time interval between photons 1 and 2. PAID is related to Fluorescence Correlation Spectroscopy (FCS) by a collapse of this histogram onto the time interval axis. PAID extends auto- and cross-correlation FCS

  1. Enhanced conjugation stability and blood circulation time of macromolecular gadolinium-DTPA contrast agent

    Energy Technology Data Exchange (ETDEWEB)

    Jenjob, Ratchapol [Department of New Drug Development, School of Medicine, Inha University, 2F A-dong, Jeongseok Bldg., Sinheung-dong 3-ga, Jung-gu, Incheon 400-712 (Korea, Republic of); Kun, Na [Department of Biotechnology, The Catholic University of Korea, 43 Jibong-ro, Wonmi-gu, Bucheon-si, Gyeonggi-do 420-743 (Korea, Republic of); Ghee, Jung Yeon [Utah-Inha DDS and Advanced Therapeutics, B-403 Meet-You-All Tower, SongdoTechnopark, 7–50, Songdo-dong, Yeonsu-gu, Incheon 406-840 (Korea, Republic of); Shen, Zheyu; Wu, Xiaoxia [Division of Functional Materials and Nano-Devices, Ningbo Institute of Materials Technology & Engineering (NIMTE), Chinese Academy of Sciences, 519 Zhuangshi Street, Zhenhai District, Ningbo, Zhejiang 315201 (China); Cho, Steve K., E-mail: scho@gist.ac.kr [Division of Liberal Arts and Science, GIST College, Gwangju Institute of Science and Technology, Gwangju 500-712 (Korea, Republic of); Lee, Don Haeng [Utah-Inha DDS and Advanced Therapeutics, B-403 Meet-You-All Tower, SongdoTechnopark, 7–50, Songdo-dong, Yeonsu-gu, Incheon 406-840 (Korea, Republic of); Department of Internal Medicine, School of Medicine, Inha University Hospital, Incheon 420-751 (Korea, Republic of); Yang, Su-Geun, E-mail: Sugeun.Yang@Inha.ac.kr [Department of New Drug Development, School of Medicine, Inha University, 2F A-dong, Jeongseok Bldg., Sinheung-dong 3-ga, Jung-gu, Incheon 400-712 (Korea, Republic of)

    2016-04-01

    In this study, we prepared macromolecular MR T1 contrast agent: pullulan-conjugated Gd diethylene triamine pentaacetate (Gd-DTPA-Pullulan) and estimated residual free Gd{sup 3+}, chelation stability in competition with metal ions, plasma and tissue pharmacokinetics, and abdominal MR contrast on rats. Residual free Gd{sup 3+} in Gd-DTPA-Pullulan was measured using colorimetric spectroscopy. The transmetalation of Gd{sup 3+} incubated with Ca{sup 2+} was performed by using a dialysis membrane (MWCO 100–500 Da) and investigated by ICP-OES. The plasma concentration profiles of Gd-DTPA-Pullulan were estimated after intravenous injection at a dose 0.1 mmol/kg of Gd. The coronal-plane abdominal images of normal rats were observed by MR imaging. The content of free Gd{sup 3+}, the toxic residual form, was less than 0.01%. Chelation stability of Gd-DTPA-Pullulan was estimated, and only 0.2% and 0.00045% of Gd{sup 3+} were released from Gd-DTPA-Pullulan after 2 h incubation with Ca{sup 2+} and Fe{sup 2+}, respectively. Gd-DTPA-Pullulan displayed the extended plasma half-life (t{sub 1/2,α} = 0.43 h, t{sub 1/2,β} = 2.32 h), much longer than 0.11 h and 0.79 h of Gd-EOB-DTPA. Abdominal MR imaging showed Gd-DTPA-Pullulan maintained initial MR contrast for 30 min. The extended plasma half-life of Gd-DTPA-Pullulan probably allows the prolonged MR acquisition time in clinic with enhanced MR contrast. - Highlights: • Macromolecule (pullulan) conjugated Gd contrast agent (Gd-DTPA-Pullulan) showed the extended plasma half-life (t{sub 1/2,α} = 0.43 h, t{sub 1/2,β} = 2.32 h) in comparison with Gd-EOB-DTPA • Gd-DTPA-pullulan T1 contrast agent exhibited strong chelation stability against Gd. • The extended blood circulation attributed the enhanced and prolonged MR contrast on abdominal region of rats. • The extended blood circulation may provide prolonged MR acquisition time window in clinics.

  2. Photon-counting single-molecule spectroscopy for studying conformational dynamics and macromolecular interactions

    International Nuclear Information System (INIS)

    Laurence, Ted Alfred

    2002-01-01

    Single-molecule methods have the potential to provide information about conformational dynamics and molecular interactions that cannot be obtained by other methods. Removal of ensemble averaging provides several benefits, including the ability to detect heterogeneous populations and the ability to observe asynchronous reactions. Single-molecule diffusion methodologies using fluorescence resonance energy transfer (FRET) are developed to monitor conformational dynamics while minimizing perturbations introduced by interactions between molecules and surfaces. These methods are used to perform studies of the folding of Chymotrypsin Inhibitor 2, a small, single-domain protein, and of single-stranded DNA (ssDNA) homopolymers. Confocal microscopy is used in combination with sensitive detectors to detect bursts of photons from fluorescently labeled biomolecules as they diffuse through the focal volume. These bursts are analyzed to extract fluorescence resonance energy transfer (FRET) efficiency. Advances in data acquisition and analysis techniques that are providing a more complete picture of the accessible molecular information are discussed. Photon Arrival-time Interval Distribution (PAID) analysis is a new method for monitoring macromolecular interactions by fluorescence detection with simultaneous determination of coincidence, brightness, diffusion time, and occupancy (proportional to concentration) of fluorescently-labeled molecules undergoing diffusion in a confocal detection volume. This method is based on recording the time of arrival of all detected photons, and then plotting the two-dimensional histogram of photon pairs, where one axis is the time interval between each pair of photons 1 and 2, and the second axis is the number of other photons detected in the time interval between photons 1 and 2. PAID is related to Fluorescence Correlation Spectroscopy (FCS) by a collapse of this histogram onto the time interval axis. PAID extends auto- and cross-correlation FCS

  3. Mineral Grains, Dimples, and Hot Volcanic Organic Streams: Dynamic Geological Backstage of Macromolecular Evolution.

    Science.gov (United States)

    Skoblikow, Nikolai E; Zimin, Andrei A

    2018-04-01

    , polycondensation, and formation of proto-cellular structures) are combined within a common dynamic geological process. We suppose macromolecular evolution had an extremely fast, "flash" start: the period from volcanic eruption to formation of lithocyte "populations" took not million years but just several tens of minutes. The scenario proposed can be verified experimentally with a three-module setup working with principles of dynamic (flow) chemistry in its core element.

  4. Inferring Domain Plans in Question-Answering

    National Research Council Canada - National Science Library

    Pollack, Martha E

    1986-01-01

    The importance of plan inference in models of conversation has been widely noted in the computational-linguistics literature, and its incorporation in question-answering systems has enabled a range...

  5. Scalable inference for stochastic block models

    KAUST Repository

    Peng, Chengbin; Zhang, Zhihua; Wong, Ka-Chun; Zhang, Xiangliang; Keyes, David E.

    2017-01-01

    Community detection in graphs is widely used in social and biological networks, and the stochastic block model is a powerful probabilistic tool for describing graphs with community structures. However, in the era of "big data," traditional inference

  6. Efficient algorithms for conditional independence inference

    Czech Academy of Sciences Publication Activity Database

    Bouckaert, R.; Hemmecke, R.; Lindner, S.; Studený, Milan

    2010-01-01

    Roč. 11, č. 1 (2010), s. 3453-3479 ISSN 1532-4435 R&D Projects: GA ČR GA201/08/0539; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional independence inference * linear programming approach Subject RIV: BA - General Mathematics Impact factor: 2.949, year: 2010 http://library.utia.cas.cz/separaty/2010/MTR/studeny-efficient algorithms for conditional independence inference.pdf

  7. Maintaining Genome Stability in Defiance of Mitotic DNA Damage

    Science.gov (United States)

    Ferrari, Stefano; Gentili, Christian

    2016-01-01

    The implementation of decisions affecting cell viability and proliferation is based on prompt detection of the issue to be addressed, formulation and transmission of a correct set of instructions and fidelity in the execution of orders. While the first and the last are purely mechanical processes relying on the faithful functioning of single proteins or macromolecular complexes (sensors and effectors), information is the real cue, with signal amplitude, duration, and frequency ultimately determining the type of response. The cellular response to DNA damage is no exception to the rule. In this review article we focus on DNA damage responses in G2 and Mitosis. First, we set the stage describing mitosis and the machineries in charge of assembling the apparatus responsible for chromosome alignment and segregation as well as the inputs that control its function (checkpoints). Next, we examine the type of issues that a cell approaching mitosis might face, presenting the impact of post-translational modifications (PTMs) on the correct and timely functioning of pathways correcting errors or damage before chromosome segregation. We conclude this essay with a perspective on the current status of mitotic signaling pathway inhibitors and their potential use in cancer therapy. PMID:27493659

  8. On the criticality of inferred models

    Science.gov (United States)

    Mastromatteo, Iacopo; Marsili, Matteo

    2011-10-01

    Advanced inference techniques allow one to reconstruct a pattern of interaction from high dimensional data sets, from probing simultaneously thousands of units of extended systems—such as cells, neural tissues and financial markets. We focus here on the statistical properties of inferred models and argue that inference procedures are likely to yield models which are close to singular values of parameters, akin to critical points in physics where phase transitions occur. These are points where the response of physical systems to external perturbations, as measured by the susceptibility, is very large and diverges in the limit of infinite size. We show that the reparameterization invariant metrics in the space of probability distributions of these models (the Fisher information) are directly related to the susceptibility of the inferred model. As a result, distinguishable models tend to accumulate close to critical points, where the susceptibility diverges in infinite systems. This region is the one where the estimate of inferred parameters is most stable. In order to illustrate these points, we discuss inference of interacting point processes with application to financial data and show that sensible choices of observation time scales naturally yield models which are close to criticality.

  9. On the criticality of inferred models

    International Nuclear Information System (INIS)

    Mastromatteo, Iacopo; Marsili, Matteo

    2011-01-01

    Advanced inference techniques allow one to reconstruct a pattern of interaction from high dimensional data sets, from probing simultaneously thousands of units of extended systems—such as cells, neural tissues and financial markets. We focus here on the statistical properties of inferred models and argue that inference procedures are likely to yield models which are close to singular values of parameters, akin to critical points in physics where phase transitions occur. These are points where the response of physical systems to external perturbations, as measured by the susceptibility, is very large and diverges in the limit of infinite size. We show that the reparameterization invariant metrics in the space of probability distributions of these models (the Fisher information) are directly related to the susceptibility of the inferred model. As a result, distinguishable models tend to accumulate close to critical points, where the susceptibility diverges in infinite systems. This region is the one where the estimate of inferred parameters is most stable. In order to illustrate these points, we discuss inference of interacting point processes with application to financial data and show that sensible choices of observation time scales naturally yield models which are close to criticality

  10. Polynomial Chaos Surrogates for Bayesian Inference

    KAUST Repository

    Le Maitre, Olivier

    2016-01-06

    The Bayesian inference is a popular probabilistic method to solve inverse problems, such as the identification of field parameter in a PDE model. The inference rely on the Bayes rule to update the prior density of the sought field, from observations, and derive its posterior distribution. In most cases the posterior distribution has no explicit form and has to be sampled, for instance using a Markov-Chain Monte Carlo method. In practice the prior field parameter is decomposed and truncated (e.g. by means of Karhunen- Lo´eve decomposition) to recast the inference problem into the inference of a finite number of coordinates. Although proved effective in many situations, the Bayesian inference as sketched above faces several difficulties requiring improvements. First, sampling the posterior can be a extremely costly task as it requires multiple resolutions of the PDE model for different values of the field parameter. Second, when the observations are not very much informative, the inferred parameter field can highly depends on its prior which can be somehow arbitrary. These issues have motivated the introduction of reduced modeling or surrogates for the (approximate) determination of the parametrized PDE solution and hyperparameters in the description of the prior field. Our contribution focuses on recent developments in these two directions: the acceleration of the posterior sampling by means of Polynomial Chaos expansions and the efficient treatment of parametrized covariance functions for the prior field. We also discuss the possibility of making such approach adaptive to further improve its efficiency.

  11. Inference of segmented color and texture description by tensor voting.

    Science.gov (United States)

    Jia, Jiaya; Tang, Chi-Keung

    2004-06-01

    A robust synthesis method is proposed to automatically infer missing color and texture information from a damaged 2D image by (N)D tensor voting (N > 3). The same approach is generalized to range and 3D data in the presence of occlusion, missing data and noise. Our method translates texture information into an adaptive (N)D tensor, followed by a voting process that infers noniteratively the optimal color values in the (N)D texture space. A two-step method is proposed. First, we perform segmentation based on insufficient geometry, color, and texture information in the input, and extrapolate partitioning boundaries by either 2D or 3D tensor voting to generate a complete segmentation for the input. Missing colors are synthesized using (N)D tensor voting in each segment. Different feature scales in the input are automatically adapted by our tensor scale analysis. Results on a variety of difficult inputs demonstrate the effectiveness of our tensor voting approach.

  12. A Bayesian Network Schema for Lessening Database Inference

    National Research Council Canada - National Science Library

    Chang, LiWu; Moskowitz, Ira S

    2001-01-01

    .... The authors introduce a formal schema for database inference analysis, based upon a Bayesian network structure, which identifies critical parameters involved in the inference problem and represents...

  13. Exercise-Induced Muscle Damage and Hypertrophy: A Closer Look Reveals the Jury is Still Out

    OpenAIRE

    Schoenfeld, Brad; Contreras, Bret

    2018-01-01

    This letter is a response to the paper by Damas et al (2017) titled, “The development of skeletal muscle hypertrophy through resistance training: the role of muscle damage and muscle protein synthesis,” which, in part, endeavored to review the role of exercise-induced muscle damage on muscle hypertrophy. We feel there are a number of issues in interpretation of research and extrapolation that preclude drawing the inference expressed in the paper that muscle damage neither explains nor potenti...

  14. Identification of transcriptional macromolecular associations in human bone using browser based in silico analysis in a giant correlation matrix.

    Science.gov (United States)

    Reppe, Sjur; Sachse, Daniel; Olstad, Ole K; Gautvik, Vigdis T; Sanderson, Paul; Datta, Harish K; Berg, Jens P; Gautvik, Kaare M

    2013-03-01

    Intracellular signaling is critically dependent on gene regulatory networks comprising physical molecular interactions. Presently, there is a lack of comprehensive databases for most human tissue types to verify such macromolecular interactions. We present a user friendly browser which helps to identify functional macromolecular interactions in human bone as significant correlations at the transcriptional level. The molecular skeletal phenotype has been characterized by transcriptome analysis of iliac crest bone biopsies from 84 postmenopausal women through quantifications of ~23,000 mRNA species. When the signal levels were inter-correlated, an array containing >260 million correlations was generated, thus recognizing the human bone interactome at the RNA level. The matrix correlation and p values were made easily accessible by a freely available online browser. We show that significant correlations within the giant matrix are reproduced in a replica set of 13 male vertebral biopsies. The identified correlations differ somewhat from transcriptional interactions identified in cell culture experiments and transgenic mice, thus demonstrating that care should be taken in extrapolating such results to the in vivo situation in human bone. The current giant matrix and web browser are a valuable tool for easy access to the human bone transcriptome and molecular interactions represented as significant correlations at the RNA-level. The browser and matrix should be a valuable hypothesis generating tool for identification of regulatory mechanisms and serve as a library of transcript relationships in human bone, a relatively inaccessible tissue. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. AR-NE3A, a New Macromolecular Crystallography Beamline for Pharmaceutical Applications at the Photon Factory

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi

    2010-01-01

    Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.

  16. Implementation of fast macromolecular proton fraction mapping on 1.5 and 3 Tesla clinical MRI scanners: preliminary experience

    Science.gov (United States)

    Yarnykh, V.; Korostyshevskaya, A.

    2017-08-01

    Macromolecular proton fraction (MPF) is a biophysical parameter describing the amount of macromolecular protons involved into magnetization exchange with water protons in tissues. MPF represents a significant interest as a magnetic resonance imaging (MRI) biomarker of myelin for clinical applications. A recent fast MPF mapping method enabled clinical translation of MPF measurements due to time-efficient acquisition based on the single-point constrained fit algorithm. However, previous MPF mapping applications utilized only 3 Tesla MRI scanners and modified pulse sequences, which are not commonly available. This study aimed to test the feasibility of MPF mapping implementation on a 1.5 Tesla clinical scanner using standard manufacturer’s sequences and compare the performance of this method between 1.5 and 3 Tesla scanners. MPF mapping was implemented on 1.5 and 3 Tesla MRI units of one manufacturer with either optimized custom-written or standard product pulse sequences. Whole-brain three-dimensional MPF maps obtained from a single volunteer were compared between field strengths and implementation options. MPF maps demonstrated similar quality at both field strengths. MPF values in segmented brain tissues and specific anatomic regions appeared in close agreement. This experiment demonstrates the feasibility of fast MPF mapping using standard sequences on 1.5 T and 3 T clinical scanners.

  17. Evaluation of macromolecular electron-density map quality using the correlation of local r.m.s. density

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    The correlation of local r.m.s. density is shown to be a good measure of the presence of distinct solvent and macromolecule regions in macromolecular electron-density maps. It has recently been shown that the standard deviation of local r.m.s. electron density is a good indicator of the presence of distinct regions of solvent and protein in macromolecular electron-density maps [Terwilliger & Berendzen (1999 ▶). Acta Cryst. D55, 501–505]. Here, it is demonstrated that a complementary measure, the correlation of local r.m.s. density in adjacent regions on the unit cell, is also a good measure of the presence of distinct solvent and protein regions. The correlation of local r.m.s. density is essentially a measure of how contiguous the solvent (and protein) regions are in the electron-density map. This statistic can be calculated in real space or in reciprocal space and has potential uses in evaluation of heavy-atom solutions in the MIR and MAD methods as well as for evaluation of trial phase sets in ab initio phasing procedures

  18. Stably engineered nanobubbles and ultrasound - An effective platform for enhanced macromolecular delivery to representative cells of the retina.

    Directory of Open Access Journals (Sweden)

    Sachin S Thakur

    Full Text Available Herein we showcase the potential of ultrasound-responsive nanobubbles in enhancing macromolecular permeation through layers of the retina, ultimately leading to significant and direct intracellular delivery; this being effectively demonstrated across three relevant and distinct retinal cell lines. Stably engineered nanobubbles of a highly homogenous and echogenic nature were fully characterised using dynamic light scattering, B-scan ultrasound and transmission electron microscopy (TEM. The nanobubbles appeared as spherical liposome-like structures under TEM, accompanied by an opaque luminal core and darkened corona around their periphery, with both features indicative of efficient gas entrapment and adsorption, respectively. A nanobubble +/- ultrasound sweeping study was conducted next, which determined the maximum tolerated dose for each cell line. Detection of underlying cellular stress was verified using the biomarker heat shock protein 70, measured before and after treatment with optimised ultrasound. Next, with safety to nanobubbles and optimised ultrasound demonstrated, each human or mouse-derived cell population was incubated with biotinylated rabbit-IgG in the presence and absence of ultrasound +/- nanobubbles. Intracellular delivery of antibody in each cell type was then quantified using Cy3-streptavidin. Nanobubbles and optimised ultrasound were found to be negligibly toxic across all cell lines tested. Macromolecular internalisation was achieved to significant, yet varying degrees in all three cell lines. The results of this study pave the way towards better understanding mechanisms underlying cellular responsiveness to ultrasound-triggered drug delivery in future ex vivo and in vivo models of the posterior eye.

  19. A formal model of interpersonal inference

    Directory of Open Access Journals (Sweden)

    Michael eMoutoussis

    2014-03-01

    Full Text Available Introduction: We propose that active Bayesian inference – a general framework for decision-making – can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: 1. Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to 'mentalising' in the psychological literature, is based upon the outcomes of interpersonal exchanges. 2. We show how some well-known social-psychological phenomena (e.g. self-serving biases can be explained in terms of active interpersonal inference. 3. Mentalising naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one’s own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modelling intersubject variability in mentalising during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalising is distorted.

  20. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  1. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  2. SASSIE: A program to study intrinsically disordered biological molecules and macromolecular ensembles using experimental scattering restraints

    Science.gov (United States)

    Curtis, Joseph E.; Raghunandan, Sindhu; Nanda, Hirsh; Krueger, Susan

    2012-02-01

    A program to construct ensembles of biomolecular structures that are consistent with experimental scattering data are described. Specifically, we generate an ensemble of biomolecular structures by varying sets of backbone dihedral angles that are then filtered using experimentally determined restraints to rapidly determine structures that have scattering profiles that are consistent with scattering data. We discuss an application of these tools to predict a set of structures for the HIV-1 Gag protein, an intrinsically disordered protein, that are consistent with small-angle neutron scattering experimental data. We have assembled these algorithms into a program called SASSIE for structure generation, visualization, and analysis of intrinsically disordered proteins and other macromolecular ensembles using neutron and X-ray scattering restraints. Program summaryProgram title: SASSIE Catalogue identifier: AEKL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v3 No. of lines in distributed program, including test data, etc.: 3 991 624 No. of bytes in distributed program, including test data, etc.: 826 Distribution format: tar.gz Programming language: Python, C/C++, Fortran Computer: PC/Mac Operating system: 32- and 64-bit Linux (Ubuntu 10.04, Centos 5.6) and Mac OS X (10.6.6) RAM: 1 GB Classification: 3 External routines: Python 2.6.5, numpy 1.4.0, swig 1.3.40, scipy 0.8.0, Gnuplot-py-1.8, Tcl 8.5, Tk 8.5, Mac installation requires aquaterm 1.0 (or X window system) and Xcode 3 development tools. Nature of problem: Open source software to generate structures of disordered biological molecules that subsequently allow for the comparison of computational and experimental results is limiting the use of scattering resources. Solution method: Starting with an all atom model of a protein, for example, users can input

  3. Liberation of microbial substrates from macromolecular organic matter by non-supercritical CO2

    Science.gov (United States)

    Sauer, P.; Glombitza, C.; Kallmeyer, J.

    2012-12-01

    The worldwide search for suitable underground storage formations for CO2 also considers coal-bearing strata. CO2 is already injected into coal seams for enhanced recovery of coal bed methane. However, the geochemical and microbiological effects of increased CO2 concentrations on organic matter rich formations are rarely investigated. The injected CO2 will dissolve in the pore water, causing a decrease in pH and resulting in acidic formation waters. Low molecular weight organic acids (LMWOAs) are chemically bound to the macromolecular matrix of sedimentary organic matter and may be liberated by hydrolysis, which is enhanced under acidic conditions. Recent investigations outlined the importance of LMWOAs as a feedstock for subsurface microbial life [1]. Therefore, injection of CO2 into coal formations may result in enhanced nutrient supply for subsurface microbes. To investigate the effects of highly CO2-saturated waters on the release of LMWOAs from coal, we developed an inexpensive high-pressure-high-temperature system that allows manipulating the concentration of dissolved gases up to 60 MPa and 120°C, respectively. The sample is placed in a flexible, gas-tight and inert PVDF sleeve, separating it from the pressure fluid and allowing for subsampling without loss of pressure. Lignite samples from the DEBITS-1 well, Waikato Basin, NZ and the Welzow-Süd open-cast mine, Niederlausitz, Germany, were extracted at 90° C and 5 MPa, with either pure water, CO2-saturated water, CO2/NO2 or CO2/SO2-saturated water. Subsamples were taken at different time points during the 72 hrs. long extraction. Extraction of LMWOAs from coal samples with our pressurised system resulted in yields that were up to four times higher than those reported for Soxhlet extraction [2]. These higher yields may be explained by the fact that during Soxhlet extraction the sample only gets into contact with freshly distilled water, whereas in our system the extraction fluid is circulated, resulting in

  4. Deep Learning for Population Genetic Inference.

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S

    2016-03-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  5. Deep Learning for Population Genetic Inference.

    Directory of Open Access Journals (Sweden)

    Sara Sheehan

    2016-03-01

    Full Text Available Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data to the output (e.g., population genetic parameters of interest. We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history. Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  6. Deep Learning for Population Genetic Inference

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S.

    2016-01-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme. PMID:27018908

  7. Inferring Phylogenetic Networks Using PhyloNet.

    Science.gov (United States)

    Wen, Dingqiao; Yu, Yun; Zhu, Jiafan; Nakhleh, Luay

    2018-07-01

    PhyloNet was released in 2008 as a software package for representing and analyzing phylogenetic networks. At the time of its release, the main functionalities in PhyloNet consisted of measures for comparing network topologies and a single heuristic for reconciling gene trees with a species tree. Since then, PhyloNet has grown significantly. The software package now includes a wide array of methods for inferring phylogenetic networks from data sets of unlinked loci while accounting for both reticulation (e.g., hybridization) and incomplete lineage sorting. In particular, PhyloNet now allows for maximum parsimony, maximum likelihood, and Bayesian inference of phylogenetic networks from gene tree estimates. Furthermore, Bayesian inference directly from sequence data (sequence alignments or biallelic markers) is implemented. Maximum parsimony is based on an extension of the "minimizing deep coalescences" criterion to phylogenetic networks, whereas maximum likelihood and Bayesian inference are based on the multispecies network coalescent. All methods allow for multiple individuals per species. As computing the likelihood of a phylogenetic network is computationally hard, PhyloNet allows for evaluation and inference of networks using a pseudolikelihood measure. PhyloNet summarizes the results of the various analyzes and generates phylogenetic networks in the extended Newick format that is readily viewable by existing visualization software.

  8. Goal inferences about robot behavior : goal inferences and human response behaviors

    NARCIS (Netherlands)

    Broers, H.A.T.; Ham, J.R.C.; Broeders, R.; De Silva, P.; Okada, M.

    2014-01-01

    This explorative research focused on the goal inferences human observers draw based on a robot's behavior, and the extent to which those inferences predict people's behavior in response to that robot. Results show that different robot behaviors cause different response behavior from people.

  9. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    Science.gov (United States)

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  10. Explanatory Preferences Shape Learning and Inference.

    Science.gov (United States)

    Lombrozo, Tania

    2016-10-01

    Explanations play an important role in learning and inference. People often learn by seeking explanations, and they assess the viability of hypotheses by considering how well they explain the data. An emerging body of work reveals that both children and adults have strong and systematic intuitions about what constitutes a good explanation, and that these explanatory preferences have a systematic impact on explanation-based processes. In particular, people favor explanations that are simple and broad, with the consequence that engaging in explanation can shape learning and inference by leading people to seek patterns and favor hypotheses that support broad and simple explanations. Given the prevalence of explanation in everyday cognition, understanding explanation is therefore crucial to understanding learning and inference. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Fuzzy logic controller using different inference methods

    International Nuclear Information System (INIS)

    Liu, Z.; De Keyser, R.

    1994-01-01

    In this paper the design of fuzzy controllers by using different inference methods is introduced. Configuration of the fuzzy controllers includes a general rule-base which is a collection of fuzzy PI or PD rules, the triangular fuzzy data model and a centre of gravity defuzzification algorithm. The generalized modus ponens (GMP) is used with the minimum operator of the triangular norm. Under the sup-min inference rule, six fuzzy implication operators are employed to calculate the fuzzy look-up tables for each rule base. The performance is tested in simulated systems with MATLAB/SIMULINK. Results show the effects of using the fuzzy controllers with different inference methods and applied to different test processes

  12. Uncertainty in prediction and in inference

    International Nuclear Information System (INIS)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support

  13. A Learning Algorithm for Multimodal Grammar Inference.

    Science.gov (United States)

    D'Ulizia, A; Ferri, F; Grifoni, P

    2011-12-01

    The high costs of development and maintenance of multimodal grammars in integrating and understanding input in multimodal interfaces lead to the investigation of novel algorithmic solutions in automating grammar generation and in updating processes. Many algorithms for context-free grammar inference have been developed in the natural language processing literature. An extension of these algorithms toward the inference of multimodal grammars is necessary for multimodal input processing. In this paper, we propose a novel grammar inference mechanism that allows us to learn a multimodal grammar from its positive samples of multimodal sentences. The algorithm first generates the multimodal grammar that is able to parse the positive samples of sentences and, afterward, makes use of two learning operators and the minimum description length metrics in improving the grammar description and in avoiding the over-generalization problem. The experimental results highlight the acceptable performances of the algorithm proposed in this paper since it has a very high probability of parsing valid sentences.

  14. Examples in parametric inference with R

    CERN Document Server

    Dixit, Ulhas Jayram

    2016-01-01

    This book discusses examples in parametric inference with R. Combining basic theory with modern approaches, it presents the latest developments and trends in statistical inference for students who do not have an advanced mathematical and statistical background. The topics discussed in the book are fundamental and common to many fields of statistical inference and thus serve as a point of departure for in-depth study. The book is divided into eight chapters: Chapter 1 provides an overview of topics on sufficiency and completeness, while Chapter 2 briefly discusses unbiased estimation. Chapter 3 focuses on the study of moments and maximum likelihood estimators, and Chapter 4 presents bounds for the variance. In Chapter 5, topics on consistent estimator are discussed. Chapter 6 discusses Bayes, while Chapter 7 studies some more powerful tests. Lastly, Chapter 8 examines unbiased and other tests. Senior undergraduate and graduate students in statistics and mathematics, and those who have taken an introductory cou...

  15. Grammatical inference algorithms, routines and applications

    CERN Document Server

    Wieczorek, Wojciech

    2017-01-01

    This book focuses on grammatical inference, presenting classic and modern methods of grammatical inference from the perspective of practitioners. To do so, it employs the Python programming language to present all of the methods discussed. Grammatical inference is a field that lies at the intersection of multiple disciplines, with contributions from computational linguistics, pattern recognition, machine learning, computational biology, formal learning theory and many others. Though the book is largely practical, it also includes elements of learning theory, combinatorics on words, the theory of automata and formal languages, plus references to real-world problems. The listings presented here can be directly copied and pasted into other programs, thus making the book a valuable source of ready recipes for students, academic researchers, and programmers alike, as well as an inspiration for their further development.>.

  16. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  17. Constructing irregular surfaces to enclose macromolecular complexes for mesoscale modeling using the discrete surface charge optimization (DISCO) algorithm.

    Science.gov (United States)

    Zhang, Qing; Beard, Daniel A; Schlick, Tamar

    2003-12-01

    Salt-mediated electrostatics interactions play an essential role in biomolecular structures and dynamics. Because macromolecular systems modeled at atomic resolution contain thousands of solute atoms, the electrostatic computations constitute an expensive part of the force and energy calculations. Implicit solvent models are one way to simplify the model and associated calculations, but they are generally used in combination with standard atomic models for the solute. To approximate electrostatics interactions in models on the polymer level (e.g., supercoiled DNA) that are simulated over long times (e.g., milliseconds) using Brownian dynamics, Beard and Schlick have developed the DiSCO (Discrete Surface Charge Optimization) algorithm. DiSCO represents a macromolecular complex by a few hundred discrete charges on a surface enclosing the system modeled by the Debye-Hückel (screened Coulombic) approximation to the Poisson-Boltzmann equation, and treats the salt solution as continuum solvation. DiSCO can represent the nucleosome core particle (>12,000 atoms), for example, by 353 discrete surface charges distributed on the surfaces of a large disk for the nucleosome core particle and a slender cylinder for the histone tail; the charges are optimized with respect to the Poisson-Boltzmann solution for the electric field, yielding a approximately 5.5% residual. Because regular surfaces enclosing macromolecules are not sufficiently general and may be suboptimal for certain systems, we develop a general method to construct irregular models tailored to the geometry of macromolecules. We also compare charge optimization based on both the electric field and electrostatic potential refinement. Results indicate that irregular surfaces can lead to a more accurate approximation (lower residuals), and the refinement in terms of the electric field is more robust. We also show that surface smoothing for irregular models is important, that the charge optimization (by the TNPACK

  18. Improved Inference of Heteroscedastic Fixed Effects Models

    Directory of Open Access Journals (Sweden)

    Afshan Saeed

    2016-12-01

    Full Text Available Heteroscedasticity is a stern problem that distorts estimation and testing of panel data model (PDM. Arellano (1987 proposed the White (1980 estimator for PDM with heteroscedastic errors but it provides erroneous inference for the data sets including high leverage points. In this paper, our attempt is to improve heteroscedastic consistent covariance matrix estimator (HCCME for panel dataset with high leverage points. To draw robust inference for the PDM, our focus is to improve kernel bootstrap estimators, proposed by Racine and MacKinnon (2007. The Monte Carlo scheme is used for assertion of the results.

  19. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point...... process, where the germs are the centres and the marks are the associated radii of the discs. We propose to use a recent parametric class of interacting disc process models, where the minimal sufficient statistic depends on various geometric properties of the random set, and the density is specified......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  20. IMAGINE: Interstellar MAGnetic field INference Engine

    Science.gov (United States)

    Steininger, Theo

    2018-03-01

    IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.

  1. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...

  2. Macromolecular crowding: chemistry and physics meet biology (Ascona, Switzerland, 10-14 June 2012)

    Science.gov (United States)

    Foffi, G.; Pastore, A.; Piazza, F.; Temussi, P. A.

    2013-08-01

    More than 60 years of biochemical and biophysical studies have accustomed us to think of proteins as highly purified entities that act in isolation, more or less freely diffusing until they find their cognate partner to bind to. While in vitro experiments that reproduce these conditions largely remain the only way to investigate the intrinsic properties of molecules, this approach ignores an important factor: in their natural milieu , proteins are surrounded by several other molecules of different chemical nature, and this crowded environment can considerably modify their behaviour. About 40% of the cellular volume on average is occupied by all sorts of molecules. Furthermore, biological macromolecules live and operate in an extremely structured and complex environment within the cell (endoplasmic reticulum, Golgi apparatus, cytoskeletal structures, etc). Hence, to further complicate the picture, the interior of the cell is by no means a simply crowded medium, rather, a most crowded and confining one. In recent times, several approaches have been developed in the attempt to take into account important factors such as the ones mentioned above, at both theoretical and experimental levels, so that this field of research is now emerging as one of the most thriving in molecular and cell biology (see figure 1). Figure 1. Figure 1. Left: number of articles containing the word 'crowding' as a keyword limited to the biological and chemical science domains (source: ISI Web of Science). The arrow flags the 2003 'EMBO Workshop on Biological Implications of Macromolecular Crowding' (Embo, 2012). Right: number of citations to articles containing the word 'crowding' limited to the same domains (bars) and an exponential regression curve (source: Elsevier Scopus). To promote the importance of molecular crowding and confinement and provide researchers active in this field an interdisciplinary forum for meeting and exchanging ideas, we recently organized an international conference

  3. Macromolecular crowding: chemistry and physics meet biology (Ascona, Switzerland, 10-14 June 2012).

    Science.gov (United States)

    Foffi, G; Pastore, A; Piazza, F; Temussi, P A

    2013-08-02

    More than 60 years of biochemical and biophysical studies have accustomed us to think of proteins as highly purified entities that act in isolation, more or less freely diffusing until they find their cognate partner to bind to. While in vitro experiments that reproduce these conditions largely remain the only way to investigate the intrinsic properties of molecules, this approach ignores an important factor: in their natural milieu , proteins are surrounded by several other molecules of different chemical nature, and this crowded environment can considerably modify their behaviour. About 40% of the cellular volume on average is occupied by all sorts of molecules. Furthermore, biological macromolecules live and operate in an extremely structured and complex environment within the cell (endoplasmic reticulum, Golgi apparatus, cytoskeletal structures, etc). Hence, to further complicate the picture, the interior of the cell is by no means a simply crowded medium, rather, a most crowded and confining one. In recent times, several approaches have been developed in the attempt to take into account important factors such as the ones mentioned above, at both theoretical and experimental levels, so that this field of research is now emerging as one of the most thriving in molecular and cell biology (see figure 1). [Formula: see text] Figure 1. Left: number of articles containing the word 'crowding' as a keyword limited to the biological and chemical science domains (source: ISI Web of Science). The arrow flags the 2003 'EMBO Workshop on Biological Implications of Macromolecular Crowding' (Embo, 2012). Right: number of citations to articles containing the word 'crowding' limited to the same domains (bars) and an exponential regression curve (source: Elsevier Scopus). To promote the importance of molecular crowding and confinement and provide researchers active in this field an interdisciplinary forum for meeting and exchanging ideas, we recently organized an international

  4. Radiation damage to mushrooms

    International Nuclear Information System (INIS)

    Sattler, P.W.

    1986-01-01

    This document contains newspaper cuttings and correspondence with various ministries in Hessen on the subject of radiation damage to mushrooms from the Odenwald area. The reader is given, amongst other things, detailed information on radiation damage to different types of mushroom in 1986. (MG) [de

  5. Animal damage to birch

    Science.gov (United States)

    James S. Jordan; Francis M. Rushmore

    1969-01-01

    A relatively few animal species are responsible for most of the reported damage to the birches. White-tailed deer, yellow-bellied sapsuckers, porcupines, moose, and hares are the major animals involved. We will review reports of damage, discuss the underlying causes, and describe possible methods of control. For example, heavy deer browsing that eliminates birch...

  6. Animal damage management handbook.

    Science.gov (United States)

    Hugh C. Black

    1994-01-01

    This handbook treats animal damage management (ADM) in the West in relation to forest, range, and recreation resources; predator management is not addressed. It provides a comprehensive reference of safe, effective, and practical methods for managing animal damage on National Forest System lands. Supporting information is included in references after each chapter and...

  7. Nuclear damage - civil liability

    International Nuclear Information System (INIS)

    Simoes, A.C.

    1980-01-01

    An analysis is made of the civil liability for nuclear damage since there is a need to adjust the existing rules to the new situations created. The conventions that set up the new disciplining rules not considered in the common law for the liability of nuclear damage are also mentioned. (A.L.) [pt

  8. DNA damage and autophagy

    International Nuclear Information System (INIS)

    Rodriguez-Rocha, Humberto; Garcia-Garcia, Aracely; Panayiotidis, Mihalis I.; Franco, Rodrigo

    2011-01-01

    Both exogenous and endogenous agents are a threat to DNA integrity. Exogenous environmental agents such as ultraviolet (UV) and ionizing radiation, genotoxic chemicals and endogenous byproducts of metabolism including reactive oxygen species can cause alterations in DNA structure (DNA damage). Unrepaired DNA damage has been linked to a variety of human disorders including cancer and neurodegenerative disease. Thus, efficient mechanisms to detect DNA lesions, signal their presence and promote their repair have been evolved in cells. If DNA is effectively repaired, DNA damage response is inactivated and normal cell functioning resumes. In contrast, when DNA lesions cannot be removed, chronic DNA damage triggers specific cell responses such as cell death and senescence. Recently, DNA damage has been shown to induce autophagy, a cellular catabolic process that maintains a balance between synthesis, degradation, and recycling of cellular components. But the exact mechanisms by which DNA damage triggers autophagy are unclear. More importantly, the role of autophagy in the DNA damage response and cellular fate is unknown. In this review we analyze evidence that supports a role for autophagy as an integral part of the DNA damage response.

  9. Metabolic growth rate control in Escherichia coli may be a consequence of subsaturation of the macromolecular biosynthetic apparatus with substrates and catalytic components

    DEFF Research Database (Denmark)

    Jensen, Kaj Frank; Pedersen, Steen

    1990-01-01

    In this paper, the Escherichia coli cell is considered as a system designed for rapid growth, but limited by the medium. We propose that this very design causes the cell to become subsaturated with precursors and catalytic components at all levels of macromolecular biosynthesis and leads to a mol...

  10. Errors in macromolecular synthesis after stress. A study of the possible protective role of the small heat shock proteinsBiochemistry

    NARCIS (Netherlands)

    Marin Vinader, L.

    2006-01-01

    The general goal of this thesis was to gain insight in what small heat shock proteins (sHsps) do with respect to macromolecular synthesis during a stressful situation in the cell. It is known that after a non-lethal heat shock, cells are better protected against a subsequent more severe heat shock,

  11. A fast band–Krylov eigensolver for macromolecular functional motion simulation on multicore architectures and graphics processors

    Energy Technology Data Exchange (ETDEWEB)

    Aliaga, José I., E-mail: aliaga@uji.es [Depto. Ingeniería y Ciencia de Computadores, Universitat Jaume I, Castellón (Spain); Alonso, Pedro [Departamento de Sistemas Informáticos y Computación, Universitat Politècnica de València (Spain); Badía, José M. [Depto. Ingeniería y Ciencia de Computadores, Universitat Jaume I, Castellón (Spain); Chacón, Pablo [Dept. Biological Chemical Physics, Rocasolano Physics and Chemistry Institute, CSIC, Madrid (Spain); Davidović, Davor [Rudjer Bošković Institute, Centar za Informatiku i Računarstvo – CIR, Zagreb (Croatia); López-Blanco, José R. [Dept. Biological Chemical Physics, Rocasolano Physics and Chemistry Institute, CSIC, Madrid (Spain); Quintana-Ortí, Enrique S. [Depto. Ingeniería y Ciencia de Computadores, Universitat Jaume I, Castellón (Spain)

    2016-03-15

    We introduce a new iterative Krylov subspace-based eigensolver for the simulation of macromolecular motions on desktop multithreaded platforms equipped with multicore processors and, possibly, a graphics accelerator (GPU). The method consists of two stages, with the original problem first reduced into a simpler band-structured form by means of a high-performance compute-intensive procedure. This is followed by a memory-intensive but low-cost Krylov iteration, which is off-loaded to be computed on the GPU by means of an efficient data-parallel kernel. The experimental results reveal the performance of the new eigensolver. Concretely, when applied to the simulation of macromolecules with a few thousands degrees of freedom and the number of eigenpairs to be computed is small to moderate, the new solver outperforms other methods implemented as part of high-performance numerical linear algebra packages for multithreaded architectures.

  12. The macromolecular complex of ICP and falcipain-2 from Plasmodium: preparation, crystallization and preliminary X-ray diffraction analysis

    International Nuclear Information System (INIS)

    Hansen, Guido; Schwarzloh, Britta; Rennenberg, Annika; Heussler, Volker T.; Hilgenfeld, Rolf

    2011-01-01

    The macromolecular complex of ICP (inhibitor of cysteine proteases) from P. berghei and falcipain-2 from P. falciparum has been prepared and crystallized, and a diffraction data set has been collected to a resolution of 2.6 Å. The malaria parasite Plasmodium depends on the tight control of cysteine-protease activity throughout its life cycle. Recently, the characterization of a new class of potent inhibitors of cysteine proteases (ICPs) secreted by Plasmodium has been reported. Here, the recombinant production, purification and crystallization of the inhibitory C-terminal domain of ICP from P. berghei in complex with the P. falciparum haemoglobinase falcipain-2 is described. The 1:1 complex was crystallized in space group P4 3 , with unit-cell parameters a = b = 71.15, c = 120.09 Å. A complete diffraction data set was collected to a resolution of 2.6 Å

  13. A new on-axis multimode spectrometer for the macromolecular crystallography beamlines of the Swiss Light Source

    International Nuclear Information System (INIS)

    Owen, Robin L.; Pearson, Arwen R.; Meents, Alke; Boehler, Pirmin; Thominet, Vincent; Schulze-Briese, Clemens

    2009-01-01

    Complementary techniques greatly aid the interpretation of macromolecule structures to yield functional information, and can also help to track radiation-induced changes. A new on-axis spectrometer being integrated into the macromolecular crystallography beamlines of the Swiss Light Source is presented. X-ray crystallography at third-generation synchrotron sources permits tremendous insight into the three-dimensional structure of macromolecules. Additional information is, however, often required to aid the transition from structure to function. In situ spectroscopic methods such as UV–Vis absorption and (resonance) Raman can provide this, and can also provide a means of detecting X-ray-induced changes. Here, preliminary results are introduced from an on-axis UV–Vis absorption and Raman multimode spectrometer currently being integrated into the beamline environment at X10SA of the Swiss Light Source. The continuing development of the spectrometer is also outlined

  14. A fast band–Krylov eigensolver for macromolecular functional motion simulation on multicore architectures and graphics processors

    International Nuclear Information System (INIS)

    Aliaga, José I.; Alonso, Pedro; Badía, José M.; Chacón, Pablo; Davidović, Davor; López-Blanco, José R.; Quintana-Ortí, Enrique S.

    2016-01-01

    We introduce a new iterative Krylov subspace-based eigensolver for the simulation of macromolecular motions on desktop multithreaded platforms equipped with multicore processors and, possibly, a graphics accelerator (GPU). The method consists of two stages, with the original problem first reduced into a simpler band-structured form by means of a high-performance compute-intensive procedure. This is followed by a memory-intensive but low-cost Krylov iteration, which is off-loaded to be computed on the GPU by means of an efficient data-parallel kernel. The experimental results reveal the performance of the new eigensolver. Concretely, when applied to the simulation of macromolecules with a few thousands degrees of freedom and the number of eigenpairs to be computed is small to moderate, the new solver outperforms other methods implemented as part of high-performance numerical linear algebra packages for multithreaded architectures.

  15. Recent Advances in the Analysis of Macromolecular Interactions Using the Matrix-Free Method of Sedimentation in the Analytical Ultracentrifuge

    Directory of Open Access Journals (Sweden)

    Stephen E. Harding

    2015-03-01

    Full Text Available Sedimentation in the analytical ultracentrifuge is a matrix free solution technique with no immobilisation, columns, or membranes required and can be used to study self-association and complex or “hetero”-interactions, stoichiometry, reversibility and interaction strength of a wide variety of macromolecular types and across a very large dynamic range (dissociation constants from 10−12 M to 10−1 M. We extend an earlier review specifically highlighting advances in sedimentation velocity and sedimentation equilibrium in the analytical ultracentrifuge applied to protein interactions and mucoadhesion and to review recent applications in protein self-association (tetanus toxoid, agrin, protein-like carbohydrate association (aminocelluloses, carbohydrate-protein interactions (polysaccharide-gliadin, nucleic-acid protein (G-duplexes, nucleic acid-carbohydrate (DNA-chitosan and finally carbohydrate-carbohydrate (xanthan-chitosan and a ternary polysaccharide complex interactions.

  16. Metabolite Damage and Metabolite Damage Control in Plants

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, Andrew D. [Horticultural Sciences Department and; Henry, Christopher S. [Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, Illinois 60439, email:; Computation Institute, University of Chicago, Chicago, Illinois 60637; Fiehn, Oliver [Genome Center, University of California, Davis, California 95616, email:; de Crécy-Lagard, Valérie [Microbiology and Cell Science Department, University of Florida, Gainesville, Florida 32611, email: ,

    2016-04-29

    It is increasingly clear that (a) many metabolites undergo spontaneous or enzyme-catalyzed side reactions in vivo, (b) the damaged metabolites formed by these reactions can be harmful, and (c) organisms have biochemical systems that limit the buildup of damaged metabolites. These damage-control systems either return a damaged molecule to its pristine state (metabolite repair) or convert harmful molecules to harmless ones (damage preemption). Because all organisms share a core set of metabolites that suffer the same chemical and enzymatic damage reactions, certain damage-control systems are widely conserved across the kingdoms of life. Relatively few damage reactions and damage-control systems are well known. Uncovering new damage reactions and identifying the corresponding damaged metabolites, damage-control genes, and enzymes demands a coordinated mix of chemistry, metabolomics, cheminformatics, biochemistry, and comparative genomics. This review illustrates the above points using examples from plants, which are at least as prone to metabolite damage as other organisms.

  17. A neutral polydisulfide containing Gd(III) DOTA monoamide as a redox-sensitive biodegradable macromolecular MRI contrast agent.

    Science.gov (United States)

    Ye, Zhen; Zhou, Zhuxian; Ayat, Nadia; Wu, Xueming; Jin, Erlei; Shi, Xiaoyue; Lu, Zheng-Rong

    2016-01-01

    This work aims to develop safe and effective gadolinium (III)-based biodegradable macromolecular MRI contrast agents for blood pool and cancer imaging. A neutral polydisulfide containing macrocyclic Gd-DOTA monoamide (GOLS) was synthesized and characterized. In addition to studying the in vitro degradation of GOLS, its kinetic stability was also investigated in an in vivo model. The efficacy of GOLS for contrast-enhanced MRI was examined with female BALB/c mice bearing 4T1 breast cancer xenografts. The pharmacokinetics, biodistribution, and metabolism of GOLS were also determined in mice. GOLS has an apparent molecular weight of 23.0 kDa with T1 relaxivities of 7.20 mM(-1) s(-1) per Gd at 1.5 T, and 6.62 mM(-1) s(-1) at 7.0 T. GOLS had high kinetic inertness against transmetallation with Zn(2+) ions, and its polymer backbone was readily cleaved by L-cysteine. The agent showed improved efficacy for blood pool and tumor MR imaging. The structural effect on biodistribution and in vivo chelation stability was assessed by comparing GOLS with Gd(HP-DO3A), a negatively charged polydisulfide containing Gd-DOTA monoamide GODC, and a polydisulfide containing Gd-DTPA-bisamide (GDCC). GOLS showed high in vivo chelation stability and minimal tissue deposition of gadolinium. The biodegradable macromolecular contrast agent GOLS is a promising polymeric contrast agent for clinical MR cardiovascular imaging and cancer imaging. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Model averaging, optimal inference and habit formation

    Directory of Open Access Journals (Sweden)

    Thomas H B FitzGerald

    2014-06-01

    Full Text Available Postulating that the brain performs approximate Bayesian inference generates principled and empirically testable models of neuronal function – the subject of much current interest in neuroscience and related disciplines. Current formulations address inference and learning under some assumed and particular model. In reality, organisms are often faced with an additional challenge – that of determining which model or models of their environment are the best for guiding behaviour. Bayesian model averaging – which says that an agent should weight the predictions of different models according to their evidence – provides a principled way to solve this problem. Importantly, because model evidence is determined by both the accuracy and complexity of the model, optimal inference requires that these be traded off against one another. This means an agent’s behaviour should show an equivalent balance. We hypothesise that Bayesian model averaging plays an important role in cognition, given that it is both optimal and realisable within a plausible neuronal architecture. We outline model averaging and how it might be implemented, and then explore a number of implications for brain and behaviour. In particular, we propose that model averaging can explain a number of apparently suboptimal phenomena within the framework of approximate (bounded Bayesian inference, focussing particularly upon the relationship between goal-directed and habitual behaviour.

  19. Efficient Bayesian inference for ARFIMA processes

    Science.gov (United States)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-03-01

    Many geophysical quantities, like atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long-range dependence (LRD). LRD means that these quantities experience non-trivial temporal memory, which potentially enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LRD. In this paper we present a modern and systematic approach to the inference of LRD. Rather than Mandelbrot's fractional Gaussian noise, we use the more flexible Autoregressive Fractional Integrated Moving Average (ARFIMA) model which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LRD, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g. short memory effects) can be integrated over in order to focus on long memory parameters, and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data, with favorable comparison to the standard estimators.

  20. Campbell's and Rubin's Perspectives on Causal Inference

    Science.gov (United States)

    West, Stephen G.; Thoemmes, Felix

    2010-01-01

    Donald Campbell's approach to causal inference (D. T. Campbell, 1957; W. R. Shadish, T. D. Cook, & D. T. Campbell, 2002) is widely used in psychology and education, whereas Donald Rubin's causal model (P. W. Holland, 1986; D. B. Rubin, 1974, 2005) is widely used in economics, statistics, medicine, and public health. Campbell's approach focuses on…

  1. Bayesian structural inference for hidden processes

    Science.gov (United States)

    Strelioff, Christopher C.; Crutchfield, James P.

    2014-04-01

    We introduce a Bayesian approach to discovering patterns in structurally complex processes. The proposed method of Bayesian structural inference (BSI) relies on a set of candidate unifilar hidden Markov model (uHMM) topologies for inference of process structure from a data series. We employ a recently developed exact enumeration of topological ɛ-machines. (A sequel then removes the topological restriction.) This subset of the uHMM topologies has the added benefit that inferred models are guaranteed to be ɛ-machines, irrespective of estimated transition probabilities. Properties of ɛ-machines and uHMMs allow for the derivation of analytic expressions for estimating transition probabilities, inferring start states, and comparing the posterior probability of candidate model topologies, despite process internal structure being only indirectly present in data. We demonstrate BSI's effectiveness in estimating a process's randomness, as reflected by the Shannon entropy rate, and its structure, as quantified by the statistical complexity. We also compare using the posterior distribution over candidate models and the single, maximum a posteriori model for point estimation and show that the former more accurately reflects uncertainty in estimated values. We apply BSI to in-class examples of finite- and infinite-order Markov processes, as well to an out-of-class, infinite-state hidden process.

  2. HIERARCHICAL PROBABILISTIC INFERENCE OF COSMIC SHEAR

    International Nuclear Information System (INIS)

    Schneider, Michael D.; Dawson, William A.; Hogg, David W.; Marshall, Philip J.; Bard, Deborah J.; Meyers, Joshua; Lang, Dustin

    2015-01-01

    Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics

  3. Interest, Inferences, and Learning from Texts

    Science.gov (United States)

    Clinton, Virginia; van den Broek, Paul

    2012-01-01

    Topic interest and learning from texts have been found to be positively associated with each other. However, the reason for this positive association is not well understood. The purpose of this study is to examine a cognitive process, inference generation, that could explain the positive association between interest and learning from texts. In…

  4. Ignorability in Statistical and Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...

  5. Inverse Ising inference with correlated samples

    International Nuclear Information System (INIS)

    Obermayer, Benedikt; Levine, Erel

    2014-01-01

    Correlations between two variables of a high-dimensional system can be indicative of an underlying interaction, but can also result from indirect effects. Inverse Ising inference is a method to distinguish one from the other. Essentially, the parameters of the least constrained statistical model are learned from the observed correlations such that direct interactions can be separated from indirect correlations. Among many other applications, this approach has been helpful for protein structure prediction, because residues which interact in the 3D structure often show correlated substitutions in a multiple sequence alignment. In this context, samples used for inference are not independent but share an evolutionary history on a phylogenetic tree. Here, we discuss the effects of correlations between samples on global inference. Such correlations could arise due to phylogeny but also via other slow dynamical processes. We present a simple analytical model to address the resulting inference biases, and develop an exact method accounting for background correlations in alignment data by combining phylogenetic modeling with an adaptive cluster expansion algorithm. We find that popular reweighting schemes are only marginally effective at removing phylogenetic bias, suggest a rescaling strategy that yields better results, and provide evidence that our conclusions carry over to the frequently used mean-field approach to the inverse Ising problem. (paper)

  6. Evolutionary inference via the Poisson Indel Process.

    Science.gov (United States)

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  7. Culture and Pragmatic Inference in Interpersonal Communication

    African Journals Online (AJOL)

    cognitive process, and that the human capacity for inference is crucially important ... been noted that research in interpersonal communication is currently pushing the ... communicative actions, the social-cultural world of everyday life is not only ... personal experiences of the authors', as documented over time and recreated ...

  8. Inference and the Introductory Statistics Course

    Science.gov (United States)

    Pfannkuch, Maxine; Regan, Matt; Wild, Chris; Budgett, Stephanie; Forbes, Sharleen; Harraway, John; Parsonage, Ross

    2011-01-01

    This article sets out some of the rationale and arguments for making major changes to the teaching and learning of statistical inference in introductory courses at our universities by changing from a norm-based, mathematical approach to more conceptually accessible computer-based approaches. The core problem of the inferential argument with its…

  9. Statistical Inference on the Canadian Middle Class

    Directory of Open Access Journals (Sweden)

    Russell Davidson

    2018-03-01

    Full Text Available Conventional wisdom says that the middle classes in many developed countries have recently suffered losses, in terms of both the share of the total population belonging to the middle class, and also their share in total income. Here, distribution-free methods are developed for inference on these shares, by means of deriving expressions for their asymptotic variances of sample estimates, and the covariance of the estimates. Asymptotic inference can be undertaken based on asymptotic normality. Bootstrap inference can be expected to be more reliable, and appropriate bootstrap procedures are proposed. As an illustration, samples of individual earnings drawn from Canadian census data are used to test various hypotheses about the middle-class shares, and confidence intervals for them are computed. It is found that, for the earlier censuses, sample sizes are large enough for asymptotic and bootstrap inference to be almost identical, but that, in the twenty-first century, the bootstrap fails on account of a strange phenomenon whereby many presumably different incomes in the data are rounded to one and the same value. Another difference between the centuries is the appearance of heavy right-hand tails in the income distributions of both men and women.

  10. Spurious correlations and inference in landscape genetics

    Science.gov (United States)

    Samuel A. Cushman; Erin L. Landguth

    2010-01-01

    Reliable interpretation of landscape genetic analyses depends on statistical methods that have high power to identify the correct process driving gene flow while rejecting incorrect alternative hypotheses. Little is known about statistical power and inference in individual-based landscape genetics. Our objective was to evaluate the power of causalmodelling with partial...

  11. Cortical information flow during inferences of agency

    NARCIS (Netherlands)

    Dogge, Myrthel; Hofman, Dennis; Boersma, Maria; Dijkerman, H Chris; Aarts, Henk

    2014-01-01

    Building on the recent finding that agency experiences do not merely rely on sensorimotor information but also on cognitive cues, this exploratory study uses electroencephalographic recordings to examine functional connectivity during agency inference processing in a setting where action and outcome

  12. Quasi-Experimental Designs for Causal Inference

    Science.gov (United States)

    Kim, Yongnam; Steiner, Peter

    2016-01-01

    When randomized experiments are infeasible, quasi-experimental designs can be exploited to evaluate causal treatment effects. The strongest quasi-experimental designs for causal inference are regression discontinuity designs, instrumental variable designs, matching and propensity score designs, and comparative interrupted time series designs. This…

  13. The importance of learning when making inferences

    Directory of Open Access Journals (Sweden)

    Jorg Rieskamp

    2008-03-01

    Full Text Available The assumption that people possess a repertoire of strategies to solve the inference problems they face has been made repeatedly. The experimental findings of two previous studies on strategy selection are reexamined from a learning perspective, which argues that people learn to select strategies for making probabilistic inferences. This learning process is modeled with the strategy selection learning (SSL theory, which assumes that people develop subjective expectancies for the strategies they have. They select strategies proportional to their expectancies, which are updated on the basis of experience. For the study by Newell, Weston, and Shanks (2003 it can be shown that people did not anticipate the success of a strategy from the beginning of the experiment. Instead, the behavior observed at the end of the experiment was the result of a learning process that can be described by the SSL theory. For the second study, by Br"oder and Schiffer (2006, the SSL theory is able to provide an explanation for why participants only slowly adapted to new environments in a dynamic inference situation. The reanalysis of the previous studies illustrates the importance of learning for probabilistic inferences.

  14. Colligation, Or the Logical Inference of Interconnection

    DEFF Research Database (Denmark)

    Falster, Peter

    1998-01-01

    laws or assumptions. Yet interconnection as an abstract concept seems to be without scientific underpinning in pure logic. Adopting a historical viewpoint, our aim is to show that the reasoning of interconnection may be identified with a neglected kind of logical inference, called "colligation...

  15. Colligation or, The Logical Inference of Interconnection

    DEFF Research Database (Denmark)

    Franksen, Ole Immanuel; Falster, Peter

    2000-01-01

    laws or assumptions. Yet interconnection as an abstract concept seems to be without scientific underpinning in oure logic. Adopting a historical viewpoint, our aim is to show that the reasoning of interconnection may be identified with a neglected kind of logical inference, called "colligation...

  16. Inferring motion and location using WLAN RSSI

    NARCIS (Netherlands)

    Kavitha Muthukrishnan, K.; van der Zwaag, B.J.; Havinga, Paul J.M.; Fuller, R.; Koutsoukos, X.

    2009-01-01

    We present novel algorithms to infer movement by making use of inherent fluctuations in the received signal strengths from existing WLAN infrastructure. We evaluate the performance of the presented algorithms based on classification metrics such as recall and precision using annotated traces

  17. Macromolecular Engineering: New Routes Towards the Synthesis of Well-??Defined Polyethers/Polyesters Co/Terpolymers with Different Architectures

    KAUST Repository

    Alamri, Haleema

    2016-05-18

    The primary objective of this research was to develop a new and efficient pathway for well-defined multicomponent homo/co/terpolymers of cyclic esters/ethers using an organocatalytic approach with an emphasis on the macromolecular engineering aspects of the overall synthesis. Macromolecular engineering (as discussed in the first chapter) of homo/copolymers refers to the specific tailoring of these materials for achieving an easy and reproducible synthesis that results in precise molecular characteristics, i.e. molecular weight and polydispersity, as well as specific structure and end?group choices. Precise control of these molecular characteristics will provide access to new materials that can be used for pre-targeted purposes such as biomedical applications. Among the most commonly used engineering materials are polyesters (biocompatible and biodegradable) and polyethers (biocompatible), either as homopolymers or when or copolymers with linear structures. The ability to create non-linear structures, for example stars, will open new horizons in the applications of these important polymeric materials. The second part of this thesis describes the synthesis of aliphatic polyesters, particularly polycaprolactone and polylactide, using a metal-free initiator/catalyst system. A phosphazene base (t?BuP2) was used as the catalyst for the ring-opening copolymerization of ?-aprolactone (??CL) and L,Lactide (LLA) at room temperature with a variety of protic initiators in different solvents. These studies provided important information for the design of a metal-free route toward the synthesis of polyester?based (bio) materials. The third part of the thesis describes a novel route for the one?pot synthesis of polyether-b polyester block copolymers with either a linear or a specific macromolecular architecture. Poly (styrene oxide)?b?poly(caprolactone)?b?poly(L,lactide) was prepared using this method with the goal of synthesizing poly(styrene oxide)-based materials since this

  18. DNA damage and polyploidization.

    Science.gov (United States)

    Chow, Jeremy; Poon, Randy Y C

    2010-01-01

    A growing body of evidence indicates that polyploidization triggers chromosomal instability and contributes to tumorigenesis. DNA damage is increasingly being recognized for its roles in promoting polyploidization. Although elegant mechanisms known as the DNA damage checkpoints are responsible for halting the cell cycle after DNA damage, agents that uncouple the checkpoints can induce unscheduled entry into mitosis. Likewise, defects of the checkpoints in several disorders permit mitotic entry even in the presence of DNA damage. Forcing cells with damaged DNA into mitosis causes severe chromosome segregation defects, including lagging chromosomes, chromosomal fragments and chromosomal bridges. The presence of these lesions in the cleavage plane is believed to abort cytokinesis. It is postulated that if cytokinesis failure is coupled with defects of the p53-dependent postmitotic checkpoint pathway, cells can enter S phase and become polyploids. Progress in the past several years has unraveled some of the underlying principles of these pathways and underscored the important role of DNA damage in polyploidization. Furthermore, polyploidization per se may also be an important determinant of sensitivity to DNA damage, thereby may offer an opportunity for novel therapies.

  19. Active inference, sensory attenuation and illusions.

    Science.gov (United States)

    Brown, Harriet; Adams, Rick A; Parees, Isabel; Edwards, Mark; Friston, Karl

    2013-11-01

    Active inference provides a simple and neurobiologically plausible account of how action and perception are coupled in producing (Bayes) optimal behaviour. This can be seen most easily as minimising prediction error: we can either change our predictions to explain sensory input through perception. Alternatively, we can actively change sensory input to fulfil our predictions. In active inference, this action is mediated by classical reflex arcs that minimise proprioceptive prediction error created by descending proprioceptive predictions. However, this creates a conflict between action and perception; in that, self-generated movements require predictions to override the sensory evidence that one is not actually moving. However, ignoring sensory evidence means that externally generated sensations will not be perceived. Conversely, attending to (proprioceptive and somatosensory) sensations enables the detection of externally generated events but precludes generation of actions. This conflict can be resolved by attenuating the precision of sensory evidence during movement or, equivalently, attending away from the consequences of self-made acts. We propose that this Bayes optimal withdrawal of precise sensory evidence during movement is the cause of psychophysical sensory attenuation. Furthermore, it explains the force-matching illusion and reproduces empirical results almost exactly. Finally, if attenuation is removed, the force-matching illusion disappears and false (delusional) inferences about agency emerge. This is important, given the negative correlation between sensory attenuation and delusional beliefs in normal subjects--and the reduction in the magnitude of the illusion in schizophrenia. Active inference therefore links the neuromodulatory optimisation of precision to sensory attenuation and illusory phenomena during the attribution of agency in normal subjects. It also provides a functional account of deficits in syndromes characterised by false inference

  20. Precise Manipulation and Patterning of Protein Crystals for Macromolecular Crystallography Using Surface Acoustic Waves.

    Science.gov (United States)

    Guo, Feng; Zhou, Weijie; Li, Peng; Mao, Zhangming; Yennawar, Neela H; French, Jarrod B; Huang, Tony Jun

    2015-06-01

    Advances in modern X-ray sources and detector technology have made it possible for crystallographers to collect usable data on crystals of only a few micrometers or less in size. Despite these developments, sample handling techniques have significantly lagged behind and often prevent the full realization of current beamline capabilities. In order to address this shortcoming, a surface acoustic wave-based method for manipulating and patterning crystals is developed. This method, which does not damage the fragile protein crystals, can precisely manipulate and pattern micrometer and submicrometer-sized crystals for data collection and screening. The technique is robust, inexpensive, and easy to implement. This method not only promises to significantly increase efficiency and throughput of both conventional and serial crystallography experiments, but will also make it possible to collect data on samples that were previously intractable. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. LSD and Genetic Damage

    Science.gov (United States)

    Dishotsky, Norman I.; And Others

    1971-01-01

    Reviews studies of the effects of lysergic acid diethylamide (LSD) on man and other organisms. Concludes that pure LSD injected in moderate doses does not cause chromosome or detectable genetic damage and is not a teratogen or carcinogen. (JM)

  2. Diabetes and nerve damage

    Science.gov (United States)

    Diabetic neuropathy; Diabetes - neuropathy; Diabetes - peripheral neuropathy ... In people with diabetes, the body's nerves can be damaged by decreased blood flow and a high blood sugar level. This condition is ...

  3. Adaptive neuro fuzzy inference system modeling to predict damage level of non-reshaped berm breakwater

    Digital Repository Service at National Institute of Oceanography (India)

    Harish, N.; Mandal, S.; Rao, S.; Lokesha

    coefficient (CC) and scatter index (SI) for test data are 8.072, 2.841, 0.92, and 0.218 respectively. Comparing with the artificial neural network model, ANFIS yields higher CC and lower SI. From the results it can be concluded that ANFIS can be an efficient...

  4. A fuzzy logic-based damage identification method for simply-supported bridge using modal shape ratios

    Directory of Open Access Journals (Sweden)

    Hanbing Liu

    2012-08-01

    Full Text Available A fuzzy logic system (FLS is established for damage identification of simply supported bridge. A novel damage indicator is developed based on ratios of mode shape components between before and after damage. Numerical simulation of a simply-supported bridge is presented to demonstrate the memory, inference and anti-noise ability of the proposed method. The bridge is divided into eight elements and nine nodes, the damage indicator vector at characteristic nodes is used as the input measurement of FLS. Results reveal that FLS can detect damage of training patterns with an accuracy of 100%. Aiming at other test patterns, the FLS also possesses favorable inference ability, the identification accuracy for single damage location is up to 93.75%. Tests with noise simulated data show that the FLS possesses favorable anti-noise ability.

  5. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  6. An Intuitive Dashboard for Bayesian Network Inference

    International Nuclear Information System (INIS)

    Reddy, Vikas; Farr, Anna Charisse; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K D V

    2014-01-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  7. The NIFTY way of Bayesian signal inference

    International Nuclear Information System (INIS)

    Selig, Marco

    2014-01-01

    We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D 3 PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy

  8. The NIFTy way of Bayesian signal inference

    Science.gov (United States)

    Selig, Marco

    2014-12-01

    We introduce NIFTy, "Numerical Information Field Theory", a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTy can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTy as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D3PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.

  9. Bayesianism and inference to the best explanation

    Directory of Open Access Journals (Sweden)

    Valeriano IRANZO

    2008-01-01

    Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.

  10. Dopamine, reward learning, and active inference

    Directory of Open Access Journals (Sweden)

    Thomas eFitzgerald

    2015-11-01

    Full Text Available Temporal difference learning models propose phasic dopamine signalling encodes reward prediction errors that drive learning. This is supported by studies where optogenetic stimulation of dopamine neurons can stand in lieu of actual reward. Nevertheless, a large body of data also shows that dopamine is not necessary for learning, and that dopamine depletion primarily affects task performance. We offer a resolution to this paradox based on an hypothesis that dopamine encodes the precision of beliefs about alternative actions, and thus controls the outcome-sensitivity of behaviour. We extend an active inference scheme for solving Markov decision processes to include learning, and show that simulated dopamine dynamics strongly resemble those actually observed during instrumental conditioning. Furthermore, simulated dopamine depletion impairs performance but spares learning, while simulated excitation of dopamine neurons drives reward learning, through aberrant inference about outcome states. Our formal approach provides a novel and parsimonious reconciliation of apparently divergent experimental findings.

  11. Dopamine, reward learning, and active inference.

    Science.gov (United States)

    FitzGerald, Thomas H B; Dolan, Raymond J; Friston, Karl

    2015-01-01

    Temporal difference learning models propose phasic dopamine signaling encodes reward prediction errors that drive learning. This is supported by studies where optogenetic stimulation of dopamine neurons can stand in lieu of actual reward. Nevertheless, a large body of data also shows that dopamine is not necessary for learning, and that dopamine depletion primarily affects task performance. We offer a resolution to this paradox based on an hypothesis that dopamine encodes the precision of beliefs about alternative actions, and thus controls the outcome-sensitivity of behavior. We extend an active inference scheme for solving Markov decision processes to include learning, and show that simulated dopamine dynamics strongly resemble those actually observed during instrumental conditioning. Furthermore, simulated dopamine depletion impairs performance but spares learning, while simulated excitation of dopamine neurons drives reward learning, through aberrant inference about outcome states. Our formal approach provides a novel and parsimonious reconciliation of apparently divergent experimental findings.

  12. Inferring genetic interactions from comparative fitness data.

    Science.gov (United States)

    Crona, Kristina; Gavryushkin, Alex; Greene, Devin; Beerenwinkel, Niko

    2017-12-20

    Darwinian fitness is a central concept in evolutionary biology. In practice, however, it is hardly possible to measure fitness for all genotypes in a natural population. Here, we present quantitative tools to make inferences about epistatic gene interactions when the fitness landscape is only incompletely determined due to imprecise measurements or missing observations. We demonstrate that genetic interactions can often be inferred from fitness rank orders, where all genotypes are ordered according to fitness, and even from partial fitness orders. We provide a complete characterization of rank orders that imply higher order epistasis. Our theory applies to all common types of gene interactions and facilitates comprehensive investigations of diverse genetic interactions. We analyzed various genetic systems comprising HIV-1, the malaria-causing parasite Plasmodium vivax , the fungus Aspergillus niger , and the TEM-family of β-lactamase associated with antibiotic resistance. For all systems, our approach revealed higher order interactions among mutations.

  13. An emergent approach to analogical inference

    Science.gov (United States)

    Thibodeau, Paul H.; Flusberg, Stephen J.; Glick, Jeremy J.; Sternberg, Daniel A.

    2013-03-01

    In recent years, a growing number of researchers have proposed that analogy is a core component of human cognition. According to the dominant theoretical viewpoint, analogical reasoning requires a specific suite of cognitive machinery, including explicitly coded symbolic representations and a mapping or binding mechanism that operates over these representations. Here we offer an alternative approach: we find that analogical inference can emerge naturally and spontaneously from a relatively simple, error-driven learning mechanism without the need to posit any additional analogy-specific machinery. The results also parallel findings from the developmental literature on analogy, demonstrating a shift from an initial reliance on surface feature similarity to the use of relational similarity later in training. Variants of the model allow us to consider and rule out alternative accounts of its performance. We conclude by discussing how these findings can potentially refine our understanding of the processes that are required to perform analogical inference.

  14. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Statistical inference from imperfect photon detection

    International Nuclear Information System (INIS)

    Audenaert, Koenraad M R; Scheel, Stefan

    2009-01-01

    We consider the statistical properties of photon detection with imperfect detectors that exhibit dark counts and less than unit efficiency, in the context of tomographic reconstruction. In this context, the detectors are used to implement certain positive operator-valued measures (POVMs) that would allow us to reconstruct the quantum state or quantum process under consideration. Here we look at the intermediate step of inferring outcome probabilities from measured outcome frequencies, and show how this inference can be performed in a statistically sound way in the presence of detector imperfections. Merging outcome probabilities for different sets of POVMs into a consistent quantum state picture has been treated elsewhere (Audenaert and Scheel 2009 New J. Phys. 11 023028). Single-photon pulsed measurements as well as continuous wave measurements are covered.

  16. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  17. Working with sample data exploration and inference

    CERN Document Server

    Chaffe-Stengel, Priscilla

    2014-01-01

    Managers and analysts routinely collect and examine key performance measures to better understand their operations and make good decisions. Being able to render the complexity of operations data into a coherent account of significant events requires an understanding of how to work well with raw data and to make appropriate inferences. Although some statistical techniques for analyzing data and making inferences are sophisticated and require specialized expertise, there are methods that are understandable and applicable by anyone with basic algebra skills and the support of a spreadsheet package. By applying these fundamental methods themselves rather than turning over both the data and the responsibility for analysis and interpretation to an expert, managers will develop a richer understanding and potentially gain better control over their environment. This text is intended to describe these fundamental statistical techniques to managers, data analysts, and students. Statistical analysis of sample data is enh...

  18. Parametric inference for biological sequence analysis.

    Science.gov (United States)

    Pachter, Lior; Sturmfels, Bernd

    2004-11-16

    One of the major successes in computational biology has been the unification, by using the graphical model formalism, of a multitude of algorithms for annotating and comparing biological sequences. Graphical models that have been applied to these problems include hidden Markov models for annotation, tree models for phylogenetics, and pair hidden Markov models for alignment. A single algorithm, the sum-product algorithm, solves many of the inference problems that are associated with different statistical models. This article introduces the polytope propagation algorithm for computing the Newton polytope of an observation from a graphical model. This algorithm is a geometric version of the sum-product algorithm and is used to analyze the parametric behavior of maximum a posteriori inference calculations for graphical models.

  19. Inferences on Children’s Reading Groups

    Directory of Open Access Journals (Sweden)

    Javier González García

    2009-05-01

    Full Text Available This article focuses on the non-literal information of a text, which can be inferred from key elements or clues offered by the text itself. This kind of text is called implicit text or inference, due to the thinking process that it stimulates. The explicit resources that lead to information retrieval are related to others of implicit information, which have increased their relevance. In this study, during two courses, how two teachers interpret three stories and how they establish a debate dividing the class into three student groups, was analyzed. The sample was formed by two classes of two urban public schools of Burgos capital (Spain, and two of public schools of Tampico (Mexico. This allowed us to observe an increasing percentage value of the group focused in text comprehension, and a lesser percentage of the group perceiving comprehension as a secondary objective.

  20. Inferring Genetic Ancestry: Opportunities, Challenges, and Implications

    OpenAIRE

    Royal, Charmaine D.; Novembre, John; Fullerton, Stephanie M.; Goldstein, David B.; Long, Jeffrey C.; Bamshad, Michael J.; Clark, Andrew G.

    2010-01-01

    Increasing public interest in direct-to-consumer (DTC) genetic ancestry testing has been accompanied by growing concern about issues ranging from the personal and societal implications of the testing to the scientific validity of ancestry inference. The very concept of “ancestry” is subject to misunderstanding in both the general and scientific communities. What do we mean by ancestry? How exactly is ancestry measured? How far back can such ancestry be defined and by which genetic tools? How ...

  1. Spatial Inference Based on Geometric Proportional Analogies

    OpenAIRE

    Mullally, Emma-Claire; O'Donoghue, Diarmuid P.

    2006-01-01

    We describe an instance-based reasoning solution to a variety of spatial reasoning problems. The solution centers on identifying an isomorphic mapping between labelled graphs that represent some problem data and a known solution instance. We describe a number of spatial reasoning problems that are solved by generating non-deductive inferences, integrating topology with area (and other) features. We report the accuracy of our algorithm on different categories of spatial reasoning tasks from th...

  2. Inferring ontology graph structures using OWL reasoning

    KAUST Repository

    Rodriguez-Garcia, Miguel Angel

    2018-01-05

    Ontologies are representations of a conceptualization of a domain. Traditionally, ontologies in biology were represented as directed acyclic graphs (DAG) which represent the backbone taxonomy and additional relations between classes. These graphs are widely exploited for data analysis in the form of ontology enrichment or computation of semantic similarity. More recently, ontologies are developed in a formal language such as the Web Ontology Language (OWL) and consist of a set of axioms through which classes are defined or constrained. While the taxonomy of an ontology can be inferred directly from the axioms of an ontology as one of the standard OWL reasoning tasks, creating general graph structures from OWL ontologies that exploit the ontologies\\' semantic content remains a challenge.We developed a method to transform ontologies into graphs using an automated reasoner while taking into account all relations between classes. Searching for (existential) patterns in the deductive closure of ontologies, we can identify relations between classes that are implied but not asserted and generate graph structures that encode for a large part of the ontologies\\' semantic content. We demonstrate the advantages of our method by applying it to inference of protein-protein interactions through semantic similarity over the Gene Ontology and demonstrate that performance is increased when graph structures are inferred using deductive inference according to our method. Our software and experiment results are available at http://github.com/bio-ontology-research-group/Onto2Graph .Onto2Graph is a method to generate graph structures from OWL ontologies using automated reasoning. The resulting graphs can be used for improved ontology visualization and ontology-based data analysis.

  3. Role of Speaker Cues in Attention Inference

    OpenAIRE

    Jin Joo Lee; Cynthia Breazeal; David DeSteno

    2017-01-01

    Current state-of-the-art approaches to emotion recognition primarily focus on modeling the nonverbal expressions of the sole individual without reference to contextual elements such as the co-presence of the partner. In this paper, we demonstrate that the accurate inference of listeners’ social-emotional state of attention depends on accounting for the nonverbal behaviors of their storytelling partner, namely their speaker cues. To gain a deeper understanding of the role of speaker cues in at...

  4. Inferring ontology graph structures using OWL reasoning.

    Science.gov (United States)

    Rodríguez-García, Miguel Ángel; Hoehndorf, Robert

    2018-01-05

    Ontologies are representations of a conceptualization of a domain. Traditionally, ontologies in biology were represented as directed acyclic graphs (DAG) which represent the backbone taxonomy and additional relations between classes. These graphs are widely exploited for data analysis in the form of ontology enrichment or computation of semantic similarity. More recently, ontologies are developed in a formal language such as the Web Ontology Language (OWL) and consist of a set of axioms through which classes are defined or constrained. While the taxonomy of an ontology can be inferred directly from the axioms of an ontology as one of the standard OWL reasoning tasks, creating general graph structures from OWL ontologies that exploit the ontologies' semantic content remains a challenge. We developed a method to transform ontologies into graphs using an automated reasoner while taking into account all relations between classes. Searching for (existential) patterns in the deductive closure of ontologies, we can identify relations between classes that are implied but not asserted and generate graph structures that encode for a large part of the ontologies' semantic content. We demonstrate the advantages of our method by applying it to inference of protein-protein interactions through semantic similarity over the Gene Ontology and demonstrate that performance is increased when graph structures are inferred using deductive inference according to our method. Our software and experiment results are available at http://github.com/bio-ontology-research-group/Onto2Graph . Onto2Graph is a method to generate graph structures from OWL ontologies using automated reasoning. The resulting graphs can be used for improved ontology visualization and ontology-based data analysis.

  5. Constrained bayesian inference of project performance models

    OpenAIRE

    Sunmola, Funlade

    2013-01-01

    Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...

  6. Using metacognitive cues to infer others' thinking

    OpenAIRE

    André Mata; Tiago Almeida

    2014-01-01

    Three studies tested whether people use cues about the way other people think---for example, whether others respond fast vs. slow---to infer what responses other people might give to reasoning problems. People who solve reasoning problems using deliberative thinking have better insight than intuitive problem-solvers into the responses that other people might give to the same problems. Presumably because deliberative responders think of intuitive responses before they think o...

  7. Thermodynamics of statistical inference by cells.

    Science.gov (United States)

    Lang, Alex H; Fisher, Charles K; Mora, Thierry; Mehta, Pankaj

    2014-10-03

    The deep connection between thermodynamics, computation, and information is now well established both theoretically and experimentally. Here, we extend these ideas to show that thermodynamics also places fundamental constraints on statistical estimation and learning. To do so, we investigate the constraints placed by (nonequilibrium) thermodynamics on the ability of biochemical signaling networks to estimate the concentration of an external signal. We show that accuracy is limited by energy consumption, suggesting that there are fundamental thermodynamic constraints on statistical inference.

  8. Bootstrap inference when using multiple imputation.

    Science.gov (United States)

    Schomaker, Michael; Heumann, Christian

    2018-04-16

    Many modern estimators require bootstrapping to calculate confidence intervals because either no analytic standard error is available or the distribution of the parameter of interest is nonsymmetric. It remains however unclear how to obtain valid bootstrap inference when dealing with multiple imputation to address missing data. We present 4 methods that are intuitively appealing, easy to implement, and combine bootstrap estimation with multiple imputation. We show that 3 of the 4 approaches yield valid inference, but that the performance of the methods varies with respect to the number of imputed data sets and the extent of missingness. Simulation studies reveal the behavior of our approaches in finite samples. A topical analysis from HIV treatment research, which determines the optimal timing of antiretroviral treatment initiation in young children, demonstrates the practical implications of the 4 methods in a sophisticated and realistic setting. This analysis suffers from missing data and uses the g-formula for inference, a method for which no standard errors are available. Copyright © 2018 John Wiley & Sons, Ltd.

  9. Inferring epidemic network topology from surveillance data.

    Directory of Open Access Journals (Sweden)

    Xiang Wan

    Full Text Available The transmission of infectious diseases can be affected by many or even hidden factors, making it difficult to accurately predict when and where outbreaks may emerge. One approach at the moment is to develop and deploy surveillance systems in an effort to detect outbreaks as timely as possible. This enables policy makers to modify and implement strategies for the control of the transmission. The accumulated surveillance data including temporal, spatial, clinical, and demographic information, can provide valuable information with which to infer the underlying epidemic networks. Such networks can be quite informative and insightful as they characterize how infectious diseases transmit from one location to another. The aim of this work is to develop a computational model that allows inferences to be made regarding epidemic network topology in heterogeneous populations. We apply our model on the surveillance data from the 2009 H1N1 pandemic in Hong Kong. The inferred epidemic network displays significant effect on the propagation of infectious diseases.

  10. Role of Speaker Cues in Attention Inference

    Directory of Open Access Journals (Sweden)

    Jin Joo Lee

    2017-10-01

    Full Text Available Current state-of-the-art approaches to emotion recognition primarily focus on modeling the nonverbal expressions of the sole individual without reference to contextual elements such as the co-presence of the partner. In this paper, we demonstrate that the accurate inference of listeners’ social-emotional state of attention depends on accounting for the nonverbal behaviors of their storytelling partner, namely their speaker cues. To gain a deeper understanding of the role of speaker cues in attention inference, we conduct investigations into real-world interactions of children (5–6 years old storytelling with their peers. Through in-depth analysis of human–human interaction data, we first identify nonverbal speaker cues (i.e., backchannel-inviting cues and listener responses (i.e., backchannel feedback. We then demonstrate how speaker cues can modify the interpretation of attention-related backchannels as well as serve as a means to regulate the responsiveness of listeners. We discuss the design implications of our findings toward our primary goal of developing attention recognition models for storytelling robots, and we argue that social robots can proactively use speaker cues to form more accurate inferences about the attentive state of their human partners.

  11. Cortical information flow during inferences of agency

    Directory of Open Access Journals (Sweden)

    Myrthel eDogge

    2014-08-01

    Full Text Available Building on the recent finding that agency experiences do not merely rely on sensorimotor information but also on cognitive cues, this exploratory study uses electroencephalographic recordings to examine functional connectivity during agency inference processing in a setting where action and outcome are independent. Participants completed a computerized task in which they pressed a button followed by one of two color words (red or blue and rated their experienced agency over producing the color. Before executing the action, a matching or mismatching color word was pre-activated by explicitly instructing participants to produce the color (goal condition or by briefly presenting the color word (prime condition. In both conditions, experienced agency was higher in matching versus mismatching trials. Furthermore, increased electroencephalography (EEG-based connectivity strength was observed between parietal and frontal nodes and within the (prefrontal cortex when color-outcomes matched with goals and participants reported high agency. This pattern of increased connectivity was not identified in trials where outcomes were pre-activated through primes. These results suggest that different connections are involved in the experience and in the loss of agency, as well as in inferences of agency resulting from different types of pre-activation. Moreover, the findings provide novel support for the involvement of a fronto-parietal network in agency inferences.

  12. Phylogenetic Inference of HIV Transmission Clusters

    Directory of Open Access Journals (Sweden)

    Vlad Novitsky

    2017-10-01

    Full Text Available Better understanding the structure and dynamics of HIV transmission networks is essential for designing the most efficient interventions to prevent new HIV transmissions, and ultimately for gaining control of the HIV epidemic. The inference of phylogenetic relationships and the interpretation of results rely on the definition of the HIV transmission cluster. The definition of the HIV cluster is complex and dependent on multiple factors, including the design of sampling, accuracy of sequencing, precision of sequence alignment, evolutionary models, the phylogenetic method of inference, and specified thresholds for cluster support. While the majority of studies focus on clusters, non-clustered cases could also be highly informative. A new dimension in the analysis of the global and local HIV epidemics is the concept of phylogenetically distinct HIV sub-epidemics. The identification of active HIV sub-epidemics reveals spreading viral lineages and may help in the design of targeted interventions.HIVclustering can also be affected by sampling density. Obtaining a proper sampling density may increase statistical power and reduce sampling bias, so sampling density should be taken into account in study design and in interpretation of phylogenetic results. Finally, recent advances in long-range genotyping may enable more accurate inference of HIV transmission networks. If performed in real time, it could both inform public-health strategies and be clinically relevant (e.g., drug-resistance testing.

  13. Causal inference of asynchronous audiovisual speech

    Directory of Open Access Journals (Sweden)

    John F Magnotti

    2013-11-01

    Full Text Available During speech perception, humans integrate auditory information from the voice with visual information from the face. This multisensory integration increases perceptual precision, but only if the two cues come from the same talker; this requirement has been largely ignored by current models of speech perception. We describe a generative model of multisensory speech perception that includes this critical step of determining the likelihood that the voice and face information have a common cause. A key feature of the model is that it is based on a principled analysis of how an observer should solve this causal inference problem using the asynchrony between two cues and the reliability of the cues. This allows the model to make predictions abut the behavior of subjects performing a synchrony judgment task, predictive power that does not exist in other approaches, such as post hoc fitting of Gaussian curves to behavioral data. We tested the model predictions against the performance of 37 subjects performing a synchrony judgment task viewing audiovisual speech under a variety of manipulations, including varying asynchronies, intelligibility, and visual cue reliability. The causal inference model outperformed the Gaussian model across two experiments, providing a better fit to the behavioral data with fewer parameters. Because the causal inference model is derived from a principled understanding of the task, model parameters are directly interpretable in terms of stimulus and subject properties.

  14. Functional neuroanatomy of intuitive physical inference.

    Science.gov (United States)

    Fischer, Jason; Mikhael, John G; Tenenbaum, Joshua B; Kanwisher, Nancy

    2016-08-23

    To engage with the world-to understand the scene in front of us, plan actions, and predict what will happen next-we must have an intuitive grasp of the world's physical structure and dynamics. How do the objects in front of us rest on and support each other, how much force would be required to move them, and how will they behave when they fall, roll, or collide? Despite the centrality of physical inferences in daily life, little is known about the brain mechanisms recruited to interpret the physical structure of a scene and predict how physical events will unfold. Here, in a series of fMRI experiments, we identified a set of cortical regions that are selectively engaged when people watch and predict the unfolding of physical events-a "physics engine" in the brain. These brain regions are selective to physical inferences relative to nonphysical but otherwise highly similar scenes and tasks. However, these regions are not exclusively engaged in physical inferences per se or, indeed, even in scene understanding; they overlap with the domain-general "multiple demand" system, especially the parts of that system involved in action planning and tool use, pointing to a close relationship between the cognitive and neural mechanisms involved in parsing the physical content of a scene and preparing an appropriate action.

  15. Elements of Causal Inference: Foundations and Learning Algorithms

    DEFF Research Database (Denmark)

    Peters, Jonas Martin; Janzing, Dominik; Schölkopf, Bernhard

    A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning......A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning...

  16. Integrating distributed Bayesian inference and reinforcement learning for sensor management

    NARCIS (Netherlands)

    Grappiolo, C.; Whiteson, S.; Pavlin, G.; Bakker, B.

    2009-01-01

    This paper introduces a sensor management approach that integrates distributed Bayesian inference (DBI) and reinforcement learning (RL). DBI is implemented using distributed perception networks (DPNs), a multiagent approach to performing efficient inference, while RL is used to automatically

  17. The In-Situ One-Step Synthesis of a PDC Macromolecular Pro-Drug and the Fabrication of a Novel Core-Shell Micell.

    Science.gov (United States)

    Yu, Cui-Yun; Yang, Sa; Li, Zhi-Ping; Huang, Can; Ning, Qian; Huang, Wen; Yang, Wen-Tong; He, Dongxiu; Sun, Lichun

    2016-01-01

    The development of slow release nano-sized carriers for efficient antineoplastic drug delivery with a biocompatible and biodegradable pectin-based macromolecular pro-drug for tumor therapy has been reported in this study. Pectin-doxorubicin conjugates (PDC), a macromolecular pro-drug, were prepared via an amide condensation reaction, and a novel amphiphilic core-shell micell based on a PDC macromolecular pro-drug (PDC-M) was self-assembled in situ, with pectin as the hydrophilic shell and doxorubicin (DOX) as the hydrophobic core. Then the chemical structure of the PDC macromolecular pro-drug was identified by both Fourier transform infrared spectroscopy (FTIR) and nuclear magnetic resonance spectroscopy ((1)H-NMR), and proved that doxorubicin combined well with the pectin and formed macromolecular pro-drug. The PDC-M were observed to have an unregularly spherical shape and were uniform in size by scanning electron microscopy (SEM). The average particle size of PDC-M, further measured by a Zetasizer nanoparticle analyzer (Nano ZS, Malvern Instruments), was about 140 nm. The encapsulation efficiency and drug loading were 57.82% ± 3.7% (n = 3) and 23.852% ±2.3% (n = 3), respectively. The in vitro drug release behaviors of the resulting PDC-M were studied in a simulated tumor environment (pH 5.0), blood (pH 7.4) and a lysosome media (pH 6.8), and showed a prolonged slow release profile. Assays for antiproliferative effects and flow cytometry of the resulting PDC-M in HepG2 cell lines demonstrated greater properties of delayed and slow release as compared to free DOX. A cell viability study against endothelial cells further revealed that the resulting PDC-M possesses excellent cell compatibilities and low cytotoxicities in comparison with that of the free DOX. Hemolysis activity was investigated in rabbits, and the results also demonstrated that the PDC-M has greater compatibility in comparison with free DOX. This shows that the resulting PDC-M can ameliorate the

  18. Spondias mombin L. (Anacardiaceae) enhances detoxification of hepatic and macromolecular oxidants in acetaminophen-intoxicated rats.

    Science.gov (United States)

    Saheed, Sabiu; Taofik, Sunmonu Olatunde; Oladipo, Ajani Emmanuel; Tom, Ashafa Anofi Omotayo

    2017-11-01

    Oxidative stress is a common pathological condition associated with drug-induced hepatotoxicity. This study investigated Spondias mombin L. aqueous leaf extract on reactive oxygen species and acetaminophen-mediated oxidative onslaught in rats' hepatocytes. Hepatotoxic rats were orally administered with the extract and vitamin C for 4 weeks. The extract dose-dependently scavenged DPPH, hydrogen peroxide and hydroxyl radicals, with IC 50 values of 0.13, 0.66, and 0.64 mg/mL, and corresponding % inhibitions of 89, 80, and 90%, respectively at 1.0 mg/mL. Ferric ion was also significantly reduced. The marked (p<0.05) increases in the activities of alkaline phosphatase, alanine aminotransferase and aspartate aminotransferase were reduced following treatment with the extract. The extract also significantly (p<0.05) induced the activities of antioxidant enzymes. These inductions reversed the acetaminophen-enhanced reduction in the specific activities of these enzymes as well as attenuated the observed elevated concentrations of autooxidized products and rived DNA in the acetaminophen-intoxicated animals. The observed effects competed with those of vitamin C and are suggestive of hepatoprotective and antioxidative attributes of the extract. Overall, the data from the present findings suggest that S. Mombin aqueous leaf extract is capable of ameliorating acetaminophen-mediated oxidative hepatic damage via enhancement of antioxidant defense systems.

  19. Radiation induced formation of giant cells in Saccharomyces uvarum. Pt. 4. Macromolecular synthesis and protein patterns

    Energy Technology Data Exchange (ETDEWEB)

    Rink, H; Baumstark-Khan, C; Partke, H J

    1986-08-01

    X-irradiated (1.0 kGy) yeast cells (Saccharomyces uvarum, ATCC 9080), grown in liquid medium stop their mitotic activities and form giant cells by development of several buds which do not separate from mother cells. Depending on the time in culture, wet and dry weights per cell, protein- RNA- and DNA- contents per cell as well as incorporation rates of /sup 14/C-leucine per cell and per hour and patterns (isoelectric focusing) of water soluble proteins were studied. Weights per cell, RNA and protein contents per cell and /sup 14/C-leucine incorporation rates increase markedly in giant cells, whereas DNA content per cell is only duplicated. Protein patterns in isoelectric focusing show one interesting difference. In samples from giant cells one protein band (IP=6.63) decreases after 8 h in culture and later on disappears completely. This finding is not due to primary damage in X-irradiated DNA but seems to be related to the control of cell cycle events.

  20. Bootstrapping phylogenies inferred from rearrangement data

    Directory of Open Access Journals (Sweden)

    Lin Yu

    2012-08-01

    Full Text Available Abstract Background Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. Results We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Conclusions Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its

  1. Bootstrapping phylogenies inferred from rearrangement data.

    Science.gov (United States)

    Lin, Yu; Rajan, Vaibhav; Moret, Bernard Me

    2012-08-29

    Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its support values follow a similar scale and its receiver

  2. Type Inference for Session Types in the Pi-Calculus

    DEFF Research Database (Denmark)

    Graversen, Eva Fajstrup; Harbo, Jacob Buchreitz; Huttel, Hans

    2014-01-01

    In this paper we present a direct algorithm for session type inference for the π-calculus. Type inference for session types has previously been achieved by either imposing limitations and restriction on the π-calculus, or by reducing the type inference problem to that for linear types. Our approach...

  3. Reasoning about Informal Statistical Inference: One Statistician's View

    Science.gov (United States)

    Rossman, Allan J.

    2008-01-01

    This paper identifies key concepts and issues associated with the reasoning of informal statistical inference. I focus on key ideas of inference that I think all students should learn, including at secondary level as well as tertiary. I argue that a fundamental component of inference is to go beyond the data at hand, and I propose that statistical…

  4. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  5. Gd-DTPA L-cystine bisamide copolymers as novel biodegradable macromolecular contrast agents for MR blood pool imaging.

    Science.gov (United States)

    Kaneshiro, Todd L; Ke, Tianyi; Jeong, Eun-Kee; Parker, Dennis L; Lu, Zheng-Rong

    2006-06-01

    The purpose of this study was to synthesize biodegradable Gd-DTPA L-cystine bisamide copolymers (GCAC) as safe and effective, macromolecular contrast agents for magnetic resonance imaging (MRI) and to evaluate their biodegradability and efficacy in MR blood pool imaging in an animal model. Three new biodegradable GCAC with different substituents at the cystine bisamide [R = H (GCAC), CH2CH2CH3 (Gd-DTPA L-cystine bispropyl amide copolymers, GCPC), and CH(CH3)2 (Gd-DTPA cystine bisisopropyl copolymers, GCIC)] were prepared by the condensation copolymerization of diethylenetriamine pentaacetic acid (DTPA) dianhydride with cystine bisamide or bisalkyl amides, followed by complexation with gadolinium triacetate. The degradability of the agents was studied in vitro by incubation in 15 microM cysteine and in vivo with Sprague-Dawley rats. The kinetics of in vivo contrast enhancement was investigated in Sprague-Dawley rats on a Siemens Trio 3 T scanner. The apparent molecular weight of the polydisulfide Gd(III) chelates ranged from 22 to 25 kDa. The longitudinal (T1) relaxivities of GCAC, GCPC, and GCIC were 4.37, 5.28, and 5.56 mM(-1) s(-1) at 3 T, respectively. The polymeric ligands and polymeric Gd(III) chelates readily degraded into smaller molecules in incubation with 15 microM cysteine via disulfide-thiol exchange reactions. The in vitro degradation rates of both the polymeric ligands and macromolecular Gd(III) chelates decreased as the steric effect around the disulfide bonds increased. The agents readily degraded in vivo, and the catabolic degradation products were detected in rat urine samples collected after intravenous injection. The agents showed strong contrast enhancement in the blood pool, major organs, and tissues at a dose of 0.1 mmol Gd/kg. The difference of their in vitro degradability did not significantly alter the kinetics of in vivo contrast enhancement of the agents. These novel GCAC are promising contrast agents for cardiovascular and tumor MRI

  6. Is there a hierarchy of social inferences? The likelihood and speed of inferring intentionality, mind, and personality.

    Science.gov (United States)

    Malle, Bertram F; Holbrook, Jess

    2012-04-01

    People interpret behavior by making inferences about agents' intentionality, mind, and personality. Past research studied such inferences 1 at a time; in real life, people make these inferences simultaneously. The present studies therefore examined whether 4 major inferences (intentionality, desire, belief, and personality), elicited simultaneously in response to an observed behavior, might be ordered in a hierarchy of likelihood and speed. To achieve generalizability, the studies included a wide range of stimulus behaviors, presented them verbally and as dynamic videos, and assessed inferences both in a retrieval paradigm (measuring the likelihood and speed of accessing inferences immediately after they were made) and in an online processing paradigm (measuring the speed of forming inferences during behavior observation). Five studies provide evidence for a hierarchy of social inferences-from intentionality and desire to belief to personality-that is stable across verbal and visual presentations and that parallels the order found in developmental and primate research. (c) 2012 APA, all rights reserved.

  7. Coal transportation road damage

    International Nuclear Information System (INIS)

    Burtraw, D.; Harrison, K.; Pawlowski, J.A.

    1994-01-01

    Heavy trucks are primarily responsible for pavement damage to the nation's highways. In this paper we evaluate the pavement damage caused by coal trucks. We analyze the chief source of pavement damage (vehicle weight per axle, not total vehicle weight) and the chief cost involved (the periodic overlay that is required when a road's surface becomes worn). This analysis is presented in two stages. In the first section we present a synopsis of current economic theory including simple versions of the formulas that can be: used to calculate costs of pavement wear. In the second section we apply this theory to a specific example proximate to the reference environment for the Fuel Cycle Study in New Mexico in order to provide a numerical measure of the magnitude of the costs

  8. Natural resource damage assessment

    International Nuclear Information System (INIS)

    Seddelmeyer, J.

    1991-01-01

    The assessment and collection of natural resource damages from petroleum and chemical companies unfortunate enough to have injured publicly owned natural resources is perhaps the most rapidly expanding area of environmental liability. The idea of recovering for injury to publicly owned natural resources is an extension of traditional common law tort concepts under which a person who negligently injures another or his property is called upon to compensate the injured party. Normally, once liability has been established, it is a fairly straightforward matter to calculate the various elements of loss, such as the cost to repair or replace damaged property, or medical expenses, and lost income. More difficult questions, such as the amount to be awarded for pain and suffering or emotional distress, are left to the jury, although courts limit the circumstances in which the jury is permitted to award such damages

  9. The inference from a single case: moral versus scientific inferences in implementing new biotechnologies.

    Science.gov (United States)

    Hofmann, B

    2008-06-01

    Are there similarities between scientific and moral inference? This is the key question in this article. It takes as its point of departure an instance of one person's story in the media changing both Norwegian public opinion and a brand-new Norwegian law prohibiting the use of saviour siblings. The case appears to falsify existing norms and to establish new ones. The analysis of this case reveals similarities in the modes of inference in science and morals, inasmuch as (a) a single case functions as a counter-example to an existing rule; (b) there is a common presupposition of stability, similarity and order, which makes it possible to reason from a few cases to a general rule; and (c) this makes it possible to hold things together and retain order. In science, these modes of inference are referred to as falsification, induction and consistency. In morals, they have a variety of other names. Hence, even without abandoning the fact-value divide, there appear to be similarities between inference in science and inference in morals, which may encourage communication across the boundaries between "the two cultures" and which are relevant to medical humanities.

  10. Web-Ice: Integrated Data Collection and Analysis for Macromolecular Crystallography

    International Nuclear Information System (INIS)

    Gonzalez, Ana; Gonzalez, Ana; Moorhead, Penjit; McPhillips, Scott E.; Song, Jinhu; Sharp, Ken; Taylor, John R.; Adams, Paul D.; Sauter, Nicholas K.; Soltis, S. Michael

    2007-01-01

    New software tools are introduced to facilitate diffraction experiments involving large numbers of crystals. While existing programs have long provided a framework for lattice indexing, Bragg spot integration, and symmetry determination, these initial data processing steps often require significant manual effort. This limits the timely availability of data analysis needed for high-throughput procedures, including the selection of the best crystals from a large sample pool, and the calculation of optimal data collection parameters to assure complete spot coverage with minimal radiation damage. To make these protocols more efficient, we developed a network of software applications and application servers, collectively known as Web-Ice. When the package is installed at a crystallography beamline, a programming interface allows the beamline control software (e.g., Blu-Ice/DCSS) to trigger data analysis automatically. Results are organized based on a list of samples that the user provides, and are examined within a Web page, accessible both locally at the beamline or remotely. Optional programming interfaces permit the user to control data acquisition through the Web browser. The system as a whole is implemented to support multiple users and multiple processors, and can be expanded to provide additional scientific functionality. Web-Ice has a distributed architecture consisting of several stand-alone software components working together via a well defined interface. Other synchrotrons or institutions may integrate selected components or the whole of Web-Ice with their own data acquisition software. Updated information about current developments may be obtained at http://smb.slac.stanford.edu/research/developments/webice

  11. An integrated native mass spectrometry and top-down proteomics method that connects sequence to structure and function of macromolecular complexes

    Science.gov (United States)

    Li, Huilin; Nguyen, Hong Hanh; Ogorzalek Loo, Rachel R.; Campuzano, Iain D. G.; Loo, Joseph A.

    2018-02-01

    Mass spectrometry (MS) has become a crucial technique for the analysis of protein complexes. Native MS has traditionally examined protein subunit arrangements, while proteomics MS has focused on sequence identification. These two techniques are usually performed separately without taking advantage of the synergies between them. Here we describe the development of an integrated native MS and top-down proteomics method using Fourier-transform ion cyclotron resonance (FTICR) to analyse macromolecular protein complexes in a single experiment. We address previous concerns of employing FTICR MS to measure large macromolecular complexes by demonstrating the detection of complexes up to 1.8 MDa, and we demonstrate the efficacy of this technique for direct acquirement of sequence to higher-order structural information with several large complexes. We then summarize the unique functionalities of different activation/dissociation techniques. The platform expands the ability of MS to integrate proteomics and structural biology to provide insights into protein structure, function and regulation.

  12. Making microenvironments: A look into incorporating macromolecular crowding into in vitro experiments, to generate biomimetic microenvironments which are capable of directing cell function for tissue engineering applications.

    Science.gov (United States)

    Benny, Paula; Raghunath, Michael

    2017-01-01

    Biomimetic microenvironments are key components to successful cell culture and tissue engineering in vitro. One of the most accurate biomimetic microenvironments is that made by the cells themselves. Cell-made microenvironments are most similar to the in vivo state as they are cell-specific and produced by the actual cells which reside in that specific microenvironment. However, cell-made microenvironments have been challenging to re-create in vitro due to the lack of extracellular matrix composition, volume and complexity which are required. By applying macromolecular crowding to current cell culture protocols, cell-made microenvironments, or cell-derived matrices, can be generated at significant rates in vitro. In this review, we will examine the causes and effects of macromolecular crowding and how it has been applied in several in vitro systems including tissue engineering.

  13. Macromolecular HPMA-based nanoparticles with cholesterol for solid-tumor targeting: detailed study of the inner structure of a highly efficient drug delivery system

    Czech Academy of Sciences Publication Activity Database

    Filippov, Sergey K.; Chytil, Petr; Konarev, P. V.; Dyakonova, M.; Papadakis, C. M.; Zhigunov, Alexander; Pleštil, Josef; Štěpánek, Petr; Etrych, Tomáš; Ulbrich, Karel; Svergun, D. I.

    2012-01-01

    Roč. 13, č. 8 (2012), s. 2594-2604 ISSN 1525-7797 R&D Projects: GA MŠk ME09059; GA AV ČR IAAX00500803; GA ČR GAP108/12/0640 Institutional research plan: CEZ:AV0Z40500505 Institutional support: RVO:61389013 Keywords : HPMA * cholesterol * SAXS Subject RIV: CD - Macromolecular Chemistry Impact factor: 5.371, year: 2012

  14. Synthesis and evaluation of water-soluble poly(vinyl alcohol)-paclitaxel conjugate as a macromolecular prodrug

    International Nuclear Information System (INIS)

    Kakinoki, Atsufumi; Kaneo, Yoshiharu; Tanaka, Tetsuro; Hosokawa, Yoshitsugu

    2008-01-01

    Paclitaxel (PTX) is an antitumor agent for the treatment of various human cancers. Cremophor EL and ethanol are used to formulate PTX in commercial injection solutions, because of its poor solubility in water. However, these agents cause severe allergic reaction upon intravenous administration. The aim of this study is to synthesize water-soluble macromolecular prodrugs of PTX for enhancing the therapeutic efficacy. Poly (vinyl alcohol) (PVA, 80 kDa), water-soluble synthetic polymer, was used as a drug carrier which is safe and stable in the body. The 2'-hydroxyl group of PTX was reacted with succinic anhydride and then carboxylic group of the succinyl spacer was coupled to PVA via ethylene diamine spacer, resulting the water-soluble prodrug of poly (vinyl alcohol)-paclitaxel conjugate (PVA-SPTX). The solubility of PTX was greatly enhanced by the conjugation to PVA. The release of PTX from the conjugate was accelerated at the neutral to basic conditions in in vitro release experiment. [ 125 I]-labeled PVA-SPTX was retained in the blood circulation for several days and was gradually distributed into the tumorous tissue after intravenous injection to the tumor-bearing mice. PVA-SPTX inhibited the growth of sarcoma 180 cells subcutaneously inoculated in mice. It was suggested that the water-solubility of PTX was markedly enhanced by the conjugation to PVA, and PVA-SPTX effectively delivered PTX to the tumorous tissue due to the enhanced permeability and retention (EPR) effect. (author)

  15. Innovative High-Throughput SAXS Methodologies Based on Photonic Lab-on-a-Chip Sensors: Application to Macromolecular Studies.

    Science.gov (United States)

    Rodríguez-Ruiz, Isaac; Radajewski, Dimitri; Charton, Sophie; Phamvan, Nhat; Brennich, Martha; Pernot, Petra; Bonneté, Françoise; Teychené, Sébastien

    2017-06-02

    The relevance of coupling droplet-based Photonic Lab-on-a-Chip (PhLoC) platforms and Small-Angle X-Ray Scattering (SAXS) technique is here highlighted for the performance of high throughput investigations, related to the study of protein macromolecular interactions. With this configuration, minute amounts of sample are required to obtain reliable statistical data. The PhLoC platforms presented in this work are designed to allow and control an effective mixing of precise amounts of proteins, crystallization reagents and buffer in nanoliter volumes, and the subsequent generation of nanodroplets by means of a two-phase flow. Spectrophotometric sensing permits a fine control on droplet generation frequency and stability as well as on concentration conditions, and finally the droplet flow is synchronized to perform synchrotron radiation SAXS measurements in individual droplets (each one acting as an isolated microreactor) to probe protein interactions. With this configuration, droplet physic-chemical conditions can be reproducibly and finely tuned, and monitored without cross-contamination, allowing for the screening of a substantial number of saturation conditions with a small amount of biological material. The setup was tested and validated using lysozyme as a model of study. By means of SAXS experiments, the proteins gyration radius and structure envelope were calculated as a function of protein concentration. The obtained values were found to be in good agreement with previously reported data, but with a dramatic reduction of sample volume requirements compared to studies reported in the literature.

  16. Accelerated Development of Supramolecular Corneal Stromal-Like Assemblies from Corneal Fibroblasts in the Presence of Macromolecular Crowders.

    Science.gov (United States)

    Kumar, Pramod; Satyam, Abhigyan; Fan, Xingliang; Rochev, Yury; Rodriguez, Brian J; Gorelov, Alexander; Joshi, Lokesh; Raghunath, Michael; Pandit, Abhay; Zeugolis, Dimitrios I

    2015-07-01

    Tissue engineering by self-assembly uses the cells' secretome as a regeneration template and biological factory of trophic factors. Despite the several advantages that have been witnessed in preclinical and clinical settings, the major obstacle for wide acceptance of this technology remains the tardy extracellular matrix formation. In this study, we assessed the influence of macromolecular crowding (MMC)/excluding volume effect, a biophysical phenomenon that accelerates thermodynamic activities and biological processes by several orders of magnitude, in human corneal fibroblast (HCF) culture. Our data indicate that the addition of negatively charged galactose derivative (carrageenan) in HCF culture, even at 0.5% serum, increases by 12-fold tissue-specific matrix deposition, while maintaining physiological cell morphology and protein/gene expression. Gene analysis indicates that a glucose derivative (dextran sulfate) may drive corneal fibroblasts toward a myofibroblast lineage. Collectively, these results indicate that MMC may be suitable not only for clinical translation and commercialization of tissue engineering by self-assembly therapies, but also for the development of in vitro pathophysiology models.

  17. New insight on aliphatic linkages in the macromolecular organic fraction of Orgueil and Murchison meteorites through ruthenium tetroxide oxidation

    Science.gov (United States)

    Remusat, Laurent; Derenne, Sylvie; Robert, François

    2005-09-01

    Ruthenium tetroxide oxidation was used to examine the macromolecular insoluble organic matter (IOM) from the Orgueil and Murchison meteorites and especially to characterize the aliphatic linkages. Already applied to various terrestrial samples, ruthenium tetroxide is a selective oxidant which destroys aromatic units, converting them into CO 2, and yields aliphatic and aromatic acids. In our experiment on chondritic IOM, it produces mainly short aliphatic diacids and polycarboxylic aromatic acids. Some short hydroxyacids are also detected. Aliphatic diacids are interpreted as aliphatic bridges between aromatic units in the chemical structure, and polycarboxylic aromatic acids are the result of the fusion of polyaromatic units. The product distribution shows that aliphatic links are short with numerous substitutions. No indigenous monocarboxylic acid was detected, showing that free aliphatic chains must be very short (less than three carbon atoms). The hydroxyacids are related to the occurrence of ester and ether functional groups within the aliphatic bridges between the aromatic units. This technique thus allows us to characterize in detail the aliphatic linkages of the IOMs, and the derived conclusions are in agreement with spectroscopic, pyrolytic, and degradative results previously reported. Compared to terrestrial samples, the aliphatic part of chondritic IOM is shorter and highly substituted. Aromatic units are smaller and more cross-linked than in coals, as already proposed from NMR data. Orgueil and Murchison IOM exhibit some tiny differences, especially in the length of aliphatic chains.

  18. Supramolecular Assembly of Comb-like Macromolecules Induced by Chemical Reactions that Modulate the Macromolecular Interactions In Situ.

    Science.gov (United States)

    Xia, Hongwei; Fu, Hailin; Zhang, Yanfeng; Shih, Kuo-Chih; Ren, Yuan; Anuganti, Murali; Nieh, Mu-Ping; Cheng, Jianjun; Lin, Yao

    2017-08-16

    Supramolecular polymerization or assembly of proteins or large macromolecular units by a homogeneous nucleation mechanism can be quite slow and require specific solution conditions. In nature, protein assembly is often regulated by molecules that modulate the electrostatic interactions of the protein subunits for various association strengths. The key to this regulation is the coupling of the assembly process with a reversible or irreversible chemical reaction that occurs within the constituent subunits. However, realizing this complex process by the rational design of synthetic molecules or macromolecules remains a challenge. Herein, we use a synthetic polypeptide-grafted comb macromolecule to demonstrate how the in situ modulation of interactions between the charged macromolecules affects their resulting supramolecular structures. The kinetics of structural formation was studied and can be described by a generalized model of nucleated polymerization containing secondary pathways. Basic thermodynamic analysis indicated the delicate role of the electrostatic interactions between the charged subunits in the reaction-induced assembly process. This approach may be applicable for assembling a variety of ionic soft matters that are amenable to chemical reactions in situ.

  19. Effects of nicotine on cellular proliferation, cell cycle phase distribution, and macromolecular synthesis in human promyelocytic HL-60 leukaemia cells

    International Nuclear Information System (INIS)

    Konno, S.; Wu, J.M.; Chiao, J.W.

    1986-01-01

    Addition of nicotine causes a dose- and time-dependent inhibition of cell growth in the human promyelocytic HL-60 leukemia cells, with 4 mM nicotine resulting in a 50% inhibition of cellular proliferation after 48-50h. Accompanying the anticellular effect of nicotine is a significant change in the cell cycle distribution of HL-60 cells. For example, treatment with 4 mM nicotine for 20h causes an increase in the proportion of G1-phase cells (from 49% to 57%) and a significant decrease in the proportion of S-phase cells (from 41% to 32%). These results suggest that nicotine causes partial cell arrest in the G-1 phase which may in part account for its effects on cell growth. To determine whether nicotine changes the cellular uptake/transport to macromolecular precursors, HL-60 cells were treated with 216 mM nicotine for 30h, at the end of which time cells were labelled with ( 3 H)thymidine, ( 3 H)uridine, ( 14 C)lysine and( 35 S)methionine, the trichloroacetic acid soluble and insoluble radioactivities from each of the labelling conditions were determined. These studies show that nicotine mainly affects the ''de novo synthesis'' of proteins. (author)

  20. Determination of macromolecular exchange and PO2 in the microcirculation: a simple system for in vivo fluorescence and phosphorescence videomicroscopy

    Directory of Open Access Journals (Sweden)

    Torres L.N.

    2001-01-01

    Full Text Available We have developed a system with two epi-illumination sources, a DC-regulated lamp for transillumination and mechanical switches for rapid shift of illumination and detection of defined areas (250-750 µm² by fluorescence and phosphorescence videomicroscopy. The system permits investigation of standard microvascular parameters, vascular permeability as well as intra- and extravascular PO2 by phosphorescence quenching of Pd-meso-tetra (4-carboxyphenyl porphine (PORPH. A Pechan prism was used to position a defined region over the photomultiplier and TV camera. In order to validate the system for in vivo use, in vitro tests were performed with probes at concentrations that can be found in microvascular studies. Extensive in vitro evaluations were performed by filling glass capillaries with solutions of various concentrations of FITC-dextran (diluted in blood and in saline mixed with different amounts of PORPH. Fluorescence intensity and phosphorescence decay were determined for each mixture. FITC-dextran solutions without PORPH and PORPH solutions without FITC-dextran were used as references. Phosphorescence decay curves were relatively unaffected by the presence of FITC-dextran at all concentrations tested (0.1 µg/ml to 5 mg/ml. Likewise, fluorescence determinations were performed in the presence of PORPH (0.05 to 0.5 mg/ml. The system was successfully used to study macromolecular extravasation and PO2 in the rat mesentery circulation under controlled conditions and during ischemia-reperfusion.