WorldWideScience

Sample records for strong inference science

  1. Strong Inference in Mathematical Modeling: A Method for Robust Science in the Twenty-First Century.

    Science.gov (United States)

    Ganusov, Vitaly V

    2016-01-01

    While there are many opinions on what mathematical modeling in biology is, in essence, modeling is a mathematical tool, like a microscope, which allows consequences to logically follow from a set of assumptions. Only when this tool is applied appropriately, as microscope is used to look at small items, it may allow to understand importance of specific mechanisms/assumptions in biological processes. Mathematical modeling can be less useful or even misleading if used inappropriately, for example, when a microscope is used to study stars. According to some philosophers (Oreskes et al., 1994), the best use of mathematical models is not when a model is used to confirm a hypothesis but rather when a model shows inconsistency of the model (defined by a specific set of assumptions) and data. Following the principle of strong inference for experimental sciences proposed by Platt (1964), I suggest "strong inference in mathematical modeling" as an effective and robust way of using mathematical modeling to understand mechanisms driving dynamics of biological systems. The major steps of strong inference in mathematical modeling are (1) to develop multiple alternative models for the phenomenon in question; (2) to compare the models with available experimental data and to determine which of the models are not consistent with the data; (3) to determine reasons why rejected models failed to explain the data, and (4) to suggest experiments which would allow to discriminate between remaining alternative models. The use of strong inference is likely to provide better robustness of predictions of mathematical models and it should be strongly encouraged in mathematical modeling-based publications in the Twenty-First century.

  2. <strong>Generic Patch Inference>

    DEFF Research Database (Denmark)

    Andersen, Jesper; Lawall, Julia Laetitia

    2008-01-01

    A key issue in maintaining Linux device drivers is the need to update drivers in response to evolutions in Linux internal libraries. Currently, there is little tool support for performing and documenting such changes. In this paper we present a tool, spfind, that identifies common changes made...... developers can use it to extract an abstract representation of the set of changes that others have made. Our experiments on recent changes in Linux show that the inferred generic patches are more concise than the corresponding patches found in commits to the Linux source tree while being safe with respect...

  3. The Freedom to Design Nature: Kant's Strong Ought→Can Inference in 21st Century Perspective

    Directory of Open Access Journals (Sweden)

    Edward Eugene Kleist

    2006-01-01

    Full Text Available Kant’s attempts to formulate a conception of the harmony of nature and freedom have two logical presuppositions. The first presupposition is separation of ought and is, which provides a logical formulation of the separation of freedom and nature. Kant might well have settled on the view that the separation between nature and freedom cannot be bridged. Why did Kant attempt to overcome said separation? The second presupposition of Kant’s project to bridge nature and freedom involves an ought→can inference, stating that moral obligation implies the possibility of its fulfillment. There are at least two ways this inference can be understood. There is a weak sense of the inference, stating that no one is obliged to do the impossible. There is also a very strong sense of the inference, stating that if a moral obligation is found to obtain, it must then be possible to fulfill it. Kant interprets the ought→can inference in this strong as well as in the weak sense. Nature, the law-governed totality of what exists, must be understood as able to provide a suitable field for moral realization. The isomorphism between the lawfulness of nature and that of moral freedom animate Kant’s account of moral judgment, and will provide the main focus of the current investigation. Kant conceives of nature and freedom as twin kingdoms, thus providing a theoretical model validating this ought→can inference. The weaker sense of this ought→can inference does justice to moral judgment without requiring the awesome task of bridging nature and freedom. Why, then, should we maintain the strong ought→can inference in our post-Kantian situation? I suggest that Kant’s insistence on the strong ought→can inference may yield an ethical approach to the ever more powerful ways in which human beings technologically transform nature, including human nature itself.

  4. Population genetics inference for longitudinally-sampled mutants under strong selection.

    Science.gov (United States)

    Lacerda, Miguel; Seoighe, Cathal

    2014-11-01

    Longitudinal allele frequency data are becoming increasingly prevalent. Such samples permit statistical inference of the population genetics parameters that influence the fate of mutant variants. To infer these parameters by maximum likelihood, the mutant frequency is often assumed to evolve according to the Wright-Fisher model. For computational reasons, this discrete model is commonly approximated by a diffusion process that requires the assumption that the forces of natural selection and mutation are weak. This assumption is not always appropriate. For example, mutations that impart drug resistance in pathogens may evolve under strong selective pressure. Here, we present an alternative approximation to the mutant-frequency distribution that does not make any assumptions about the magnitude of selection or mutation and is much more computationally efficient than the standard diffusion approximation. Simulation studies are used to compare the performance of our method to that of the Wright-Fisher and Gaussian diffusion approximations. For large populations, our method is found to provide a much better approximation to the mutant-frequency distribution when selection is strong, while all three methods perform comparably when selection is weak. Importantly, maximum-likelihood estimates of the selection coefficient are severely attenuated when selection is strong under the two diffusion models, but not when our method is used. This is further demonstrated with an application to mutant-frequency data from an experimental study of bacteriophage evolution. We therefore recommend our method for estimating the selection coefficient when the effective population size is too large to utilize the discrete Wright-Fisher model. Copyright © 2014 by the Genetics Society of America.

  5. Probing the Small-scale Structure in Strongly Lensed Systems via Transdimensional Inference

    Science.gov (United States)

    Daylan, Tansu; Cyr-Racine, Francis-Yan; Diaz Rivero, Ana; Dvorkin, Cora; Finkbeiner, Douglas P.

    2018-02-01

    Strong lensing is a sensitive probe of the small-scale density fluctuations in the Universe. We implement a pipeline to model strongly lensed systems using probabilistic cataloging, which is a transdimensional, hierarchical, and Bayesian framework to sample from a metamodel (union of models with different dimensionality) consistent with observed photon count maps. Probabilistic cataloging allows one to robustly characterize modeling covariances within and across lens models with different numbers of subhalos. Unlike traditional cataloging of subhalos, it does not require model subhalos to improve the goodness of fit above the detection threshold. Instead, it allows the exploitation of all information contained in the photon count maps—for instance, when constraining the subhalo mass function. We further show that, by not including these small subhalos in the lens model, fixed-dimensional inference methods can significantly mismodel the data. Using a simulated Hubble Space Telescope data set, we show that the subhalo mass function can be probed even when many subhalos in the sample catalogs are individually below the detection threshold and would be absent in a traditional catalog. The implemented software, Probabilistic Cataloger (PCAT) is made publicly available at https://github.com/tdaylan/pcat.

  6. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    2010-01-01

    Chapter 9: This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based on a maximum likelihood or Bayesian approach combined with markov chain Monte Carlo...... (MCMC) techniques. Due to space limitations the focus is on spatial point processes....

  7. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    (This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.1 with the ......(This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.......1 with the title ‘Inference'.) This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods using Markov chain Monte Carlo (MCMC) simulations. Due to space limitations the focus...

  8. Fault location and source process of the Boumerdes, Algeria, earthquake inferred from geodetic and strong motion data

    Science.gov (United States)

    Semmane, Fethi; Campillo, Michel; Cotton, Fabrice

    2005-01-01

    The Boumerdes earthquake occurred on a fault whose precise location, offshore the Algerian coast, was unknown. Geodetic data are used to determine the absolute position of the fault. The fault might emerge at about 15 km offshore. Accelerograms are used to infer the space-time history of the rupture using a two-step inversion in the spectral domain. The observed strong motion records agree with the synthetics for the fault location inferred from geodetic data. The fault plane ruptured for about 18 seconds. The slip distribution on the fault indicates one asperity northwest of the hypocenter with maximum slip amplitude about 3 m. This asperity is probably responsible for most of the damage. Another asperity with slightly smaller slip amplitude is located southeast of the hypocenter. The rupture stops its westward propagation close to the Thenia fault, a structure almost perpendicular to the main fault.

  9. Characterizing the Google Books Corpus: Strong Limits to Inferences of Socio-Cultural and Linguistic Evolution.

    Directory of Open Access Journals (Sweden)

    Eitan Adam Pechenick

    Full Text Available It is tempting to treat frequency trends from the Google Books data sets as indicators of the "true" popularity of various words and phrases. Doing so allows us to draw quantitatively strong conclusions about the evolution of cultural perception of a given topic, such as time or gender. However, the Google Books corpus suffers from a number of limitations which make it an obscure mask of cultural popularity. A primary issue is that the corpus is in effect a library, containing one of each book. A single, prolific author is thereby able to noticeably insert new phrases into the Google Books lexicon, whether the author is widely read or not. With this understood, the Google Books corpus remains an important data set to be considered more lexicon-like than text-like. Here, we show that a distinct problematic feature arises from the inclusion of scientific texts, which have become an increasingly substantive portion of the corpus throughout the 1900 s. The result is a surge of phrases typical to academic articles but less common in general, such as references to time in the form of citations. We use information theoretic methods to highlight these dynamics by examining and comparing major contributions via a divergence measure of English data sets between decades in the period 1800-2000. We find that only the English Fiction data set from the second version of the corpus is not heavily affected by professional texts. Overall, our findings call into question the vast majority of existing claims drawn from the Google Books corpus, and point to the need to fully characterize the dynamics of the corpus before using these data sets to draw broad conclusions about cultural and linguistic evolution.

  10. Characterizing the Google Books Corpus: Strong Limits to Inferences of Socio-Cultural and Linguistic Evolution.

    Science.gov (United States)

    Pechenick, Eitan Adam; Danforth, Christopher M; Dodds, Peter Sheridan

    2015-01-01

    It is tempting to treat frequency trends from the Google Books data sets as indicators of the "true" popularity of various words and phrases. Doing so allows us to draw quantitatively strong conclusions about the evolution of cultural perception of a given topic, such as time or gender. However, the Google Books corpus suffers from a number of limitations which make it an obscure mask of cultural popularity. A primary issue is that the corpus is in effect a library, containing one of each book. A single, prolific author is thereby able to noticeably insert new phrases into the Google Books lexicon, whether the author is widely read or not. With this understood, the Google Books corpus remains an important data set to be considered more lexicon-like than text-like. Here, we show that a distinct problematic feature arises from the inclusion of scientific texts, which have become an increasingly substantive portion of the corpus throughout the 1900 s. The result is a surge of phrases typical to academic articles but less common in general, such as references to time in the form of citations. We use information theoretic methods to highlight these dynamics by examining and comparing major contributions via a divergence measure of English data sets between decades in the period 1800-2000. We find that only the English Fiction data set from the second version of the corpus is not heavily affected by professional texts. Overall, our findings call into question the vast majority of existing claims drawn from the Google Books corpus, and point to the need to fully characterize the dynamics of the corpus before using these data sets to draw broad conclusions about cultural and linguistic evolution.

  11. Fault location and source process of the 2003 Boumerdes, Algeria, earthquake inferred from geodetic and strong motion data.

    Science.gov (United States)

    Semmane, F.; Campillo, M.; Cotton, F.

    2004-12-01

    The Boumerdes earthquake occurred on a fault which precise location, offshore the algerian coast, was unknown. Geodetic data consist of GPS measurements, levelling points and coastal uplifts. They are first used to determine the absolute position of the fault. We performed a series of inversions assuming different positions and chose the model giving the smallest misfit. According to this analysis, the fault emerge at about 15 km offshore. Accelerograms are then used to infer the space-time history of rupture on the fault plane using a two-step inversion in the spectral domain. The observed strong motion records are in good agreement with the synthetics for the fault location inferred from geodetic data. The fault plane ruptured for about 16 seconds. The slip distribution on the fault indicates one asperity north-west of the hypocenter with a maximum slip amplitude larger than 2.5 m. Another asperity with slightly smaller slip amplitude is located south-east of the hypocenter. The rupture seems to stop its propagation westward when it encounters the Thenia fault, a structure almost perpendicular to the main fault. We computed the spatial distribution of ground motion predicted by this fault model and compared it with the observed damages.

  12. Multimodel inference and adaptive management

    Science.gov (United States)

    Rehme, S.E.; Powell, L.A.; Allen, Craig R.

    2011-01-01

    Ecology is an inherently complex science coping with correlated variables, nonlinear interactions and multiple scales of pattern and process, making it difficult for experiments to result in clear, strong inference. Natural resource managers, policy makers, and stakeholders rely on science to provide timely and accurate management recommendations. However, the time necessary to untangle the complexities of interactions within ecosystems is often far greater than the time available to make management decisions. One method of coping with this problem is multimodel inference. Multimodel inference assesses uncertainty by calculating likelihoods among multiple competing hypotheses, but multimodel inference results are often equivocal. Despite this, there may be pressure for ecologists to provide management recommendations regardless of the strength of their study’s inference. We reviewed papers in the Journal of Wildlife Management (JWM) and the journal Conservation Biology (CB) to quantify the prevalence of multimodel inference approaches, the resulting inference (weak versus strong), and how authors dealt with the uncertainty. Thirty-eight percent and 14%, respectively, of articles in the JWM and CB used multimodel inference approaches. Strong inference was rarely observed, with only 7% of JWM and 20% of CB articles resulting in strong inference. We found the majority of weak inference papers in both journals (59%) gave specific management recommendations. Model selection uncertainty was ignored in most recommendations for management. We suggest that adaptive management is an ideal method to resolve uncertainty when research results in weak inference.

  13. Isotope ratio mass spectrometry as a tool for source inference in forensic science: A critical review.

    Science.gov (United States)

    Gentile, Natacha; Siegwolf, Rolf T W; Esseiva, Pierre; Doyle, Sean; Zollinger, Kurt; Delémont, Olivier

    2015-06-01

    Isotope ratio mass spectrometry (IRMS) has been used in numerous fields of forensic science in a source inference perspective. This review compiles the studies published on the application of isotope ratio mass spectrometry (IRMS) to the traditional fields of forensic science so far. It completes the review of Benson et al. [1] and synthesises the extent of knowledge already gathered in the following fields: illicit drugs, flammable liquids, human provenancing, microtraces, explosives and other specific materials (packaging tapes, safety matches, plastics, etc.). For each field, a discussion assesses the state of science and highlights the relevance of the information in a forensic context. Through the different discussions which mark out the review, the potential and limitations of IRMS, as well as the needs and challenges of future studies are emphasized. The paper elicits the various dimensions of the source which can be obtained from the isotope information and demonstrates the transversal nature of IRMS as a tool for source inference. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Observation, Inference, and Imagination: Elements of Edgar Allan Poe's Philosophy of Science

    Science.gov (United States)

    Gelfert, Axel

    2014-03-01

    Edgar Allan Poe's standing as a literary figure, who drew on (and sometimes dabbled in) the scientific debates of his time, makes him an intriguing character for any exploration of the historical interrelationship between science, literature and philosophy. His sprawling `prose-poem' Eureka (1848), in particular, has sometimes been scrutinized for anticipations of later scientific developments. By contrast, the present paper argues that it should be understood as a contribution to the raging debates about scientific methodology at the time. This methodological interest, which is echoed in Poe's `tales of ratiocination', gives rise to a proposed new mode of—broadly abductive—inference, which Poe attributes to the hybrid figure of the `poet-mathematician'. Without creative imagination and intuition, Science would necessarily remain incomplete, even by its own standards. This concern with imaginative (abductive) inference ties in nicely with his coherentism, which grants pride of place to the twin virtues of Simplicity and Consistency, which must constrain imagination lest it degenerate into mere fancy.

  15. Targeted learning in data science causal inference for complex longitudinal studies

    CERN Document Server

    van der Laan, Mark J

    2018-01-01

    This textbook for graduate students in statistics, data science, and public health deals with the practical challenges that come with big, complex, and dynamic data. It presents a scientific roadmap to translate real-world data science applications into formal statistical estimation problems by using the general template of targeted maximum likelihood estimators. These targeted machine learning algorithms estimate quantities of interest while still providing valid inference. Targeted learning methods within data science area critical component for solving scientific problems in the modern age. The techniques can answer complex questions including optimal rules for assigning treatment based on longitudinal data with time-dependent confounding, as well as other estimands in dependent data structures, such as networks. Included in Targeted Learning in Data Science are demonstrations with soft ware packages and real data sets that present a case that targeted learning is crucial for the next generatio...

  16. Strong asymmetry of hemispheric climates during MIS-13 inferred from correlating China loess and Antarctica ice records

    Directory of Open Access Journals (Sweden)

    Z. T. Guo

    2009-02-01

    Full Text Available We correlate the China loess and Antarctica ice records to address the inter-hemispheric climate link over the past 800 ka. The results show a broad coupling between Asian and Antarctic climates at the glacial-interglacial scale. However, a number of decoupled aspects are revealed, among which marine isotope stage (MIS 13 exhibits a strong anomaly compared with the other interglacials. It is characterized by unusually positive benthic oxygen (δ18O and carbon isotope (δ13C values in the world oceans, cooler Antarctic temperature, lower summer sea surface temperature in the South Atlantic, lower CO2 and CH4 concentrations, but by extremely strong Asian, Indian and African summer monsoons, weakest Asian winter monsoon, and lowest Asian dust and iron fluxes. Pervasive warm conditions were also evidenced by the records from northern high-latitude regions. These consistently indicate a warmer Northern Hemisphere and a cooler Southern Hemisphere, and hence a strong asymmetry of hemispheric climates during MIS-13. Similar anomalies of lesser extents also occurred during MIS-11 and MIS-5e. Thus, MIS-13 provides a case that the Northern Hemisphere experienced a substantial warming under relatively low concentrations of greenhouse gases. It suggests that the global climate system possesses a natural variability that is not predictable from the simple response of northern summer insolation and atmospheric CO2 changes. During MIS-13, both hemispheres responded in different ways leading to anomalous continental, marine and atmospheric conditions at the global scale. The correlations also suggest that the marine δ18O record is not always a reliable indicator of the northern ice-volume changes, and that the asymmetry of hemispheric climates is one of the prominent factors controlling the strength of Asian, Indian and African monsoon circulations, most likely through modulating the position of

  17. Strong Lensing Science Results from the Hyper Suprime-Cam Survey

    Science.gov (United States)

    Wong, Kenneth; HSC SSP Strong Lens Working Group

    2018-01-01

    Strong gravitational lenses are valuable objects for studying galaxy structure and cosmology. Lensing is a unique probe of the dark matter structure of galaxies, groups, and clusters, as well as an independent tool for constraining cosmological parameters. Lensing also magnifies the background source population, allowing for detailed studies of their properties at high resolution. However, strong lenses are rare and difficult to find, requiring deep wide-area high-resolution imaging surveys. With data from the ongoing Hyper Suprime-Cam (HSC) Subaru Strategic Program, we have discovered over 100 new strong lenses at the galaxy and group scale to expand the sample of lensing systems, particularly at redshifts z > 0.5, where there have previously been relatively few known lenses. We present a summary of the latest strong lensing science results from the HSC survey data taken through the S17A semester.

  18. The Semantic Web and Human Inference: A Lesson from Cognitive Science

    Science.gov (United States)

    Yamauchi, Takashi

    For the development of Semantic Web technology, researchers and developers in the Semantic Web community need to focus on the areas in which human reasoning is particularly difficult. Two studies in this paper demonstrate that people are predisposed to use class-inclusion labels for inductive judgments. This tendency appears to stem from a general characteristic of human reasoning - using heuristics to solve problems. The inference engines and interface designs that incorporate human reasoning need to integrate this general characteristic underlying human induction.

  19. Observation, Inference, and Imagination: Elements of Edgar Allan Poe's Philosophy of Science

    Science.gov (United States)

    Gelfert, Axel

    2014-01-01

    Edgar Allan Poe's standing as a literary figure, who drew on (and sometimes dabbled in) the scientific debates of his time, makes him an intriguing character for any exploration of the historical interrelationship between science, literature and philosophy. His sprawling "prose-poem" "Eureka" (1848), in particular, has…

  20. Living Learning Communities: An Intervention in Keeping Women Strong in Science, Technology, Engineering, and Mathematics

    Science.gov (United States)

    Belichesky, Jennifer

    2013-01-01

    The purpose of this study was to expand on the current research pertaining to women in science, technology, engineering, and mathematics (STEM) majors, better understand the experiences of undergraduate women in the sciences, identify barriers to female persistence in their intended STEM majors, and understand the impact of the STEM co-educational…

  1. Knowledge and inference

    CERN Document Server

    Nagao, Makoto

    1990-01-01

    Knowledge and Inference discusses an important problem for software systems: How do we treat knowledge and ideas on a computer and how do we use inference to solve problems on a computer? The book talks about the problems of knowledge and inference for the purpose of merging artificial intelligence and library science. The book begins by clarifying the concept of """"knowledge"""" from many points of view, followed by a chapter on the current state of library science and the place of artificial intelligence in library science. Subsequent chapters cover central topics in the artificial intellig

  2. Patterns of rationality recurring inferences in science, social cognition and religious thinking

    CERN Document Server

    Bertolotti, Tommaso

    2015-01-01

    This book proposes an applied epistemological framework for investigating science, social cognition and religious thinking based on inferential patterns that recur in the different domains. It presents human rationality as a tool that allows us to make sense of our (physical or social) surroundings. It shows that the resulting cognitive activity produces a broad spectrum of outputs, such as scientific models and experimentation, gossip and social networks, but also ancient and contemporary deities. The book consists of three parts, the first of which addresses scientific modeling and experimentation, and their application to the analysis of scientific rationality. Thus, this part continues the tradition of eco-cognitive epistemology and abduction studies. The second part deals with the relationship between social cognition and cognitive niche construction, i.e. the evolutionarily relevant externalization of knowledge onto the environment, while the third part focuses on what is commonly defined as "irrational...

  3. Inferences about Supernova Physics from Gravitational-Wave Measurements: GW151226 Spin Misalignment as an Indicator of Strong Black-Hole Natal Kicks.

    Science.gov (United States)

    O'Shaughnessy, Richard; Gerosa, Davide; Wysocki, Daniel

    2017-07-07

    The inferred parameters of the binary black hole GW151226 are consistent with nonzero spin for the most massive black hole, misaligned from the binary's orbital angular momentum. If the black holes formed through isolated binary evolution from an initially aligned binary star, this misalignment would then arise from a natal kick imparted to the first-born black hole at its birth during stellar collapse. We use simple kinematic arguments to constrain the characteristic magnitude of this kick, and find that a natal kick v_{k}≳50  km/s must be imparted to the black hole at birth to produce misalignments consistent with GW151226. Such large natal kicks exceed those adopted by default in most of the current supernova and binary evolution models.

  4. Micro-Macro Compatibility: When Does a Complex Systems Approach Strongly Benefit Science Learning?

    Science.gov (United States)

    Samon, Sigal; Levy, Sharona T.

    2017-01-01

    The study explores how a complexity approach empowers science learning. A complexity approach represents systems as many interacting entities. The construct of micro-macro compatibility is introduced, the degree of similarity between behaviors at the micro- and macro-levels of the system. Seventh-grade students' learning about gases was studied…

  5. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; Nord, B.; Gaitsch, H.; Gaitsch, S.; Lin, H.; Allam, S.; Collett, T. E.; Furlanetto, C.; Gill, M. S. S.; More, A.; Nightingale, J.; Odden, C.; Pellico, A.; Tucker, D. L.; Costa, L. N. da; Neto, A. Fausti; Kuropatkin, N.; Soares-Santos, M.; Welch, B.; Zhang, Y.; Frieman, J. A.; Abdalla, F. B.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Burke, D. L.; Rosell, A. Carnero; Kind, M. Carrasco; Carretero, J.; Cunha, C. E.; D’Andrea, C. B.; Desai, S.; Dietrich, J. P.; Drlica-Wagner, A.; Evrard, A. E.; Finley, D. A.; Flaugher, B.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Nichol, R. C.; Nugent, P.; Ogando, R. L. C.; Plazas, A. A.; Reil, K.; Romer, A. K.; Sako, M.; Sanchez, E.; Santiago, B.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.

    2017-09-01

    We report the results of our searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verication and Year 1 observations. The Science Verication data spans approximately 250 sq. deg. with median i

  6. Philosophical skepticism not relativism is the problem with the Strong Programme in Science Studies and with Educational Constructivism

    Science.gov (United States)

    Papayannakos, Dimitris P.

    2008-06-01

    The structure of David’s Bloor argument for the Strong Programme (SP) in Science Studies is criticized from the philosophical perspective of anti-skeptical, scientific realism. The paper transforms the common criticism of SP—that the symmetry principle of SP implies an untenable form of cognitive relativism—into the clear philosophical issue of naturalism versus Platonism. It is also argued that the concrete patterns of SP’s interest-explanations and its sociological definition of knowledge involve philosophical skepticism. It is claimed, then, that the most problematic elements of SP reside primarily in philosophical skepticism. It is also claimed that this sort of criticism can be directed against other more radical, versions of constructivism in science and science education studies.

  7. Low Genetic Diversity and Strong Geographical Structure of the Critically Endangered White-Headed Langur (Trachypithecus leucocephalus) Inferred from Mitochondrial DNA Control Region Sequences.

    Science.gov (United States)

    Wang, Weiran; Qiao, Yu; Pan, Wenshi; Yao, Meng

    2015-01-01

    Many Asian colobine monkey species are suffering from habitat destruction and population size decline. There is a great need to understand their genetic diversity, population structure and demographic history for effective species conservation. The white-headed langur (Trachypithecus leucocephalus) is a Critically Endangered colobine species endemic to the limestone karst forests in southwestern China. We analyzed the mitochondrial DNA (mtDNA) control region sequences of 390 fecal samples from 40 social groups across the main distribution areas, which represented one-third of the total extant population. Only nine haplotypes and 10 polymorphic sites were identified, indicating remarkably low genetic diversity in the species. Using a subset of 77 samples from different individuals, we evaluated genetic variation, population structure, and population demographic history. We found very low values of haplotype diversity (h = 0.570 ± 0.056) and nucleotide diversity (π = 0.00323 ± 0.00044) in the hypervariable region I (HVRI) of the mtDNA control region. Distribution of haplotypes displayed marked geographical pattern, with one population (Chongzuo, CZ) showing a complete lack of genetic diversity (having only one haplotype), whereas the other population (Fusui, FS) having all nine haplotypes. We detected strong population genetic structure among habit patches (ΦST = 0.375, P population size and modest population expansion in the last 2,000 years. Our results indicate different genetic diversity and possibly distinct population history for different local populations, and suggest that CZ and FS should be considered as one evolutionarily significant unit (ESU) and two management units (MUs) pending further investigation using nuclear markers.

  8. A Strong Start-up Scene Flourishes in the Life Sciences Capital Basel.

    Science.gov (United States)

    Burckhardt, Peter

    2014-12-01

    Basel is known for its successful global players in Pharma, Agro and Chemicals. The wealth of top-tier companies, universities and academic institutions in such a small region is unparalleled. It creates an optimum climate for world-class research and its translation into successful businesses. This is also reflected in a strong start-up scene. Over the past decades the multinational players have shown that they are able to adapt to the ever-increasing challenges in the market. Basel has seen blue-chip company mergers, accompanied by the transfer of business assets into spin-off companies. This process created a mind-set for change which has positively influenced the local start-up environment. Actelion is one of these former spin-offs that successfully made the transition to become a global player. BioVersys and PIQUR are two of the most promising very early stage Swiss biotech companies. Many other examples can be found in Northwestern Switzerland. The region also offers a solid background of supporting activities. Infrastructure, coaching and all other support are offered and complement national innovation initiatives.

  9. Distributional Inference

    NARCIS (Netherlands)

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  10. Research prioritization through prediction of future impact on biomedical science: a position paper on inference-analytics.

    Science.gov (United States)

    Ganapathiraju, Madhavi K; Orii, Naoki

    2013-08-30

    Advances in biotechnology have created "big-data" situations in molecular and cellular biology. Several sophisticated algorithms have been developed that process big data to generate hundreds of biomedical hypotheses (or predictions). The bottleneck to translating this large number of biological hypotheses is that each of them needs to be studied by experimentation for interpreting its functional significance. Even when the predictions are estimated to be very accurate, from a biologist's perspective, the choice of which of these predictions is to be studied further is made based on factors like availability of reagents and resources and the possibility of formulating some reasonable hypothesis about its biological relevance. When viewed from a global perspective, say from that of a federal funding agency, ideally the choice of which prediction should be studied would be made based on which of them can make the most translational impact. We propose that algorithms be developed to identify which of the computationally generated hypotheses have potential for high translational impact; this way, funding agencies and scientific community can invest resources and drive the research based on a global view of biomedical impact without being deterred by local view of feasibility. In short, data-analytic algorithms analyze big-data and generate hypotheses; in contrast, the proposed inference-analytic algorithms analyze these hypotheses and rank them by predicted biological impact. We demonstrate this through the development of an algorithm to predict biomedical impact of protein-protein interactions (PPIs) which is estimated by the number of future publications that cite the paper which originally reported the PPI. This position paper describes a new computational problem that is relevant in the era of big-data and discusses the challenges that exist in studying this problem, highlighting the need for the scientific community to engage in this line of research. The proposed

  11. Elements of Causal Inference: Foundations and Learning Algorithms

    DEFF Research Database (Denmark)

    Peters, Jonas Martin; Janzing, Dominik; Schölkopf, Bernhard

    A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning......A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning...

  12. The DES Bright Arcs Survey: Hundreds of Candidate Strongly Lensed Galaxy Systems from the Dark Energy Survey Science Verification and Year 1 Observations

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; Nord, B.; Gaitsch, H.; Gaitsch, S.; Lin, H.; Allam, S.; Odden, C.; Pellico, A.; Tucker, D. L.; Kuropatkin, N.; Soares-Santos, M. [Fermi National Accelerator Laboratory, P.O. Box 500, Batavia, IL 60510 (United States); Collett, T. E. [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth, PO1 3FX (United Kingdom); Furlanetto, C.; Nightingale, J. [University of Nottingham, School of Physics and Astronomy, Nottingham NG7 2RD (United Kingdom); Gill, M. S. S. [SLAC National Accelerator Laboratory, Menlo Park, CA 94025 (United States); More, A. [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba 277-8583 (Japan); Costa, L. N. da; Neto, A. Fausti, E-mail: diehl@fnal.gov [Laboratório Interinstitucional de e-Astronomia—LIneA, Rua Gal. José Cristino 77, Rio de Janeiro, RJ—20921-400 (Brazil); Collaboration: DES Collaboration; and others

    2017-09-01

    We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median i -band limiting magnitude for extended objects (10 σ ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an i -band limiting magnitude for extended objects (10 σ ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified based on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.

  13. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  14. Statistical inference on variance components

    NARCIS (Netherlands)

    Verdooren, L.R.

    1988-01-01

    In several sciences but especially in animal and plant breeding, the general mixed model with fixed and random effects plays a great role. Statistical inference on variance components means tests of hypotheses about variance components, constructing confidence intervals for them, estimating them,

  15. A strong 'filter' effect of the East China Sea land bridge for East Asia's temperate plant species: inferences from molecular phylogeography and ecological niche modelling of Platycrater arguta (Hydrangeaceae).

    Science.gov (United States)

    Qi, Xin-Shuai; Yuan, Na; Comes, Hans Peter; Sakaguchi, Shota; Qiu, Ying-Xiong

    2014-03-04

    In East Asia, an increasing number of studies on temperate forest tree species find evidence for migration and gene exchange across the East China Sea (ECS) land bridge up until the last glacial maximum (LGM). However, it is less clear when and how lineages diverged in this region, whether in full isolation or in the face of post-divergence gene flow. Here, we investigate the effects of Quaternary changes in climate and sea level on the evolutionary and demographic history of Platycrater arguta, a rare temperate understorey shrub with disjunct distributions in East China (var. sinensis) and South Japan (var. arguta). Molecular data were obtained from 14 P. arguta populations to infer current patterns of molecular structure and diversity in relation to past (Last Interglacial and Last Glacial Maximum) and present distributions based on ecological niche modelling (ENM). A coalescent-based isolation-with-migration (IM) model was used to estimate lineage divergence times and population demographic parameters. Combining information from nuclear/chloroplast sequence data with nuclear microsatellites, our IM analyses identify the two varieties as genetically distinct units that evolved in strict allopatry since the mid-Pleistocene, c. 0.89 (0.51-1.2) Ma. Together with Bayesian Skyeline Plots, our data further suggest that both lineages experienced post-divergence demographic growth, followed by refugial isolation, divergence, and in the case of var. arguta post-glacial admixture. However, past species distribution modelling indicates that the species' overall distribution has not greatly changed over the last glacial cycles. Our findings highlight the important influence of ancient sea-level changes on the diversification of East Asia's temperate flora. Implicitly, they challenge the notion of general temperate forest expansion across the ECS land bridge, demonstrating instead its 'filter' effect owing to an unsuitable environment for certain species and their biological

  16. A strong ‘filter’ effect of the East China Sea land bridge for East Asia’s temperate plant species: inferences from molecular phylogeography and ecological niche modelling of Platycrater arguta (Hydrangeaceae)

    Science.gov (United States)

    2014-01-01

    Background In East Asia, an increasing number of studies on temperate forest tree species find evidence for migration and gene exchange across the East China Sea (ECS) land bridge up until the last glacial maximum (LGM). However, it is less clear when and how lineages diverged in this region, whether in full isolation or in the face of post-divergence gene flow. Here, we investigate the effects of Quaternary changes in climate and sea level on the evolutionary and demographic history of Platycrater arguta, a rare temperate understorey shrub with disjunct distributions in East China (var. sinensis) and South Japan (var. arguta). Molecular data were obtained from 14 P. arguta populations to infer current patterns of molecular structure and diversity in relation to past (Last Interglacial and Last Glacial Maximum) and present distributions based on ecological niche modelling (ENM). A coalescent-based isolation-with-migration (IM) model was used to estimate lineage divergence times and population demographic parameters. Results Combining information from nuclear/chloroplast sequence data with nuclear microsatellites, our IM analyses identify the two varieties as genetically distinct units that evolved in strict allopatry since the mid-Pleistocene, c. 0.89 (0.51–1.2) Ma. Together with Bayesian Skyeline Plots, our data further suggest that both lineages experienced post-divergence demographic growth, followed by refugial isolation, divergence, and in the case of var. arguta post-glacial admixture. However, past species distribution modelling indicates that the species’ overall distribution has not greatly changed over the last glacial cycles. Conclusions Our findings highlight the important influence of ancient sea-level changes on the diversification of East Asia’s temperate flora. Implicitly, they challenge the notion of general temperate forest expansion across the ECS land bridge, demonstrating instead its ‘filter’ effect owing to an unsuitable environment

  17. Sociolinguistic Perception as Inference Under Uncertainty.

    Science.gov (United States)

    Kleinschmidt, Dave F; Weatherholtz, Kodi; Florian Jaeger, T

    2018-03-15

    Social and linguistic perceptions are linked. On one hand, talker identity affects speech perception. On the other hand, speech itself provides information about a talker's identity. Here, we propose that the same probabilistic knowledge might underlie both socially conditioned linguistic inferences and linguistically conditioned social inferences. Our computational-level approach-the ideal adapter-starts from the idea that listeners use probabilistic knowledge of covariation between social, linguistic, and acoustic cues in order to infer the most likely explanation of the speech signals they hear. As a first step toward understanding social inferences in this framework, we use a simple ideal observer model to show that it would be possible to infer aspects of a talker's identity using cue distributions based on actual speech production data. This suggests the possibility of a single formal framework for social and linguistic inferences and the interactions between them. Copyright © 2018 Cognitive Science Society, Inc.

  18. 'Tagger' - a Mac OS X Interactive Graphical Application for Data Inference and Analysis of N-Dimensional Datasets in the Natural Physical Sciences.

    Science.gov (United States)

    Morse, P. E.; Reading, A. M.; Lueg, C.

    2014-12-01

    Pattern-recognition in scientific data is not only a computational problem but a human-observer problem as well. Human observation of - and interaction with - data visualization software can augment, select, interrupt and modify computational routines and facilitate processes of pattern and significant feature recognition for subsequent human analysis, machine learning, expert and artificial intelligence systems.'Tagger' is a Mac OS X interactive data visualisation tool that facilitates Human-Computer interaction for the recognition of patterns and significant structures. It is a graphical application developed using the Quartz Composer framework. 'Tagger' follows a Model-View-Controller (MVC) software architecture: the application problem domain (the model) is to facilitate novel ways of abstractly representing data to a human interlocutor, presenting these via different viewer modalities (e.g. chart representations, particle systems, parametric geometry) to the user (View) and enabling interaction with the data (Controller) via a variety of Human Interface Devices (HID). The software enables the user to create an arbitrary array of tags that may be appended to the visualised data, which are then saved into output files as forms of semantic metadata. Three fundamental problems that are not strongly supported by conventional scientific visualisation software are addressed:1] How to visually animate data over time, 2] How to rapidly deploy unconventional parametrically driven data visualisations, 3] How to construct and explore novel interaction models that capture the activity of the end-user as semantic metadata that can be used to computationally enhance subsequent interrogation. Saved tagged data files may be loaded into Tagger, so that tags may be tagged, if desired. Recursion opens up the possibility of refining or overlapping different types of tags, tagging a variety of different POIs or types of events, and of capturing different types of specialist

  19. EI: A Program for Ecological Inference

    Directory of Open Access Journals (Sweden)

    Gary King

    2004-09-01

    Full Text Available The program EI provides a method of inferring individual behavior from aggregate data. It implements the statistical procedures, diagnostics, and graphics from the book A Solution to the Ecological Inference Problem: Reconstructing Individual Behavior from Aggregate Data (King 1997. Ecological inference, as traditionally defined, is the process of using aggregate (i.e., "ecological" data to infer discrete individual-level relationships of interest when individual-level data are not available. Ecological inferences are required in political science research when individual-level surveys are unavailable (e.g., local or comparative electoral politics, unreliable (racial politics, insufficient (political geography, or infeasible (political history. They are also required in numerous areas of ma jor significance in public policy (e.g., for applying the Voting Rights Act and other academic disciplines ranging from epidemiology and marketing to sociology and quantitative history.

  20. Density estimation in tiger populations: combining information for strong inference

    Science.gov (United States)

    Gopalaswamy, Arjun M.; Royle, J. Andrew; Delampady, Mohan; Nichols, James D.; Karanth, K. Ullas; Macdonald, David W.

    2012-01-01

    A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture–recapture data. The model, which combined information, provided the most precise estimate of density (8.5 ± 1.95 tigers/100 km2 [posterior mean ± SD]) relative to a model that utilized only one data source (photographic, 12.02 ± 3.02 tigers/100 km2 and fecal DNA, 6.65 ± 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.

  1. Heuristics as Bayesian inference under extreme priors.

    Science.gov (United States)

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  3. Causal inference based on counterfactuals

    Directory of Open Access Journals (Sweden)

    Höfler M

    2005-09-01

    Full Text Available Abstract Background The counterfactual or potential outcome model has become increasingly standard for causal inference in epidemiological and medical studies. Discussion This paper provides an overview on the counterfactual and related approaches. A variety of conceptual as well as practical issues when estimating causal effects are reviewed. These include causal interactions, imperfect experiments, adjustment for confounding, time-varying exposures, competing risks and the probability of causation. It is argued that the counterfactual model of causal effects captures the main aspects of causality in health sciences and relates to many statistical procedures. Summary Counterfactuals are the basis of causal inference in medicine and epidemiology. Nevertheless, the estimation of counterfactual differences pose several difficulties, primarily in observational studies. These problems, however, reflect fundamental barriers only when learning from observations, and this does not invalidate the counterfactual concept.

  4. SEMANTIC PATCH INFERENCE

    DEFF Research Database (Denmark)

    Andersen, Jesper

    2009-01-01

    Collateral evolution the problem of updating several library-using programs in response to API changes in the used library. In this dissertation we address the issue of understanding collateral evolutions by automatically inferring a high-level specification of the changes evident in a given set...... specifications inferred by spdiff in Linux are shown. We find that the inferred specifications concisely capture the actual collateral evolution performed in the examples....

  5. Inference in `poor` languages

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, S.

    1996-10-01

    Languages with a solvable implication problem but without complete and consistent systems of inference rules (`poor` languages) are considered. The problem of existence of finite complete and consistent inference rule system for a ``poor`` language is stated independently of the language or rules syntax. Several properties of the problem arc proved. An application of results to the language of join dependencies is given.

  6. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  7. Geometric statistical inference

    International Nuclear Information System (INIS)

    Periwal, Vipul

    1999-01-01

    A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined

  8. Practical Bayesian Inference

    Science.gov (United States)

    Bailer-Jones, Coryn A. L.

    2017-04-01

    Preface; 1. Probability basics; 2. Estimation and uncertainty; 3. Statistical models and inference; 4. Linear models, least squares, and maximum likelihood; 5. Parameter estimation: single parameter; 6. Parameter estimation: multiple parameters; 7. Approximating distributions; 8. Monte Carlo methods for inference; 9. Parameter estimation: Markov chain Monte Carlo; 10. Frequentist hypothesis testing; 11. Model comparison; 12. Dealing with more complicated problems; References; Index.

  9. Logical inference and evaluation

    International Nuclear Information System (INIS)

    Perey, F.G.

    1981-01-01

    Most methodologies of evaluation currently used are based upon the theory of statistical inference. It is generally perceived that this theory is not capable of dealing satisfactorily with what are called systematic errors. Theories of logical inference should be capable of treating all of the information available, including that not involving frequency data. A theory of logical inference is presented as an extension of deductive logic via the concept of plausibility and the application of group theory. Some conclusions, based upon the application of this theory to evaluation of data, are also given

  10. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  11. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  12. 25 years and still going strong: 2'-O-(pyren-1-yl)methylribonucleotides - versatile building blocks for applications in molecular biology, diagnostics and materials science.

    Science.gov (United States)

    Hrdlicka, Patrick J; Karmakar, Saswata

    2017-11-29

    Oligonucleotides (ONs) modified with 2'-O-(pyren-1-yl)methylribonucleotides have been explored for a range of applications in molecular biology, nucleic acid diagnostics, and materials science for more than 25 years. The first part of this review provides an overview of synthetic strategies toward 2'-O-(pyren-1-yl)methylribonucleotides and is followed by a summary of biophysical properties of nucleic acid duplexes modified with these building blocks. Insights from structural studies are then presented to rationalize the reported properties. In the second part, applications of ONs modified with 2'-O-(pyren-1-yl)methyl-RNA monomers are reviewed, which include detection of RNA targets, discrimination of single nucleotide polymorphisms, formation of self-assembled pyrene arrays on nucleic acid scaffolds, the study of charge transfer phenomena in nucleic acid duplexes, and sequence-unrestricted recognition of double-stranded DNA. The predictable binding mode of the pyrene moiety, coupled with the microenvironment-dependent properties and synthetic feasibility, render 2'-O-(pyren-1-yl)methyl-RNA monomers as a promising class of pyrene-functionalized nucleotide building blocks for new applications in molecular biology, nucleic acid diagnostics, and materials science.

  13. Inference and uncertainty in radiology.

    Science.gov (United States)

    Sistrom, Chris

    2006-05-01

    This paper seeks to enhance understanding of the philosophical underpinnings of our discipline and the resulting practical implications. Radiology reports exist in order to convey new knowledge about a patient's condition based on empiric observations from anatomic or functional images of the body. The route to explanation and prediction from empiric evidence is mostly through inference based on inductive (and sometimes abductive) arguments. The conclusions of inductive arguments are, by definition, contingent and provisional. Therefore, it is necessary to deal in some way with the uncertainty of inferential conclusions (i.e. interpretations) made in radiology reports. Two paradigms for managing uncertainty in natural sciences exist in dialectic tension with each other. These are the frequentist and Bayesian theories of probability. Tension between them is mirrored during routine interactions among radiologists and clinicians. I will describe these core issues and argue that they are quite relevant to routine image interpretation and reporting.

  14. An introduction to the philosophy of science

    CERN Document Server

    Staley, Kent W

    2014-01-01

    This book guides readers by gradual steps through the central concepts and debates in the philosophy of science. Using concrete examples from the history of science, Kent W. Staley shows how seemingly abstract philosophical issues are relevant to important aspects of scientific practice. Structured in two parts, the book first tackles the central concepts of the philosophy of science, such as the problem of induction, falsificationism, and underdetermination, and important figures and movements, such as the logical empiricists, Thomas Kuhn, and Paul Feyerabend. The second part turns to contemporary debates in the philosophy of science, such as scientific realism, explanation, the role of values in science, the different views of scientific inference, and probability. This broad yet detailed overview will give readers a strong grounding whilst also providing opportunities for further exploration. It will be of particular interest to students of philosophy, the philosophy of science, and science. Read more at h...

  15. Subjective randomness as statistical inference.

    Science.gov (United States)

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Making Type Inference Practical

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Oxhøj, Nicholas; Palsberg, Jens

    1992-01-01

    We present the implementation of a type inference algorithm for untyped object-oriented programs with inheritance, assignments, and late binding. The algorithm significantly improves our previous one, presented at OOPSLA'91, since it can handle collection classes, such as List, in a useful way. A......-oriented languages practical....

  17. Type Inference with Inequalities

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    1991-01-01

    of (monotonic) inequalities on the types of variables and expressions. A general result about systems of inequalities over semilattices yields a solvable form. We distinguish between deciding typability (the existence of solutions) and type inference (the computation of a minimal solution). In our case, both...

  18. Inference as Prediction

    Science.gov (United States)

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  19. Bayesian Inference of Nonstationary Precipitation Intensity-Duration-Frequency Curves for Infrastructure Design

    Science.gov (United States)

    2016-03-01

    comparison of these Bayesian- inferred IDF curves under stationary and nonstationary conditions. BACKGROUND: Probability concepts and related relevant...second edition, texts in statistical science. United Kingdom: Chapman & Hall/CRC. Gelman, A., and D. B. Rubin. 1992. Inference from iterative...ERDC/CHL CHETN-X-2 March 2016 Approved for public release; distribution is unlimited. Bayesian Inference of Nonstationary Precipitation Intensity

  20. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  1. Irony as Inferred Contradiction

    Directory of Open Access Journals (Sweden)

    Лаура Альба-Хуес

    2014-12-01

    Full Text Available “If we acknowledge the existence of an Irony Principle, we should also acknowledge another ‘higher-order principle’ which has the opposite effect. While irony is an apparently friendly way of being offensive (mock politeness, the type of verbal behaviour known as ‘banter’ is an offensive way of being friendly (mock impoliteness.” Geoffrey Leech, Principles of Pragmatics (1983: 144 In this work I present some theoretical considerations about what I consider to be a permanent and ever-present feature of verbal irony, namely, inferred contradiction , which has to be distinguished from plain, direct (non-inferred contradiction as well as from indirect negation , for a contradiction which is directly expressed cannot be interpreted as ironical (since it lacks a crucial component: inference, and an indirect negation may or may not be ironic (depending on the situation, and thus cannot be considered a permanent feature of the phenomenon. In spite of the fact that many scholars have proposed different theories in order to capture the essence of this intricate and complex phenomenon, not all of them have managed to find a feature or characteristic that applies to or is found in all possible occurrences of irony. I briefly discuss the tenets of some of the best-known of these theories, namely the Classical theories (Socrates, Cicero, Quintilian, the Echoic-Mention Theory (later Echoic Theory, the Echoic Reminder Theory, the Pretence Theory and the Relevant Inappropriateness Theory, trying to show that in all the types of irony emerging from these proposals (e.g. echoic irony, pretence irony, etc. it can be observed that the irony is triggered by inferred contradiction . The one theory that according to my view and knowledge- seems to capture its whole essence to date is Attardo’s (2000 Relevant Inappropriateness Theory, to whose proposal I adhere, but I argue at the same time that inferred contradiction is another feature of irony (which

  2. Causal inference in econometrics

    CERN Document Server

    Kreinovich, Vladik; Sriboonchitta, Songsak

    2016-01-01

    This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.

  3. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  4. Multiple Instance Fuzzy Inference

    Science.gov (United States)

    2015-12-02

    and learn the fuzzy inference system’s parameters [24, 25]. In this later technique, supervised and unsupervised learning algorithms are devised to...algorithm ( unsupervised learning ) can be used to identify local contexts of the input space, and a linear classifier (supervised learning ) can be used...instance level (patch-level) labels and would require the image to be correctly segmented and labeled prior to learning . Figure 1.1: Example of an image

  5. INFERENCE BUILDING BLOCKS

    Science.gov (United States)

    2018-02-15

    whether unsupervised (such as clustering) or supervised (such as Naive Bayes). We observed the following advantages: 1 APPROVED FOR PUBLIC RELEASE...section, we explain our research in relation to DARPA’s Probabilistic Programming for Advancing Machine Learning (PPAML) program and other approaches...develop machine- learning applications by combining probabilistic models and inference techniques. On one hand, a probabilistic model is a mathematical

  6. Active inference and learning.

    Science.gov (United States)

    Friston, Karl; FitzGerald, Thomas; Rigoli, Francesco; Schwartenbeck, Philipp; O Doherty, John; Pezzulo, Giovanni

    2016-09-01

    This paper offers an active inference account of choice behaviour and learning. It focuses on the distinction between goal-directed and habitual behaviour and how they contextualise each other. We show that habits emerge naturally (and autodidactically) from sequential policy optimisation when agents are equipped with state-action policies. In active inference, behaviour has explorative (epistemic) and exploitative (pragmatic) aspects that are sensitive to ambiguity and risk respectively, where epistemic (ambiguity-resolving) behaviour enables pragmatic (reward-seeking) behaviour and the subsequent emergence of habits. Although goal-directed and habitual policies are usually associated with model-based and model-free schemes, we find the more important distinction is between belief-free and belief-based schemes. The underlying (variational) belief updating provides a comprehensive (if metaphorical) process theory for several phenomena, including the transfer of dopamine responses, reversal learning, habit formation and devaluation. Finally, we show that active inference reduces to a classical (Bellman) scheme, in the absence of ambiguity. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Continuous Integrated Invariant Inference Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed project will develop a new technique for invariant inference and embed this and other current invariant inference and checking techniques in an...

  8. science

    International Development Research Centre (IDRC) Digital Library (Canada)

    David Spurgeon

    green revolution". — seemed to confirm the value of science and technology to international development. Yet studies showed that, at that time, only about two percent of ... gap in science and technology between the Third World and the industrial- ..... Finance; Treasury Board; Industry, Trade and Commerce; Agriculture;.

  9. Atoms and clusters in strong laser fields

    NARCIS (Netherlands)

    Marchenko, T.

    2008-01-01

    This thesis describes experimental and theoretical studies on the interaction of strong infrared laser fields with atoms and atomic clusters. Part I provides an overview of the main strong-field phenomena in atoms, molecules and clusters and describes the state-of-the-art in strong-field science.

  10. 78 FR 15710 - Strong Sensitizer Guidance

    Science.gov (United States)

    2013-03-12

    ... definition of ``strong sensitizer'' found at 16 CFR 1500.3(c)(5). The Commission is proposing to revise the supplemental definition of ``strong sensitizer'' due to advancements in the science of sensitization that have... document is intended to clarify the ``strong sensitizer'' definition, assist manufacturers in understanding...

  11. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  12. Nanotechnology and statistical inference

    Science.gov (United States)

    Vesely, Sara; Vesely, Leonardo; Vesely, Alessandro

    2017-08-01

    We discuss some problems that arise when applying statistical inference to data with the aim of disclosing new func-tionalities. A predictive model analyzes the data taken from experiments on a specific material to assess the likelihood that another product, with similar structure and properties, will exhibit the same functionality. It doesn't have much predictive power if vari-ability occurs as a consequence of a specific, non-linear behavior. We exemplify our discussion on some experiments with biased dice.

  13. Generic patch inference

    DEFF Research Database (Denmark)

    Andersen, Jesper; Lawall, Julia

    2010-01-01

    A key issue in maintaining Linux device drivers is the need to keep them up to date with respect to evolutions in Linux internal libraries. Currently, there is little tool support for performing and documenting such changes. In this paper we present a tool, spdiff, that identifies common changes...... developers can use it to extract an abstract representation of the set of changes that others have made. Our experiments on recent changes in Linux show that the inferred generic patches are more concise than the corresponding patches found in commits to the Linux source tree while being safe with respect...

  14. Foundations of Inference

    Directory of Open Access Journals (Sweden)

    Kevin H. Knuth

    2012-06-01

    Full Text Available We present a simple and clear foundation for finite inference that unites and significantly extends the approaches of Kolmogorov and Cox. Our approach is based on quantifying lattices of logical statements in a way that satisfies general lattice symmetries. With other applications such as measure theory in mind, our derivations assume minimal symmetries, relying on neither negation nor continuity nor differentiability. Each relevant symmetry corresponds to an axiom of quantification, and these axioms are used to derive a unique set of quantifying rules that form the familiar probability calculus. We also derive a unique quantification of divergence, entropy and information.

  15. Efficient Bayesian inference for ARFIMA processes

    Science.gov (United States)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-03-01

    Many geophysical quantities, like atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long-range dependence (LRD). LRD means that these quantities experience non-trivial temporal memory, which potentially enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LRD. In this paper we present a modern and systematic approach to the inference of LRD. Rather than Mandelbrot's fractional Gaussian noise, we use the more flexible Autoregressive Fractional Integrated Moving Average (ARFIMA) model which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LRD, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g. short memory effects) can be integrated over in order to focus on long memory parameters, and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data, with favorable comparison to the standard estimators.

  16. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  17. Contingency inferences driven by base rates: Valid by sampling

    Directory of Open Access Journals (Sweden)

    Florian Kutzner

    2011-04-01

    Full Text Available Fiedler et al. (2009, reviewed evidence for the utilization of a contingency inference strategy termed pseudocontingencies (PCs. In PCs, the more frequent levels (and, by implication, the less frequent levels are assumed to be associated. PCs have been obtained using a wide range of task settings and dependent measures. Yet, the readiness with which decision makers rely on PCs is poorly understood. A computer simulation explored two potential sources of subjective validity of PCs. First, PCs are shown to perform above chance level when the task is to infer the sign of moderate to strong population contingencies from a sample of observations. Second, contingency inferences based on PCs and inferences based on cell frequencies are shown to partially agree across samples. Intriguingly, this criterion and convergent validity are by-products of random sampling error, highlighting the inductive nature of contingency inferences.

  18. Inferring horizontal gene transfer.

    Directory of Open Access Journals (Sweden)

    Matt Ravenhall

    2015-05-01

    Full Text Available Horizontal or Lateral Gene Transfer (HGT or LGT is the transmission of portions of genomic DNA between organisms through a process decoupled from vertical inheritance. In the presence of HGT events, different fragments of the genome are the result of different evolutionary histories. This can therefore complicate the investigations of evolutionary relatedness of lineages and species. Also, as HGT can bring into genomes radically different genotypes from distant lineages, or even new genes bearing new functions, it is a major source of phenotypic innovation and a mechanism of niche adaptation. For example, of particular relevance to human health is the lateral transfer of antibiotic resistance and pathogenicity determinants, leading to the emergence of pathogenic lineages. Computational identification of HGT events relies upon the investigation of sequence composition or evolutionary history of genes. Sequence composition-based ("parametric" methods search for deviations from the genomic average, whereas evolutionary history-based ("phylogenetic" approaches identify genes whose evolutionary history significantly differs from that of the host species. The evaluation and benchmarking of HGT inference methods typically rely upon simulated genomes, for which the true history is known. On real data, different methods tend to infer different HGT events, and as a result it can be difficult to ascertain all but simple and clear-cut HGT events.

  19. A "crítica forte" da ciência e implicações para a educação em ciências The "strong criticism" of science and the implication to science education

    Directory of Open Access Journals (Sweden)

    Ileana María Greca

    2004-12-01

    Full Text Available Neste trabalho discutimos alguns elementos oriundos tanto do que se pode denominar vagamente de tendências pós-modernas na filosofia, quanto do campo da história social e da sociologia das ciências, e as possíveis implicações dos mesmos para a pesquisa e a educação em ciências. Nossa avaliação é que, independentemente do problemático de alguns de seus pressupostos, estas correntes têm a contribuir para a nossa compreensão da ciência e para a formação de cidadãos mais responsáveis.In this article we discuss some issues from what can be termed vaguely postmodernists tendencies in philosophy and social history, trying to establish some implications from these positions for Science Education. Although problematic aspects exist in their assumptions, we think they may contribute to our understanding of science and help us to prepare more responsible citizens.

  20. Bayesianism and inference to the best explanation

    Directory of Open Access Journals (Sweden)

    Valeriano IRANZO

    2008-01-01

    Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.

  1. Statistical inferences in phylogeography

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Beaumont, Mark A

    2009-01-01

    can randomly lead to multiple different genealogies. Likewise, the same gene trees can arise under different demographic models. This problem has led to the emergence of many statistical methods for making phylogeographic inferences. A popular phylogeographic approach based on nested clade analysis...... is challenged by the fact that a certain amount of the interpretation of the data is left to the subjective choices of the user, and it has been argued that the method performs poorly in simulation studies. More rigorous statistical methods based on coalescence theory have been developed. However, these methods...... may also be challenged by computational problems or poor model choice. In this review, we will describe the development of statistical methods in phylogeographic analysis, and discuss some of the challenges facing these methods....

  2. Admissibility of logical inference rules

    CERN Document Server

    Rybakov, VV

    1997-01-01

    The aim of this book is to present the fundamental theoretical results concerning inference rules in deductive formal systems. Primary attention is focused on: admissible or permissible inference rules the derivability of the admissible inference rules the structural completeness of logics the bases for admissible and valid inference rules. There is particular emphasis on propositional non-standard logics (primary, superintuitionistic and modal logics) but general logical consequence relations and classical first-order theories are also considered. The book is basically self-contained and

  3. Dopamine, reward learning, and active inference

    Directory of Open Access Journals (Sweden)

    Thomas eFitzgerald

    2015-11-01

    Full Text Available Temporal difference learning models propose phasic dopamine signalling encodes reward prediction errors that drive learning. This is supported by studies where optogenetic stimulation of dopamine neurons can stand in lieu of actual reward. Nevertheless, a large body of data also shows that dopamine is not necessary for learning, and that dopamine depletion primarily affects task performance. We offer a resolution to this paradox based on an hypothesis that dopamine encodes the precision of beliefs about alternative actions, and thus controls the outcome-sensitivity of behaviour. We extend an active inference scheme for solving Markov decision processes to include learning, and show that simulated dopamine dynamics strongly resemble those actually observed during instrumental conditioning. Furthermore, simulated dopamine depletion impairs performance but spares learning, while simulated excitation of dopamine neurons drives reward learning, through aberrant inference about outcome states. Our formal approach provides a novel and parsimonious reconciliation of apparently divergent experimental findings.

  4. Testing strong interaction theories

    International Nuclear Information System (INIS)

    Ellis, J.

    1979-01-01

    The author discusses possible tests of the current theories of the strong interaction, in particular, quantum chromodynamics. High energy e + e - interactions should provide an excellent means of studying the strong force. (W.D.L.)

  5. Gauging Variational Inference

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ahn, Sungsoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of); Shin, Jinwoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2017-05-25

    Computing partition function is the most important statistical inference task arising in applications of Graphical Models (GM). Since it is computationally intractable, approximate methods have been used to resolve the issue in practice, where meanfield (MF) and belief propagation (BP) are arguably the most popular and successful approaches of a variational type. In this paper, we propose two new variational schemes, coined Gauged-MF (G-MF) and Gauged-BP (G-BP), improving MF and BP, respectively. Both provide lower bounds for the partition function by utilizing the so-called gauge transformation which modifies factors of GM while keeping the partition function invariant. Moreover, we prove that both G-MF and G-BP are exact for GMs with a single loop of a special structure, even though the bare MF and BP perform badly in this case. Our extensive experiments, on complete GMs of relatively small size and on large GM (up-to 300 variables) confirm that the newly proposed algorithms outperform and generalize MF and BP.

  6. Multistability and perceptual inference.

    Science.gov (United States)

    Gershman, Samuel J; Vul, Edward; Tenenbaum, Joshua B

    2012-01-01

    Ambiguous images present a challenge to the visual system: How can uncertainty about the causes of visual inputs be represented when there are multiple equally plausible causes? A Bayesian ideal observer should represent uncertainty in the form of a posterior probability distribution over causes. However, in many real-world situations, computing this distribution is intractable and requires some form of approximation. We argue that the visual system approximates the posterior over underlying causes with a set of samples and that this approximation strategy produces perceptual multistability--stochastic alternation between percepts in consciousness. Under our analysis, multistability arises from a dynamic sample-generating process that explores the posterior through stochastic diffusion, implementing a rational form of approximate Bayesian inference known as Markov chain Monte Carlo (MCMC). We examine in detail the most extensively studied form of multistability, binocular rivalry, showing how a variety of experimental phenomena--gamma-like stochastic switching, patchy percepts, fusion, and traveling waves--can be understood in terms of MCMC sampling over simple graphical models of the underlying perceptual tasks. We conjecture that the stochastic nature of spiking neurons may lend itself to implementing sample-based posterior approximations in the brain.

  7. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  8. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framework...... is composed of a set of language primitives and of an inference engine based on a message-passing system that integrates cutting-edge computational tools, including proximal algorithms and high performance Hamiltonian Markov Chain Monte Carlo techniques. A set of domain-specific highly optimized GPU...

  9. Optimization methods for logical inference

    CERN Document Server

    Chandru, Vijay

    2011-01-01

    Merging logic and mathematics in deductive inference-an innovative, cutting-edge approach. Optimization methods for logical inference? Absolutely, say Vijay Chandru and John Hooker, two major contributors to this rapidly expanding field. And even though ""solving logical inference problems with optimization methods may seem a bit like eating sauerkraut with chopsticks. . . it is the mathematical structure of a problem that determines whether an optimization model can help solve it, not the context in which the problem occurs."" Presenting powerful, proven optimization techniques for logic in

  10. Explanation in causal inference methods for mediation and interaction

    CERN Document Server

    VanderWeele, Tyler

    2015-01-01

    A comprehensive examination of methods for mediation and interaction, VanderWeele's book is the first to approach this topic from the perspective of causal inference. Numerous software tools are provided, and the text is both accessible and easy to read, with examples drawn from diverse fields. The result is an essential reference for anyone conducting empirical research in the biomedical or social sciences.

  11. On quantum statistical inference

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.

    Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...

  12. Sample Size and Robustness of Inferences from Logistic Regression in the Presence of Nonlinearity and Multicollinearity

    OpenAIRE

    Bergtold, Jason S.; Yeager, Elizabeth A.; Featherstone, Allen M.

    2011-01-01

    The logistic regression models has been widely used in the social and natural sciences and results from studies using this model can have significant impact. Thus, confidence in the reliability of inferences drawn from these models is essential. The robustness of such inferences is dependent on sample size. The purpose of this study is to examine the impact of sample size on the mean estimated bias and efficiency of parameter estimation and inference for the logistic regression model. A numbe...

  13. PREFACE: Strongly correlated electron systems Strongly correlated electron systems

    Science.gov (United States)

    Saxena, Siddharth S.; Littlewood, P. B.

    2012-07-01

    This special section is dedicated to the Strongly Correlated Electron Systems Conference (SCES) 2011, which was held from 29 August-3 September 2011, in Cambridge, UK. SCES'2011 is dedicated to 100 years of superconductivity and covers a range of topics in the area of strongly correlated systems. The correlated electronic and magnetic materials featured include f-electron based heavy fermion intermetallics and d-electron based transition metal compounds. The selected papers derived from invited presentations seek to deepen our understanding of the rich physical phenomena that arise from correlation effects. The focus is on quantum phase transitions, non-Fermi liquid phenomena, quantum magnetism, unconventional superconductivity and metal-insulator transitions. Both experimental and theoretical work is presented. Based on fundamental advances in the understanding of electronic materials, much of 20th century materials physics was driven by miniaturisation and integration in the electronics industry to the current generation of nanometre scale devices. The achievements of this industry have brought unprecedented advances to society and well-being, and no doubt there is much further to go—note that this progress is founded on investments and studies in the fundamentals of condensed matter physics from more than 50 years ago. Nevertheless, the defining challenges for the 21st century will lie in the discovery in science, and deployment through engineering, of technologies that can deliver the scale needed to have an impact on the sustainability agenda. Thus the big developments in nanotechnology may lie not in the pursuit of yet smaller transistors, but in the design of new structures that can revolutionise the performance of solar cells, batteries, fuel cells, light-weight structural materials, refrigeration, water purification, etc. The science presented in the papers of this special section also highlights the underlying interest in energy-dense materials, which

  14. Strongly correlated systems experimental techniques

    CERN Document Server

    Mancini, Ferdinando

    2015-01-01

    The continuous evolution and development of experimental techniques is at the basis of any fundamental achievement in modern physics. Strongly correlated systems (SCS), more than any other, need to be investigated through the greatest variety of experimental techniques in order to unveil and crosscheck the numerous and puzzling anomalous behaviors characterizing them. The study of SCS fostered the improvement of many old experimental techniques, but also the advent of many new ones just invented in order to analyze the complex behaviors of these systems. Many novel materials, with functional properties emerging from macroscopic quantum behaviors at the frontier of modern research in physics, chemistry and materials science, belong to this class of systems. The volume presents a representative collection of the modern experimental techniques specifically tailored for the analysis of strongly correlated systems. Any technique is presented in great detail by its own inventor or by one of the world-wide recognize...

  15. Strongly Correlated Systems Theoretical Methods

    CERN Document Server

    Avella, Adolfo

    2012-01-01

    The volume presents, for the very first time, an exhaustive collection of those modern theoretical methods specifically tailored for the analysis of Strongly Correlated Systems. Many novel materials, with functional properties emerging from macroscopic quantum behaviors at the frontier of modern research in physics, chemistry and materials science, belong to this class of systems. Any technique is presented in great detail by its own inventor or by one of the world-wide recognized main contributors. The exposition has a clear pedagogical cut and fully reports on the most relevant case study where the specific technique showed to be very successful in describing and enlightening the puzzling physics of a particular strongly correlated system. The book is intended for advanced graduate students and post-docs in the field as textbook and/or main reference, but also for other researchers in the field who appreciates consulting a single, but comprehensive, source or wishes to get acquainted, in a as painless as po...

  16. Strongly correlated systems numerical methods

    CERN Document Server

    Mancini, Ferdinando

    2013-01-01

    This volume presents, for the very first time, an exhaustive collection of those modern numerical methods specifically tailored for the analysis of Strongly Correlated Systems. Many novel materials, with functional properties emerging from macroscopic quantum behaviors at the frontier of modern research in physics, chemistry and material science, belong to this class of systems. Any technique is presented in great detail by its own inventor or by one of the world-wide recognized main contributors. The exposition has a clear pedagogical cut and fully reports on the most relevant case study where the specific technique showed to be very successful in describing and enlightening the puzzling physics of a particular strongly correlated system. The book is intended for advanced graduate students and post-docs in the field as textbook and/or main reference, but also for other researchers in the field who appreciate consulting a single, but comprehensive, source or wishes to get acquainted, in a as painless as possi...

  17. Statistical inference via fiducial methods

    OpenAIRE

    Salomé, Diemer

    1998-01-01

    In this thesis the attention is restricted to inductive reasoning using a mathematical probability model. A statistical procedure prescribes, for every theoretically possible set of data, the inference about the unknown of interest. ... Zie: Summary

  18. On principles of inductive inference

    OpenAIRE

    Kostecki, Ryszard Paweł

    2011-01-01

    We propose an intersubjective epistemic approach to foundations of probability theory and statistical inference, based on relative entropy and category theory, and aimed to bypass the mathematical and conceptual problems of existing foundational approaches.

  19. Statistical inference for stochastic processes

    National Research Council Canada - National Science Library

    Basawa, Ishwar V; Prakasa Rao, B. L. S

    1980-01-01

    The aim of this monograph is to attempt to reduce the gap between theory and applications in the area of stochastic modelling, by directing the interest of future researchers to the inference aspects...

  20. Strong field laser physics

    CERN Document Server

    2008-01-01

    Since the invention of the laser in the 1960s, people have strived to reach higher intensities and shorter pulse durations. High intensities and ultrashort pulse durations are intimately related. Recent developments have shown that high intensity lasers also open the way to realize pulses with the shortest durations to date, giving birth to the field of attosecond science (1 asec = 10-18s). This book is about high-intensity lasers and their applications. The goal is to give an up to date introduction to the technology behind these laser systems and to the broad range of intense laser applications. These applications include AMO (atomic molecular and optical) physics, x-ray science, attosecond science, plasma physics and particle acceleration, condensed matter science and laser micromachining, and finally even high-energy physics.

  1. Bayesian Inference: with ecological applications

    Science.gov (United States)

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  2. Comprensión de textos de ciencias en estudiantes universitarios: generación de inferencias causales durante la lectura (Comprehension of Science Texts in College Students: generation of Causal Inferences During Reading

    Directory of Open Access Journals (Sweden)

    Gastón Saux

    2015-12-01

    Full Text Available RESUMEN: Se examinó la generación de inferencias causal-antecedente durante y luego de la lectura de textos expositivos con contenidos científicos poco familiares en 52 estudiantes de grado universitario (Edad M = 24.48, DS = 3.6, quienes leyeron 24 textos científicos breves. Se registraron dos medidas de activación durante la lectura (tiempos de lectura y decisión léxica y una medida post-lectura (respuestas a preguntas; además, se controló el grado de familiaridad de los materiales. Los resultados sugieren que las inferencias causal-antecedente son generadas al leer materiales científicos poco familiares, pero el grado de activación de la información requerida por la inferencia depende de la distancia de las partes del texto a ser integradas. ABSTRACT: The generation of causal inferences-background was examined during and after the reading of expository texts with little unfamiliar scientific content in 52 university level students (M Age = 24.48, SD = 3.6, who read 24 short scientific texts. There were two activation measures during the reading (reading times and lexical decision and a post-reading measure (answers to questions; in addition, the degree of familiarity of the material was controlled. The results suggest that causal inferences-background is generated when reading unfamiliar scientific material, but the degree of activation of the information required by inference depends on the distance of the parts of the text to be integrated.

  3. Active inference, communication and hermeneutics.

    Science.gov (United States)

    Friston, Karl J; Frith, Christopher D

    2015-07-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others--during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions--both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then--in principle--they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Active inference, communication and hermeneutics☆

    Science.gov (United States)

    Friston, Karl J.; Frith, Christopher D.

    2015-01-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others – during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions – both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then – in principle – they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa. PMID:25957007

  5. Neural Correlates of Bridging Inferences and Coherence Processing

    Science.gov (United States)

    Kim, Sung-il; Yoon, Misun; Kim, Wonsik; Lee, Sunyoung; Kang, Eunjoo

    2012-01-01

    We explored the neural correlates of bridging inferences and coherence processing during story comprehension using Positron Emission Tomography (PET). Ten healthy right-handed volunteers were visually presented three types of stories (Strong Coherence, Weak Coherence, and Control) consisted of three sentences. The causal connectedness among…

  6. Direct Evidence for a Dual Process Model of Deductive Inference

    Science.gov (United States)

    Markovits, Henry; Brunet, Marie-Laurence; Thompson, Valerie; Brisson, Janie

    2013-01-01

    In 2 experiments, we tested a strong version of a dual process theory of conditional inference (cf. Verschueren et al., 2005a, 2005b) that assumes that most reasoners have 2 strategies available, the choice of which is determined by situational variables, cognitive capacity, and metacognitive control. The statistical strategy evaluates inferences…

  7. Strongly Correlated Topological Insulators

    Science.gov (United States)

    2016-02-03

    Strongly Correlated Topological Insulators In the past year, the grant was used for work in the field of topological phases, with emphasis on finding...surface of topological insulators. In the past 3 years, we have started a new direction, that of fractional topological insulators. These are materials...in which a topologically nontrivial quasi-flat band is fractionally filled and then subject to strong interactions. The views, opinions and/or

  8. Strong Cosmic Censorship

    Science.gov (United States)

    Isenberg, James

    2017-01-01

    The Hawking-Penrose theorems tell us that solutions of Einstein's equations are generally singular, in the sense of the incompleteness of causal geodesics (the paths of physical observers). These singularities might be marked by the blowup of curvature and therefore crushing tidal forces, or by the breakdown of physical determinism. Penrose has conjectured (in his `Strong Cosmic Censorship Conjecture`) that it is generically unbounded curvature that causes singularities, rather than causal breakdown. The verification that ``AVTD behavior'' (marked by the domination of time derivatives over space derivatives) is generically present in a family of solutions has proven to be a useful tool for studying model versions of Strong Cosmic Censorship in that family. I discuss some of the history of Strong Cosmic Censorship, and then discuss what is known about AVTD behavior and Strong Cosmic Censorship in families of solutions defined by varying degrees of isometry, and discuss recent results which we believe will extend this knowledge and provide new support for Strong Cosmic Censorship. I also comment on some of the recent work on ``Weak Null Singularities'', and how this relates to Strong Cosmic Censorship.

  9. Embedding relations connected with strong approximation of Fourier ...

    Indian Academy of Sciences (India)

    . © Indian Academy of Sciences. Embedding relations connected with strong approximation of Fourier series. BOGDAN SZAL. Faculty of Mathematics, Computer Science and Econometrics,. University of Zielona Góra, 65-516 Zielona Góra, ul.

  10. Strong Arcwise Connectedness

    OpenAIRE

    Espinoza, Benjamin; Gartside, Paul; Kovan-Bakan, Merve; Mamatelashvili, Ana

    2012-01-01

    A space is `n-strong arc connected' (n-sac) if for any n points in the space there is an arc in the space visiting them in order. A space is omega-strong arc connected (omega-sac) if it is n-sac for all n. We study these properties in finite graphs, regular continua, and rational continua. There are no 4-sac graphs, but there are 3-sac graphs and graphs which are 2-sac but not 3-sac. For every n there is an n-sac regular continuum, but no regular continuum is omega-sac. There is an omega-sac ...

  11. Abortion: Strong's counterexamples fail

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2009-01-01

    This paper shows that the counterexamples proposed by Strong in 2008 in the Journal of Medical Ethics to Marquis's argument against abortion fail. Strong's basic idea is that there are cases--for example, terminally ill patients--where killing an adult human being is prima facie seriously morally......'s scenarios have some valuable future or admitted that killing them is not seriously morally wrong. Finally, if "valuable future" is interpreted as referring to objective standards, one ends up with implausible and unpalatable moral claims....

  12. Locative inferences in medical texts.

    Science.gov (United States)

    Mayer, P S; Bailey, G H; Mayer, R J; Hillis, A; Dvoracek, J E

    1987-06-01

    Medical research relies on epidemiological studies conducted on a large set of clinical records that have been collected from physicians recording individual patient observations. These clinical records are recorded for the purpose of individual care of the patient with little consideration for their use by a biostatistician interested in studying a disease over a large population. Natural language processing of clinical records for epidemiological studies must deal with temporal, locative, and conceptual issues. This makes text understanding and data extraction of clinical records an excellent area for applied research. While much has been done in making temporal or conceptual inferences in medical texts, parallel work in locative inferences has not been done. This paper examines the locative inferences as well as the integration of temporal, locative, and conceptual issues in the clinical record understanding domain by presenting an application that utilizes two key concepts in its parsing strategy--a knowledge-based parsing strategy and a minimal lexicon.

  13. Quadratic inference functions in marginal models for longitudinal data.

    Science.gov (United States)

    Song, Peter X-K; Jiang, Zhichang; Park, Eunjoo; Qu, Annie

    2009-12-20

    The quadratic inference function (QIF) is a new statistical methodology developed for the estimation and inference in longitudinal data analysis using marginal models. This method is an alternative to the popular generalized estimating equations approach, and it has several useful properties such as robustness, a goodness-of-fit test and model selection. This paper presents an introductory review of the QIF, with a strong emphasis on its applications. In particular, a recently developed SAS MACRO QIF is illustrated in this paper to obtain numerical results.

  14. A strong comeback

    International Nuclear Information System (INIS)

    Marier, D.

    1992-01-01

    This article presents the results of a financial rankings survey which show a strong economic activity in the independent energy industry. The topics of the article include advisor turnover, overseas banks, and the increase in public offerings. The article identifies the top project finance investors for new projects and restructurings and rankings for lenders

  15. Object-Oriented Type Inference

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Palsberg, Jens

    1991-01-01

    We present a new approach to inferring types in untyped object-oriented programs with inheritance, assignments, and late binding. It guarantees that all messages are understood, annotates the program with type information, allows polymorphic methods, and can be used as the basis of an op-timizing......We present a new approach to inferring types in untyped object-oriented programs with inheritance, assignments, and late binding. It guarantees that all messages are understood, annotates the program with type information, allows polymorphic methods, and can be used as the basis of an op...

  16. Eight challenges in phylodynamic inference

    Directory of Open Access Journals (Sweden)

    Simon D.W. Frost

    2015-03-01

    Full Text Available The field of phylodynamics, which attempts to enhance our understanding of infectious disease dynamics using pathogen phylogenies, has made great strides in the past decade. Basic epidemiological and evolutionary models are now well characterized with inferential frameworks in place. However, significant challenges remain in extending phylodynamic inference to more complex systems. These challenges include accounting for evolutionary complexities such as changing mutation rates, selection, reassortment, and recombination, as well as epidemiological complexities such as stochastic population dynamics, host population structure, and different patterns at the within-host and between-host scales. An additional challenge exists in making efficient inferences from an ever increasing corpus of sequence data.

  17. Statistical Inference for Fractional Diffusion Processes

    CERN Document Server

    Rao, B L S Prakasa

    2010-01-01

    Statistical Inference for Fractional Diffusion Processes looks at statistical inference for stochastic processes modeled by stochastic differential equations driven by fractional Brownian motion. Other related processes, such as sequential inference, nonparametric and non parametric inference and parametric estimation are also discussed. The book will deal with Fractional Diffusion Processes (FDP) in relation to statistical influence for stochastic processes. The books main focus is on parametric and non parametric inference problems for fractional diffusion processes when a complete path of t

  18. Information Theory, Inference and Learning Algorithms

    Science.gov (United States)

    Mackay, David J. C.

    2003-10-01

    Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

  19. Grouping preprocess for haplotype inference from SNP and CNV data

    International Nuclear Information System (INIS)

    Shindo, Hiroyuki; Chigira, Hiroshi; Nagaoka, Tomoyo; Inoue, Masato; Kamatani, Naoyuki

    2009-01-01

    The method of statistical haplotype inference is an indispensable technique in the field of medical science. The authors previously reported Hardy-Weinberg equilibrium-based haplotype inference that could manage single nucleotide polymorphism (SNP) data. We recently extended the method to cover copy number variation (CNV) data. Haplotype inference from mixed data is important because SNPs and CNVs are occasionally in linkage disequilibrium. The idea underlying the proposed method is simple, but the algorithm for it needs to be quite elaborate to reduce the calculation cost. Consequently, we have focused on the details on the algorithm in this study. Although the main advantage of the method is accuracy, in that it does not use any approximation, its main disadvantage is still the calculation cost, which is sometimes intractable for large data sets with missing values.

  20. Inferring Human Mobility from Sparse Low Accuracy Mobile Sensing Data

    DEFF Research Database (Denmark)

    Cuttone, Andrea; Jørgensen, Sune Lehmann; Larsen, Jakob Eg

    2014-01-01

    is not always feasible in a longitudinal study or for everyday applications because location sensing has a high battery cost. In this paper we study the feasibility of inferring human mobility from sparse, low accuracy mobile sensing data. We validate our results using participants' location diaries......Understanding both collective and personal human mobility is a central topic in Computational Social Science. Smartphone sensing data is emerging as a promising source for studying human mobility. However, most literature focuses on high-precision GPS positioning and high-frequency sampling, which......, and analyze the inferred geographical networks, the time spent at different places, and the number of unique places over time. Our results suggest that low resolution data allows accurate inference of human mobility patterns....

  1. Strong Electroweak Symmetry Breaking

    CERN Document Server

    Grinstein, Benjamin

    2011-01-01

    Models of spontaneous breaking of electroweak symmetry by a strong interaction do not have fine tuning/hierarchy problem. They are conceptually elegant and use the only mechanism of spontaneous breaking of a gauge symmetry that is known to occur in nature. The simplest model, minimal technicolor with extended technicolor interactions, is appealing because one can calculate by scaling up from QCD. But it is ruled out on many counts: inappropriately low quark and lepton masses (or excessive FCNC), bad electroweak data fits, light scalar and vector states, etc. However, nature may not choose the minimal model and then we are stuck: except possibly through lattice simulations, we are unable to compute and test the models. In the LHC era it therefore makes sense to abandon specific models (of strong EW breaking) and concentrate on generic features that may indicate discovery. The Technicolor Straw Man is not a model but a parametrized search strategy inspired by a remarkable generic feature of walking technicolor,...

  2. Inferring connectivity in networked dynamical systems: Challenges using Granger causality

    Science.gov (United States)

    Lusch, Bethany; Maia, Pedro D.; Kutz, J. Nathan

    2016-09-01

    Determining the interactions and causal relationships between nodes in an unknown networked dynamical system from measurement data alone is a challenging, contemporary task across the physical, biological, and engineering sciences. Statistical methods, such as the increasingly popular Granger causality, are being broadly applied for data-driven discovery of connectivity in fields from economics to neuroscience. A common version of the algorithm is called pairwise-conditional Granger causality, which we systematically test on data generated from a nonlinear model with known causal network structure. Specifically, we simulate networked systems of Kuramoto oscillators and use the Multivariate Granger Causality Toolbox to discover the underlying coupling structure of the system. We compare the inferred results to the original connectivity for a wide range of parameters such as initial conditions, connection strengths, community structures, and natural frequencies. Our results show a significant systematic disparity between the original and inferred network, unless the true structure is extremely sparse or dense. Specifically, the inferred networks have significant discrepancies in the number of edges and the eigenvalues of the connectivity matrix, demonstrating that they typically generate dynamics which are inconsistent with the ground truth. We provide a detailed account of the dynamics for the Erdős-Rényi network model due to its importance in random graph theory and network science. We conclude that Granger causal methods for inferring network structure are highly suspect and should always be checked against a ground truth model. The results also advocate the need to perform such comparisons with any network inference method since the inferred connectivity results appear to have very little to do with the ground truth system.

  3. Inference Optimization using Relational Algebra

    NARCIS (Netherlands)

    Evers, S.; Fokkinga, M.M.; Apers, Peter M.G.

    Exact inference procedures in Bayesian networks can be expressed using relational algebra; this provides a common ground for optimizations from the AI and database communities. Specifically, the ability to accomodate sparse representations of probability distributions opens up the way to optimize

  4. Mixed normal inference on multicointegration

    NARCIS (Netherlands)

    Boswijk, H.P.

    2009-01-01

    Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the

  5. Plasmons in strong superconductors

    International Nuclear Information System (INIS)

    Baldo, M.; Ducoin, C.

    2011-01-01

    We present a study of the possible plasmon excitations that can occur in systems where strong superconductivity is present. In these systems the plasmon energy is comparable to or smaller than the pairing gap. As a prototype of these systems we consider the proton component of Neutron Star matter just below the crust when electron screening is not taken into account. For the realistic case we consider in detail the different aspects of the elementary excitations when the proton, electron components are considered within the Random-Phase Approximation generalized to the superfluid case, while the influence of the neutron component is considered only at qualitative level. Electron screening plays a major role in modifying the proton spectrum and spectral function. At the same time the electron plasmon is strongly modified and damped by the indirect coupling with the superfluid proton component, even at moderately low values of the gap. The excitation spectrum shows the interplay of the different components and their relevance for each excitation modes. The results are relevant for neutrino physics and thermodynamical processes in neutron stars. If electron screening is neglected, the spectral properties of the proton component show some resemblance with the physical situation in high-T c superconductors, and we briefly discuss similarities and differences in this connection. In a general prospect, the results of the study emphasize the role of Coulomb interaction in strong superconductors.

  6. Causal inference for Mann-Whitney-Wilcoxon rank sum and other nonparametric statistics.

    Science.gov (United States)

    Wu, P; Han, Y; Chen, T; Tu, X M

    2014-04-15

    The nonparametric Mann-Whitney-Wilcoxon (MWW) rank sum test is widely used to test treatment effect by comparing the outcome distributions between two groups, especially when there are outliers in the data. However, such statistics generally yield invalid conclusions when applied to nonrandomized studies, particularly those in epidemiologic research. Although one may control for selection bias by using available approaches of covariates adjustment such as matching, regression analysis, propensity score matching, and marginal structural models, such analyses yield results that are not only subjective based on how the outliers are handled but also often difficult to interpret. A popular alternative is a conditional permutation test based on randomization inference [Rosenbaum PR. Covariance adjustment in randomized experiments and observational studies. Statistical Science 2002; 17(3):286-327]. Because it requires strong and implausible assumptions that may not be met in most applications, this approach has limited applications in practice. In this paper, we address this gap in the literature by extending MWW and other nonparametric statistics to provide causal inference for nonrandomized study data by integrating the potential outcome paradigm with the functional response models (FRM). FRM is uniquely positioned to model dynamic relationships between subjects, rather than attributes of a single subject as in most regression models, such as the MWW test within our context. The proposed approach is illustrated with data from both real and simulated studies. Copyright © 2013 John Wiley & Sons, Ltd.

  7. Strong-coupling approximations

    International Nuclear Information System (INIS)

    Abbott, R.B.

    1984-03-01

    Standard path-integral techniques such as instanton calculations give good answers for weak-coupling problems, but become unreliable for strong-coupling. Here we consider a method of replacing the original potential by a suitably chosen harmonic oscillator potential. Physically this is motivated by the fact that potential barriers below the level of the ground-state energy of a quantum-mechanical system have little effect. Numerically, results are good, both for quantum-mechanical problems and for massive phi 4 field theory in 1 + 1 dimensions. 9 references, 6 figures

  8. Strong interaction and QFD

    International Nuclear Information System (INIS)

    Ebata, T.

    1981-01-01

    With an assumed weak multiplet structure for bosonic hadrons, which is consistent with the ΔI = 1/2 rule, it is shown that the strong interaction effective hamiltonian is compatible with the weak SU(2) x U(1) gauge transformation. Especially the rho-meson transforms as a triplet under SU(2)sub(w), and this is the origin of the rho-photon analogy. It is also shown that the existence of the non-vanishing Cabibbo angle is a necessary condition for the absence of the exotic hadrons. (orig.)

  9. Quantifying the multi-scale performance of network inference algorithms.

    Science.gov (United States)

    Oates, Chris J; Amos, Richard; Spencer, Simon E F

    2014-10-01

    Graphical models are widely used to study complex multivariate biological systems. Network inference algorithms aim to reverse-engineer such models from noisy experimental data. It is common to assess such algorithms using techniques from classifier analysis. These metrics, based on ability to correctly infer individual edges, possess a number of appealing features including invariance to rank-preserving transformation. However, regulation in biological systems occurs on multiple scales and existing metrics do not take into account the correctness of higher-order network structure. In this paper novel performance scores are presented that share the appealing properties of existing scores, whilst capturing ability to uncover regulation on multiple scales. Theoretical results confirm that performance of a network inference algorithm depends crucially on the scale at which inferences are to be made; in particular strong local performance does not guarantee accurate reconstruction of higher-order topology. Applying these scores to a large corpus of data from the DREAM5 challenge, we undertake a data-driven assessment of estimator performance. We find that the "wisdom of crowds" network, that demonstrated superior local performance in the DREAM5 challenge, is also among the best performing methodologies for inference of regulation on multiple length scales.

  10. Rotating compressible fluids under strong stratification

    Czech Academy of Sciences Publication Activity Database

    Feireisl, Eduard; Lu, Y.; Novotný, A.

    2014-01-01

    Roč. 19, October (2014), s. 11-18 ISSN 1468-1218 Keywords : rotating fluid * compressible Navier-Stokes * strong stratification Subject RIV: BA - General Mathematics Impact factor: 2.519, year: 2014 http://www.sciencedirect.com/science/article/pii/S1468121814000212#

  11. Strong Coupling Holography

    CERN Document Server

    Dvali, Gia

    2009-01-01

    We show that whenever a 4-dimensional theory with N particle species emerges as a consistent low energy description of a 3-brane embedded in an asymptotically-flat (4+d)-dimensional space, the holographic scale of high-dimensional gravity sets the strong coupling scale of the 4D theory. This connection persists in the limit in which gravity can be consistently decoupled. We demonstrate this effect for orbifold planes, as well as for the solitonic branes and string theoretic D-branes. In all cases the emergence of a 4D strong coupling scale from bulk holography is a persistent phenomenon. The effect turns out to be insensitive even to such extreme deformations of the brane action that seemingly shield 4D theory from the bulk gravity effects. A well understood example of such deformation is given by large 4D Einstein term in the 3-brane action, which is known to suppress the strength of 5D gravity at short distances and change the 5D Newton's law into the four-dimensional one. Nevertheless, we observe that the ...

  12. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  13. Statistical inference on residual life

    CERN Document Server

    Jeong, Jong-Hyeon

    2014-01-01

    This is a monograph on the concept of residual life, which is an alternative summary measure of time-to-event data, or survival data. The mean residual life has been used for many years under the name of life expectancy, so it is a natural concept for summarizing survival or reliability data. It is also more interpretable than the popular hazard function, especially for communications between patients and physicians regarding the efficacy of a new drug in the medical field. This book reviews existing statistical methods to infer the residual life distribution. The review and comparison includes existing inference methods for mean and median, or quantile, residual life analysis through medical data examples. The concept of the residual life is also extended to competing risks analysis. The targeted audience includes biostatisticians, graduate students, and PhD (bio)statisticians. Knowledge in survival analysis at an introductory graduate level is advisable prior to reading this book.

  14. Bayesian Inference on Proportional Elections

    Science.gov (United States)

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  15. Statistical inference an integrated approach

    CERN Document Server

    Migon, Helio S; Louzada, Francisco

    2014-01-01

    Introduction Information The concept of probability Assessing subjective probabilities An example Linear algebra and probability Notation Outline of the bookElements of Inference Common statistical modelsLikelihood-based functions Bayes theorem Exchangeability Sufficiency and exponential family Parameter elimination Prior Distribution Entirely subjective specification Specification through functional forms Conjugacy with the exponential family Non-informative priors Hierarchical priors Estimation Introduction to decision theoryBayesian point estimation Classical point estimation Empirical Bayes estimation Comparison of estimators Interval estimation Estimation in the Normal model Approximating Methods The general problem of inference Optimization techniquesAsymptotic theory Other analytical approximations Numerical integration methods Simulation methods Hypothesis Testing Introduction Classical hypothesis testingBayesian hypothesis testing Hypothesis testing and confidence intervalsAsymptotic tests Prediction...

  16. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  17. LIGO: The strong belief

    CERN Multimedia

    Antonella Del Rosso

    2016-01-01

    Twenty years of designing, building and testing a number of innovative technologies, with the strong belief that the endeavour would lead to a historic breakthrough. The Bulletin publishes an abstract of the Courier’s interview with Barry Barish, one of the founding fathers of LIGO.   The plots show the signals of gravitational waves detected by the twin LIGO observatories at Livingston, Louisiana, and Hanford, Washington. (Image: Caltech/MIT/LIGO Lab) On 11 February, the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Virgo collaborations published a historic paper in which they showed a gravitational signal emitted by the merger of two black holes. These results come after 20 years of hard work by a large collaboration of scientists operating the two LIGO observatories in the US. Barry Barish, Linde Professor of Physics, Emeritus at the California Institute of Technology and former Director of the Global Design Effort for the Internat...

  18. Racing for conditional independence inference

    Czech Academy of Sciences Publication Activity Database

    Bouckaert, R. R.; Studený, Milan

    2005-01-01

    Roč. 3571, - (2005), s. 221-232 ISSN 0302-9743. [ECSQARU 2005. European Conference /8./. Barcelona, 06.07.2005-08.07.2005] R&D Projects: GA ČR GA201/04/0393; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional independence inference * imset * racing algorithms Subject RIV: BA - General Mathematics

  19. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  20. On Quantum Statistical Inference, II

    OpenAIRE

    Barndorff-Nielsen, O. E.; Gill, R. D.; Jupp, P. E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, theoretical developments in the theory of quantum measurements have brought the basic mathematical framework for the probability calculations much closer to that of classical probability theory. The present paper reviews this field and proposes and inte...

  1. Nonparametric predictive inference in reliability

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Yan, K.J.

    2002-01-01

    We introduce a recently developed statistical approach, called nonparametric predictive inference (NPI), to reliability. Bounds for the survival function for a future observation are presented. We illustrate how NPI can deal with right-censored data, and discuss aspects of competing risks. We present possible applications of NPI for Bernoulli data, and we briefly outline applications of NPI for replacement decisions. The emphasis is on introduction and illustration of NPI in reliability contexts, detailed mathematical justifications are presented elsewhere

  2. Computational Neuropsychology and Bayesian Inference.

    Science.gov (United States)

    Parr, Thomas; Rees, Geraint; Friston, Karl J

    2018-01-01

    Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine 'prior' beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology - optimal inference with suboptimal priors - and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient's behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.

  3. Continuous Integrated Invariant Inference, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed project will develop a new technique for invariant inference and embed this and other current invariant inference and checking techniques in an...

  4. Variational inference & deep learning : A new synthesis

    NARCIS (Netherlands)

    Kingma, D.P.

    2017-01-01

    In this thesis, Variational Inference and Deep Learning: A New Synthesis, we propose novel solutions to the problems of variational (Bayesian) inference, generative modeling, representation learning, semi-supervised learning, and stochastic optimization.

  5. Variations on Bayesian Prediction and Inference

    Science.gov (United States)

    2016-05-09

    inference 2.2.1 Background There are a number of statistical inference problems that are not generally formulated via a full probability model...problem of inference about an unknown parameter, the Bayesian approach requires a full probability 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...the problem of inference about an unknown parameter, the Bayesian approach requires a full probability model/likelihood which can be an obstacle

  6. John Strong (1941 - 2006)

    CERN Multimedia

    Wickens, F

    Our friend and colleague John Strong was cruelly taken from us by a brain tumour on Monday 31st July, a few days before his 65th birthday John started his career working with a group from Westfield College, under the leadership of Ted Bellamy. He obtained his PhD and spent the early part of his career on experiments at Rutherford Appleton Laboratory (RAL), but after the early 1970s his research was focussed on experiments in CERN. Over the years he made a number of notable contributions to experiments in CERN: The Omega spectrometer adopted a system John had originally developed for experiments at RAL using vidicon cameras to record the sparks in the spark chambers; He contributed to the success of NA1 and NA7, where he became heavily involved in the electronic trigger systems; He was responsible for the second level trigger system for the ALEPH detector and spent five years leading a team that designed and built the system, which ran for twelve years with only minor interventions. Following ALEPH he tur...

  7. Stirring Strongly Coupled Plasma

    CERN Document Server

    Fadafan, Kazem Bitaghsir; Rajagopal, Krishna; Wiedemann, Urs Achim

    2009-01-01

    We determine the energy it takes to move a test quark along a circle of radius L with angular frequency w through the strongly coupled plasma of N=4 supersymmetric Yang-Mills (SYM) theory. We find that for most values of L and w the energy deposited by stirring the plasma in this way is governed either by the drag force acting on a test quark moving through the plasma in a straight line with speed v=Lw or by the energy radiated by a quark in circular motion in the absence of any plasma, whichever is larger. There is a continuous crossover from the drag-dominated regime to the radiation-dominated regime. In the crossover regime we find evidence for significant destructive interference between energy loss due to drag and that due to radiation as if in vacuum. The rotating quark thus serves as a model system in which the relative strength of, and interplay between, two different mechanisms of parton energy loss is accessible via a controlled classical gravity calculation. We close by speculating on the implicati...

  8. Strong-interaction nonuniversality

    International Nuclear Information System (INIS)

    Volkas, R.R.; Foot, R.; He, X.; Joshi, G.C.

    1989-01-01

    The universal QCD color theory is extended to an SU(3) 1 direct product SU(3) 2 direct product SU(3) 3 gauge theory, where quarks of the ith generation transform as triplets under SU(3)/sub i/ and singlets under the other two factors. The usual color group is then identified with the diagonal subgroup, which remains exact after symmetry breaking. The gauge bosons associated with the 16 broken generators then form two massive octets under ordinary color. The interactions between quarks and these heavy gluonlike particles are explicitly nonuniversal and thus an exploration of their physical implications allows us to shed light on the fundamental issue of strong-interaction universality. Nonuniversality and weak flavor mixing are shown to generate heavy-gluon-induced flavor-changing neutral currents. The phenomenology of these processes is studied, as they provide the major experimental constraint on the extended theory. Three symmetry-breaking scenarios are presented. The first has color breaking occurring at the weak scale, while the second and third divorce the two scales. The third model has the interesting feature of radiatively induced off-diagonal Kobayashi-Maskawa matrix elements

  9. Computational Neuropsychology and Bayesian Inference

    Directory of Open Access Journals (Sweden)

    Thomas Parr

    2018-02-01

    Full Text Available Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine ‘prior’ beliefs with a generative (predictive model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world. This draws upon the notion of a Bayes optimal pathology – optimal inference with suboptimal priors – and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient’s behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.

  10. Plasma pressure and anisotropy inferred from the Tsyganenkomagnetic field model

    Directory of Open Access Journals (Sweden)

    F. Cao

    Full Text Available A numerical procedure has been developed to deduce the plasma pressure and anisotropy from the Tsyganenko magnetic field model. The Tsyganenko empirical field model, which is based on vast satellite field data, provides a realistic description of magnetic field configuration in the magnetosphere. When the force balance under the static condition is assumed, the electromagnetic <strong>J×B> force from the Tsyganenko field model can be used to infer the plasma pressure and anisotropy distributions consistent with the field model. It is found that the <strong>J×B> force obtained from the Tsyganenko field model is not curl-free. The curl-free part of the <strong>J×B> force in an empirical field model can be balanced by the gradient of the isotropic pressure, while the nonzero curl of the <strong>J×B> force can only be associated with the pressure anisotropy. The plasma pressure and anisotropy in the near-Earth plasma sheet are numerically calculated to obtain a static equilibrium consistent with the Tsyganenko field model both in the noon-midnight meridian and in the equatorial plane. The plasma pressure distribution deduced from the Tsyganenko 1989 field model is highly anisotropic and shows this feature early in the substorm growth phase. The pressure anisotropy parameter αP, defined as αP=1-PVertP, is typically ~0.3 at x ≈ -4.5RE and gradually decreases to a small negative value with an increasing tailward distance. The pressure anisotropy from the Tsyganenko 1989 model accounts for 50% of the cross-tail current at maximum and only in a highly localized region near xsim-10RE. In comparison, the plasma pressure anisotropy inferred from the Tsyganenko 1987 model is much smaller. We also find that the boundary

  11. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  12. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  13. SICK: THE SPECTROSCOPIC INFERENCE CRANK

    International Nuclear Information System (INIS)

    Casey, Andrew R.

    2016-01-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  14. sick: The Spectroscopic Inference Crank

    Science.gov (United States)

    Casey, Andrew R.

    2016-03-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  15. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees a...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....... and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last...

  16. Type Inference of Turbo Pascal

    DEFF Research Database (Denmark)

    Hougaard, Ole Ildsgaard; Schwartzbach, Michael I; Askari, Hosein

    1995-01-01

    Type inference is generally thought of as being an exclusive property of the functional programming paradigm. We argue that such a feature may be of significant benefit for also standard imperative languages. We present a working tool (available by WWW) providing these benefits for a full version...... of Turbo Pascal. It has the form of a preprocessor that analyzes programs in which the type annotations are only partial or even absent. The resulting program has full type annotations, will be accepted by the standard Turbo Pascal compiler, and has polymorphic use of procedures resolved by means of code...

  17. Inferring network structure from cascades

    Science.gov (United States)

    Ghonge, Sushrut; Vural, Dervis Can

    2017-07-01

    Many physical, biological, and social phenomena can be described by cascades taking place on a network. Often, the activity can be empirically observed, but not the underlying network of interactions. In this paper we offer three topological methods to infer the structure of any directed network given a set of cascade arrival times. Our formulas hold for a very general class of models where the activation probability of a node is a generic function of its degree and the number of its active neighbors. We report high success rates for synthetic and real networks, for several different cascade models.

  18. Inferring And Possibilities, Rather Than Natural Laws, In Robust Climate Modeling

    Science.gov (United States)

    Brumble, K. C.

    2011-12-01

    One concern raised about sciences which rely upon simulation models (such as climatology) is that the nature of simulations calls into question the soundness of the inferences that can be drawn from them. I argue that this concern stems from a belief that simulation models must provide laws in order to count as rigorous science, when in actual practice simulation models can investigate a variety of types of possibility with differing inferential potential, and that these differing inferences are necessary parts of the experimental process. I appeal to philosophical work in epistemology to make the argument that simulation models in general - and climate models in particular - explore different kinds of possibility, from logical possibilities to physical possibilities. I then argue that not only is this plurality of inference compatible with robust modeling practices, but that it leads to stronger inferences for climatology.

  19. Local temperatures inferred from plant communities suggest strong spatial buffering of climate warming across Northern Europe

    DEFF Research Database (Denmark)

    Lenoir, Jonathan; Graae, Bente; Aarrestad, Per

    2013-01-01

    Recent studies from mountainous areas of small spatial extent (<2500 km(2) ) suggest that fine-grained thermal variability over tens or hundreds of metres exceeds much of the climate warming expected for the coming decades. Such variability in temperature provides buffering to mitigate climate-ch...

  20. Inferring Adolescent Social Networks Using Partial Ego-Network Substance Use Data

    Science.gov (United States)

    2008-05-15

    Inferring Adolescent Social Networks Using Partial Ego-Network Substance Use Data Ju-Sung Lee Department of Social and Decision Sciences College of...DATES COVERED 00-00-2008 to 00-00-2008 4. TITLE AND SUBTITLE Inferring Adolescent Social Networks Using Partial Ego-Network Substance Use Data 5a...respectively. The sample size of the social networks in Na- tional Longitudinal Study of Adolescent Health (Add Health) (Bearman et al., 2004) is of

  1. Inferring Identity From Language: Linguistic Intergroup Bias Informs Social Categorization.

    Science.gov (United States)

    Porter, Shanette C; Rheinschmidt-Same, Michelle; Richeson, Jennifer A

    2016-01-01

    The present research examined whether a communicator's verbal, implicit message regarding a target is used as a cue for inferring that communicator's social identity. Previous research has found linguistic intergroup bias (LIB) in individuals' speech: They use abstract language to describe in-group targets' desirable behaviors and concrete language to describe their undesirable behaviors (favorable LIB), but use concrete language for out-group targets' desirable behaviors and abstract language for their undesirable behaviors (unfavorable LIB). Consequently, one can infer the type of language a communicator is likely to use to describe in-group and out-group targets. We hypothesized and found evidence for the reverse inference. Across four studies, individuals inferred a communicator's social identity on the basis of the communicator's use of an LIB. Specifically, participants more strongly believed that a communicator and target shared a social identity when the communicator used the favorable, rather than the unfavorable, LIB in describing that target. © The Author(s) 2015.

  2. Strong-Q-sequences and small d

    Czech Academy of Sciences Publication Activity Database

    Chodounský, David

    2012-01-01

    Roč. 159, č. 3 (2012), s. 2942-2946 ISSN 0166-8641. [Prague Symposium on General Topology and its Relations to Modern Analysis and Algebra /11./. Prague, 07.08.2011-12.08.2011] Institutional support: RVO:67985840 Keywords : Katowice problem * strong-Q-sequence * dominating number Subject RIV: BA - General Mathematics Impact factor: 0.562, year: 2012 http://www.sciencedirect.com/science/article/pii/S0166864112002222

  3. Bayesian Inference of Tumor Hypoxia

    Science.gov (United States)

    Gunawan, R.; Tenti, G.; Sivaloganathan, S.

    2009-12-01

    Tumor hypoxia is a state of oxygen deprivation in tumors. It has been associated with aggressive tumor phenotypes and with increased resistance to conventional cancer therapies. In this study, we report on the application of Bayesian sequential analysis in estimating the most probable value of tumor hypoxia quantification based on immunohistochemical assays of a biomarker. The `gold standard' of tumor hypoxia assessment is a direct measurement of pO2 in vivo by the Eppendorf polarographic electrode, which is an invasive technique restricted to accessible sites and living tissues. An attractive alternative is immunohistochemical staining to detect proteins expressed by cells during hypoxia. Carbonic anhydrase IX (CAIX) is an enzyme expressed on the cell membrane during hypoxia to balance the immediate extracellular microenvironment. CAIX is widely regarded as a surrogate marker of chronic hypoxia in various cancers. The study was conducted with two different experimental procedures. The first data set was a group of three patients with invasive cervical carcinomas, from which five biopsies were obtained. Each of the biopsies was fully sectioned and from each section, the proportion of CAIX-positive cells was estimated. Measurements were made by image analysis of multiple deep sections cut through these biopsies, labeled for CAIX using both immunofluorescence and immunohistochemical techniques [1]. The second data set was a group of 24 patients, also with invasive cervical carcinomas, from which two biopsies were obtained. Bayesian parameter estimation was applied to obtain a reliable inference about the proportion of CAIX-positive cells within the carcinomas, based on the available biopsies. From the first data set, two to three biopsies were found to be sufficient to infer the overall CAIX percentage in the simple form: best estimate±uncertainty. The second data-set led to a similar result in 70% of the cases. In the remaining cases Bayes' theorem warned us

  4. Cognitive Inference Device for Activity Supervision in the Elderly

    Science.gov (United States)

    2014-01-01

    Human activity, life span, and quality of life are enhanced by innovations in science and technology. Aging individual needs to take advantage of these developments to lead a self-regulated life. However, maintaining a self-regulated life at old age involves a high degree of risk, and the elderly often fail at this goal. Thus, the objective of our study is to investigate the feasibility of implementing a cognitive inference device (CI-device) for effective activity supervision in the elderly. To frame the CI-device, we propose a device design framework along with an inference algorithm and implement the designs through an artificial neural model with different configurations, mapping the CI-device's functions to minimise the device's prediction error. An analysis and discussion are then provided to validate the feasibility of CI-device implementation for activity supervision in the elderly. PMID:25405211

  5. Spontaneous Trait Inferences on Social Media.

    Science.gov (United States)

    Levordashka, Ana; Utz, Sonja

    2017-01-01

    The present research investigates whether spontaneous trait inferences occur under conditions characteristic of social media and networking sites: nonextreme, ostensibly self-generated content, simultaneous presentation of multiple cues, and self-paced browsing. We used an established measure of trait inferences (false recognition paradigm) and a direct assessment of impressions. Without being asked to do so, participants spontaneously formed impressions of people whose status updates they saw. Our results suggest that trait inferences occurred from nonextreme self-generated content, which is commonly found in social media updates (Experiment 1) and when nine status updates from different people were presented in parallel (Experiment 2). Although inferences did occur during free browsing, the results suggest that participants did not necessarily associate the traits with the corresponding status update authors (Experiment 3). Overall, the findings suggest that spontaneous trait inferences occur on social media. We discuss implications for online communication and research on spontaneous trait inferences.

  6. Efficient Bayesian inference for natural time series using ARFIMA processes

    Science.gov (United States)

    Graves, Timothy; Gramacy, Robert; Franzke, Christian; Watkins, Nicholas

    2016-04-01

    Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. We present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators [1]. In addition we show how the method can be used to perform joint inference of the stability exponent and the memory parameter when ARFIMA is extended to allow for alpha-stable innovations. Such models can be used to study systems where heavy tails and long range memory coexist. [1] Graves et al, Nonlin. Processes Geophys., 22, 679-700, 2015; doi:10.5194/npg-22-679-2015.

  7. Statistical inference for financial engineering

    CERN Document Server

    Taniguchi, Masanobu; Ogata, Hiroaki; Taniai, Hiroyuki

    2014-01-01

    This monograph provides the fundamentals of statistical inference for financial engineering and covers some selected methods suitable for analyzing financial time series data. In order to describe the actual financial data, various stochastic processes, e.g. non-Gaussian linear processes, non-linear processes, long-memory processes, locally stationary processes etc. are introduced and their optimal estimation is considered as well. This book also includes several statistical approaches, e.g., discriminant analysis, the empirical likelihood method, control variate method, quantile regression, realized volatility etc., which have been recently developed and are considered to be powerful tools for analyzing the financial data, establishing a new bridge between time series and financial engineering. This book is well suited as a professional reference book on finance, statistics and statistical financial engineering. Readers are expected to have an undergraduate-level knowledge of statistics.

  8. Inferring echolocation in ancient bats.

    Science.gov (United States)

    Simmons, Nancy B; Seymour, Kevin L; Habersetzer, Jörg; Gunnell, Gregg F

    2010-08-19

    Laryngeal echolocation, used by most living bats to form images of their surroundings and to detect and capture flying prey, is considered to be a key innovation for the evolutionary success of bats, and palaeontologists have long sought osteological correlates of echolocation that can be used to infer the behaviour of fossil bats. Veselka et al. argued that the most reliable trait indicating echolocation capabilities in bats is an articulation between the stylohyal bone (part of the hyoid apparatus that supports the throat and larynx) and the tympanic bone, which forms the floor of the middle ear. They examined the oldest and most primitive known bat, Onychonycteris finneyi (early Eocene, USA), and argued that it showed evidence of this stylohyal-tympanic articulation, from which they concluded that O. finneyi may have been capable of echolocation. We disagree with their interpretation of key fossil data and instead argue that O. finneyi was probably not an echolocating bat.

  9. Polynomial Regressions and Nonsense Inference

    Directory of Open Access Journals (Sweden)

    Daniel Ventosa-Santaulària

    2013-11-01

    Full Text Available Polynomial specifications are widely used, not only in applied economics, but also in epidemiology, physics, political analysis and psychology, just to mention a few examples. In many cases, the data employed to estimate such specifications are time series that may exhibit stochastic nonstationary behavior. We extend Phillips’ results (Phillips, P. Understanding spurious regressions in econometrics. J. Econom. 1986, 33, 311–340. by proving that an inference drawn from polynomial specifications, under stochastic nonstationarity, is misleading unless the variables cointegrate. We use a generalized polynomial specification as a vehicle to study its asymptotic and finite-sample properties. Our results, therefore, lead to a call to be cautious whenever practitioners estimate polynomial regressions.

  10. Type inference for correspondence types

    DEFF Research Database (Denmark)

    Hüttel, Hans; Gordon, Andy; Hansen, Rene Rydhof

    2009-01-01

    We present a correspondence type/effect system for authenticity in a π-calculus with polarized channels, dependent pair types and effect terms and show how one may, given a process P and an a priori type environment E, generate constraints that are formulae in the Alternating Least Fixed......-Point (ALFP) logic. We then show how a reasonable model of the generated constraints yields a type/effect assignment such that P becomes well-typed with respect to E if and only if this is possible. The formulae generated satisfy a finite model property; a system of constraints is satisfiable if and only...... if it has a finite model. As a consequence, we obtain the result that type/effect inference in our system is polynomial-time decidable....

  11. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  12. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... and computational complexity. We also analyze the impact of transceiver filters on the sparseness of the channel response, and propose a dictionary design that permits the deployment of sparse inference methods in conditions of low bandwidth....

  13. Inference Attacks and Control on Database Structures

    Directory of Open Access Journals (Sweden)

    Muhamed Turkanovic

    2015-02-01

    Full Text Available Today’s databases store information with sensitivity levels that range from public to highly sensitive, hence ensuring confidentiality can be highly important, but also requires costly control. This paper focuses on the inference problem on different database structures. It presents possible treats on privacy with relation to the inference, and control methods for mitigating these treats. The paper shows that using only access control, without any inference control is inadequate, since these models are unable to protect against indirect data access. Furthermore, it covers new inference problems which rise from the dimensions of new technologies like XML, semantics, etc.

  14. Inferring gene regression networks with model trees

    Directory of Open Access Journals (Sweden)

    Aguilar-Ruiz Jesus S

    2010-10-01

    Full Text Available Abstract Background Novel strategies are required in order to handle the huge amount of data produced by microarray technologies. To infer gene regulatory networks, the first step is to find direct regulatory relationships between genes building the so-called gene co-expression networks. They are typically generated using correlation statistics as pairwise similarity measures. Correlation-based methods are very useful in order to determine whether two genes have a strong global similarity but do not detect local similarities. Results We propose model trees as a method to identify gene interaction networks. While correlation-based methods analyze each pair of genes, in our approach we generate a single regression tree for each gene from the remaining genes. Finally, a graph from all the relationships among output and input genes is built taking into account whether the pair of genes is statistically significant. For this reason we apply a statistical procedure to control the false discovery rate. The performance of our approach, named REGNET, is experimentally tested on two well-known data sets: Saccharomyces Cerevisiae and E.coli data set. First, the biological coherence of the results are tested. Second the E.coli transcriptional network (in the Regulon database is used as control to compare the results to that of a correlation-based method. This experiment shows that REGNET performs more accurately at detecting true gene associations than the Pearson and Spearman zeroth and first-order correlation-based methods. Conclusions REGNET generates gene association networks from gene expression data, and differs from correlation-based methods in that the relationship between one gene and others is calculated simultaneously. Model trees are very useful techniques to estimate the numerical values for the target genes by linear regression functions. They are very often more precise than linear regression models because they can add just different linear

  15. LAIT: a local ancestry inference toolkit.

    Science.gov (United States)

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  16. Forward and backward inference in spatial cognition.

    Directory of Open Access Journals (Sweden)

    Will D Penny

    Full Text Available This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of 'lower-level' computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.

  17. Fiducial inference - A Neyman-Pearson interpretation

    NARCIS (Netherlands)

    Salome, D; VonderLinden, W; Dose,; Fischer, R; Preuss, R

    1999-01-01

    Fisher's fiducial argument is a tool for deriving inferences in the form of a probability distribution on the parameter space, not based on Bayes's Theorem. Lindley established that in exceptional situations fiducial inferences coincide with posterior distributions; in the other situations fiducial

  18. Uncertainty in prediction and in inference

    NARCIS (Netherlands)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close re-lationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in

  19. Using the Weibull distribution reliability, modeling and inference

    CERN Document Server

    McCool, John I

    2012-01-01

    Understand and utilize the latest developments in Weibull inferential methods While the Weibull distribution is widely used in science and engineering, most engineers do not have the necessary statistical training to implement the methodology effectively. Using the Weibull Distribution: Reliability, Modeling, and Inference fills a gap in the current literature on the topic, introducing a self-contained presentation of the probabilistic basis for the methodology while providing powerful techniques for extracting information from data. The author explains the use of the Weibull distribution

  20. Reinforcement learning or active inference?

    Science.gov (United States)

    Friston, Karl J; Daunizeau, Jean; Kiebel, Stefan J

    2009-07-29

    This paper questions the need for reinforcement learning or control theory when optimising behaviour. We show that it is fairly simple to teach an agent complicated and adaptive behaviours using a free-energy formulation of perception. In this formulation, agents adjust their internal states and sampling of the environment to minimize their free-energy. Such agents learn causal structure in the environment and sample it in an adaptive and self-supervised fashion. This results in behavioural policies that reproduce those optimised by reinforcement learning and dynamic programming. Critically, we do not need to invoke the notion of reward, value or utility. We illustrate these points by solving a benchmark problem in dynamic programming; namely the mountain-car problem, using active perception or inference under the free-energy principle. The ensuing proof-of-concept may be important because the free-energy formulation furnishes a unified account of both action and perception and may speak to a reappraisal of the role of dopamine in the brain.

  1. Reinforcement learning or active inference?

    Directory of Open Access Journals (Sweden)

    Karl J Friston

    2009-07-01

    Full Text Available This paper questions the need for reinforcement learning or control theory when optimising behaviour. We show that it is fairly simple to teach an agent complicated and adaptive behaviours using a free-energy formulation of perception. In this formulation, agents adjust their internal states and sampling of the environment to minimize their free-energy. Such agents learn causal structure in the environment and sample it in an adaptive and self-supervised fashion. This results in behavioural policies that reproduce those optimised by reinforcement learning and dynamic programming. Critically, we do not need to invoke the notion of reward, value or utility. We illustrate these points by solving a benchmark problem in dynamic programming; namely the mountain-car problem, using active perception or inference under the free-energy principle. The ensuing proof-of-concept may be important because the free-energy formulation furnishes a unified account of both action and perception and may speak to a reappraisal of the role of dopamine in the brain.

  2. Extended likelihood inference in reliability

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Beckman, R.J.; Waller, R.A.

    1978-10-01

    Extended likelihood methods of inference are developed in which subjective information in the form of a prior distribution is combined with sampling results by means of an extended likelihood function. The extended likelihood function is standardized for use in obtaining extended likelihood intervals. Extended likelihood intervals are derived for the mean of a normal distribution with known variance, the failure-rate of an exponential distribution, and the parameter of a binomial distribution. Extended second-order likelihood methods are developed and used to solve several prediction problems associated with the exponential and binomial distributions. In particular, such quantities as the next failure-time, the number of failures in a given time period, and the time required to observe a given number of failures are predicted for the exponential model with a gamma prior distribution on the failure-rate. In addition, six types of life testing experiments are considered. For the binomial model with a beta prior distribution on the probability of nonsurvival, methods are obtained for predicting the number of nonsurvivors in a given sample size and for predicting the required sample size for observing a specified number of nonsurvivors. Examples illustrate each of the methods developed. Finally, comparisons are made with Bayesian intervals in those cases where these are known to exist

  3. Inferring evoked brain connectivity through adaptive perturbation.

    Science.gov (United States)

    Lepage, Kyle Q; Ching, ShiNung; Kramer, Mark A

    2013-04-01

    Inference of functional networks-representing the statistical associations between time series recorded from multiple sensors-has found important applications in neuroscience. However, networksexhibiting time-locked activity between physically independent elements can bias functional connectivity estimates employing passive measurements. Here, a perturbative and adaptive method of inferring network connectivity based on measurement and stimulation-so called "evoked network connectivity" is introduced. This procedure, employing a recursive Bayesian update scheme, allows principled network stimulation given a current network estimate inferred from all previous stimulations and recordings. The method decouples stimulus and detector design from network inference and can be suitably applied to a wide range of clinical and basic neuroscience related problems. The proposed method demonstrates improved accuracy compared to network inference based on passive observation of node dynamics and an increased rate of convergence relative to network estimation employing a naïve stimulation strategy.

  4. An Example of Bayesian Inference in Thermal Sciences

    Indian Academy of Sciences (India)

    Admin

    ob jective, n on -in form ative p rior. A subjective but usually inform ative and usefulprior is the G aussian or norm al prior. For exam ple, if the param eter to be retrieved is speci¯c heat of a solid and if w e know that the m aterial is a m etal, then one can calculate the m ean and standard deviation of speci¯c heat of all m etals ...

  5. Quantum electrodynamics of strong fields

    International Nuclear Information System (INIS)

    Greiner, W.

    1983-01-01

    Quantum Electrodynamics of Strong Fields provides a broad survey of the theoretical and experimental work accomplished, presenting papers by a group of international researchers who have made significant contributions to this developing area. Exploring the quantum theory of strong fields, the volume focuses on the phase transition to a charged vacuum in strong electric fields. The contributors also discuss such related topics as QED at short distances, precision tests of QED, nonperturbative QCD and confinement, pion condensation, and strong gravitational fields In addition, the volume features a historical paper on the roots of quantum field theory in the history of quantum physics by noted researcher Friedrich Hund

  6. Statistical Inference for Data Adaptive Target Parameters.

    Science.gov (United States)

    Hubbard, Alan E; Kherad-Pajouh, Sara; van der Laan, Mark J

    2016-05-01

    Consider one observes n i.i.d. copies of a random variable with a probability distribution that is known to be an element of a particular statistical model. In order to define our statistical target we partition the sample in V equal size sub-samples, and use this partitioning to define V splits in an estimation sample (one of the V subsamples) and corresponding complementary parameter-generating sample. For each of the V parameter-generating samples, we apply an algorithm that maps the sample to a statistical target parameter. We define our sample-split data adaptive statistical target parameter as the average of these V-sample specific target parameters. We present an estimator (and corresponding central limit theorem) of this type of data adaptive target parameter. This general methodology for generating data adaptive target parameters is demonstrated with a number of practical examples that highlight new opportunities for statistical learning from data. This new framework provides a rigorous statistical methodology for both exploratory and confirmatory analysis within the same data. Given that more research is becoming "data-driven", the theory developed within this paper provides a new impetus for a greater involvement of statistical inference into problems that are being increasingly addressed by clever, yet ad hoc pattern finding methods. To suggest such potential, and to verify the predictions of the theory, extensive simulation studies, along with a data analysis based on adaptively determined intervention rules are shown and give insight into how to structure such an approach. The results show that the data adaptive target parameter approach provides a general framework and resulting methodology for data-driven science.

  7. Journal of Earth System Science | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science. R V Krishnamurthy. Articles written in Journal of Earth System Science. Volume 109 Issue 1 March 2000 pp 129-140. Late Glacial and Holocene Paleoliminology of two temperate lakes inferred from sediment organic C chronology · N A Lovan R V Krishnamurthy.

  8. Bayesian inference of chemical kinetic models from proposed reactions

    KAUST Repository

    Galagali, Nikhil

    2015-02-01

    © 2014 Elsevier Ltd. Bayesian inference provides a natural framework for combining experimental data with prior knowledge to develop chemical kinetic models and quantify the associated uncertainties, not only in parameter values but also in model structure. Most existing applications of Bayesian model selection methods to chemical kinetics have been limited to comparisons among a small set of models, however. The significant computational cost of evaluating posterior model probabilities renders traditional Bayesian methods infeasible when the model space becomes large. We present a new framework for tractable Bayesian model inference and uncertainty quantification using a large number of systematically generated model hypotheses. The approach involves imposing point-mass mixture priors over rate constants and exploring the resulting posterior distribution using an adaptive Markov chain Monte Carlo method. The posterior samples are used to identify plausible models, to quantify rate constant uncertainties, and to extract key diagnostic information about model structure-such as the reactions and operating pathways most strongly supported by the data. We provide numerical demonstrations of the proposed framework by inferring kinetic models for catalytic steam and dry reforming of methane using available experimental data.

  9. Inferring the gene network underlying the branching of tomato inflorescence.

    Directory of Open Access Journals (Sweden)

    Laura Astola

    Full Text Available The architecture of tomato inflorescence strongly affects flower production and subsequent crop yield. To understand the genetic activities involved, insight into the underlying network of genes that initiate and control the sympodial growth in the tomato is essential. In this paper, we show how the structure of this network can be derived from available data of the expressions of the involved genes. Our approach starts from employing biological expert knowledge to select the most probable gene candidates behind branching behavior. To find how these genes interact, we develop a stepwise procedure for computational inference of the network structure. Our data consists of expression levels from primary shoot meristems, measured at different developmental stages on three different genotypes of tomato. With the network inferred by our algorithm, we can explain the dynamics corresponding to all three genotypes simultaneously, despite their apparent dissimilarities. We also correctly predict the chronological order of expression peaks for the main hubs in the network. Based on the inferred network, using optimal experimental design criteria, we are able to suggest an informative set of experiments for further investigation of the mechanisms underlying branching behavior.

  10. Strong WW Interaction at LHC

    Energy Technology Data Exchange (ETDEWEB)

    Pelaez, Jose R

    1998-12-14

    We present a brief pedagogical introduction to the Effective Electroweak Chiral Lagrangians, which provide a model independent description of the WW interactions in the strong regime. When it is complemented with some unitarization or a dispersive approach, this formalism allows the study of the general strong scenario expected at the LHC, including resonances.

  11. Strong spin-photon coupling in silicon.

    Science.gov (United States)

    Samkharadze, N; Zheng, G; Kalhor, N; Brousse, D; Sammak, A; Mendes, U C; Blais, A; Scappucci, G; Vandersypen, L M K

    2018-03-09

    Long coherence times of single spins in silicon quantum dots make these systems highly attractive for quantum computation, but how to scale up spin qubit systems remains an open question. As a first step to address this issue, we demonstrate the strong coupling of a single electron spin and a single microwave photon. The electron spin is trapped in a silicon double quantum dot, and the microwave photon is stored in an on-chip high-impedance superconducting resonator. The electric field component of the cavity photon couples directly to the charge dipole of the electron in the double dot, and indirectly to the electron spin, through a strong local magnetic field gradient from a nearby micromagnet. Our results provide a route to realizing large networks of quantum dot-based spin qubit registers. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  12. Order statistics & inference estimation methods

    CERN Document Server

    Balakrishnan, N

    1991-01-01

    The literature on order statistics and inferenc eis quite extensive and covers a large number of fields ,but most of it is dispersed throughout numerous publications. This volume is the consolidtion of the most important results and places an emphasis on estimation. Both theoretical and computational procedures are presented to meet the needs of researchers, professionals, and students. The methods of estimation discussed are well-illustrated with numerous practical examples from both the physical and life sciences, including sociology,psychology,a nd electrical and chemical engineering. A co

  13. Bayesian Inference in Statistical Analysis

    CERN Document Server

    Box, George E P

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Rob

  14. Strong-back safety latch

    International Nuclear Information System (INIS)

    DeSantis, G.N.

    1995-01-01

    The calculation decides the integrity of the safety latch that will hold the strong-back to the pump during lifting. The safety latch will be welded to the strong-back and will latch to a 1.5-in. dia cantilever rod welded to the pump baseplate. The static and dynamic analysis shows that the safety latch will hold the strong-back to the pump if the friction clamps fail and the pump become free from the strong-back. Thus, the safety latch will meet the requirements of the Lifting and Rigging Manual for under the hook lifting for static loading; it can withstand shock loads from the strong-back falling 0.25 inch

  15. Strong-back safety latch

    Energy Technology Data Exchange (ETDEWEB)

    DeSantis, G.N.

    1995-03-06

    The calculation decides the integrity of the safety latch that will hold the strong-back to the pump during lifting. The safety latch will be welded to the strong-back and will latch to a 1.5-in. dia cantilever rod welded to the pump baseplate. The static and dynamic analysis shows that the safety latch will hold the strong-back to the pump if the friction clamps fail and the pump become free from the strong-back. Thus, the safety latch will meet the requirements of the Lifting and Rigging Manual for under the hook lifting for static loading; it can withstand shock loads from the strong-back falling 0.25 inch.

  16. All of statistics a concise course in statistical inference

    CERN Document Server

    Wasserman, Larry

    2004-01-01

    This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...

  17. Inference and the introductory statistics course

    Science.gov (United States)

    Pfannkuch, Maxine; Regan, Matt; Wild, Chris; Budgett, Stephanie; Forbes, Sharleen; Harraway, John; Parsonage, Ross

    2011-10-01

    This article sets out some of the rationale and arguments for making major changes to the teaching and learning of statistical inference in introductory courses at our universities by changing from a norm-based, mathematical approach to more conceptually accessible computer-based approaches. The core problem of the inferential argument with its hypothetical probabilistic reasoning process is examined in some depth. We argue that the revolution in the teaching of inference must begin. We also discuss some perplexing issues, problematic areas and some new insights into language conundrums associated with introducing the logic of inference through randomization methods.

  18. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  19. Examining Associations between Reading Motivation and Inference Generation beyond Reading Comprehension Skill

    Science.gov (United States)

    Clinton, Virginia

    2015-01-01

    The purpose of this study was to examine the associations between reading motivation and inference generation while reading. Undergraduate participants (N = 69) read two science articles while thinking aloud, completed a standardized reading comprehension assessment, and self reported their habitual reading motivation. Findings indicate that…

  20. Titanium: light, strong, and white

    Science.gov (United States)

    Woodruff, Laurel; Bedinger, George

    2013-01-01

    Titanium (Ti) is a strong silver-gray metal that is highly resistant to corrosion and is chemically inert. It is as strong as steel but 45 percent lighter, and it is twice as strong as aluminum but only 60 percent heavier. Titanium dioxide (TiO2) has a very high refractive index, which means that it has high light-scattering ability. As a result, TiO2 imparts whiteness, opacity, and brightness to many products. ...Because of the unique physical properties of titanium metal and the whiteness provided by TiO2, titanium is now used widely in modern industrial societies.

  1. Inferring Causalities in Landscape Genetics: An Extension of Wright's Causal Modeling to Distance Matrices.

    Science.gov (United States)

    Fourtune, Lisa; Prunier, Jérôme G; Paz-Vinas, Ivan; Loot, Géraldine; Veyssière, Charlotte; Blanchet, Simon

    2018-04-01

    Identifying landscape features that affect functional connectivity among populations is a major challenge in fundamental and applied sciences. Landscape genetics combines landscape and genetic data to address this issue, with the main objective of disentangling direct and indirect relationships among an intricate set of variables. Causal modeling has strong potential to address the complex nature of landscape genetic data sets. However, this statistical approach was not initially developed to address the pairwise distance matrices commonly used in landscape genetics. Here, we aimed to extend the applicability of two causal modeling methods-that is, maximum-likelihood path analysis and the directional separation test-by developing statistical approaches aimed at handling distance matrices and improving functional connectivity inference. Using simulations, we showed that these approaches greatly improved the robustness of the absolute (using a frequentist approach) and relative (using an information-theoretic approach) fits of the tested models. We used an empirical data set combining genetic information on a freshwater fish species (Gobio occitaniae) and detailed landscape descriptors to demonstrate the usefulness of causal modeling to identify functional connectivity in wild populations. Specifically, we demonstrated how direct and indirect relationships involving altitude, temperature, and oxygen concentration influenced within- and between-population genetic diversity of G. occitaniae.

  2. Experimental evidence for circular inference in schizophrenia

    Science.gov (United States)

    Jardri, Renaud; Duverne, Sandrine; Litvinova, Alexandra S.; Denève, Sophie

    2017-01-01

    Schizophrenia (SCZ) is a complex mental disorder that may result in some combination of hallucinations, delusions and disorganized thinking. Here SCZ patients and healthy controls (CTLs) report their level of confidence on a forced-choice task that manipulated the strength of sensory evidence and prior information. Neither group's responses can be explained by simple Bayesian inference. Rather, individual responses are best captured by a model with different degrees of circular inference. Circular inference refers to a corruption of sensory data by prior information and vice versa, leading us to `see what we expect' (through descending loops), to `expect what we see' (through ascending loops) or both. Ascending loops are stronger for SCZ than CTLs and correlate with the severity of positive symptoms. Descending loops correlate with the severity of negative symptoms. Both loops correlate with disorganized symptoms. The findings suggest that circular inference might mediate the clinical manifestations of SCZ.

  3. Artificial Hydrocarbon Networks Fuzzy Inference System

    Directory of Open Access Journals (Sweden)

    Hiram Ponce

    2013-01-01

    Full Text Available This paper presents a novel fuzzy inference model based on artificial hydrocarbon networks, a computational algorithm for modeling problems based on chemical hydrocarbon compounds. In particular, the proposed fuzzy-molecular inference model (FIM-model uses molecular units of information to partition the output space in the defuzzification step. Moreover, these molecules are linguistic units that can be partially understandable due to the organized structure of the topology and metadata parameters involved in artificial hydrocarbon networks. In addition, a position controller for a direct current (DC motor was implemented using the proposed FIM-model in type-1 and type-2 fuzzy inference systems. Experimental results demonstrate that the fuzzy-molecular inference model can be applied as an alternative of type-2 Mamdani’s fuzzy control systems because the set of molecular units can deal with dynamic uncertainties mostly present in real-world control applications.

  4. SEBINI: Software Environment for BIological Network Inference.

    Science.gov (United States)

    Taylor, Ronald C; Shah, Anuj; Treatman, Charles; Blevins, Meridith

    2006-11-01

    The Software Environment for BIological Network Inference (SEBINI) has been created to provide an interactive environment for the deployment and evaluation of algorithms used to reconstruct the structure of biological regulatory and interaction networks. SEBINI can be used to compare and train network inference methods on artificial networks and simulated gene expression perturbation data. It also allows the analysis within the same framework of experimental high-throughput expression data using the suite of (trained) inference methods; hence SEBINI should be useful to software developers wishing to evaluate, compare, refine or combine inference techniques, and to bioinformaticians analyzing experimental data. SEBINI provides a platform that aids in more accurate reconstruction of biological networks, with less effort, in less time. A demonstration website is located at https://www.emsl.pnl.gov/NIT/NIT.html. The Java source code and PostgreSQL database schema are available freely for non-commercial use.

  5. Inferring Domain Plans in Question-Answering

    National Research Council Canada - National Science Library

    Pollack, Martha E

    1986-01-01

    The importance of plan inference in models of conversation has been widely noted in the computational-linguistics literature, and its incorporation in question-answering systems has enabled a range...

  6. Quantum centipedes with strong global constraint

    Science.gov (United States)

    Grange, Pascal

    2017-06-01

    A centipede made of N quantum walkers on a one-dimensional lattice is considered. The distance between two consecutive legs is either one or two lattice spacings, and a global constraint is imposed: the maximal distance between the first and last leg is N  +  1. This is the strongest global constraint compatible with walking. For an initial value of the wave function corresponding to a localized configuration at the origin, the probability law of the first leg of the centipede can be expressed in closed form in terms of Bessel functions. The dispersion relation and the group velocities are worked out exactly. Their maximal group velocity goes to zero when N goes to infinity, which is in contrast with the behaviour of group velocities of quantum centipedes without global constraint, which were recently shown by Krapivsky, Luck and Mallick to give rise to ballistic spreading of extremal wave-front at non-zero velocity in the large-N limit. The corresponding Hamiltonians are implemented numerically, based on a block structure of the space of configurations corresponding to compositions of the integer N. The growth of the maximal group velocity when the strong constraint is gradually relaxed is explored, and observed to be linear in the density of gaps allowed in the configurations. Heuristic arguments are presented to infer that the large-N limit of the globally constrained model can yield finite group velocities provided the allowed number of gaps is a finite fraction of N.

  7. Scientific inference learning from data

    CERN Document Server

    Vaughan, Simon

    2013-01-01

    Providing the knowledge and practical experience to begin analysing scientific data, this book is ideal for physical sciences students wishing to improve their data handling skills. The book focuses on explaining and developing the practice and understanding of basic statistical analysis, concentrating on a few core ideas, such as the visual display of information, modelling using the likelihood function, and simulating random data. Key concepts are developed through a combination of graphical explanations, worked examples, example computer code and case studies using real data. Students will develop an understanding of the ideas behind statistical methods and gain experience in applying them in practice. Further resources are available at www.cambridge.org/9781107607590, including data files for the case studies so students can practise analysing data, and exercises to test students' understanding.

  8. Artificial Hydrocarbon Networks Fuzzy Inference System

    OpenAIRE

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2013-01-01

    This paper presents a novel fuzzy inference model based on artificial hydrocarbon networks, a computational algorithm for modeling problems based on chemical hydrocarbon compounds. In particular, the proposed fuzzy-molecular inference model (FIM-model) uses molecular units of information to partition the output space in the defuzzification step. Moreover, these molecules are linguistic units that can be partially understandable due to the organized structure of the topology and metadata param...

  9. Efficient algorithms for conditional independence inference

    Czech Academy of Sciences Publication Activity Database

    Bouckaert, R.; Hemmecke, R.; Lindner, S.; Studený, Milan

    2010-01-01

    Roč. 11, č. 1 (2010), s. 3453-3479 ISSN 1532-4435 R&D Projects: GA ČR GA201/08/0539; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional independence inference * linear programming approach Subject RIV: BA - General Mathematics Impact factor: 2.949, year: 2010 http://library.utia.cas.cz/separaty/2010/MTR/studeny-efficient algorithms for conditional independence inference.pdf

  10. Probabilistic inferences related to the measurement process

    International Nuclear Information System (INIS)

    Rossi, G. B.

    2010-01-01

    In measurement indications from a measuring system are acquired and, on the basis of them, some inference about the measurand is made. The final result may be the assignment of a probability distribution for the possible values of the measurand. We discuss the logical structure of such an inference and some of its epistemological consequences. In particular, we propose a new solution to the problem of systematic effects in measurement.

  11. Polynomial Chaos Surrogates for Bayesian Inference

    KAUST Repository

    Le Maitre, Olivier

    2016-01-06

    The Bayesian inference is a popular probabilistic method to solve inverse problems, such as the identification of field parameter in a PDE model. The inference rely on the Bayes rule to update the prior density of the sought field, from observations, and derive its posterior distribution. In most cases the posterior distribution has no explicit form and has to be sampled, for instance using a Markov-Chain Monte Carlo method. In practice the prior field parameter is decomposed and truncated (e.g. by means of Karhunen- Lo´eve decomposition) to recast the inference problem into the inference of a finite number of coordinates. Although proved effective in many situations, the Bayesian inference as sketched above faces several difficulties requiring improvements. First, sampling the posterior can be a extremely costly task as it requires multiple resolutions of the PDE model for different values of the field parameter. Second, when the observations are not very much informative, the inferred parameter field can highly depends on its prior which can be somehow arbitrary. These issues have motivated the introduction of reduced modeling or surrogates for the (approximate) determination of the parametrized PDE solution and hyperparameters in the description of the prior field. Our contribution focuses on recent developments in these two directions: the acceleration of the posterior sampling by means of Polynomial Chaos expansions and the efficient treatment of parametrized covariance functions for the prior field. We also discuss the possibility of making such approach adaptive to further improve its efficiency.

  12. On the criticality of inferred models

    International Nuclear Information System (INIS)

    Mastromatteo, Iacopo; Marsili, Matteo

    2011-01-01

    Advanced inference techniques allow one to reconstruct a pattern of interaction from high dimensional data sets, from probing simultaneously thousands of units of extended systems—such as cells, neural tissues and financial markets. We focus here on the statistical properties of inferred models and argue that inference procedures are likely to yield models which are close to singular values of parameters, akin to critical points in physics where phase transitions occur. These are points where the response of physical systems to external perturbations, as measured by the susceptibility, is very large and diverges in the limit of infinite size. We show that the reparameterization invariant metrics in the space of probability distributions of these models (the Fisher information) are directly related to the susceptibility of the inferred model. As a result, distinguishable models tend to accumulate close to critical points, where the susceptibility diverges in infinite systems. This region is the one where the estimate of inferred parameters is most stable. In order to illustrate these points, we discuss inference of interacting point processes with application to financial data and show that sensible choices of observation time scales naturally yield models which are close to criticality

  13. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  14. Ceres' Geophysical Evolution Inferred from Dawn Data

    Science.gov (United States)

    Castillo-Rogez, Julie; Bowling, Timothy; Ermakov, Anton I.; Fu, Roger; Park, Ryan; Raymond, Carol; De Sanctis, Maria Cristina; Ammannito, Eleonora; Ruesch, Ottaviano; Prettyman, Thomas H.; Y McSween, Harry; Toplis, Michael J.; Russell, Christopher T.; Dawn Team

    2016-10-01

    If Ceres formed as an ice-rich body, as suggested by its low density and the detection of ammoniated phyllosilicates [1], then it should have differentiated an ice-dominated shell, analogous to large icy satellites [2]. Instead, Dawn observations revealed an enrichment of Ceres' shell in strong materials, either a rocky component and/or salts and gas hydrates [3, 4, 5, 6]. We have explored several scenarios for the emplacement of Ceres' surface. Endogenic processes cannot account for its overall homogeneity. Instead we suggest that Ceres differentiated an icy shell upon freezing of its early ocean that was removed as a consequence of frequent exposure by impacting after the dwarf planet migrated from a cold accretional environment to the warmer outer main belt (or when the solar nebula dissipated, if Ceres formed in situ). This scenario implies that Ceres' current surface represents the interface between the original ice shell and the top of the frozen ocean, a region that is extremely rich chemistry-wise, as illustrated by the mineralogical observations returned by Dawn [7]. Thermal modeling shows that the shell could remain warm over the long term and offer a setting for the generation of brines that may be responsible for the emplacement of Ahuna Mons [8] and Occator's bright spots [7] on an otherwise homogeneous surface [9]. An important implication is that Ceres' surface offers an analog for better understanding the deep interior and chemical evolution of large ice-rich bodies.References: [1] De Sanctis et al., Nature, 2015; [2] McCord and Sotin, Journal of Geophysical Research, 2005; [3] Park et al., Nature, 2016 (in press); [4] Hiesinger et al., Science (submitted); [5] Bland et al., Nature Geoscience, 2016 (in press); [6] Fu et al., AGU Fall Meeting, 2015 [7] De Sanctis et al., Nature, 2016 (in press); [8] Ruesch et al., Science, in revision; [9] Ammannito et al., Science, 2016 (accepted).Acknowledgements: Part of this work is being carried out at the Jet

  15. The SNAP Strong Lens Survey

    Energy Technology Data Exchange (ETDEWEB)

    Marshall, P.

    2005-01-03

    Basic considerations of lens detection and identification indicate that a wide field survey of the types planned for weak lensing and Type Ia SNe with SNAP are close to optimal for the optical detection of strong lenses. Such a ''piggy-back'' survey might be expected even pessimistically to provide a catalogue of a few thousand new strong lenses, with the numbers dominated by systems of faint blue galaxies lensed by foreground ellipticals. After sketching out our strategy for detecting and measuring these galaxy lenses using the SNAP images, we discuss some of the scientific applications of such a large sample of gravitational lenses: in particular we comment on the partition of information between lens structure, the source population properties and cosmology. Understanding this partitioning is key to assessing strong lens cosmography's value as a cosmological probe.

  16. Strong coupling phase in QED

    International Nuclear Information System (INIS)

    Aoki, Ken-ichi

    1988-01-01

    Existence of a strong coupling phase in QED has been suggested in solutions of the Schwinger-Dyson equation and in Monte Carlo simulation of lattice QED. In this article we recapitulate the previous arguments, and formulate the problem in the modern framework of the renormalization theory, Wilsonian renormalization. This scheme of renormalization gives the best understanding of the basic structure of a field theory especially when it has a multi-phase structure. We resolve some misleading arguments in the previous literature. Then we set up a strategy to attack the strong phase, if any. We describe a trial; a coupled Schwinger-Dyson equation. Possible picture of the strong coupling phase QED is presented. (author)

  17. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  18. Inferring Mathematical Equations Using Crowdsourcing.

    Directory of Open Access Journals (Sweden)

    Szymon Wasik

    Full Text Available Crowdsourcing, understood as outsourcing work to a large network of people in the form of an open call, has been utilized successfully many times, including a very interesting concept involving the implementation of computer games with the objective of solving a scientific problem by employing users to play a game-so-called crowdsourced serious games. Our main objective was to verify whether such an approach could be successfully applied to the discovery of mathematical equations that explain experimental data gathered during the observation of a given dynamic system. Moreover, we wanted to compare it with an approach based on artificial intelligence that uses symbolic regression to find such formulae automatically. To achieve this, we designed and implemented an Internet game in which players attempt to design a spaceship representing an equation that models the observed system. The game was designed while considering that it should be easy to use for people without strong mathematical backgrounds. Moreover, we tried to make use of the collective intelligence observed in crowdsourced systems by enabling many players to collaborate on a single solution. The idea was tested on several hundred players playing almost 10,000 games and conducting a user opinion survey. The results prove that the proposed solution has very high potential. The function generated during weeklong tests was almost as precise as the analytical solution of the model of the system and, up to a certain complexity level of the formulae, it explained data better than the solution generated automatically by Eureqa, the leading software application for the implementation of symbolic regression. Moreover, we observed benefits of using crowdsourcing; the chain of consecutive solutions that led to the best solution was obtained by the continuous collaboration of several players.

  19. Inferring Mathematical Equations Using Crowdsourcing.

    Science.gov (United States)

    Wasik, Szymon; Fratczak, Filip; Krzyskow, Jakub; Wulnikowski, Jaroslaw

    2015-01-01

    Crowdsourcing, understood as outsourcing work to a large network of people in the form of an open call, has been utilized successfully many times, including a very interesting concept involving the implementation of computer games with the objective of solving a scientific problem by employing users to play a game-so-called crowdsourced serious games. Our main objective was to verify whether such an approach could be successfully applied to the discovery of mathematical equations that explain experimental data gathered during the observation of a given dynamic system. Moreover, we wanted to compare it with an approach based on artificial intelligence that uses symbolic regression to find such formulae automatically. To achieve this, we designed and implemented an Internet game in which players attempt to design a spaceship representing an equation that models the observed system. The game was designed while considering that it should be easy to use for people without strong mathematical backgrounds. Moreover, we tried to make use of the collective intelligence observed in crowdsourced systems by enabling many players to collaborate on a single solution. The idea was tested on several hundred players playing almost 10,000 games and conducting a user opinion survey. The results prove that the proposed solution has very high potential. The function generated during weeklong tests was almost as precise as the analytical solution of the model of the system and, up to a certain complexity level of the formulae, it explained data better than the solution generated automatically by Eureqa, the leading software application for the implementation of symbolic regression. Moreover, we observed benefits of using crowdsourcing; the chain of consecutive solutions that led to the best solution was obtained by the continuous collaboration of several players.

  20. Estimating detection rates for the LIGO-Virgo search for gravitational-wave burst counterparts to gamma-ray bursts using inferred local GRB rates

    International Nuclear Information System (INIS)

    Leonor, I; Frey, R; Sutton, P J; Jones, G; Marka, S; Marka, Z

    2009-01-01

    One of the ongoing searches performed using the LIGO-Virgo network of gravitational-wave interferometers is the search for gravitational-wave burst (GWB) counterparts to gamma-ray bursts (GRBs). This type of analysis makes use of GRB time and position information from gamma-ray satellite detectors to trigger the GWB search, and the GWB detection rates possible for such an analysis thus strongly depend on the GRB detection efficiencies of the satellite detectors. Using local GRB rate densities inferred from observations which are found in the science literature, we calculate estimates of the GWB detection rates for different configurations of the LIGO-Virgo network for this type of analysis.

  1. Deep Learning for Population Genetic Inference.

    Directory of Open Access Journals (Sweden)

    Sara Sheehan

    2016-03-01

    Full Text Available Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data to the output (e.g., population genetic parameters of interest. We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history. Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  2. Deep Learning for Population Genetic Inference

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S.

    2016-01-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme. PMID:27018908

  3. Science Opens Doors

    Science.gov (United States)

    Smyth, Steve; Smyth, Jen

    2016-01-01

    Science Opens Doors is the creation of Clive Thompson of the Horners' Livery Company. The Science Opens Doors project philosophy is strongly based upon the King's College London ASPIRES project, which established that children like doing science in junior school (ages 7-11), but that by the age of 12-14 they are firmly against becoming scientists.…

  4. Strong Decomposition of Random Variables

    DEFF Research Database (Denmark)

    Hoffmann-Jørgensen, Jørgen; Kagan, Abram M.; Pitt, Loren D.

    2007-01-01

    A random variable X is stongly decomposable if X=Y+Z where Y=Φ(X) and Z=X-Φ(X) are independent non-degenerated random variables (called the components). It is shown that at least one of the components is singular, and we derive a necessary and sufficient condition for strong decomposability...

  5. Strong interaction at finite temperature

    Indian Academy of Sciences (India)

    Abstract. We review two methods discussed in the literature to determine the effective parameters of strongly interacting particles as they move through a heat bath. The first one is the general method of chiral perturbation theory, which may be readily applied to this problem. The other is the method of thermal QCD sum rules ...

  6. Relevance of different prior knowledge sources for inferring gene interaction networks.

    Science.gov (United States)

    Olsen, Catharina; Bontempi, Gianluca; Emmert-Streib, Frank; Quackenbush, John; Haibe-Kains, Benjamin

    2014-01-01

    When inferring networks from high-throughput genomic data, one of the main challenges is the subsequent validation of these networks. In the best case scenario, the true network is partially known from previous research results published in structured databases or research articles. Traditionally, inferred networks are validated against these known interactions. Whenever the recovery rate is gauged to be high enough, subsequent high scoring but unknown inferred interactions are deemed good candidates for further experimental validation. Therefore such validation framework strongly depends on the quantity and quality of published interactions and presents serious pitfalls: (1) availability of these known interactions for the studied problem might be sparse; (2) quantitatively comparing different inference algorithms is not trivial; and (3) the use of these known interactions for validation prevents their integration in the inference procedure. The latter is particularly relevant as it has recently been showed that integration of priors during network inference significantly improves the quality of inferred networks. To overcome these problems when validating inferred networks, we recently proposed a data-driven validation framework based on single gene knock-down experiments. Using this framework, we were able to demonstrate the benefits of integrating prior knowledge and expression data. In this paper we used this framework to assess the quality of different sources of prior knowledge on their own and in combination with different genomic data sets in colorectal cancer. We observed that most prior sources lead to significant F-scores. Furthermore, their integration with genomic data leads to a significant increase in F-scores, especially for priors extracted from full text PubMed articles, known co-expression modules and genetic interactions. Lastly, we observed that the results are consistent for three different data sets: experimental knock-down data and two

  7. Phylogeny and Divergence Times of Lemurs Inferred with Recent and Ancient Fossils in the Tree.

    Science.gov (United States)

    Herrera, James P; Dávalos, Liliana M

    2016-09-01

    Paleontological and neontological systematics seek to answer evolutionary questions with different data sets. Phylogenies inferred for combined extant and extinct taxa provide novel insights into the evolutionary history of life. Primates have an extensive, diverse fossil record and molecular data for living and extinct taxa are rapidly becoming available. We used two models to infer the phylogeny and divergence times for living and fossil primates, the tip-dating (TD) and fossilized birth-death process (FBD). We collected new morphological data, especially on the living and extinct endemic lemurs of Madagascar. We combined the morphological data with published DNA sequences to infer near-complete (88% of lemurs) time-calibrated phylogenies. The results suggest that primates originated around the Cretaceous-Tertiary boundary, slightly earlier than indicated by the fossil record and later than previously inferred from molecular data alone. We infer novel relationships among extinct lemurs, and strong support for relationships that were previously unresolved. Dates inferred with TD were significantly older than those inferred with FBD, most likely related to an assumption of a uniform branching process in the TD compared with a birth-death process assumed in the FBD. This is the first study to combine morphological and DNA sequence data from extinct and extant primates to infer evolutionary relationships and divergence times, and our results shed new light on the tempo of lemur evolution and the efficacy of combined phylogenetic analyses. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Strong-strong beam-beam simulation on parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Qiang, Ji

    2004-08-02

    The beam-beam interaction puts a strong limit on the luminosity of the high energy storage ring colliders. At the interaction points, the electromagnetic fields generated by one beam focus or defocus the opposite beam. This can cause beam blowup and a reduction of luminosity. An accurate simulation of the beam-beam interaction is needed to help optimize the luminosity in high energy colliders.

  9. Strong-strong beam-beam simulation on parallel computer

    International Nuclear Information System (INIS)

    Qiang, Ji

    2004-01-01

    The beam-beam interaction puts a strong limit on the luminosity of the high energy storage ring colliders. At the interaction points, the electromagnetic fields generated by one beam focus or defocus the opposite beam. This can cause beam blowup and a reduction of luminosity. An accurate simulation of the beam-beam interaction is needed to help optimize the luminosity in high energy colliders

  10. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    Science.gov (United States)

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  11. Statistical inference for noisy nonlinear ecological dynamic systems.

    Science.gov (United States)

    Wood, Simon N

    2010-08-26

    Chaotic ecological dynamic systems defy conventional statistical analysis. Systems with near-chaotic dynamics are little better. Such systems are almost invariably driven by endogenous dynamic processes plus demographic and environmental process noise, and are only observable with error. Their sensitivity to history means that minute changes in the driving noise realization, or the system parameters, will cause drastic changes in the system trajectory. This sensitivity is inherited and amplified by the joint probability density of the observable data and the process noise, rendering it useless as the basis for obtaining measures of statistical fit. Because the joint density is the basis for the fit measures used by all conventional statistical methods, this is a major theoretical shortcoming. The inability to make well-founded statistical inferences about biological dynamic models in the chaotic and near-chaotic regimes, other than on an ad hoc basis, leaves dynamic theory without the methods of quantitative validation that are essential tools in the rest of biological science. Here I show that this impasse can be resolved in a simple and general manner, using a method that requires only the ability to simulate the observed data on a system from the dynamic model about which inferences are required. The raw data series are reduced to phase-insensitive summary statistics, quantifying local dynamic structure and the distribution of observations. Simulation is used to obtain the mean and the covariance matrix of the statistics, given model parameters, allowing the construction of a 'synthetic likelihood' that assesses model fit. This likelihood can be explored using a straightforward Markov chain Monte Carlo sampler, but one further post-processing step returns pure likelihood-based inference. I apply the method to establish the dynamic nature of the fluctuations in Nicholson's classic blowfly experiments.

  12. Single board system for fuzzy inference

    Science.gov (United States)

    Symon, James R.; Watanabe, Hiroyuki

    1991-01-01

    The very large scale integration (VLSI) implementation of a fuzzy logic inference mechanism allows the use of rule-based control and decision making in demanding real-time applications. Researchers designed a full custom VLSI inference engine. The chip was fabricated using CMOS technology. The chip consists of 688,000 transistors of which 476,000 are used for RAM memory. The fuzzy logic inference engine board system incorporates the custom designed integrated circuit into a standard VMEbus environment. The Fuzzy Logic system uses Transistor-Transistor Logic (TTL) parts to provide the interface between the Fuzzy chip and a standard, double height VMEbus backplane, allowing the chip to perform application process control through the VMEbus host. High level C language functions hide details of the hardware system interface from the applications level programmer. The first version of the board was installed on a robot at Oak Ridge National Laboratory in January of 1990.

  13. A Learning Algorithm for Multimodal Grammar Inference.

    Science.gov (United States)

    D'Ulizia, A; Ferri, F; Grifoni, P

    2011-12-01

    The high costs of development and maintenance of multimodal grammars in integrating and understanding input in multimodal interfaces lead to the investigation of novel algorithmic solutions in automating grammar generation and in updating processes. Many algorithms for context-free grammar inference have been developed in the natural language processing literature. An extension of these algorithms toward the inference of multimodal grammars is necessary for multimodal input processing. In this paper, we propose a novel grammar inference mechanism that allows us to learn a multimodal grammar from its positive samples of multimodal sentences. The algorithm first generates the multimodal grammar that is able to parse the positive samples of sentences and, afterward, makes use of two learning operators and the minimum description length metrics in improving the grammar description and in avoiding the over-generalization problem. The experimental results highlight the acceptable performances of the algorithm proposed in this paper since it has a very high probability of parsing valid sentences.

  14. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  15. Grammatical inference algorithms, routines and applications

    CERN Document Server

    Wieczorek, Wojciech

    2017-01-01

    This book focuses on grammatical inference, presenting classic and modern methods of grammatical inference from the perspective of practitioners. To do so, it employs the Python programming language to present all of the methods discussed. Grammatical inference is a field that lies at the intersection of multiple disciplines, with contributions from computational linguistics, pattern recognition, machine learning, computational biology, formal learning theory and many others. Though the book is largely practical, it also includes elements of learning theory, combinatorics on words, the theory of automata and formal languages, plus references to real-world problems. The listings presented here can be directly copied and pasted into other programs, thus making the book a valuable source of ready recipes for students, academic researchers, and programmers alike, as well as an inspiration for their further development.>.

  16. Automatic transformations in the inference process

    Energy Technology Data Exchange (ETDEWEB)

    Veroff, R. L.

    1980-07-01

    A technique for incorporating automatic transformations into processes such as the application of inference rules, subsumption, and demodulation provides a mechanism for improving search strategies for theorem proving problems arising from the field of program verification. The incorporation of automatic transformations into the inference process can alter the search space for a given problem, and is particularly useful for problems having broad rather than deep proofs. The technique can also be used to permit the generation of inferences that might otherwise be blocked and to build some commutativity or associativity into the unification process. Appropriate choice of transformations, and new literal clashing and unification algorithms for applying them, showed significant improvement on several real problems according to several distinct criteria. 22 references, 1 figure.

  17. Uncertainty in prediction and in inference

    International Nuclear Information System (INIS)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support

  18. Examples in parametric inference with R

    CERN Document Server

    Dixit, Ulhas Jayram

    2016-01-01

    This book discusses examples in parametric inference with R. Combining basic theory with modern approaches, it presents the latest developments and trends in statistical inference for students who do not have an advanced mathematical and statistical background. The topics discussed in the book are fundamental and common to many fields of statistical inference and thus serve as a point of departure for in-depth study. The book is divided into eight chapters: Chapter 1 provides an overview of topics on sufficiency and completeness, while Chapter 2 briefly discusses unbiased estimation. Chapter 3 focuses on the study of moments and maximum likelihood estimators, and Chapter 4 presents bounds for the variance. In Chapter 5, topics on consistent estimator are discussed. Chapter 6 discusses Bayes, while Chapter 7 studies some more powerful tests. Lastly, Chapter 8 examines unbiased and other tests. Senior undergraduate and graduate students in statistics and mathematics, and those who have taken an introductory cou...

  19. Inferences from counterfactual threats and promises.

    Science.gov (United States)

    Egan, Suzanne M; Byrne, Ruth M J

    2012-01-01

    We examine how people understand and reason from counterfactual threats, for example, "if you had hit your sister, I would have grounded you" and counterfactual promises, for example, "if you had tidied your room, I would have given you ice-cream." The first experiment shows that people consider counterfactual threats, but not counterfactual promises, to have the illocutionary force of an inducement. They also make the immediate inference that the action mentioned in the "if" part of the counterfactual threat and promise did not occur. The second experiment shows that people make more negative inferences (modus tollens and denial of the antecedent) than affirmative inferences (modus ponens and affirmation of the consequent) from counterfactual threats and promises, unlike indicative threats and promises. We discuss the implications of the results for theories of the mental representations and cognitive processes that underlie conditional inducements.

  20. Mathematical inference and control of molecular networks from perturbation experiments

    Science.gov (United States)

    Mohammed-Rasheed, Mohammed

    in order to affect the time evolution of molecular activity in a desirable manner. In this proposal, we address both the inference and control problems of GRNs. In the first part of the thesis, we consider the control problem. We assume that we are given a general topology network structure, whose dynamics follow a discrete-time Markov chain model. We subsequently develop a comprehensive framework for optimal perturbation control of the network. The aim of the perturbation is to drive the network away from undesirable steady-states and to force it to converge to a unique desirable steady-state. The proposed framework does not make any assumptions about the topology of the initial network (e.g., ergodicity, weak and strong connectivity), and is thus applicable to general topology networks. We define the optimal perturbation as the minimum-energy perturbation measured in terms of the Frobenius norm between the initial and perturbed networks. We subsequently demonstrate that there exists at most one optimal perturbation that forces the network into the desirable steady-state. In the event where the optimal perturbation does not exist, we construct a family of sub-optimal perturbations that approximate the optimal solution arbitrarily closely. In the second part of the thesis, we address the inference problem of GRNs from time series data. We model the dynamics of the molecules using a system of ordinary differential equations corrupted by additive white noise. For large-scale networks, we formulate the inference problem as a constrained maximum likelihood estimation problem. We derive the molecular interactions that maximize the likelihood function while constraining the network to be sparse. We further propose a procedure to recover weak interactions based on the Bayesian information criterion. For small-size networks, we investigated the inference of a globally stable 7-gene melanoma genetic regulatory network from genetic perturbation experiments. We considered five

  1. IMAGINE: Interstellar MAGnetic field INference Engine

    Science.gov (United States)

    Steininger, Theo

    2018-03-01

    IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.

  2. Improved Inference of Heteroscedastic Fixed Effects Models

    Directory of Open Access Journals (Sweden)

    Afshan Saeed

    2016-12-01

    Full Text Available Heteroscedasticity is a stern problem that distorts estimation and testing of panel data model (PDM. Arellano (1987 proposed the White (1980 estimator for PDM with heteroscedastic errors but it provides erroneous inference for the data sets including high leverage points. In this paper, our attempt is to improve heteroscedastic consistent covariance matrix estimator (HCCME for panel dataset with high leverage points. To draw robust inference for the PDM, our focus is to improve kernel bootstrap estimators, proposed by Racine and MacKinnon (2007. The Monte Carlo scheme is used for assertion of the results.

  3. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...

  4. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point...... with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analysing Peter Diggle's heather data set, where we discuss the results of simulation......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  5. The aggregate site frequency spectrum for comparative population genomic inference.

    Science.gov (United States)

    Xue, Alexander T; Hickerson, Michael J

    2015-12-01

    Understanding how assemblages of species responded to past climate change is a central goal of comparative phylogeography and comparative population genomics, an endeavour that has increasing potential to integrate with community ecology. New sequencing technology now provides the potential to perform complex demographic inference at unprecedented resolution across assemblages of nonmodel species. To this end, we introduce the aggregate site frequency spectrum (aSFS), an expansion of the site frequency spectrum to use single nucleotide polymorphism (SNP) data sets collected from multiple, co-distributed species for assemblage-level demographic inference. We describe how the aSFS is constructed over an arbitrary number of independent population samples and then demonstrate how the aSFS can differentiate various multispecies demographic histories under a wide range of sampling configurations while allowing effective population sizes and expansion magnitudes to vary independently. We subsequently couple the aSFS with a hierarchical approximate Bayesian computation (hABC) framework to estimate degree of temporal synchronicity in expansion times across taxa, including an empirical demonstration with a data set consisting of five populations of the threespine stickleback (Gasterosteus aculeatus). Corroborating what is generally understood about the recent postglacial origins of these populations, the joint aSFS/hABC analysis strongly suggests that the stickleback data are most consistent with synchronous expansion after the Last Glacial Maximum (posterior probability = 0.99). The aSFS will have general application for multilevel statistical frameworks to test models involving assemblages and/or communities, and as large-scale SNP data from nonmodel species become routine, the aSFS expands the potential for powerful next-generation comparative population genomic inference. © 2015 The Authors. Molecular Ecology Published by John Wiley & Sons Ltd.

  6. Strongly nonlinear oscillators analytical solutions

    CERN Document Server

    Cveticanin, Livija

    2014-01-01

    This book provides the presentation of the motion of pure nonlinear oscillatory systems and various solution procedures which give the approximate solutions of the strong nonlinear oscillator equations. The book presents the original author’s method for the analytical solution procedure of the pure nonlinear oscillator system. After an introduction, the physical explanation of the pure nonlinearity and of the pure nonlinear oscillator is given. The analytical solution for free and forced vibrations of the one-degree-of-freedom strong nonlinear system with constant and time variable parameter is considered. Special attention is given to the one and two mass oscillatory systems with two-degrees-of-freedom. The criteria for the deterministic chaos in ideal and non-ideal pure nonlinear oscillators are derived analytically. The method for suppressing chaos is developed. Important problems are discussed in didactic exercises. The book is self-consistent and suitable as a textbook for students and also for profess...

  7. Strong curvature effects in Neumann wave problems

    DEFF Research Database (Denmark)

    Willatzen, Morten; Pors, A.; Gravesen, Jens

    2012-01-01

    Waveguide phenomena play a major role in basic sciences and engineering. The Helmholtz equation is the governing equation for the electric field in electromagnetic wave propagation and the acoustic pressure in the study of pressure dynamics. The Schro¨dinger equation simplifies to the Helmholtz...... equation for a quantum-mechanical particle confined by infinite barriers relevant in semiconductor physics. With this in mind and the interest to tailor waveguides towards a desired spectrum and modal pattern structure in classical structures and nanostructures, it becomes increasingly important...... to understand the influence of curvature effects in waveguides. In this work, we demonstrate analytically strong curvature effects for the eigenvalue spectrum of the Helmholtz equation with Neumann boundary conditions in cases where the waveguide cross section is a circular sector. It is found that the linear...

  8. Flavour Democracy in Strong Unification

    CERN Document Server

    Abel, S A; Abel, Steven; King, Steven

    1998-01-01

    We show that the fermion mass spectrum may naturally be understood in terms of flavour democratic fixed points in supersymmetric theories which have a large domain of attraction in the presence of "strong unification". Our approach provides an alternative to the approximate Yukawa texture zeroes of the Froggatt-Nielsen mechanism. We discuss a particular model based on a broken gauged $SU(3)_L\\times SU(3)_R$ family symmetry which illustrates our approach.

  9. Inferring biological functions of guanylyl cyclases with computational methods

    KAUST Repository

    Alquraishi, May Majed

    2013-09-03

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  10. Strong Gravitational Lensing in a Brane-World Black Hole

    Science.gov (United States)

    Li, GuoPing; Cao, Biao; Feng, Zhongwen; Zu, Xiaotao

    2015-09-01

    Adopting the strong field limit approach, we investigated the strong gravitational lensing in a Brane-World black hole, which means that the strong field limit coefficients and the deflection angle in this gravitational field are obtained. With this result, it can be said with certainly that the strong gravitational lensing is related to the metric of gravitational fields closely, the cosmology parameter α and the dark matter parameter β come from the Brane-World black hole exerts a great influence on it. Comparing with the Schwarzschild-AdS spacetime and the Schwarzschild-XCMD spacetime, the parameters α, β of black holes have the similar effects on the gravitational lensing. In some way, we infer that the real gravitational fields in our universe can be described by this metric, so the results of the strong gravitational lensing in this spacetime will be more reasonable for us to observe. Finally, it has to be noticed that the influence which the parameters α, β exerted on the main observable quantities of this gravitational field is discussed.

  11. Strong Purifying Selection at Synonymous Sites in D. melanogaster

    Science.gov (United States)

    Lawrie, David S.; Messer, Philipp W.; Hershberg, Ruth; Petrov, Dmitri A.

    2013-01-01

    Synonymous sites are generally assumed to be subject to weak selective constraint. For this reason, they are often neglected as a possible source of important functional variation. We use site frequency spectra from deep population sequencing data to show that, contrary to this expectation, 22% of four-fold synonymous (4D) sites in Drosophila melanogaster evolve under very strong selective constraint while few, if any, appear to be under weak constraint. Linking polymorphism with divergence data, we further find that the fraction of synonymous sites exposed to strong purifying selection is higher for those positions that show slower evolution on the Drosophila phylogeny. The function underlying the inferred strong constraint appears to be separate from splicing enhancers, nucleosome positioning, and the translational optimization generating canonical codon bias. The fraction of synonymous sites under strong constraint within a gene correlates well with gene expression, particularly in the mid-late embryo, pupae, and adult developmental stages. Genes enriched in strongly constrained synonymous sites tend to be particularly functionally important and are often involved in key developmental pathways. Given that the observed widespread constraint acting on synonymous sites is likely not limited to Drosophila, the role of synonymous sites in genetic disease and adaptation should be reevaluated. PMID:23737754

  12. Double jeopardy in inferring cognitive processes.

    Science.gov (United States)

    Fific, Mario

    2014-01-01

    Inferences we make about underlying cognitive processes can be jeopardized in two ways due to problematic forms of aggregation. First, averaging across individuals is typically considered a very useful tool for removing random variability. The threat is that averaging across subjects leads to averaging across different cognitive strategies, thus harming our inferences. The second threat comes from the construction of inadequate research designs possessing a low diagnostic accuracy of cognitive processes. For that reason we introduced the systems factorial technology (SFT), which has primarily been designed to make inferences about underlying processing order (serial, parallel, coactive), stopping rule (terminating, exhaustive), and process dependency. SFT proposes that the minimal research design complexity to learn about n number of cognitive processes should be equal to 2 (n) . In addition, SFT proposes that (a) each cognitive process should be controlled by a separate experimental factor, and (b) The saliency levels of all factors should be combined in a full factorial design. In the current study, the author cross combined the levels of jeopardies in a 2 × 2 analysis, leading to four different analysis conditions. The results indicate a decline in the diagnostic accuracy of inferences made about cognitive processes due to the presence of each jeopardy in isolation and when combined. The results warrant the development of more individual subject analyses and the utilization of full-factorial (SFT) experimental designs.

  13. Colligation, Or the Logical Inference of Interconnection

    DEFF Research Database (Denmark)

    Falster, Peter

    1998-01-01

    laws or assumptions. Yet interconnection as an abstract concept seems to be without scientific underpinning in pure logic. Adopting a historical viewpoint, our aim is to show that the reasoning of interconnection may be identified with a neglected kind of logical inference, called "colligation...

  14. Colligation or, The Logical Inference of Interconnection

    DEFF Research Database (Denmark)

    Franksen, Ole Immanuel; Falster, Peter

    2000-01-01

    laws or assumptions. Yet interconnection as an abstract concept seems to be without scientific underpinning in oure logic. Adopting a historical viewpoint, our aim is to show that the reasoning of interconnection may be identified with a neglected kind of logical inference, called "colligation...

  15. Investigating Mathematics Teachers' Thoughts of Statistical Inference

    Science.gov (United States)

    Yang, Kai-Lin

    2012-01-01

    Research on statistical cognition and application suggests that statistical inference concepts are commonly misunderstood by students and even misinterpreted by researchers. Although some research has been done on students' misunderstanding or misconceptions of confidence intervals (CIs), few studies explore either students' or mathematics…

  16. Theory change and Bayesian statistical inference

    NARCIS (Netherlands)

    Romeijn, Jan-Willem

    2005-01-01

    This paper addresses the problem that Bayesian statistical inference cannot accommodate theory change, and proposes a framework for dealing with such changes. It first presents a scheme for generating predictions from observations by means of hypotheses. An example shows how the hypotheses represent

  17. lems that arise in statistical inference. Ther

    African Journals Online (AJOL)

    Administrateur

    ample, may require the integration of the nuisance parameters. Also, several optimization problems in statistical inference use numerical integration steps, such as EM algorithm in [6] and [2]. Here we are concerned with simulation based integration methods that uses the generation of random variables. Such methods are.

  18. Eleusis: complexity and interaction in inductive inference

    NARCIS (Netherlands)

    Kurzen, L.; Arrazola, X.; Ponte, M.

    2010-01-01

    This paper analyzes the computational complexity of the inductive inference game Eleusis. Eleusis is a card game in which one player constructs a secret rule which has to be discovered by the other players. We determine the complexity of various decision problems that arise in Eleusis. We show that

  19. Quasi-Experimental Designs for Causal Inference

    Science.gov (United States)

    Kim, Yongnam; Steiner, Peter

    2016-01-01

    When randomized experiments are infeasible, quasi-experimental designs can be exploited to evaluate causal treatment effects. The strongest quasi-experimental designs for causal inference are regression discontinuity designs, instrumental variable designs, matching and propensity score designs, and comparative interrupted time series designs. This…

  20. Conditional Inference and Advanced Mathematical Study

    Science.gov (United States)

    Inglis, Matthew; Simpson, Adrian

    2008-01-01

    Many mathematicians and curriculum bodies have argued in favour of the theory of formal discipline: that studying advanced mathematics develops one's ability to reason logically. In this paper we explore this view by directly comparing the inferences drawn from abstract conditional statements by advanced mathematics students and well-educated arts…

  1. Culture and Pragmatic Inference in Interpersonal Communication

    African Journals Online (AJOL)

    cognitive process, and that the human capacity for inference is crucially important in interpersonal communication in these contexts. Generally, communication involves 'the transmission of messages between individuals acting consciously and intentionally for that end' (Harder, 2009, p. 62). It is an integral part of our ...

  2. Understanding COBOL systems using inferred types

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon)

    1999-01-01

    textabstractIn a typical COBOL program, the data division consists of 50 of the lines of code. Automatic type inference can help to understand the large collections of variable declarations contained therein, showing how variables are related based on their actual usage. The most problematic aspect

  3. Diet elucidation: Supplementary inferences from mysid feeding ...

    African Journals Online (AJOL)

    Comparison of the structure of the feeding appendages of these two marine mysid species allows dietary inferences supplementing data derived from gut content analyses. Both species occur in abundance in the surf zone off sandy beaches where they contribute significantly to energy transfer through the food web.

  4. Theory Change and Bayesian Statistical Inference

    NARCIS (Netherlands)

    Romeyn, Jan-Willem

    2008-01-01

    This paper addresses the problem that Bayesian statistical inference cannot accommodate theory change, and proposes a framework for dealing with such changes. It first presents a scheme for generating predictions from observations by means of hypotheses. An example shows how the hypotheses represent

  5. Ignorability in Statistical and Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed by ma...

  6. Spurious correlations and inference in landscape genetics

    Science.gov (United States)

    Samuel A. Cushman; Erin L. Landguth

    2010-01-01

    Reliable interpretation of landscape genetic analyses depends on statistical methods that have high power to identify the correct process driving gene flow while rejecting incorrect alternative hypotheses. Little is known about statistical power and inference in individual-based landscape genetics. Our objective was to evaluate the power of causalmodelling with partial...

  7. Abduction and Inference to the Best Explanation

    Directory of Open Access Journals (Sweden)

    Valeriano Iranzo

    2009-12-01

    Full Text Available The paper deals with the relation between abduction and inference to the best explanation (IBE. A heuristic and a normative interpretation of IBE are distinguished. Besides, two different normative interpretations —those vindicated by I. Niiniluoto and S. Psillos— are discussed. I conclude that, in principle, Aliseda’s theory of abduction fits better with a heuristic account of IBE

  8. Supplementary inferences from mysid feeding appendage morphology

    African Journals Online (AJOL)

    Diet elucidation: Supplementary inferences from mysid feeding appendage morphology. P. Webb* and T.H. Wooldridge. Institute of Coastal Research and Department of Zoology, University of Port Elizabeth, P.O. Box 1600, Port Elizabeth,. 6000 Republic of South Africa. Received 9 August 1988; accepted 24 October 1988.

  9. Inference and the Introductory Statistics Course

    Science.gov (United States)

    Pfannkuch, Maxine; Regan, Matt; Wild, Chris; Budgett, Stephanie; Forbes, Sharleen; Harraway, John; Parsonage, Ross

    2011-01-01

    This article sets out some of the rationale and arguments for making major changes to the teaching and learning of statistical inference in introductory courses at our universities by changing from a norm-based, mathematical approach to more conceptually accessible computer-based approaches. The core problem of the inferential argument with its…

  10. Protein Inference Using Peptide Quantification Patterns

    NARCIS (Netherlands)

    Lukasse, P.N.J.; America, A.H.P.

    2014-01-01

    Determining the list of proteins present in a sample, based on the list of identified peptides, is a crucial step in the untargeted proteomics LC-MS/MS data-processing pipeline. This step, commonly referred to as protein inference, turns out to be a very challenging problem because many peptide

  11. Inferring motion and location using WLAN RSSI

    NARCIS (Netherlands)

    Kavitha Muthukrishnan, K.; van der Zwaag, B.J.; Havinga, Paul J.M.; Fuller, R.; Koutsoukos, X.

    2009-01-01

    We present novel algorithms to infer movement by making use of inherent fluctuations in the received signal strengths from existing WLAN infrastructure. We evaluate the performance of the presented algorithms based on classification metrics such as recall and precision using annotated traces

  12. Atoms in strong laser fields

    International Nuclear Information System (INIS)

    L'Huillier, A.

    2002-01-01

    When a high-power laser focuses into a gas of atoms, the electromagnetic field becomes of the same magnitude as the Coulomb field which binds a 1s electron in a hydrogen atom. 3 highly non-linear phenomena can happen: 1) ATI (above threshold ionization): electrons initially in the ground state absorb a large number of photons, many more than the minimum number required for ionization; 2) multiple ionization: many electrons can be emitted one at a time, in a sequential process, or simultaneously in a mechanism called direct or non-sequential; and 3) high order harmonic generation (HHG): efficient photon emission in the extreme ultraviolet range, in the form of high-order harmonics of the fundamental laser field can occur. The theoretical problem consists in solving the time dependent Schroedinger equation (TDSE) that describes the interaction of a many-electron atom with a laser field. A number of methods have been proposed to solve this problem in the case of a hydrogen atom or a single-active electron atom in a strong laser field. A large effort is presently being devoted to go beyond the single-active approximation. The understanding of the physics of the interaction between atoms and strong laser fields has been provided by a very simple model called ''simple man's theory''. A unified view of HHG, ATI, and non-sequential ionization, originating from the simple man's model and the strong field approximation, expressed in terms of electrons trajectories or quantum paths is slowly emerging. (A.C.)

  13. Strongly Interacting Light Dark Matter

    Directory of Open Access Journals (Sweden)

    Sebastian Bruggisser, Francesco Riva, Alfredo Urbano

    2017-09-01

    Full Text Available In the presence of approximate global symmetries that forbid relevant interactions, strongly coupled light Dark Matter (DM can appear weakly coupled at small energy and generate a sizable relic abundance. Fundamental principles like unitarity restrict these symmetries to a small class, where the leading interactions are captured by effective operators up to dimension-8. Chiral symmetry, spontaneously broken global symmetries and non-linearly realized supersymmetry are examples of this. Their DM candidates (composite fermions, pseudo Nambu-Goldstone Bosons and Goldstini are interesting targets for LHC missing-energy searches.

  14. Strongly interacting light dark matter

    International Nuclear Information System (INIS)

    Bruggisser, Sebastian; Riva, Francesco; Urbano, Alfredo

    2016-07-01

    In the presence of approximate global symmetries that forbid relevant interactions, strongly coupled light Dark Matter (DM) can appear weakly coupled at small-energy and generate a sizable relic abundance. Fundamental principles like unitarity restrict these symmetries to a small class, where the leading interactions are captured by effective operators up to dimension-8. Chiral symmetry, spontaneously broken global symmetries and non-linearly realized supersymmetry are examples of this. Their DM candidates (composite fermions, pseudo-Nambu-Goldstone Bosons and Goldstini) are interesting targets for LHC missing-energy searches.

  15. Rydberg atoms in strong fields

    International Nuclear Information System (INIS)

    Kleppner, D.; Tsimmerman, M.

    1985-01-01

    Experimental and theoretical achievements in studying Rydberg atoms in external fields are considered. Only static (or quasistatic) fields and ''one-electron'' atoms, i.e. atoms that are well described by one-electron states, are discussed. Mainly behaviour of alkali metal atoms in electric field is considered. The state of theoretical investigations for hydrogen atom in magnetic field is described, but experimental data for atoms of alkali metals are presented as an illustration. Results of the latest experimental and theoretical investigations into the structure of Rydberg atoms in strong fields are presented

  16. Scalar strong interaction hadron theory

    CERN Document Server

    Hoh, Fang Chao

    2015-01-01

    The scalar strong interaction hadron theory, SSI, is a first principles' and nonlocal theory at quantum mechanical level that provides an alternative to low energy QCD and Higgs related part of the standard model. The quark-quark interaction is scalar rather than color-vectorial. A set of equations of motion for mesons and another set for baryons have been constructed. This book provides an account of the present state of a theory supposedly still at its early stage of development. This work will facilitate researchers interested in entering into this field and serve as a basis for possible future development of this theory.

  17. Strong Plate, Weak Slab Dichotomy

    Science.gov (United States)

    Petersen, R. I.; Stegman, D. R.; Tackley, P.

    2015-12-01

    Models of mantle convection on Earth produce styles of convection that are not observed on Earth.Moreover non-Earth-like modes, such as two-sided downwellings, are the de facto mode of convection in such models.To recreate Earth style subduction, i.e. one-sided asymmetric recycling of the lithosphere, proper treatment of the plates and plate interface are required. Previous work has identified several model features that promote subduction. A free surface or pseudo-free surface and a layer of material with a relatively low strength material (weak crust) allow downgoing plates to bend and slide past overriding without creating undue stress at the plate interface. (Crameri, et al. 2012, GRL)A low viscosity mantle wedge, possibly a result of slab dehydration, decouples the plates in the system. (Gerya et al. 2007, Geo)Plates must be composed of material which, in the case of the overriding plate, are is strong enough to resist bending stresses imposed by the subducting plate and yet, as in the case of the subducting plate, be weak enough to bend and subduct when pulled by the already subducted slab. (Petersen et al. 2015, PEPI) Though strong surface plates are required for subduction such plates may present a problem when they encounter the lower mantle.As the subducting slab approaches the higher viscosity, lower mantle stresses are imposed on the tip.Strong slabs transmit this stress to the surface.There the stress field at the plate interface is modified and potentially modifies the style of convection. In addition to modifying the stress at the plate interface, the strength of the slab affects the morphology of the slab at the base of the upper mantle. (Stegman, et al 2010, Tectonophysics)Slabs that maintain a sufficient portion of their strength after being bent require high stresses to unbend or otherwise change their shape.On the other hand slabs that are weakened though the bending process are more amenable to changes in morphology. We present the results of

  18. Active inference, sensory attenuation and illusions.

    Science.gov (United States)

    Brown, Harriet; Adams, Rick A; Parees, Isabel; Edwards, Mark; Friston, Karl

    2013-11-01

    Active inference provides a simple and neurobiologically plausible account of how action and perception are coupled in producing (Bayes) optimal behaviour. This can be seen most easily as minimising prediction error: we can either change our predictions to explain sensory input through perception. Alternatively, we can actively change sensory input to fulfil our predictions. In active inference, this action is mediated by classical reflex arcs that minimise proprioceptive prediction error created by descending proprioceptive predictions. However, this creates a conflict between action and perception; in that, self-generated movements require predictions to override the sensory evidence that one is not actually moving. However, ignoring sensory evidence means that externally generated sensations will not be perceived. Conversely, attending to (proprioceptive and somatosensory) sensations enables the detection of externally generated events but precludes generation of actions. This conflict can be resolved by attenuating the precision of sensory evidence during movement or, equivalently, attending away from the consequences of self-made acts. We propose that this Bayes optimal withdrawal of precise sensory evidence during movement is the cause of psychophysical sensory attenuation. Furthermore, it explains the force-matching illusion and reproduces empirical results almost exactly. Finally, if attenuation is removed, the force-matching illusion disappears and false (delusional) inferences about agency emerge. This is important, given the negative correlation between sensory attenuation and delusional beliefs in normal subjects--and the reduction in the magnitude of the illusion in schizophrenia. Active inference therefore links the neuromodulatory optimisation of precision to sensory attenuation and illusory phenomena during the attribution of agency in normal subjects. It also provides a functional account of deficits in syndromes characterised by false inference

  19. Artificial frame filling using adaptive neural fuzzy inference system for particle image velocimetry dataset

    Science.gov (United States)

    Akdemir, Bayram; Doǧan, Sercan; Aksoy, Muharrem H.; Canli, Eyüp; Özgören, Muammer

    2015-03-01

    Liquid behaviors are very important for many areas especially for Mechanical Engineering. Fast camera is a way to observe and search the liquid behaviors. Camera traces the dust or colored markers travelling in the liquid and takes many pictures in a second as possible as. Every image has large data structure due to resolution. For fast liquid velocity, there is not easy to evaluate or make a fluent frame after the taken images. Artificial intelligence has much popularity in science to solve the nonlinear problems. Adaptive neural fuzzy inference system is a common artificial intelligence in literature. Any particle velocity in a liquid has two dimension speed and its derivatives. Adaptive Neural Fuzzy Inference System has been used to create an artificial frame between previous and post frames as offline. Adaptive neural fuzzy inference system uses velocities and vorticities to create a crossing point vector between previous and post points. In this study, Adaptive Neural Fuzzy Inference System has been used to fill virtual frames among the real frames in order to improve image continuity. So this evaluation makes the images much understandable at chaotic or vorticity points. After executed adaptive neural fuzzy inference system, the image dataset increase two times and has a sequence as virtual and real, respectively. The obtained success is evaluated using R2 testing and mean squared error. R2 testing has a statistical importance about similarity and 0.82, 0.81, 0.85 and 0.8 were obtained for velocities and derivatives, respectively.

  20. Pakistan strong industrial base urged for economic progress

    CERN Multimedia

    2001-01-01

    A conference organized by Pakistan Nuclear Society urged that Pakistan should develop a strong industrial base and capability to export equipment for economic progress. The chairmen of PAEC pointed out that Pakistan is already showing remarkable progress in export of science-related equipment to CERN. He also asked scientists to wage a war against Pakistans inability to acquire indigenous technology (1 page).

  1. Characteristics of global strong earthquakes and their implications ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 126; Issue 7. Characteristics of global strong earthquakes and their implications for ... We grouped 518 of them into 12 regions (Boxes) based on their geographical proximity and tectonic setting. For each box, the present-day stress field and regime were obtained ...

  2. Earthquake source model using strong motion displacement as ...

    Indian Academy of Sciences (India)

    Earthquake source model using strong motion displacement as response of finite elastic media. R N IYENGAR* and SHAILESH KR AGRAWAL**. *Department of Civil Engineering, Indian Institute of Science, Bangalore 560 012, India. e-mail: rni@civil.iisc.ernet.in. **Central Building Research Institute, Roorkee, India.

  3. EDITORIAL: Strongly correlated electron systems Strongly correlated electron systems

    Science.gov (United States)

    Ronning, Filip; Batista, Cristian

    2011-03-01

    Strongly correlated electrons is an exciting and diverse field in condensed matter physics. This special issue aims to capture some of that excitement and recent developments in the field. Given that this issue was inspired by the 2010 International Conference on Strongly Correlated Electron Systems (SCES 2010), we briefly give some history in order to place this issue in context. The 2010 International Conference on Strongly Correlated Electron Systems was held in Santa Fe, New Mexico, a reunion of sorts from the 1989 International Conference on the Physics of Highly Correlated Electron Systems that also convened in Santa Fe. SCES 2010—co-chaired by John Sarrao and Joe Thompson—followed the tradition of earlier conferences, in this century, hosted by Buzios (2008), Houston (2007), Vienna (2005), Karlsruhe (2004), Krakow (2002) and Ann Arbor (2001). Every three years since 1997, SCES has joined the International Conference on Magnetism (ICM), held in Recife (2000), Rome (2003), Kyoto (2006) and Karlsruhe (2009). Like its predecessors, SCES 2010 topics included strongly correlated f- and d-electron systems, heavy-fermion behaviors, quantum-phase transitions, non-Fermi liquid phenomena, unconventional superconductivity, and emergent states that arise from electronic correlations. Recent developments from studies of quantum magnetism and cold atoms complemented the traditional subjects and were included in SCES 2010. 2010 celebrated the 400th anniversary of Santa Fe as well as the birth of astronomy. So what's the connection to SCES? The Dutch invention of the first practical telescope and its use by Galileo in 1610 and subsequent years overturned dogma that the sun revolved about the earth. This revolutionary, and at the time heretical, conclusion required innovative combinations of new instrumentation, observation and mathematics. These same combinations are just as important 400 years later and are the foundation of scientific discoveries that were discussed

  4. Science and Science Fiction

    Science.gov (United States)

    Oravetz, David

    2005-01-01

    This article is for teachers looking for new ways to motivate students, increase science comprehension, and understanding without using the old standard expository science textbook. This author suggests reading a science fiction novel in the science classroom as a way to engage students in learning. Using science fiction literature and language…

  5. Physics of Strongly Coupled Plasma

    Energy Technology Data Exchange (ETDEWEB)

    Kraeft, Wolf-Dietrich [Universitat Rostock (Germany)

    2007-07-15

    Strongly coupled plasmas (or non-ideal plasmas) are multi-component charged many-particle systems, in which the mean value of the potential energy of the system is of the same order as or even higher than the mean value of the kinetic energy. The constituents are electrons, ions, atoms and molecules. Dusty (or complex) plasmas contain still mesoscopic (multiply charged) particles. In such systems, the effects of strong coupling (non-ideality) lead to considerable deviations of physical properties from the corresponding properties of ideal plasmas, i.e., of plasmas in which the mean kinetic energy is essentially larger than the mean potential energy. For instance, bound state energies become density dependent and vanish at higher densities (Mott effect) due to the interaction of the pair with the surrounding particles. Non-ideal plasmas are of interest both for general scientific reasons (including, for example, astrophysical questions), and for technical applications such as inertially confined fusion. In spite of great efforts both experimentally and theoretically, satisfactory information on the physical properties of strongly coupled plasmas is not at hand for any temperature and density. For example, the theoretical description of non-ideal plasmas is possible only at low densities/high temperatures and at extremely high densities (high degeneracy). For intermediate degeneracy, however, numerical experiments have to fill the gap. Experiments are difficult in the region of 'warm dense matter'. The monograph tries to present the state of the art concerning both theoretical and experimental attempts. It mainly includes results of the work performed in famous Russian laboratories in recent decades. After outlining basic concepts (chapter 1), the generation of plasmas is considered (chapter 2, chapter 3). Questions of partial (chapter 4) and full ionization (chapter 5) are discussed including Mott transition and Wigner crystallization. Electrical and

  6. Revelations from a single strong-motion record retreived during the 27 June 1998 Adana (Turkey) earthquake

    Science.gov (United States)

    Celebi, M.

    2000-01-01

    During the 27 June 1998 Adana (Turkey) earthquake, only one strong-motion record was retrieved in the region where the most damage occurred. This single record from the station in Ceyhan, approximately 15 km from the epicenter of that earthquake, exhibits characteristics that are related to the dominant frequencies of the ground and structures. The purpose of this paper is to explain the causes of the damage as inferred from both field observations and the characteristics of a single strong-motion record retrieved from the immediate epicentral area. In the town of Ceyhan there was considerable but selective damage to a significant number of mid-rise (7-12 stories high) buildings. The strong-motion record exhibits dominant frequencies that are typically similar for the mid-rise building structures. This is further supported by spectral ratios derived using Nakamura's method [QR of RTRI, 30 (1989) 25] that facilitates computation of a spectral ratio from a single tri-axial record as the ratio of amplitude spectrum of horizontal component to that of the vertical component [R = H(f)/V(f)]. The correlation between the damage and the characteristics exhibited from the single strong-motion record is remarkable. Although deficient construction practices played a significant role in the extent of damage to the mid-rise buildings, it is clear that site resonance also contributed to the detrimental fate of most of the mid-rise buildings. Therefore, even a single record can be useful to explain the effect of site resonance on building response and performance. Such information can be very useful for developing zonation criteria in similar alluvial valleys. Published by Elsevier Science Ltd.

  7. Strongly coupled dust coulomb clusters

    International Nuclear Information System (INIS)

    Juan Wentau; Lai Yingju; Chen Mingheng; I Lin

    1999-01-01

    The structures and motions of quasi-2-dimensional strongly coupled dust Coulomb clusters with particle number N from few to hundreds in a cylindrical rf plasma trap are studied and compared with the results from the molecular dynamic simulation using more ideal models. Shell structures with periodic packing in different shells and intershell rotational motion dominated excitations are observed at small N. As N increases, the boundary has less effect, the system recovers to the triangular lattice with isotropic vortex type cooperative excitations similar to an infinite N system except the outer shell region. The above generic behaviors are mainly determined by the system symmetry and agree with the simulation results. The detailed interaction form causes minor effect such as the fine structure of packing

  8. Probability densities in strong turbulence

    Science.gov (United States)

    Yakhot, Victor

    2006-03-01

    In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.

  9. Inferences on Children’s Reading Groups

    Directory of Open Access Journals (Sweden)

    Javier González García

    2009-05-01

    Full Text Available This article focuses on the non-literal information of a text, which can be inferred from key elements or clues offered by the text itself. This kind of text is called implicit text or inference, due to the thinking process that it stimulates. The explicit resources that lead to information retrieval are related to others of implicit information, which have increased their relevance. In this study, during two courses, how two teachers interpret three stories and how they establish a debate dividing the class into three student groups, was analyzed. The sample was formed by two classes of two urban public schools of Burgos capital (Spain, and two of public schools of Tampico (Mexico. This allowed us to observe an increasing percentage value of the group focused in text comprehension, and a lesser percentage of the group perceiving comprehension as a secondary objective.

  10. Working with sample data exploration and inference

    CERN Document Server

    Chaffe-Stengel, Priscilla

    2014-01-01

    Managers and analysts routinely collect and examine key performance measures to better understand their operations and make good decisions. Being able to render the complexity of operations data into a coherent account of significant events requires an understanding of how to work well with raw data and to make appropriate inferences. Although some statistical techniques for analyzing data and making inferences are sophisticated and require specialized expertise, there are methods that are understandable and applicable by anyone with basic algebra skills and the support of a spreadsheet package. By applying these fundamental methods themselves rather than turning over both the data and the responsibility for analysis and interpretation to an expert, managers will develop a richer understanding and potentially gain better control over their environment. This text is intended to describe these fundamental statistical techniques to managers, data analysts, and students. Statistical analysis of sample data is enh...

  11. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  12. An emergent approach to analogical inference

    Science.gov (United States)

    Thibodeau, Paul H.; Flusberg, Stephen J.; Glick, Jeremy J.; Sternberg, Daniel A.

    2013-03-01

    In recent years, a growing number of researchers have proposed that analogy is a core component of human cognition. According to the dominant theoretical viewpoint, analogical reasoning requires a specific suite of cognitive machinery, including explicitly coded symbolic representations and a mapping or binding mechanism that operates over these representations. Here we offer an alternative approach: we find that analogical inference can emerge naturally and spontaneously from a relatively simple, error-driven learning mechanism without the need to posit any additional analogy-specific machinery. The results also parallel findings from the developmental literature on analogy, demonstrating a shift from an initial reliance on surface feature similarity to the use of relational similarity later in training. Variants of the model allow us to consider and rule out alternative accounts of its performance. We conclude by discussing how these findings can potentially refine our understanding of the processes that are required to perform analogical inference.

  13. Inferring genetic interactions from comparative fitness data.

    Science.gov (United States)

    Crona, Kristina; Gavryushkin, Alex; Greene, Devin; Beerenwinkel, Niko

    2017-12-20

    Darwinian fitness is a central concept in evolutionary biology. In practice, however, it is hardly possible to measure fitness for all genotypes in a natural population. Here, we present quantitative tools to make inferences about epistatic gene interactions when the fitness landscape is only incompletely determined due to imprecise measurements or missing observations. We demonstrate that genetic interactions can often be inferred from fitness rank orders, where all genotypes are ordered according to fitness, and even from partial fitness orders. We provide a complete characterization of rank orders that imply higher order epistasis. Our theory applies to all common types of gene interactions and facilitates comprehensive investigations of diverse genetic interactions. We analyzed various genetic systems comprising HIV-1, the malaria-causing parasite Plasmodium vivax , the fungus Aspergillus niger , and the TEM-family of β-lactamase associated with antibiotic resistance. For all systems, our approach revealed higher order interactions among mutations.

  14. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Statistical inference from imperfect photon detection

    International Nuclear Information System (INIS)

    Audenaert, Koenraad M R; Scheel, Stefan

    2009-01-01

    We consider the statistical properties of photon detection with imperfect detectors that exhibit dark counts and less than unit efficiency, in the context of tomographic reconstruction. In this context, the detectors are used to implement certain positive operator-valued measures (POVMs) that would allow us to reconstruct the quantum state or quantum process under consideration. Here we look at the intermediate step of inferring outcome probabilities from measured outcome frequencies, and show how this inference can be performed in a statistically sound way in the presence of detector imperfections. Merging outcome probabilities for different sets of POVMs into a consistent quantum state picture has been treated elsewhere (Audenaert and Scheel 2009 New J. Phys. 11 023028). Single-photon pulsed measurements as well as continuous wave measurements are covered.

  16. An Intuitive Dashboard for Bayesian Network Inference

    International Nuclear Information System (INIS)

    Reddy, Vikas; Farr, Anna Charisse; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K D V

    2014-01-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  17. Landsat: building a strong future

    Science.gov (United States)

    Loveland, Thomas R.; Dwyer, John L.

    2012-01-01

    Conceived in the 1960s, the Landsat program has experienced six successful missions that have contributed to an unprecedented 39-year record of Earth Observations that capture global land conditions and dynamics. Incremental improvements in imaging capabilities continue to improve the quality of Landsat science data, while ensuring continuity over the full instrument record. Landsats 5 and 7 are still collecting imagery. The planned launch of the Landsat Data Continuity Mission in December 2012 potentially extends the Landsat record to nearly 50 years. The U.S. Geological Survey (USGS) Landsat archive contains nearly three million Landsat images. All USGS Landsat data are available at no cost via the Internet. The USGS is committed to improving the content of the historical Landsat archive though the consolidation of Landsat data held in international archives. In addition, the USGS is working on a strategy to develop higher-level Landsat geo- and biophysical datasets. Finally, Federal efforts are underway to transition Landsat into a sustained operational program within the Department of the Interior and to authorize the development of the next two satellites — Landsats 9 and 10.

  18. Inferring Genetic Ancestry: Opportunities, Challenges, and Implications

    OpenAIRE

    Royal, Charmaine D.; Novembre, John; Fullerton, Stephanie M.; Goldstein, David B.; Long, Jeffrey C.; Bamshad, Michael J.; Clark, Andrew G.

    2010-01-01

    Increasing public interest in direct-to-consumer (DTC) genetic ancestry testing has been accompanied by growing concern about issues ranging from the personal and societal implications of the testing to the scientific validity of ancestry inference. The very concept of “ancestry” is subject to misunderstanding in both the general and scientific communities. What do we mean by ancestry? How exactly is ancestry measured? How far back can such ancestry be defined and by which genetic tools? How ...

  19. Inferring ontology graph structures using OWL reasoning.

    Science.gov (United States)

    Rodríguez-García, Miguel Ángel; Hoehndorf, Robert

    2018-01-05

    Ontologies are representations of a conceptualization of a domain. Traditionally, ontologies in biology were represented as directed acyclic graphs (DAG) which represent the backbone taxonomy and additional relations between classes. These graphs are widely exploited for data analysis in the form of ontology enrichment or computation of semantic similarity. More recently, ontologies are developed in a formal language such as the Web Ontology Language (OWL) and consist of a set of axioms through which classes are defined or constrained. While the taxonomy of an ontology can be inferred directly from the axioms of an ontology as one of the standard OWL reasoning tasks, creating general graph structures from OWL ontologies that exploit the ontologies' semantic content remains a challenge. We developed a method to transform ontologies into graphs using an automated reasoner while taking into account all relations between classes. Searching for (existential) patterns in the deductive closure of ontologies, we can identify relations between classes that are implied but not asserted and generate graph structures that encode for a large part of the ontologies' semantic content. We demonstrate the advantages of our method by applying it to inference of protein-protein interactions through semantic similarity over the Gene Ontology and demonstrate that performance is increased when graph structures are inferred using deductive inference according to our method. Our software and experiment results are available at http://github.com/bio-ontology-research-group/Onto2Graph . Onto2Graph is a method to generate graph structures from OWL ontologies using automated reasoning. The resulting graphs can be used for improved ontology visualization and ontology-based data analysis.

  20. Context recognition for a hyperintensional inference machine

    Science.gov (United States)

    Duží, Marie; Fait, Michal; Menšík, Marek

    2017-07-01

    The goal of this paper is to introduce the algorithm of context recognition in the functional programming language TIL-Script, which is a necessary condition for the implementation of the TIL-Script inference machine. The TIL-Script language is an operationally isomorphic syntactic variant of Tichý's Transparent Intensional Logic (TIL). From the formal point of view, TIL is a hyperintensional, partial, typed λ-calculus with procedural semantics. Hyperintensional, because TIL λ-terms denote procedures (defined as TIL constructions) producing set-theoretic functions rather than the functions themselves; partial, because TIL is a logic of partial functions; and typed, because all the entities of TIL ontology, including constructions, receive a type within a ramified hierarchy of types. These features make it possible to distinguish three levels of abstraction at which TIL constructions operate. At the highest hyperintensional level the object to operate on is a construction (though a higher-order construction is needed to present this lower-order construction as an object of predication). At the middle intensional level the object to operate on is the function presented, or constructed, by a construction, while at the lowest extensional level the object to operate on is the value (if any) of the presented function. Thus a necessary condition for the development of an inference machine for the TIL-Script language is recognizing a context in which a construction occurs, namely extensional, intensional and hyperintensional context, in order to determine the type of an argument at which a given inference rule can be properly applied. As a result, our logic does not flout logical rules of extensional logic, which makes it possible to develop a hyperintensional inference machine for the TIL-Script language.

  1. Inferring ontology graph structures using OWL reasoning

    KAUST Repository

    Rodriguez-Garcia, Miguel Angel

    2018-01-05

    Ontologies are representations of a conceptualization of a domain. Traditionally, ontologies in biology were represented as directed acyclic graphs (DAG) which represent the backbone taxonomy and additional relations between classes. These graphs are widely exploited for data analysis in the form of ontology enrichment or computation of semantic similarity. More recently, ontologies are developed in a formal language such as the Web Ontology Language (OWL) and consist of a set of axioms through which classes are defined or constrained. While the taxonomy of an ontology can be inferred directly from the axioms of an ontology as one of the standard OWL reasoning tasks, creating general graph structures from OWL ontologies that exploit the ontologies\\' semantic content remains a challenge.We developed a method to transform ontologies into graphs using an automated reasoner while taking into account all relations between classes. Searching for (existential) patterns in the deductive closure of ontologies, we can identify relations between classes that are implied but not asserted and generate graph structures that encode for a large part of the ontologies\\' semantic content. We demonstrate the advantages of our method by applying it to inference of protein-protein interactions through semantic similarity over the Gene Ontology and demonstrate that performance is increased when graph structures are inferred using deductive inference according to our method. Our software and experiment results are available at http://github.com/bio-ontology-research-group/Onto2Graph .Onto2Graph is a method to generate graph structures from OWL ontologies using automated reasoning. The resulting graphs can be used for improved ontology visualization and ontology-based data analysis.

  2. Thermodynamics of statistical inference by cells.

    Science.gov (United States)

    Lang, Alex H; Fisher, Charles K; Mora, Thierry; Mehta, Pankaj

    2014-10-03

    The deep connection between thermodynamics, computation, and information is now well established both theoretically and experimentally. Here, we extend these ideas to show that thermodynamics also places fundamental constraints on statistical estimation and learning. To do so, we investigate the constraints placed by (nonequilibrium) thermodynamics on the ability of biochemical signaling networks to estimate the concentration of an external signal. We show that accuracy is limited by energy consumption, suggesting that there are fundamental thermodynamic constraints on statistical inference.

  3. Constrained bayesian inference of project performance models

    OpenAIRE

    Sunmola, Funlade

    2013-01-01

    Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...

  4. Racing algorithms for conditional independence inference

    Czech Academy of Sciences Publication Activity Database

    Bouckaert, R. R.; Studený, Milan

    2007-01-01

    Roč. 45, č. 2 (2007), s. 386-401 ISSN 0888-613X R&D Projects: GA ČR GA201/04/0393 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditonal independence * inference * imset * algorithm Subject RIV: BA - General Mathematics Impact factor: 1.220, year: 2007 http://library.utia.cas.cz/separaty/2007/mtr/studeny-0083472.pdf

  5. Controlling Selection Bias in Causal Inference

    Science.gov (United States)

    2012-02-01

    and cervix . Journal of the National Cancer Institute 11 1269–1275. Didelez, V., Kreiner, S. and Keiding, N. (2010). Graphical models for inference...Endometrial Cancer (Y ) was overestimated in the data studied. One of the symptoms of the use of Oe- strogen is vaginal bleeding (W ) (Fig. 1(c)), and the...whether similar bounds can be de - rived in the presence of selection bias. We will show that selection bias can be removed entirely through the use of

  6. Strong Ideal Convergence in Probabilistic Metric Spaces

    Indian Academy of Sciences (India)

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  7. Strong ideal convergence in probabilistic metric spaces

    Indian Academy of Sciences (India)

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  8. Bootstrap inference when using multiple imputation.

    Science.gov (United States)

    Schomaker, Michael; Heumann, Christian

    2018-04-16

    Many modern estimators require bootstrapping to calculate confidence intervals because either no analytic standard error is available or the distribution of the parameter of interest is nonsymmetric. It remains however unclear how to obtain valid bootstrap inference when dealing with multiple imputation to address missing data. We present 4 methods that are intuitively appealing, easy to implement, and combine bootstrap estimation with multiple imputation. We show that 3 of the 4 approaches yield valid inference, but that the performance of the methods varies with respect to the number of imputed data sets and the extent of missingness. Simulation studies reveal the behavior of our approaches in finite samples. A topical analysis from HIV treatment research, which determines the optimal timing of antiretroviral treatment initiation in young children, demonstrates the practical implications of the 4 methods in a sophisticated and realistic setting. This analysis suffers from missing data and uses the g-formula for inference, a method for which no standard errors are available. Copyright © 2018 John Wiley & Sons, Ltd.

  9. Inferring gene ontologies from pairwise similarity data

    Science.gov (United States)

    Kramer, Michael; Dutkowski, Janusz; Yu, Michael; Bafna, Vineet; Ideker, Trey

    2014-01-01

    Motivation: While the manually curated Gene Ontology (GO) is widely used, inferring a GO directly from -omics data is a compelling new problem. Recognizing that ontologies are a directed acyclic graph (DAG) of terms and hierarchical relations, algorithms are needed that: analyze a full matrix of gene–gene pairwise similarities from -omics data;infer true hierarchical structure in these data rather than enforcing hierarchy as a computational artifact; andrespect biological pleiotropy, by which a term in the hierarchy can relate to multiple higher level terms. Methods addressing these requirements are just beginning to emerge—none has been evaluated for GO inference. Methods: We consider two algorithms [Clique Extracted Ontology (CliXO), LocalFitness] that uniquely satisfy these requirements, compared with methods including standard clustering. CliXO is a new approach that finds maximal cliques in a network induced by progressive thresholding of a similarity matrix. We evaluate each method’s ability to reconstruct the GO biological process ontology from a similarity matrix based on (a) semantic similarities for GO itself or (b) three -omics datasets for yeast. Results: For task (a) using semantic similarity, CliXO accurately reconstructs GO (>99% precision, recall) and outperforms other approaches (Ontology) and better than LocalFitness or standard clustering (20–25% precision, recall). Conclusion: This study provides algorithmic foundation for building gene ontologies by capturing hierarchical and pleiotropic structure embedded in biomolecular data. Contact: tideker@ucsd.edu PMID:24932003

  10. Inferring epidemic network topology from surveillance data.

    Directory of Open Access Journals (Sweden)

    Xiang Wan

    Full Text Available The transmission of infectious diseases can be affected by many or even hidden factors, making it difficult to accurately predict when and where outbreaks may emerge. One approach at the moment is to develop and deploy surveillance systems in an effort to detect outbreaks as timely as possible. This enables policy makers to modify and implement strategies for the control of the transmission. The accumulated surveillance data including temporal, spatial, clinical, and demographic information, can provide valuable information with which to infer the underlying epidemic networks. Such networks can be quite informative and insightful as they characterize how infectious diseases transmit from one location to another. The aim of this work is to develop a computational model that allows inferences to be made regarding epidemic network topology in heterogeneous populations. We apply our model on the surveillance data from the 2009 H1N1 pandemic in Hong Kong. The inferred epidemic network displays significant effect on the propagation of infectious diseases.

  11. Effects of low doses: Proof and inferences

    International Nuclear Information System (INIS)

    Hubert, Ph.

    2010-01-01

    It is essential to discuss the plausibility of 'low-dose' effects from environmental exposures. The question, nonetheless, is wrongly labelled, for it is not the magnitude of the dose that matters, but rather the effect. The question thus concerns 'doses with low effects'. More precisely, because the low effects on large populations are not that small, even when epidemiological tools fail to detect them, it would be more accurate to talk about 'doses with undetectable or barely detectable effects'. Hereafter, we describe this 'low-effect dose' concept from the viewpoint of toxicology and epidemiology and discuss the fragile boundary line for these low-effect doses. Next, we review the different types of inference from observed situations (i.e., with high effects) to situations relevant to public health, to characterize the level of confidence to be accorded them. The first type is extrapolation - from higher to lower doses or from higher to lower dose rates. The second type is transposition - from humans to other humans or from animals to humans. The third type can be called 'analogy' as in 'read across' approaches, where QSAR (Quantitative Structure Activity Relationship) methodology can be used. These three types of inferences can be based on an estimate of the 'distance' between observed and predicted areas, but can also rely on knowledge and theories of the relevant mechanisms. The new tools of predictive toxicology are helpful both in deriving quantitative estimates and grounding inferences on sound bases. (author)

  12. Intuitive Mechanics: Inferences of Vertical Projectile Motion

    Directory of Open Access Journals (Sweden)

    Milana Damjenić

    2016-07-01

    Full Text Available Our intuitive knowledge of physics mechanics, i.e. knowledge defined through personal experience about velocity, acceleration, motion causes, etc., is often wrong. This research examined whether similar misconceptions occur systematically in the case of vertical projectiles launched upwards. The first experiment examined inferences of velocity and acceleration of the ball moving vertically upwards, while the second experiment examined whether the mass of the thrown ball and force of the throw have an impact on the inference. The results showed that more than three quarters of the participants wrongly assumed that maximum velocity and peak acceleration did not occur at the initial launch of the projectile. There was no effect of object mass or effect of the force of the throw on the inference relating to the velocity and acceleration of the ball. The results exceed the explanatory reach of the impetus theory, most commonly used to explain the naive understanding of the mechanics of object motion. This research supports that the actions on objects approach and the property transmission heuristics may more aptly explain the dissidence between perceived and actual implications in projectile motion.

  13. Markov Logic Based Inference Engine for CDSS

    International Nuclear Information System (INIS)

    Bajwa, I.S.; Ramzan, B.; Ramzan, S.

    2017-01-01

    CDSS (Clinical Decision Support System) is typically a diagnostic application and a modern technology that can be employed to provide standardized and quality medical facilities to the medical patients especially when expert doctors are not available at the medical centres. These days the use of the CDSSs is quite common in medical practice at remote areas. A CDSS can be very helpful not only in preventive health care but also in computerized diagnosis. However, a typical problem of CDSS based diagnosis is uncertainty. Typically, an ambiguity can occur when a patient is not able to explain the symptoms of his disease in a better way. The typically used forward chaining mechanisms in rule based decision support systems perform reasoning with uncertain data. ML (Markov Logic) is a new technique that has ability to deal with uncertainty of data by integrating FOL (First-Order-Logic) with probabilistic graphical models. In this paper, we have proposed the architecture of a ML based inference engine for a rule based CDSS and we have also presented an algorithm to use ML based forward chaining mechanism in the proposed inference engine. The results of the experiments show that the proposed inference engine would be intelligent enough to diagnose a patient's disease even from uncertain or incomplete/partial information. (author)

  14. Role of Speaker Cues in Attention Inference

    Directory of Open Access Journals (Sweden)

    Jin Joo Lee

    2017-10-01

    Full Text Available Current state-of-the-art approaches to emotion recognition primarily focus on modeling the nonverbal expressions of the sole individual without reference to contextual elements such as the co-presence of the partner. In this paper, we demonstrate that the accurate inference of listeners’ social-emotional state of attention depends on accounting for the nonverbal behaviors of their storytelling partner, namely their speaker cues. To gain a deeper understanding of the role of speaker cues in attention inference, we conduct investigations into real-world interactions of children (5–6 years old storytelling with their peers. Through in-depth analysis of human–human interaction data, we first identify nonverbal speaker cues (i.e., backchannel-inviting cues and listener responses (i.e., backchannel feedback. We then demonstrate how speaker cues can modify the interpretation of attention-related backchannels as well as serve as a means to regulate the responsiveness of listeners. We discuss the design implications of our findings toward our primary goal of developing attention recognition models for storytelling robots, and we argue that social robots can proactively use speaker cues to form more accurate inferences about the attentive state of their human partners.

  15. Efficient Bayesian inference under the structured coalescent.

    Science.gov (United States)

    Vaughan, Timothy G; Kühnert, Denise; Popinga, Alex; Welch, David; Drummond, Alexei J

    2014-08-15

    Population structure significantly affects evolutionary dynamics. Such structure may be due to spatial segregation, but may also reflect any other gene-flow-limiting aspect of a model. In combination with the structured coalescent, this fact can be used to inform phylogenetic tree reconstruction, as well as to infer parameters such as migration rates and subpopulation sizes from annotated sequence data. However, conducting Bayesian inference under the structured coalescent is impeded by the difficulty of constructing Markov Chain Monte Carlo (MCMC) sampling algorithms (samplers) capable of efficiently exploring the state space. In this article, we present a new MCMC sampler capable of sampling from posterior distributions over structured trees: timed phylogenetic trees in which lineages are associated with the distinct subpopulation in which they lie. The sampler includes a set of MCMC proposal functions that offer significant mixing improvements over a previously published method. Furthermore, its implementation as a BEAST 2 package ensures maximum flexibility with respect to model and prior specification. We demonstrate the usefulness of this new sampler by using it to infer migration rates and effective population sizes of H3N2 influenza between New Zealand, New York and Hong Kong from publicly available hemagglutinin (HA) gene sequences under the structured coalescent. The sampler has been implemented as a publicly available BEAST 2 package that is distributed under version 3 of the GNU General Public License at http://compevol.github.io/MultiTypeTree. © The Author 2014. Published by Oxford University Press.

  16. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Experimental results and performance measures infer that NSGA-II produces quality schedules compared to NSPSO. Volume 40 Issue 1 February 2015 pp 107-119 Electrical and Computer Sciences. Fuzzy delay model based fault simulator for crosstalk delay fault test generation in asynchronous sequential circuits.

  17. Remnants of strong tidal interactions

    International Nuclear Information System (INIS)

    Mcglynn, T.A.

    1990-01-01

    This paper examines the properties of stellar systems that have recently undergone a strong tidal shock, i.e., a shock which removes a significant fraction of the particles in the system, and where the shocked system has a much smaller mass than the producer of the tidal field. N-body calculations of King models shocked in a variety of ways are performed, and the consequences of the shocks are investigated. The results confirm the prediction of Jaffe for shocked systems. Several models are also run where the tidal forces on the system are constant, simulating a circular orbit around a primary, and the development of tidal radii under these static conditions appears to be a mild process which does not dramatically affect material that is not stripped. The tidal radii are about twice as large as classical formulas would predict. Remnant density profiles are compared with a sample of elliptical galaxies, and the implications of the results for the development of stellar populations and galaxies are considered. 38 refs

  18. John Strong - 1941-2006

    CERN Document Server

    2006-01-01

    Our friend and colleague John Strong was cruelly taken from us by a brain tumour on 31 July, a few days before his 65th birthday. John started his career and obtained his PhD in a group from Westfield College, initially working on experiments at Rutherford Appleton Laboratory (RAL). From the early 1970s onwards, however, his research was focused on experiments in CERN, with several particularly notable contributions. The Omega spectrometer adopted a system John had originally developed for experiments at RAL using vidicon cameras (a type of television camera) to record the sparks in the spark chambers. This highly automated system allowed Omega to be used in a similar way to bubble chambers. He contributed to the success of NA1 and NA7, where he became heavily involved in the electronic trigger systems. In these experiments the Westfield group joined forces with Italian colleagues to measure the form factors of the pion and the kaon, and the lifetime of some of the newly discovered charm particles. Such h...

  19. Strong seismic ground motion propagation

    International Nuclear Information System (INIS)

    Seale, S.; Archuleta, R.; Pecker, A.; Bouchon, M.; Mohammadioun, G.; Murphy, A.; Mohammadioun, B.

    1988-10-01

    At the McGee Creek, California, site, 3-component strong-motion accelerometers are located at depths of 166 m, 35 m and 0 m. The surface material is glacial moraine, to a depth of 30.5 m, overlying homfels. Accelerations were recorded from two California earthquakes: Round Valley, M L 5.8, November 23, 1984, 18:08 UTC and Chalfant Valley, M L 6.4, July 21, 1986, 14:42 UTC. By separating out the SH components of acceleration, we were able to determine the orientations of the downhole instruments. By separating out the SV component of acceleration, we were able to determine the approximate angle of incidence of the signal at 166 m. A constant phase velocity Haskell-Thomson model was applied to generate synthetic SH seismograms at the surface using the accelerations recorded at 166 m. In the frequency band 0.0 - 10.0 Hz, we compared the filtered synthetic records to the filtered surface data. The onset of the SH pulse is clearly seen, as are the reflections from the interface at 30.5 m. The synthetic record closely matches the data in amplitude and phase. The fit between the synthetic accelerogram and the data shows that the seismic amplification at the surface is a result of the contrast of the impedances (shear stiffnesses) of the near surface materials

  20. Strong curvature effects in Neumann wave problems

    International Nuclear Information System (INIS)

    Willatzen, M.; Pors, A.; Gravesen, J.

    2012-01-01

    Waveguide phenomena play a major role in basic sciences and engineering. The Helmholtz equation is the governing equation for the electric field in electromagnetic wave propagation and the acoustic pressure in the study of pressure dynamics. The Schrödinger equation simplifies to the Helmholtz equation for a quantum-mechanical particle confined by infinite barriers relevant in semiconductor physics. With this in mind and the interest to tailor waveguides towards a desired spectrum and modal pattern structure in classical structures and nanostructures, it becomes increasingly important to understand the influence of curvature effects in waveguides. In this work, we demonstrate analytically strong curvature effects for the eigenvalue spectrum of the Helmholtz equation with Neumann boundary conditions in cases where the waveguide cross section is a circular sector. It is found that the linear-in-curvature contribution originates from parity symmetry breaking of eigenstates in circular-sector tori and hence vanishes in a torus with a complete circular cross section. The same strong curvature effect is not present in waveguides subject to Dirichlet boundary conditions where curvature contributions contribute to second-order in the curvature only. We demonstrate this finding by considering wave propagation in a circular-sector torus corresponding to Neumann and Dirichlet boundary conditions, respectively. Results for relative eigenfrequency shifts and modes are determined and compared with three-dimensional finite element method results. Good agreement is found between the present analytical method using a combination of differential geometry with perturbation theory and finite element results for a large range of curvature ratios.

  1. Bootstrapping phylogenies inferred from rearrangement data

    Directory of Open Access Journals (Sweden)

    Lin Yu

    2012-08-01

    Full Text Available Abstract Background Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. Results We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Conclusions Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its

  2. Bootstrapping phylogenies inferred from rearrangement data.

    Science.gov (United States)

    Lin, Yu; Rajan, Vaibhav; Moret, Bernard Me

    2012-08-29

    Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its support values follow a similar scale and its receiver

  3. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  4. Type Inference for Session Types in the Pi-Calculus

    DEFF Research Database (Denmark)

    Graversen, Eva Fajstrup; Harbo, Jacob Buchreitz; Huttel, Hans

    2014-01-01

    In this paper we present a direct algorithm for session type inference for the π-calculus. Type inference for session types has previously been achieved by either imposing limitations and restriction on the π-calculus, or by reducing the type inference problem to that for linear types. Our approach...

  5. Reasoning about Informal Statistical Inference: One Statistician's View

    Science.gov (United States)

    Rossman, Allan J.

    2008-01-01

    This paper identifies key concepts and issues associated with the reasoning of informal statistical inference. I focus on key ideas of inference that I think all students should learn, including at secondary level as well as tertiary. I argue that a fundamental component of inference is to go beyond the data at hand, and I propose that statistical…

  6. Strongly interacting photons and atoms

    International Nuclear Information System (INIS)

    Alge, W.

    1999-05-01

    This thesis contains the main results of the research topics I have pursued during the my PhD studies at the University of Innsbruck and partly in collaboration with the Institut d' Optique in Orsay, France. It is divided into three parts. The first and largest part discusses the possibility of using strong standing waves as a tool to cool and trap neutral atoms in optical cavities. This is very important in the field of nonlinear optics where several successful experiments with cold atoms in cavities have been performed recently. A discussion of the optical parametric oscillator in a regime where the nonlinearity dominates the evolution is the topic of the second part. We investigated mainly the statistical properties of the cavity output of the three interactive cavity modes. Very recently a system has been proposed which promises fantastic properties. It should exhibit a giant Kerr nonlinearity with negligible absorption thus leading to a photonic turnstile device based on cold atoms in cavity. We have shown that this model suffers from overly simplistic assumptions and developed several more comprehensive approaches to study the behavior of this system. Apart from the division into three parts of different contents the thesis is divided into publications, supplements and invisible stuff. The intention of the supplements is to reach researchers which work in related areas and provide them with more detailed information about the concepts and the numerical tools we used. It is written especially for diploma and PhD students to give them a chance to use the third part of our work which is actually the largest one. They consist of a large number of computer programs we wrote to investigate the behavior of the systems in parameter regions where no hope exists to solve the equations analytically. (author)

  7. Topics in strong Langmuir turbulence

    International Nuclear Information System (INIS)

    Skoric, M.M.

    1981-01-01

    This thesis discusses certain aspects of the turbulence of a fully ionised non-isothermal plasma dominated by the Langmuir mode. Some of the basic properties of strongly turbulent plasmas are reviewed. In particular, interest is focused on the state of Langmuir turbulence, that is the turbulence of a simple externally unmagnetized plasma. The problem of the existence and dynamics of Langmuir collapse is discussed, often met as a non-linear stage of the modulational instability in the framework of the Zakharov equations (i.e. simple time-averaged dynamical equations). Possible macroscopic consequences of such dynamical turbulent models are investigated. In order to study highly non-linear collapse dynamics in its advanced stage, a set of generalized Zakharov equations are derived. Going beyond the original approximation, the author includes the effects of higher electron non-linearities and a breakdown of slow-timescale quasi-neutrality. He investigates how these corrections may influence the collapse stabilisation. Recently, it has been realised that the modulational instability in a Langmuir plasma will be accompanied by the collisionless-generation of a slow-timescale magnetic field. Accordingly, a novel physical situation has emerged which is investigated in detail. The stability of monochromatic Langmuir waves in a self-magnetized Langmuir plasma, is discussed, and the existence of a novel magneto-modulational instability shown. The wave collapse dynamics is investigated and a physical interpretation of the basic results is given. A problem of the transient analysis of an interaction of time-dependent electromagnetic pulses with linear cold plasma media is investigated. (Auth.)

  8. Promoting Strong Written Communication Skills

    Science.gov (United States)

    Narayanan, M.

    2015-12-01

    The reason that an improvement in the quality of technical writing is still needed in the classroom is due to the fact that universities are facing challenging problems not only on the technological front but also on the socio-economic front. The universities are actively responding to the changes that are taking place in the global consumer marketplace. Obviously, there are numerous benefits of promoting strong written communication skills. They can be summarized into the following six categories. First, and perhaps the most important: The University achieves learner satisfaction. The learner has documented verbally, that the necessary knowledge has been successfully acquired. This results in learner loyalty that in turn will attract more qualified learners.Second, quality communication lowers the cost per pupil, consequently resulting in increased productivity backed by a stronger economic structure and forecast. Third, quality communications help to improve the cash flow and cash reserves of the university. Fourth, having high quality communication enables the university to justify the need for high costs of tuition and fees. Fifth, better quality in written communication skills result in attracting top-quality learners. This will lead to happier and satisfied learners, not to mention greater prosperity for the university as a whole. Sixth, quality written communication skills result in reduced complaints, thus meaning fewer hours spent on answering or correcting the situation. The University faculty and staff are thus able to devote more time on scholarly activities, meaningful research and productive community service. References Boyer, Ernest L. (1990). Scholarship reconsidered: Priorities of the Professorate.Princeton, NJ: Carnegie Foundation for the Advancement of Teaching. Hawkins, P., & Winter, J. (1997). Mastering change: Learning the lessons of the enterprise.London: Department for Education and Employment. Buzzel, Robert D., and Bradley T. Gale. (1987

  9. Is there a hierarchy of social inferences? The likelihood and speed of inferring intentionality, mind, and personality.

    Science.gov (United States)

    Malle, Bertram F; Holbrook, Jess

    2012-04-01

    People interpret behavior by making inferences about agents' intentionality, mind, and personality. Past research studied such inferences 1 at a time; in real life, people make these inferences simultaneously. The present studies therefore examined whether 4 major inferences (intentionality, desire, belief, and personality), elicited simultaneously in response to an observed behavior, might be ordered in a hierarchy of likelihood and speed. To achieve generalizability, the studies included a wide range of stimulus behaviors, presented them verbally and as dynamic videos, and assessed inferences both in a retrieval paradigm (measuring the likelihood and speed of accessing inferences immediately after they were made) and in an online processing paradigm (measuring the speed of forming inferences during behavior observation). Five studies provide evidence for a hierarchy of social inferences-from intentionality and desire to belief to personality-that is stable across verbal and visual presentations and that parallels the order found in developmental and primate research. (c) 2012 APA, all rights reserved.

  10. Free will in Bayesian and inverse Bayesian inference-driven endo-consciousness.

    Science.gov (United States)

    Gunji, Yukio-Pegio; Minoura, Mai; Kojima, Kei; Horry, Yoichi

    2017-12-01

    How can we link challenging issues related to consciousness and/or qualia with natural science? The introduction of endo-perspective, instead of exo-perspective, as proposed by Matsuno, Rössler, and Gunji, is considered one of the most promising candidate approaches. Here, we distinguish the endo-from the exo-perspective in terms of whether the external is or is not directly operated. In the endo-perspective, the external can be neither perceived nor recognized directly; rather, one can only indirectly summon something outside of the perspective, which can be illustrated by a causation-reversal pair. On one hand, causation logically proceeds from the cause to the effect. On the other hand, a reversal from the effect to the cause is non-logical and is equipped with a metaphorical structure. We argue that the differences in exo- and endo-perspectives result not from the difference between Western and Eastern cultures, but from differences between modernism and animism. Here, a causation-reversal pair described using a pair of upward (from premise to consequence) and downward (from consequence to premise) causation and a pair of Bayesian and inverse Bayesian inference (BIB inference). Accordingly, the notion of endo-consciousness is proposed as an agent equipped with BIB inference. We also argue that BIB inference can yield both highly efficient computations through Bayesian interference and robust computations through inverse Bayesian inference. By adapting a logical model of the free will theorem to the BIB inference, we show that endo-consciousness can explain free will as a regression of the controllability of voluntary action. Copyright © 2017. Published by Elsevier Ltd.

  11. MIDER: network inference with mutual information distance and entropy reduction.

    Directory of Open Access Journals (Sweden)

    Alejandro F Villaverde

    Full Text Available The prediction of links among variables from a given dataset is a task referred to as network inference or reverse engineering. It is an open problem in bioinformatics and systems biology, as well as in other areas of science. Information theory, which uses concepts such as mutual information, provides a rigorous framework for addressing it. While a number of information-theoretic methods are already available, most of them focus on a particular type of problem, introducing assumptions that limit their generality. Furthermore, many of these methods lack a publicly available implementation. Here we present MIDER, a method for inferring network structures with information theoretic concepts. It consists of two steps: first, it provides a representation of the network in which the distance among nodes indicates their statistical closeness. Second, it refines the prediction of the existing links to distinguish between direct and indirect interactions and to assign directionality. The method accepts as input time-series data related to some quantitative features of the network nodes (such as e.g. concentrations, if the nodes are chemical species. It takes into account time delays between variables, and allows choosing among several definitions and normalizations of mutual information. It is general purpose: it may be applied to any type of network, cellular or otherwise. A Matlab implementation including source code and data is freely available (http://www.iim.csic.es/~gingproc/mider.html. The performance of MIDER has been evaluated on seven different benchmark problems that cover the main types of cellular networks, including metabolic, gene regulatory, and signaling. Comparisons with state of the art information-theoretic methods have demonstrated the competitive performance of MIDER, as well as its versatility. Its use does not demand any a priori knowledge from the user; the default settings and the adaptive nature of the method provide good

  12. MIDER: network inference with mutual information distance and entropy reduction.

    Science.gov (United States)

    Villaverde, Alejandro F; Ross, John; Morán, Federico; Banga, Julio R

    2014-01-01

    The prediction of links among variables from a given dataset is a task referred to as network inference or reverse engineering. It is an open problem in bioinformatics and systems biology, as well as in other areas of science. Information theory, which uses concepts such as mutual information, provides a rigorous framework for addressing it. While a number of information-theoretic methods are already available, most of them focus on a particular type of problem, introducing assumptions that limit their generality. Furthermore, many of these methods lack a publicly available implementation. Here we present MIDER, a method for inferring network structures with information theoretic concepts. It consists of two steps: first, it provides a representation of the network in which the distance among nodes indicates their statistical closeness. Second, it refines the prediction of the existing links to distinguish between direct and indirect interactions and to assign directionality. The method accepts as input time-series data related to some quantitative features of the network nodes (such as e.g. concentrations, if the nodes are chemical species). It takes into account time delays between variables, and allows choosing among several definitions and normalizations of mutual information. It is general purpose: it may be applied to any type of network, cellular or otherwise. A Matlab implementation including source code and data is freely available (http://www.iim.csic.es/~gingproc/mider.html). The performance of MIDER has been evaluated on seven different benchmark problems that cover the main types of cellular networks, including metabolic, gene regulatory, and signaling. Comparisons with state of the art information-theoretic methods have demonstrated the competitive performance of MIDER, as well as its versatility. Its use does not demand any a priori knowledge from the user; the default settings and the adaptive nature of the method provide good results for a wide

  13. Science and data science.

    Science.gov (United States)

    Blei, David M; Smyth, Padhraic

    2017-08-07

    Data science has attracted a lot of attention, promising to turn vast amounts of data into useful predictions and insights. In this article, we ask why scientists should care about data science. To answer, we discuss data science from three perspectives: statistical, computational, and human. Although each of the three is a critical component of data science, we argue that the effective combination of all three components is the essence of what data science is about.

  14. Dimensionality reduction and network inference for sea surface temperature data

    Science.gov (United States)

    Falasca, Fabrizio; Bracco, Annalisa; Nenes, Athanasios; Dovrolis, Constantine; Fountalis, Ilias

    2017-04-01

    Earth's climate is a complex dynamical system. The underlying components of the system interact with each other (in a linear or non linear way) on several spatial and time scales. Network science provides a set of tools to study the structure and dynamics of such systems. Here we propose an application of a novel network inference method, δ-MAPS, to investigate sea surface temperature (SST) fields in reanalyses and models. δ-MAPS first identifies the underlying components (domains) of the system, modeling them as spatially contiguous, potentially overlapping regions of highly correlated temporal activity, and then infers the weighted and potentially lagged interactions between them. The SST network is represented as a weighted and directed graph. Edge direction captures the temporal ordering of events, while edge weights capture the magnitude of the interaction between the domains. We focus on two reanalysis datasets (HadISST and COBE ) and on a dozen of runs of the CESM model (extracted from the so-called large ensemble). The networks are built using 45 years of data every 3 years for the total dataset temporal coverage (from 1871 to 2015 for HadISST, from 1891 to 2015 for COBE and from 1920 to 2100 for CESM members). We then explore similarities and differences between reanalyses and models in terms of the domains identified, the networks inferred and their time evolution. The spatial extent and shape of the identified domains is consistent between observations and models. According to our analysis the largest SST domain always corresponds to the El Niño Southern Oscillation (ENSO) while most of the other domains correspond to known climate modes. However, the network structure shows significant differences. For example, the unique role played by the South Tropical Atlantic in the observed network is not captured by any model run. Regarding the time evolution of the system we focus on the strength of ENSO: while we observe a positive trend for observations and

  15. Strong Correlation Physics in Aromatic Hydrocarbon Superconductors

    Science.gov (United States)

    Capone, Massimo; Giovannetti, Gianluca

    2012-02-01

    We show, by means of ab-initio calculations, that electron-electron correlations play an important role in doped aromatic hydrocarbon superconductors, including potassium doped picene with Tc= 18K [1], coronene and phenanthrene [2]. For the case of picene the inclusion of exchange interactions by means of hybrid functionals reproduces the correct gap for the undoped compound and predicts an antiferromagnetic state for x=3, where superconductivity has been observed [3]. The latter finding is compatible with a sizable value of the correlation strength. The differences between the different compounds are analyzed and results of Dynamical Mean-Field Theory including both correlation effects and electron-phonon interactions are presented. Finally we discuss the consequences of strong correlations in an organic superconductor in relation to the properties of Cs3C60, in which electron correlations drive an antiferromagnetic state [4] but also lead to an enhancement of superconductivity [5]. 1. R. Mitsuhashi et al. Nature 464, 76 (2010)2. X.F. Wang et al, Nat. Comm. 2, 507 (2011)3. G. Giovannetti and M. Capone, Phys. Rev. B 83, 134508 (2011)4. Y. Takabayashi et al., Science 323, 1585 (2009)5. M. Capone et al. Rev. Mod. Phys. 81, 943 (2009

  16. Lessons for Religious Education from Cognitive Science of Religion

    Science.gov (United States)

    Brelsford, Theodore

    2005-01-01

    Recent work in the cognitive sciences provides new neurological/biological and evolutionary bases for understanding the construction of knowledge (in the form of sets of ideas containing functionally useful inferences) and the capacity for imagination (as the ability to run inferences and generate ideas from information) in the human mind. In…

  17. Improved functional overview of protein complexes using inferred epistatic relationships

    LENUS (Irish Health Repository)

    Ryan, Colm

    2011-05-23

    Abstract Background Epistatic Miniarray Profiling(E-MAP) quantifies the net effect on growth rate of disrupting pairs of genes, often producing phenotypes that may be more (negative epistasis) or less (positive epistasis) severe than the phenotype predicted based on single gene disruptions. Epistatic interactions are important for understanding cell biology because they define relationships between individual genes, and between sets of genes involved in biochemical pathways and protein complexes. Each E-MAP screen quantifies the interactions between a logically selected subset of genes (e.g. genes whose products share a common function). Interactions that occur between genes involved in different cellular processes are not as frequently measured, yet these interactions are important for providing an overview of cellular organization. Results We introduce a method for combining overlapping E-MAP screens and inferring new interactions between them. We use this method to infer with high confidence 2,240 new strongly epistatic interactions and 34,469 weakly epistatic or neutral interactions. We show that accuracy of the predicted interactions approaches that of replicate experiments and that, like measured interactions, they are enriched for features such as shared biochemical pathways and knockout phenotypes. We constructed an expanded epistasis map for yeast cell protein complexes and show that our new interactions increase the evidence for previously proposed inter-complex connections, and predict many new links. We validated a number of these in the laboratory, including new interactions linking the SWR-C chromatin modifying complex and the nuclear transport apparatus. Conclusion Overall, our data support a modular model of yeast cell protein network organization and show how prediction methods can considerably extend the information that can be extracted from overlapping E-MAP screens.

  18. Nonparametric inference of network structure and dynamics

    Science.gov (United States)

    Peixoto, Tiago P.

    The network structure of complex systems determine their function and serve as evidence for the evolutionary mechanisms that lie behind them. Despite considerable effort in recent years, it remains an open challenge to formulate general descriptions of the large-scale structure of network systems, and how to reliably extract such information from data. Although many approaches have been proposed, few methods attempt to gauge the statistical significance of the uncovered structures, and hence the majority cannot reliably separate actual structure from stochastic fluctuations. Due to the sheer size and high-dimensionality of many networks, this represents a major limitation that prevents meaningful interpretations of the results obtained with such nonstatistical methods. In this talk, I will show how these issues can be tackled in a principled and efficient fashion by formulating appropriate generative models of network structure that can have their parameters inferred from data. By employing a Bayesian description of such models, the inference can be performed in a nonparametric fashion, that does not require any a priori knowledge or ad hoc assumptions about the data. I will show how this approach can be used to perform model comparison, and how hierarchical models yield the most appropriate trade-off between model complexity and quality of fit based on the statistical evidence present in the data. I will also show how this general approach can be elegantly extended to networks with edge attributes, that are embedded in latent spaces, and that change in time. The latter is obtained via a fully dynamic generative network model, based on arbitrary-order Markov chains, that can also be inferred in a nonparametric fashion. Throughout the talk I will illustrate the application of the methods with many empirical networks such as the internet at the autonomous systems level, the global airport network, the network of actors and films, social networks, citations among

  19. Gravity as a Strong Prior: Implications for Perception and Action

    Directory of Open Access Journals (Sweden)

    Joan López-Moliner

    2017-04-01

    Full Text Available In the future, humans are likely to be exposed to environments with altered gravity conditions, be it only visually (Virtual and Augmented Reality, or visually and bodily (space travel. As visually and bodily perceived gravity as well as an interiorized representation of earth gravity are involved in a series of tasks, such as catching, grasping, body orientation estimation and spatial inferences, humans will need to adapt to these new gravity conditions. Performance under earth gravity discrepant conditions has been shown to be relatively poor, and few studies conducted in gravity adaptation are rather discouraging. Especially in VR on earth, conflicts between bodily and visual gravity cues seem to make a full adaptation to visually perceived earth-discrepant gravities nearly impossible, and even in space, when visual and bodily cues are congruent, adaptation is extremely slow. We invoke a Bayesian framework for gravity related perceptual processes, in which earth gravity holds the status of a so called “strong prior”. As other strong priors, the gravity prior has developed through years and years of experience in an earth gravity environment. For this reason, the reliability of this representation is extremely high and overrules any sensory information to its contrary. While also other factors such as the multisensory nature of gravity perception need to be taken into account, we present the strong prior account as a unifying explanation for empirical results in gravity perception and adaptation to earth-discrepant gravities.

  20. Lower complexity bounds for lifted inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2015-01-01

    One of the big challenges in the development of probabilistic relational (or probabilistic logical) modeling and learning frameworks is the design of inference techniques that operate on the level of the abstract model representation language, rather than on the level of ground, propositional...... probabilistic relational models. Artificial Intelligence 117, 297–308). However, it is not immediate that these results also apply to the type of modeling languages that currently receive the most attention, i.e., weighted, quantifier-free formulas. In this paper we extend these earlier results, and show...

  1. Robust Inference with Multi-way Clustering

    OpenAIRE

    A. Colin Cameron; Jonah B. Gelbach; Douglas L. Miller; Doug Miller

    2009-01-01

    In this paper we propose a variance estimator for the OLS estimator as well as for nonlinear estimators such as logit, probit and GMM. This variance estimator enables cluster-robust inference when there is two-way or multi-way clustering that is non-nested. The variance estimator extends the standard cluster-robust variance estimator or sandwich estimator for one-way clustering (e.g. Liang and Zeger (1986), Arellano (1987)) and relies on similar relatively weak distributional assumptions. Our...

  2. Bayesian inference and the parametric bootstrap

    Science.gov (United States)

    Efron, Bradley

    2013-01-01

    The parametric bootstrap can be used for the efficient computation of Bayes posterior distributions. Importance sampling formulas take on an easy form relating to the deviance in exponential families, and are particularly simple starting from Jeffreys invariant prior. Because of the i.i.d. nature of bootstrap sampling, familiar formulas describe the computational accuracy of the Bayes estimates. Besides computational methods, the theory provides a connection between Bayesian and frequentist analysis. Efficient algorithms for the frequentist accuracy of Bayesian inferences are developed and demonstrated in a model selection example. PMID:23843930

  3. Ignorability in Statistical and Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...... the question of when the mar/car assumption is warranted. First we provide a ``static'' analysis that characterizes the admissibility of the car assumption in terms of the support structure of the joint probability distribution of complete data and incomplete observations. Here we obtain an equivalence...

  4. Approximate Inference and Deep Generative Models

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Advances in deep generative models are at the forefront of deep learning research because of the promise they offer for allowing data-efficient learning, and for model-based reinforcement learning. In this talk I'll review a few standard methods for approximate inference and introduce modern approximations which allow for efficient large-scale training of a wide variety of generative models. Finally, I'll demonstrate several important application of these models to density estimation, missing data imputation, data compression and planning.

  5. Abductive Inference using Array-Based Logic

    DEFF Research Database (Denmark)

    Frisvad, Jeppe Revall; Falster, Peter; Møller, Gert L.

    employed in array-based logic we embrace abduction in a simple structural operation. We argue that a theory of abduction on this form allows for an implementation which, at runtime, can perform abductive inference quite efficiently on arbitrary rules of logic representing knowledge of finite domains.......The notion of abduction has found its usage within a wide variety of AI fields. Computing abductive solutions has, however, shown to be highly intractable in logic programming. To avoid this intractability we present a new approach to logicbased abduction; through the geometrical view of data...

  6. Inverse Bayesian inference as a key of consciousness featuring a macroscopic quantum logical structure.

    Science.gov (United States)

    Gunji, Yukio-Pegio; Shinohara, Shuji; Haruna, Taichi; Basios, Vasileios

    2017-02-01

    To overcome the dualism between mind and matter and to implement consciousness in science, a physical entity has to be embedded with a measurement process. Although quantum mechanics have been regarded as a candidate for implementing consciousness, nature at its macroscopic level is inconsistent with quantum mechanics. We propose a measurement-oriented inference system comprising Bayesian and inverse Bayesian inferences. While Bayesian inference contracts probability space, the newly defined inverse one relaxes the space. These two inferences allow an agent to make a decision corresponding to an immediate change in their environment. They generate a particular pattern of joint probability for data and hypotheses, comprising multiple diagonal and noisy matrices. This is expressed as a nondistributive orthomodular lattice equivalent to quantum logic. We also show that an orthomodular lattice can reveal information generated by inverse syllogism as well as the solutions to the frame and symbol-grounding problems. Our model is the first to connect macroscopic cognitive processes with the mathematical structure of quantum mechanics with no additional assumptions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. POPPER, a simple programming language for probabilistic semantic inference in medicine.

    Science.gov (United States)

    Robson, Barry

    2015-01-01

    Our previous reports described the use of the Hyperbolic Dirac Net (HDN) as a method for probabilistic inference from medical data, and a proposed probabilistic medical Semantic Web (SW) language Q-UEL to provide that data. Rather like a traditional Bayes Net, that HDN provided estimates of joint and conditional probabilities, and was static, with no need for evolution due to "reasoning". Use of the SW will require, however, (a) at least the semantic triple with more elaborate relations than conditional ones, as seen in use of most verbs and prepositions, and (b) rules for logical, grammatical, and definitional manipulation that can generate changes in the inference net. Here is described the simple POPPER language for medical inference. It can be automatically written by Q-UEL, or by hand. Based on studies with our medical students, it is believed that a tool like this may help in medical education and that a physician unfamiliar with SW science can understand it. It is here used to explore the considerable challenges of assigning probabilities, and not least what the meaning and utility of inference net evolution would be for a physician. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Response to Comment on "Estimating the reproducibility of psychological science"

    NARCIS (Netherlands)

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-01-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively

  9. On Valdivia strong version of Nikodym boundedness property

    Czech Academy of Sciences Publication Activity Database

    Kąkol, Jerzy; López-Pellicer, M.

    2017-01-01

    Roč. 446, č. 1 (2017), s. 1-17 ISSN 0022-247X R&D Projects: GA ČR GF16-34860L Institutional support: RVO:67985840 Keywords : finitely additive scalar measure * Nikodym and strong Nikodym property * increasing tree Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.064, year: 2016 http://www.sciencedirect.com/science/article/pii/S0022247X16304413

  10. Facility Activity Inference Using Radiation Networks

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Ramirez Aviles, Camila A. [ORNL

    2017-11-01

    We consider the problem of inferring the operational status of a reactor facility using measurements from a radiation sensor network deployed around the facility’s ventilation off-gas stack. The intensity of stack emissions decays with distance, and the sensor counts or measurements are inherently random with parameters determined by the intensity at the sensor’s location. We utilize the measurements to estimate the intensity at the stack, and use it in a one-sided Sequential Probability Ratio Test (SPRT) to infer on/off status of the reactor. We demonstrate the superior performance of this method over conventional majority fusers and individual sensors using (i) test measurements from a network of 21 NaI detectors, and (ii) effluence measurements collected at the stack of a reactor facility. We also analytically establish the superior detection performance of the network over individual sensors with fixed and adaptive thresholds by utilizing the Poisson distribution of the counts. We quantify the performance improvements of the network detection over individual sensors using the packing number of the intensity space.

  11. Inferring network topology from complex dynamics

    International Nuclear Information System (INIS)

    Shandilya, Srinivas Gorur; Timme, Marc

    2011-01-01

    Inferring the network topology from dynamical observations is a fundamental problem pervading research on complex systems. Here, we present a simple, direct method for inferring the structural connection topology of a network, given an observation of one collective dynamical trajectory. The general theoretical framework is applicable to arbitrary network dynamical systems described by ordinary differential equations. No interference (external driving) is required and the type of dynamics is hardly restricted in any way. In particular, the observed dynamics may be arbitrarily complex; stationary, invariant or transient; synchronous or asynchronous and chaotic or periodic. Presupposing a knowledge of the functional form of the dynamical units and of the coupling functions between them, we present an analytical solution to the inverse problem of finding the network topology from observing a time series of state variables only. Robust reconstruction is achieved in any sufficiently long generic observation of the system. We extend our method to simultaneously reconstructing both the entire network topology and all parameters appearing linear in the system's equations of motion. Reconstruction of network topology and system parameters is viable even in the presence of external noise that distorts the original dynamics substantially. The method provides a conceptually new step towards reconstructing a variety of real-world networks, including gene and protein interaction networks and neuronal circuits.

  12. PREMER: a Tool to Infer Biological Networks.

    Science.gov (United States)

    Villaverde, Alejandro F; Becker, Kolja; Banga, Julio R

    2017-10-04

    Inferring the structure of unknown cellular networks is a main challenge in computational biology. Data-driven approaches based on information theory can determine the existence of interactions among network nodes automatically. However, the elucidation of certain features - such as distinguishing between direct and indirect interactions or determining the direction of a causal link - requires estimating information-theoretic quantities in a multidimensional space. This can be a computationally demanding task, which acts as a bottleneck for the application of elaborate algorithms to large-scale network inference problems. The computational cost of such calculations can be alleviated by the use of compiled programs and parallelization. To this end we have developed PREMER (Parallel Reverse Engineering with Mutual information & Entropy Reduction), a software toolbox that can run in parallel and sequential environments. It uses information theoretic criteria to recover network topology and determine the strength and causality of interactions, and allows incorporating prior knowledge, imputing missing data, and correcting outliers. PREMER is a free, open source software tool that does not require any commercial software. Its core algorithms are programmed in FORTRAN 90 and implement OpenMP directives. It has user interfaces in Python and MATLAB/Octave, and runs on Windows, Linux and OSX (https://sites.google.com/site/premertoolbox/).

  13. Automated adaptive inference of phenomenological dynamical models

    Science.gov (United States)

    Daniels, Bryan

    Understanding the dynamics of biochemical systems can seem impossibly complicated at the microscopic level: detailed properties of every molecular species, including those that have not yet been discovered, could be important for producing macroscopic behavior. The profusion of data in this area has raised the hope that microscopic dynamics might be recovered in an automated search over possible models, yet the combinatorial growth of this space has limited these techniques to systems that contain only a few interacting species. We take a different approach inspired by coarse-grained, phenomenological models in physics. Akin to a Taylor series producing Hooke's Law, forgoing microscopic accuracy allows us to constrain the search over dynamical models to a single dimension. This makes it feasible to infer dynamics with very limited data, including cases in which important dynamical variables are unobserved. We name our method Sir Isaac after its ability to infer the dynamical structure of the law of gravitation given simulated planetary motion data. Applying the method to output from a microscopically complicated but macroscopically simple biological signaling model, it is able to adapt the level of detail to the amount of available data. Finally, using nematode behavioral time series data, the method discovers an effective switch between behavioral attractors after the application of a painful stimulus.

  14. Inference with the Median of a Prior

    Directory of Open Access Journals (Sweden)

    Ali Mohammad-Djafari

    2006-06-01

    Full Text Available We consider the problem of inference on one of the two parameters of a probability distribution when we have some prior information on a nuisance parameter. When a prior probability distribution on this nuisance parameter is given, the marginal distribution is the classical tool to account for it. If the prior distribution is not given, but we have partial knowledge such as a fixed number of moments, we can use the maximum entropy principle to assign a prior law and thus go back to the previous case. In this work, we consider the case where we only know the median of the prior and propose a new tool for this case. This new inference tool looks like a marginal distribution. It is obtained by first remarking that the marginal distribution can be considered as the mean value of the original distribution with respect to the prior probability law of the nuisance parameter, and then, by using the median in place of the mean.

  15. Graphical models for inferring single molecule dynamics

    Directory of Open Access Journals (Sweden)

    Gonzalez Ruben L

    2010-10-01

    Full Text Available Abstract Background The recent explosion of experimental techniques in single molecule biophysics has generated a variety of novel time series data requiring equally novel computational tools for analysis and inference. This article describes in general terms how graphical modeling may be used to learn from biophysical time series data using the variational Bayesian expectation maximization algorithm (VBEM. The discussion is illustrated by the example of single-molecule fluorescence resonance energy transfer (smFRET versus time data, where the smFRET time series is modeled as a hidden Markov model (HMM with Gaussian observables. A detailed description of smFRET is provided as well. Results The VBEM algorithm returns the model’s evidence and an approximating posterior parameter distribution given the data. The former provides a metric for model selection via maximum evidence (ME, and the latter a description of the model’s parameters learned from the data. ME/VBEM provide several advantages over the more commonly used approach of maximum likelihood (ML optimized by the expectation maximization (EM algorithm, the most important being a natural form of model selection and a well-posed (non-divergent optimization problem. Conclusions The results demonstrate the utility of graphical modeling for inference of dynamic processes in single molecule biophysics.

  16. Accelerating Bayesian inference for evolutionary biology models.

    Science.gov (United States)

    Meyer, Xavier; Chopard, Bastien; Salamin, Nicolas

    2017-03-01

    Bayesian inference is widely used nowadays and relies largely on Markov chain Monte Carlo (MCMC) methods. Evolutionary biology has greatly benefited from the developments of MCMC methods, but the design of more complex and realistic models and the ever growing availability of novel data is pushing the limits of the current use of these methods. We present a parallel Metropolis-Hastings (M-H) framework built with a novel combination of enhancements aimed towards parameter-rich and complex models. We show on a parameter-rich macroevolutionary model increases of the sampling speed up to 35 times with 32 processors when compared to a sequential M-H process. More importantly, our framework achieves up to a twentyfold faster convergence to estimate the posterior probability of phylogenetic trees using 32 processors when compared to the well-known software MrBayes for Bayesian inference of phylogenetic trees. https://bitbucket.org/XavMeyer/hogan. nicolas.salamin@unil.ch. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  17. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  18. Causal Inference in the Perception of Verticality.

    Science.gov (United States)

    de Winkel, Ksander N; Katliar, Mikhail; Diers, Daniel; Bülthoff, Heinrich H

    2018-04-03

    The perceptual upright is thought to be constructed by the central nervous system (CNS) as a vector sum; by combining estimates on the upright provided by the visual system and the body's inertial sensors with prior knowledge that upright is usually above the head. Recent findings furthermore show that the weighting of the respective sensory signals is proportional to their reliability, consistent with a Bayesian interpretation of a vector sum (Forced Fusion, FF). However, violations of FF have also been reported, suggesting that the CNS may rely on a single sensory system (Cue Capture, CC), or choose to process sensory signals based on inferred signal causality (Causal Inference, CI). We developed a novel alternative-reality system to manipulate visual and physical tilt independently. We tasked participants (n = 36) to indicate the perceived upright for various (in-)congruent combinations of visual-inertial stimuli, and compared models based on their agreement with the data. The results favor the CI model over FF, although this effect became unambiguous only for large discrepancies (±60°). We conclude that the notion of a vector sum does not provide a comprehensive explanation of the perception of the upright, and that CI offers a better alternative.

  19. Science in Science Fiction.

    Science.gov (United States)

    Allday, Jonathan

    2003-01-01

    Offers some suggestions as to how science fiction, especially television science fiction programs such as "Star Trek" and "Star Wars", can be drawn into physics lessons to illuminate some interesting issues. (Author/KHR)

  20. Directed partial correlation: inferring large-scale gene regulatory network through induced topology disruptions.

    Directory of Open Access Journals (Sweden)

    Yinyin Yuan

    Full Text Available Inferring regulatory relationships among many genes based on their temporal variation in transcript abundance has been a popular research topic. Due to the nature of microarray experiments, classical tools for time series analysis lose power since the number of variables far exceeds the number of the samples. In this paper, we describe some of the existing multivariate inference techniques that are applicable to hundreds of variables and show the potential challenges for small-sample, large-scale data. We propose a directed partial correlation (DPC method as an efficient and effective solution to regulatory network inference using these data. Specifically for genomic data, the proposed method is designed to deal with large-scale datasets. It combines the efficiency of partial correlation for setting up network topology by testing conditional independence, and the concept of Granger causality to assess topology change with induced interruptions. The idea is that when a transcription factor is induced artificially within a gene network, the disruption of the network by the induction signifies a genes role in transcriptional regulation. The benchmarking results using GeneNetWeaver, the simulator for the DREAM challenges, provide strong evidence of the outstanding performance of the proposed DPC method. When applied to real biological data, the inferred starch metabolism network in Arabidopsis reveals many biologically meaningful network modules worthy of further investigation. These results collectively suggest DPC is a versatile tool for genomics research. The R package DPC is available for download (http://code.google.com/p/dpcnet/.

  1. Climate-induced changes in lake ecosystem structure inferred from coupled neo- and paleoecological approaches

    Science.gov (United States)

    Saros, Jasmine E.; Stone, Jeffery R.; Pederson, Gregory T.; Slemmons, Krista; Spanbauer, Trisha; Schliep, Anna; Cahl, Douglas; Williamson, Craig E.; Engstrom, Daniel R.

    2015-01-01

    Over the 20th century, surface water temperatures have increased in many lake ecosystems around the world, but long-term trends in the vertical thermal structure of lakes remain unclear, despite the strong control that thermal stratification exerts on the biological response of lakes to climate change. Here we used both neo- and paleoecological approaches to develop a fossil-based inference model for lake mixing depths and thereby refine understanding of lake thermal structure change. We focused on three common planktonic diatom taxa, the distributions of which previous research suggests might be affected by mixing depth. Comparative lake surveys and growth rate experiments revealed that these species respond to lake thermal structure when nitrogen is sufficient, with species optima ranging from shallower to deeper mixing depths. The diatom-based mixing depth model was applied to sedimentary diatom profiles extending back to 1750 AD in two lakes with moderate nitrate concentrations but differing climate settings. Thermal reconstructions were consistent with expected changes, with shallower mixing depths inferred for an alpine lake where treeline has advanced, and deeper mixing depths inferred for a boreal lake where wind strength has increased. The inference model developed here provides a new tool to expand and refine understanding of climate-induced changes in lake ecosystems.

  2. Inference generation and story comprehension among children with ADHD.

    Science.gov (United States)

    Van Neste, Jessica; Hayden, Angela; Lorch, Elizabeth P; Milich, Richard

    2015-02-01

    Academic difficulties are well-documented among children with ADHD. Exploring these difficulties through story comprehension research has revealed deficits among children with ADHD in making causal connections between events and in using causal structure and thematic importance to guide recall of stories. Important to theories of story comprehension and implied in these deficits is the ability to make inferences. Often, characters' goals are implicit and explanations of events must be inferred. The purpose of the present study was to compare the inferences generated during story comprehension by 23 7- to 11-year-old children with ADHD (16 males) and 35 comparison peers (19 males). Children watched two televised stories, each paused at five points. In the experimental condition, at each pause children told what they were thinking about the story, whereas in the control condition no responses were made during pauses. After viewing, children recalled the story. Several types of inferences and inference plausibility were coded. Children with ADHD generated fewer of the most essential inferences, plausible explanatory inferences, than did comparison children, both during story processing and during story recall. The groups did not differ on production of other types of inferences. Group differences in generating inferences during the think-aloud task significantly mediated group differences in patterns of recall. Both groups recalled more of the most important story information after completing the think-aloud task. Generating fewer explanatory inferences has important implications for story comprehension deficits in children with ADHD.

  3. Children's inference generation: The role of vocabulary and working memory.

    Science.gov (United States)

    Currie, Nicola Kate; Cain, Kate

    2015-09-01

    Inferences are crucial to successful discourse comprehension. We assessed the contributions of vocabulary and working memory to inference making in children aged 5 and 6years (n=44), 7 and 8years (n=43), and 9 and 10years (n=43). Children listened to short narratives and answered questions to assess local and global coherence inferences after each one. Analysis of variance (ANOVA) confirmed developmental improvements on both types of inference. Although standardized measures of both vocabulary and working memory were correlated with inference making, multiple regression analyses determined that vocabulary was the key predictor. For local coherence inferences, only vocabulary predicted unique variance for the 6- and 8-year-olds; in contrast, none of the variables predicted performance for the 10-year-olds. For global coherence inferences, vocabulary was the only unique predictor for each age group. Mediation analysis confirmed that although working memory was associated with the ability to generate local and global coherence inferences in 6- to 10-year-olds, the effect was mediated by vocabulary. We conclude that vocabulary knowledge supports inference making in two ways: through knowledge of word meanings required to generate inferences and through its contribution to memory processes. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Inferring the temperature dependence of population parameters: the effects of experimental design and inference algorithm.

    Science.gov (United States)

    Palamara, Gian Marco; Childs, Dylan Z; Clements, Christopher F; Petchey, Owen L; Plebani, Marco; Smith, Matthew J

    2014-12-01

    Understanding and quantifying the temperature dependence of population parameters, such as intrinsic growth rate and carrying capacity, is critical for predicting the ecological responses to environmental change. Many studies provide empirical estimates of such temperature dependencies, but a thorough investigation of the methods used to infer them has not been performed yet. We created artificial population time series using a stochastic logistic model parameterized with the Arrhenius equation, so that activation energy drives the temperature dependence of population parameters. We simulated different experimental designs and used different inference methods, varying the likelihood functions and other aspects of the parameter estimation methods. Finally, we applied the best performing inference methods to real data for the species Paramecium caudatum. The relative error of the estimates of activation energy varied between 5% and 30%. The fraction of habitat sampled played the most important role in determining the relative error; sampling at least 1% of the habitat kept it below 50%. We found that methods that simultaneously use all time series data (direct methods) and methods that estimate population parameters separately for each temperature (indirect methods) are complementary. Indirect methods provide a clearer insight into the shape of the functional form describing the temperature dependence of population parameters; direct methods enable a more accurate estimation of the parameters of such functional forms. Using both methods, we found that growth rate and carrying capacity of Paramecium caudatum scale with temperature according to different activation energies. Our study shows how careful choice of experimental design and inference methods can increase the accuracy of the inferred relationships between temperature and population parameters. The comparison of estimation methods provided here can increase the accuracy of model predictions, with important

  5. Human brain lesion-deficit inference remapped.

    Science.gov (United States)

    Mah, Yee-Haur; Husain, Masud; Rees, Geraint; Nachev, Parashkev

    2014-09-01

    Our knowledge of the anatomical organization of the human brain in health and disease draws heavily on the study of patients with focal brain lesions. Historically the first method of mapping brain function, it is still potentially the most powerful, establishing the necessity of any putative neural substrate for a given function or deficit. Great inferential power, however, carries a crucial vulnerability: without stronger alternatives any consistent error cannot be easily detected. A hitherto unexamined source of such error is the structure of the high-dimensional distribution of patterns of focal damage, especially in ischaemic injury-the commonest aetiology in lesion-deficit studies-where the anatomy is naturally shaped by the architecture of the vascular tree. This distribution is so complex that analysis of lesion data sets of conventional size cannot illuminate its structure, leaving us in the dark about the presence or absence of such error. To examine this crucial question we assembled the largest known set of focal brain lesions (n = 581), derived from unselected patients with acute ischaemic injury (mean age = 62.3 years, standard deviation = 17.8, male:female ratio = 0.547), visualized with diffusion-weighted magnetic resonance imaging, and processed with validated automated lesion segmentation routines. High-dimensional analysis of this data revealed a hidden bias within the multivariate patterns of damage that will consistently distort lesion-deficit maps, displacing inferred critical regions from their true locations, in a manner opaque to replication. Quantifying the size of this mislocalization demonstrates that past lesion-deficit relationships estimated with conventional inferential methodology are likely to be significantly displaced, by a magnitude dependent on the unknown underlying lesion-deficit relationship itself. Past studies therefore cannot be retrospectively corrected, except by new knowledge that would render them redundant

  6. Species tree inference by minimizing deep coalescences.

    Science.gov (United States)

    Than, Cuong; Nakhleh, Luay

    2009-09-01

    In a 1997 seminal paper, W. Maddison proposed minimizing deep coalescences, or MDC, as an optimization criterion for inferring the species tree from a set of incongruent gene trees, assuming the incongruence is exclusively due to lineage sorting. In a subsequent paper, Maddison and Knowles provided and implemented a search heuristic for optimizing the MDC criterion, given a set of gene trees. However, the heuristic is not guaranteed to compute optimal solutions, and its hill-climbing search makes it slow in practice. In this paper, we provide two exact solutions to the problem of inferring the species tree from a set of gene trees under the MDC criterion. In other words, our solutions are guaranteed to find the tree that minimizes the total number of deep coalescences from a set of gene trees. One solution is based on a novel integer linear programming (ILP) formulation, and another is based on a simple dynamic programming (DP) approach. Powerful ILP solvers, such as CPLEX, make the first solution appealing, particularly for very large-scale instances of the problem, whereas the DP-based solution eliminates dependence on proprietary tools, and its simplicity makes it easy to integrate with other genomic events that may cause gene tree incongruence. Using the exact solutions, we analyze a data set of 106 loci from eight yeast species, a data set of 268 loci from eight Apicomplexan species, and several simulated data sets. We show that the MDC criterion provides very accurate estimates of the species tree topologies, and that our solutions are very fast, thus allowing for the accurate analysis of genome-scale data sets. Further, the efficiency of the solutions allow for quick exploration of sub-optimal solutions, which is important for a parsimony-based criterion such as MDC, as we show. We show that searching for the species tree in the compatibility graph of the clusters induced by the gene trees may be sufficient in practice, a finding that helps ameliorate the

  7. Species tree inference by minimizing deep coalescences.

    Directory of Open Access Journals (Sweden)

    Cuong Than

    2009-09-01

    Full Text Available In a 1997 seminal paper, W. Maddison proposed minimizing deep coalescences, or MDC, as an optimization criterion for inferring the species tree from a set of incongruent gene trees, assuming the incongruence is exclusively due to lineage sorting. In a subsequent paper, Maddison and Knowles provided and implemented a search heuristic for optimizing the MDC criterion, given a set of gene trees. However, the heuristic is not guaranteed to compute optimal solutions, and its hill-climbing search makes it slow in practice. In this paper, we provide two exact solutions to the problem of inferring the species tree from a set of gene trees under the MDC criterion. In other words, our solutions are guaranteed to find the tree that minimizes the total number of deep coalescences from a set of gene trees. One solution is based on a novel integer linear programming (ILP formulation, and another is based on a simple dynamic programming (DP approach. Powerful ILP solvers, such as CPLEX, make the first solution appealing, particularly for very large-scale instances of the problem, whereas the DP-based solution eliminates dependence on proprietary tools, and its simplicity makes it easy to integrate with other genomic events that may cause gene tree incongruence. Using the exact solutions, we analyze a data set of 106 loci from eight yeast species, a data set of 268 loci from eight Apicomplexan species, and several simulated data sets. We show that the MDC criterion provides very accurate estimates of the species tree topologies, and that our solutions are very fast, thus allowing for the accurate analysis of genome-scale data sets. Further, the efficiency of the solutions allow for quick exploration of sub-optimal solutions, which is important for a parsimony-based criterion such as MDC, as we show. We show that searching for the species tree in the compatibility graph of the clusters induced by the gene trees may be sufficient in practice, a finding that helps

  8. Ten-month-old infants infer the value of goals from the costs of actions.

    Science.gov (United States)

    Liu, Shari; Ullman, Tomer D; Tenenbaum, Joshua B; Spelke, Elizabeth S

    2017-11-24

    Infants understand that people pursue goals, but how do they learn which goals people prefer? We tested whether infants solve this problem by inverting a mental model of action planning, trading off the costs of acting against the rewards actions bring. After seeing an agent attain two goals equally often at varying costs, infants expected the agent to prefer the goal it attained through costlier actions. These expectations held across three experiments that conveyed cost through different physical path features (height, width, and incline angle), suggesting that an abstract variable-such as "force," "work," or "effort"-supported infants' inferences. We modeled infants' expectations as Bayesian inferences over utility-theoretic calculations, providing a bridge to recent quantitative accounts of action understanding in older children and adults. Copyright © 2017, American Association for the Advancement of Science.

  9. Conflicting Epistemologies and Inference in Coupled Human and Natural Systems

    Science.gov (United States)

    Garcia, M. E.

    2017-12-01

    Last year, I presented a model that projects per capita water consumption based on changes in price, population, building codes, and water stress salience. This model applied methods from hydrological science and engineering to relationships both within and beyond their traditional scope. Epistemologically, the development of mathematical models of natural or engineered systems is objectivist while research examining relationships between observations, perceptions and action is commonly constructivist or subjectivist. Drawing on multiple epistemologies is common in, and perhaps central to, the growing fields of coupled human and natural systems, and socio-hydrology. Critically, these philosophical perspectives vary in their view of the nature of the system as mechanistic, adaptive or constructed, and the split between aleatory and epistemic uncertainty. Interdisciplinary research is commonly cited as a way to address the critical and domain crossing challenge of sustainability as synthesis across perspectives can offer a more comprehensive view of system dynamics. However, combining methods and concepts from multiple ontologies and epistemologies can introduce contradictions into the logic of inference. These contractions challenge the evaluation of research products and the implications for practical application of research findings are not fully understood. Reflections on the evaluation, application, and generalization of the water consumption model described above are used to ground these broader questions and offer thoughts on the way forward.

  10. Modeling and notation of DEA with strong and weak disposable outputs.

    Science.gov (United States)

    Kuntz, Ludwig; Sülz, Sandra

    2011-12-01

    Recent articles published in Health Care Management Science have described DEA applications under the assumption of strong and weak disposable outputs. As we confidently assume that these papers include some methodical deficiencies, we aim to illustrate a revised approach.

  11. Bayesian inference data evaluation and decisions

    CERN Document Server

    Harney, Hanns Ludwig

    2016-01-01

    This new edition offers a comprehensive introduction to the analysis of data using Bayes rule. It generalizes Gaussian error intervals to situations in which the data follow distributions other than Gaussian. This is particularly useful when the observed parameter is barely above the background or the histogram of multiparametric data contains many empty bins, so that the determination of the validity of a theory cannot be based on the chi-squared-criterion. In addition to the solutions of practical problems, this approach provides an epistemic insight: the logic of quantum mechanics is obtained as the logic of unbiased inference from counting data. New sections feature factorizing parameters, commuting parameters, observables in quantum mechanics, the art of fitting with coherent and with incoherent alternatives and fitting with multinomial distribution. Additional problems and examples help deepen the knowledge. Requiring no knowledge of quantum mechanics, the book is written on introductory level, with man...

  12. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  13. Progression inference for somatic mutations in cancer

    Directory of Open Access Journals (Sweden)

    Leif E. Peterson

    2017-04-01

    Full Text Available Computational methods were employed to determine progression inference of genomic alterations in commonly occurring cancers. Using cross-sectional TCGA data, we computed evolutionary trajectories involving selectivity relationships among pairs of gene-specific genomic alterations such as somatic mutations, deletions, amplifications, downregulation, and upregulation among the top 20 driver genes associated with each cancer. Results indicate that the majority of hierarchies involved TP53, PIK3CA, ERBB2, APC, KRAS, EGFR, IDH1, VHL, etc. Research into the order and accumulation of genomic alterations among cancer driver genes will ever-increase as the costs of nextgen sequencing subside, and personalized/precision medicine incorporates whole-genome scans into the diagnosis and treatment of cancer. Keywords: Oncology, Cancer research, Genetics, Computational biology

  14. Inferring human intentions from the brain data

    DEFF Research Database (Denmark)

    Stanek, Konrad

    discharges across the neural tissue are responsible for emergence of high cognitive function, conscious perception and voluntary action. The brain’s capacity to exercise free will, or internally generated free choice, has long been investigated by philosophers, psychologists and neuroscientists. Rather than......The human brain is a massively complex organ composed of approximately a hundred billion densely interconnected, interacting neural cells. The neurons are not wired randomly - instead, they are organized in local functional assemblies. It is believed that the complex patterns of dynamic electric...... assuming a causal power of conscious will, the neuroscience of volition is based on the premise that "mental states rest on brain processes”, and hence by measuring spatial and temporal correlates of volition in carefully controlled experiments we can infer about their underlying mind processes, including...

  15. Cancer Evolution: Mathematical Models and Computational Inference

    Science.gov (United States)

    Beerenwinkel, Niko; Schwarz, Roland F.; Gerstung, Moritz; Markowetz, Florian

    2015-01-01

    Cancer is a somatic evolutionary process characterized by the accumulation of mutations, which contribute to tumor growth, clinical progression, immune escape, and drug resistance development. Evolutionary theory can be used to analyze the dynamics of tumor cell populations and to make inference about the evolutionary history of a tumor from molecular data. We review recent approaches to modeling the evolution of cancer, including population dynamics models of tumor initiation and progression, phylogenetic methods to model the evolutionary relationship between tumor subclones, and probabilistic graphical models to describe dependencies among mutations. Evolutionary modeling helps to understand how tumors arise and will also play an increasingly important prognostic role in predicting disease progression and the outcome of medical interventions, such as targeted therapy. PMID:25293804

  16. Supplier Selection Using Fuzzy Inference System

    Directory of Open Access Journals (Sweden)

    hamidreza kadhodazadeh

    2014-01-01

    Full Text Available Suppliers are one of the most vital parts of supply chain whose operation has significant indirect effect on customer satisfaction. Since customer's expectations from organization are different, organizations should consider different standards, respectively. There are many researches in this field using different standards and methods in recent years. The purpose of this study is to propose an approach for choosing a supplier in a food manufacturing company considering cost, quality, service, type of relationship and structure standards of the supplier organization. To evaluate supplier according to the above standards, the fuzzy inference system has been used. Input data of this system includes supplier's score in any standard that is achieved by AHP approach and the output is final score of each supplier. Finally, a supplier has been selected that although is not the best in price and quality, has achieved good score in all of the standards.

  17. Field dynamics inference via spectral density estimation

    Science.gov (United States)

    Frank, Philipp; Steininger, Theo; Enßlin, Torsten A.

    2017-11-01

    Stochastic differential equations are of utmost importance in various scientific and industrial areas. They are the natural description of dynamical processes whose precise equations of motion are either not known or too expensive to solve, e.g., when modeling Brownian motion. In some cases, the equations governing the dynamics of a physical system on macroscopic scales occur to be unknown since they typically cannot be deduced from general principles. In this work, we describe how the underlying laws of a stochastic process can be approximated by the spectral density of the corresponding process. Furthermore, we show how the density can be inferred from possibly very noisy and incomplete measurements of the dynamical field. Generally, inverse problems like these can be tackled with the help of Information Field Theory. For now, we restrict to linear and autonomous processes. To demonstrate its applicability, we employ our reconstruction algorithm on a time-series and spatiotemporal processes.

  18. BOOTSTRAP-BASED INFERENCE FOR GROUPED DATA

    Directory of Open Access Journals (Sweden)

    Jorge Iván Vélez

    2015-07-01

    Full Text Available Grouped data refers to continuous variables that are partitioned in intervals, not necessarily of the same length, to facilitate its interpretation.  Unlike in ungrouped data, estimating simple summary statistics as the mean and mode, or more complex ones as a percentile or the coefficient of variation, is a difficult endeavour in grouped data. When the probability distribution generating the data is unknown, inference in ungrouped data is carried out using parametric or nonparametric resampling methods. However, there are no equivalent methods in the case of grouped data.  Here, a bootstrap-based procedure to estimate the parameters of an unknown distribution based on grouped data is proposed, described and illustrated.

  19. Inferring Past Environments from Ancient Epigenomes.

    Science.gov (United States)

    Gokhman, David; Malul, Anat; Carmel, Liran

    2017-10-01

    Analyzing the conditions in which past individuals lived is key to understanding the environments and cultural transitions to which humans had to adapt. Here, we suggest a methodology to probe into past environments, using reconstructed premortem DNA methylation maps of ancient individuals. We review a large body of research showing that differential DNA methylation is associated with changes in various external and internal factors, and propose that loci whose DNA methylation level is environmentally responsive could serve as markers to infer about ancient daily life, diseases, nutrition, exposure to toxins, and more. We demonstrate this approach by showing that hunger-related DNA methylation changes are found in ancient hunter-gatherers. The strategy we present here opens a window to reconstruct previously inaccessible aspects of the lives of past individuals. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  20. Automatic inference of indexing rules for MEDLINE.

    Science.gov (United States)

    Névéol, Aurélie; Shooshan, Sonya E; Claveau, Vincent

    2008-11-19

    Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. In this paper, we describe the use and the customization of Inductive Logic Programming (ILP) to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI), a system producing automatic indexing recommendations for MEDLINE. We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  1. MISTIC: mutual information server to infer coevolution

    DEFF Research Database (Denmark)

    Simonetti, Franco L.; Teppa, Elin; Chernomoretz, Ariel

    2013-01-01

    MISTIC (mutual information server to infer coevolution) is a web server for graphical representation of the information contained within a MSA (multiple sequence alignment) and a complete analysis tool for Mutual Information networks in protein families. The server outputs a graphical visualization...... containing all results can be downloaded. The server is available at http://mistic.leloir.org.ar. In summary, MISTIC allows for a comprehensive, compact, visually rich view of the information contained within an MSA in a manner unique to any other publicly available web server. In particular, the use...... of several information-related quantities using a circos representation. This provides an integrated view of the MSA in terms of (i) the mutual information (MI) between residue pairs, (ii) sequence conservation and (iii) the residue cumulative and proximity MI scores. Further, an interactive interface...

  2. Active inference and the anatomy of oculomotion.

    Science.gov (United States)

    Parr, Thomas; Friston, Karl J

    2018-03-01

    Given that eye movement control can be framed as an inferential process, how are the requisite forces generated to produce anticipated or desired fixation? Starting from a generative model based on simple Newtonian equations of motion, we derive a variational solution to this problem and illustrate the plausibility of its implementation in the oculomotor brainstem. We show, through simulation, that the Bayesian filtering equations that implement 'planning as inference' can generate both saccadic and smooth pursuit eye movements. Crucially, the associated message passing maps well onto the known connectivity and neuroanatomy of the brainstem - and the changes in these messages over time are strikingly similar to single unit recordings of neurons in the corresponding nuclei. Furthermore, we show that simulated lesions to axonal pathways reproduce eye movement patterns of neurological patients with damage to these tracts. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Inferring Phylogenetic Networks from Gene Order Data

    Directory of Open Access Journals (Sweden)

    Alexey Anatolievich Morozov

    2013-01-01

    Full Text Available Existing algorithms allow us to infer phylogenetic networks from sequences (DNA, protein or binary, sets of trees, and distance matrices, but there are no methods to build them using the gene order data as an input. Here we describe several methods to build split networks from the gene order data, perform simulation studies, and use our methods for analyzing and interpreting different real gene order datasets. All proposed methods are based on intermediate data, which can be generated from genome structures under study and used as an input for network construction algorithms. Three intermediates are used: set of jackknife trees, distance matrix, and binary encoding. According to simulations and case studies, the best intermediates are jackknife trees and distance matrix (when used with Neighbor-Net algorithm. Binary encoding can also be useful, but only when the methods mentioned above cannot be used.

  4. Population inference from contemporary American craniometrics.

    Science.gov (United States)

    Algee-Hewitt, Bridget F B

    2016-08-01

    This analysis delivers a composite picture of population structure, admixture, ancestry variation, and personal identity in the United States, as observed through the lens of forensic anthropological casework and modern skeletal collections. It tests the applicability of the probabilistic clustering methods commonly used in human population genetics for the analysis of continuous, cranial measurement data, to improve population inference for admixed individuals without prior knowledge of sample origins. The unsupervised model-based clustering methods of finite mixture analysis are used here to reveal latent population structure and generate admixture proportions for craniofacial measurements from the Forensic Anthropology Data Bank (FDB). Craniometric estimates of ancestry are also generated under a three contributor model, sourcing parental reference populations from the Howells Craniometric Dataset. Tests of association are made among the coefficients of cluster memberships and the demographic information documented for each individual in the FDB. Clustering results are contextualized within the framework of conventional approaches to population structure analysis and individual ancestry estimation to discuss method compatibility. The findings reported here for contemporary American craniometrics are in agreement with the expected patterns of intergroup relationships, geographic origins and results from published genetic analyses. Population inference methods that allow for the model-bound estimation of admixture and ancestry proportions from craniometric data not only enable parallel-skeletal and genetic-analyses but they are also shown to be more informative than those methods that perform hard classifications using externally-imposed categories or seek to explain gross variation by low-dimensional projections. Am J Phys Anthropol 160:604-624, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  5. Network geometry inference using common neighbors

    Science.gov (United States)

    Papadopoulos, Fragkiskos; Aldecoa, Rodrigo; Krioukov, Dmitri

    2015-08-01

    We introduce and explore a method for inferring hidden geometric coordinates of nodes in complex networks based on the number of common neighbors between the nodes. We compare this approach to the HyperMap method, which is based only on the connections (and disconnections) between the nodes, i.e., on the links that the nodes have (or do not have). We find that for high degree nodes, the common-neighbors approach yields a more accurate inference than the link-based method, unless heuristic periodic adjustments (or "correction steps") are used in the latter. The common-neighbors approach is computationally intensive, requiring O (t4) running time to map a network of t nodes, versus O (t3) in the link-based method. But we also develop a hybrid method with O (t3) running time, which combines the common-neighbors and link-based approaches, and we explore a heuristic that reduces its running time further to O (t2) , without significant reduction in the mapping accuracy. We apply this method to the autonomous systems (ASs) Internet, and we reveal how soft communities of ASs evolve over time in the similarity space. We further demonstrate the method's predictive power by forecasting future links between ASs. Taken altogether, our results advance our understanding of how to efficiently and accurately map real networks to their latent geometric spaces, which is an important necessary step toward understanding the laws that govern the dynamics of nodes in these spaces, and the fine-grained dynamics of network connections.

  6. Strong Bisimilarity of Simple Process Algebras

    DEFF Research Database (Denmark)

    Srba, Jirí

    2003-01-01

    We study bisimilarity and regularity problems of simple process algebras. In particular, we show PSPACE-hardness of the following problems: (i) strong bisimilarity of Basic Parallel Processes (BPP), (ii) strong bisimilarity of Basic Process Algebra (BPA), (iii) strong regularity of BPP, and (iv) ...

  7. [Inferences and verbal comprehension in children with developmental language disorders].

    Science.gov (United States)

    Monfort, Isabelle; Monfort, Marc

    2013-02-22

    We review the concept of inference in language comprehension -both oral and written- recalling the different proposals of classification. We analyze the type of difficulties that children might encounter in their application of the inferences, depending on the type of language or development pathology. Finally, we describe the proposals for intervention that have been made to enhance the ability to apply inferences in language comprehension.

  8. VINE: A Variational Inference -Based Bayesian Neural Network Engine

    Science.gov (United States)

    2018-01-01

    functions and learning rates. The Python implementation that will be turned in is a parameterized implementation of the EASI algorithm in the sense that...Inference (VI) engine to perform inference and learning (statically and on-the-fly) under uncertain or incomplete input and output features. A secondary...realization, and that can not only do inference but also can be retrained on-the-fly based on incoming data. 15. SUBJECT TERMS Machine learning

  9. Science in Computational Sciences

    Directory of Open Access Journals (Sweden)

    Jameson Cerrosen

    2012-12-01

    Full Text Available The existing theory in relation to science presents the physics as an ideal, although many sciences not approach the same, so that the current philosophy of science-Theory of Science- is not much help when it comes to analyze the computer science, an emerging field of knowledge that aims investigation of computers, which are included in the materialization of the ideas that try to structure the knowledge and information about the world. Computer Science is based on logic and mathematics, but both theoretical research methods and experimental follow patterns of classical scientific fields. Modeling and computer simulation, as a method, are specific to the discipline and will be further developed in the near future, not only applied to computers but also to other scientific fields. In this article it is analyze the aspects of science in computer science, is presenting an approach to the definition of science and the scientific method in general and describes the relationships between science, research, development and technology.

  10. Probabilistic logic networks a comprehensive framework for uncertain inference

    CERN Document Server

    Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari

    2008-01-01

    This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.

  11. Parametric statistical inference basic theory and modern approaches

    CERN Document Server

    Zacks, Shelemyahu; Tsokos, C P

    1981-01-01

    Parametric Statistical Inference: Basic Theory and Modern Approaches presents the developments and modern trends in statistical inference to students who do not have advanced mathematical and statistical preparation. The topics discussed in the book are basic and common to many fields of statistical inference and thus serve as a jumping board for in-depth study. The book is organized into eight chapters. Chapter 1 provides an overview of how the theory of statistical inference is presented in subsequent chapters. Chapter 2 briefly discusses statistical distributions and their properties. Chapt

  12. Multi-Modal Inference in Animacy Perception for Artificial Object

    Directory of Open Access Journals (Sweden)

    Kohske Takahashi

    2011-10-01

    Full Text Available Sometimes we feel animacy for artificial objects and their motion. Animals usually interact with environments through multiple sensory modalities. Here we investigated how the sensory responsiveness of artificial objects to the environment would contribute to animacy judgment for them. In a 90-s trial, observers freely viewed four objects moving in a virtual 3D space. The objects, whose position and motion were determined following Perlin-noise series, kept drifting independently in the space. Visual flashes, auditory bursts, or synchronous flashes and bursts appeared with 1–2 s intervals. The first object abruptly accelerated their motion just after visual flashes, giving an impression of responding to the flash. The second object responded to bursts. The third object responded to synchronous flashes and bursts. The forth object accelerated at a random timing independent of flashes and bursts. The observers rated how strongly they felt animacy for each object. The results showed that the object responding to the auditory bursts was rated as having weaker animacy compared to the other objects. This implies that sensory modality through which an object interacts with the environment may be a factor for animacy perception in the object and may serve as the basis of multi-modal and cross-modal inference of animacy.

  13. The origins of probabilistic inference in human infants.

    Science.gov (United States)

    Denison, Stephanie; Xu, Fei

    2014-03-01

    Reasoning under uncertainty is the bread and butter of everyday life. Many areas of psychology, from cognitive, developmental, social, to clinical, are interested in how individuals make inferences and decisions with incomplete information. The ability to reason under uncertainty necessarily involves probability computations, be they exact calculations or estimations. What are the developmental origins of probabilistic reasoning? Recent work has begun to examine whether infants and toddlers can compute probabilities; however, previous experiments have confounded quantity and probability-in most cases young human learners could have relied on simple comparisons of absolute quantities, as opposed to proportions, to succeed in these tasks. We present four experiments providing evidence that infants younger than 12 months show sensitivity to probabilities based on proportions. Furthermore, infants use this sensitivity to make predictions and fulfill their own desires, providing the first demonstration that even preverbal learners use probabilistic information to navigate the world. These results provide strong evidence for a rich quantitative and statistical reasoning system in infants. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Strong interaction effects in high-Z K sup minus atoms

    Energy Technology Data Exchange (ETDEWEB)

    Batty, C.J.; Eckhause, M.; Gall, K.P.; Guss, P.P.; Hertzog, D.W.; Kane, J.R.; Kunselman, A.R.; Miller, J.P.; O' Brien, F.; Phillips, W.C.; Powers, R.J.; Roberts, B.L.; Sutton, R.B.; Vulcan, W.F.; Welsh, R.E.; Whyley, R.J.; Winter, R.G. (Rutherford-Appleton Laboratory, Chilton, Didcot OX11 0QX, United Kingdom (GB) College of William and Mary, Williamsburg, Virginia 23185 Boston University, Boston, Massachusetts 02215 University of Wyoming, Laramie, Wyoming 82071 California Institute of Technology, Pasadena, California 91125 Carnegie-Mellon University, Pittsburgh, Pennsylvania 15213)

    1989-11-01

    A systematic experimental study of strong interaction shifts, widths, and yields from high-{ital Z} kaonic atoms is reported. Strong interaction effects for the {ital K}{sup {minus}}(8{r arrow}7) transition were measured in U, Pb, and W, and the {ital K}{sup {minus}}(7{r arrow}6) transition in W was also observed. This is the first observation of two measurably broadened and shifted kaonic transitions in a single target and thus permitted the width of the upper state to be determined directly, rather than being inferred from yield data. The results are compared with optical-model calculations.

  15. Inferring learning rules from distribution of firing rates in cortical neurons

    Science.gov (United States)

    Lim, Sukbin; McKee, Jillian L.; Woloszyn, Luke; Amit, Yali; Freedman, David J.; Sheinberg, David L.; Brunel, Nicolas

    2015-01-01

    Information about external stimuli is thought to be stored in cortical circuits through experience-dependent modifications of synaptic connectivity. These modifications of network connectivity should lead to changes in neuronal activity, as a particular stimulus is repeatedly encountered. Here, we ask what plasticity rules are consistent with the differences in the statistics of the visual response to novel and familiar stimuli in inferior temporal cortex, an area underlying visual object recognition. We introduce a method that allows inferring the dependence of the ‘learning rule’ on post-synaptic firing rate, and show that the inferred learning rule exhibits depression for low post-synaptic rates and potentiation for high rates. The threshold separating depression from potentiation is strongly correlated with both mean and standard deviation of the firing rate distribution. Finally, we show that network models implementing a rule extracted from data show stable learning dynamics, and lead to sparser representations of stimuli. PMID:26523643

  16. Inference of purifying and positive selection in three subspecies of chimpanzees (Pan troglodytes) from exome sequencing

    DEFF Research Database (Denmark)

    Bataillon, Thomas; Duan, Jinjie; Hvilsom, Christina

    2015-01-01

    of recent gene flow from Western into Eastern chimpanzees. The striking contrast in X-linked vs. autosomal polymorphism and divergence previously reported in Central chimpanzees is also found in Eastern and Western chimpanzees. We show that the direction of selection (DoS) statistic exhibits a strong non......-monotonic relationship with the strength of purifying selection S, making it inappropriate for estimating S. We instead use counts in synonymous vs. non-synonymous frequency classes to infer the distribution of S coefficients acting on non-synonymous mutations in each subspecies. The strength of purifying selection we...... infer is congruent with the differences in effective sizes of each subspecies: Central chimpanzees are undergoing the strongest purifying selection followed by Eastern and Western chimpanzees. Coding indels show stronger selection against indels changing the reading frame than observed in human...

  17. Application of strong phosphoric acid to radiochemistry

    International Nuclear Information System (INIS)

    Terada, Kikuo

    1977-01-01

    Not only inorganic and organic compounds but also natural substrances, such as accumulations in soil, are completely decomposed and distilled by heating with strong phosphoric acid for 30 to 50 minutes. As applications of strong phosphoric acid to radiochemistry, determination of uranium and boron by use of solubilization effect of this substance, titration of uranyl ion by use of sulfuric iron (II) contained in this substance, application to tracer experiment, and determination of radioactive ruthenium in environmental samples are reviewed. Strong phosphoric acid is also applied to activation analysis, for example, determination of N in pyrographite with iodate potassium-strong phosphoric acid method, separation of Os and Ru with sulfuric cerium (IV) - strong phosphoric acid method or potassium dechromate-strong phosphoric acid method, analysis of Se, As and Sb rocks and accumulations with ammonium bromide, sodium chloride and sodium bromide-strong phosphoric acid method. (Kanao, N.)

  18. A bias-corrected covariance estimate for improved inference with quadratic inference functions.

    Science.gov (United States)

    Westgate, Philip M

    2012-12-20

    The method of quadratic inference functions (QIF) is an increasingly popular method for the analysis of correlated data because of its multiple advantages over generalized estimating equations (GEE). One advantage is that it is more efficient for parameter estimation when the working covariance structure for the data is misspecified. In the QIF literature, the asymptotic covariance formula is used to obtain standard errors. We show that in small to moderately sized samples, these standard error estimates can be severely biased downward, therefore inflating test size and decreasing coverage probability. We propose adjustments to the asymptotic covariance formula that eliminate finite-sample biases and, as shown via simulation, lead to substantial improvements in standard error estimates, inference, and coverage. The proposed method is illustrated in application to a cluster randomized trial and a longitudinal study. Furthermore, QIF and GEE are contrasted via simulation and these applications. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Properties of the subglacial till inferred from supraglacial lake drainage

    Science.gov (United States)

    Neufeld, J. A.; Hewitt, D.

    2017-12-01

    The buildup and drainage of supraglacial lakes along the margins of the Greenland ice sheet has been previously observed using detailed GPS campaigns which show that rapid drainage events are often preceded by localised, transient uplift followed by rapid, and much broader scale, uplift and flexure associated with the main drainage event [1,2]. Previous models of these events have focused on fracturing during rapid lake drainage from an impermeable bedrock [3] or a thin subglacial film [4]. We present a new model of supraglacial drainage that couples the water flux from rapid lake drainage events to a simplified model of the pore-pressure in a porous, subglacial till along with a simplified model of the flexure of glacial ice. Using a hybrid mathematical model we explore the internal transitions between turbulent and laminar flow throughout the evolving subglacial cavity and porous till. The model predicts that an initially small water flux may locally increase pore-pressure in the till leading to uplift and a local divergence in the ice velocity that may ultimately be responsible for large hydro-fracturing and full-scale drainage events. Furthermore, we find that during rapid drainage while the presence of a porous, subglacial till is crucial for propagation, the manner of spreading is remarkably insensitive to the properties of the subglacial till. This is in stark contrast to the post-drainage relaxation of the pore pressure, and hence sliding velocity, which is highly sensitive to the permeability, compressibility and thickness of subglacial till. We use our model, and the inferred sensitivity to the properties of the subglacial till after the main drainage event, to infer the properties of the subglacial till. The results suggest that a detailed interpretation of supraglacial lake drainage may provide important insights into the hydrology of the subglacial till along the margins of the Greenland ice sheet, and the coupling of pore pressure in subglacial till

  20. Limitations of the strong field approximation in ionization of the hydrogen atom by ultrashort pulses

    International Nuclear Information System (INIS)

    Arbo, D.G.; Toekesi, K.; Miraglia, J.E.; FCEN, University of Buenos Aires

    2008-01-01

    Complete text of publication follows. We presented a theoretical study of the ionization of hydrogen atoms as a result of the interaction with an ultrashort external electric field. Doubly-differential momentum distributions and angular momentum distributions of ejected electrons calculated in the framework of the Coulomb-Volkov and strong field approximations, as well as classical calculations are compared with the exact solution of the time dependent Schroedinger equation. We have shown that the Coulomb-Volkov approximation (CVA) describes the quantum atomic ionization probabilities exactly when the external field is described by a sudden momentum transfer [1]. The velocity distribution of emitted electrons right after ionization by a sudden momentum transfer is given through the strong field approximation (SFA) within both the CVA and CTMC methods. In this case, the classical and quantum time dependent evolutions of an atom subject to a sudden momentum transfer are identical. The difference between the classical and quantum final momentum distributions resides in the time evolution of the escaping electron under the subsequent action of the Coulomb field. Furthermore, classical mechanics is incapable of reproducing the quantum angular momentum distribution due to the improper initial radial distribution used in the CTMC calculations, i.e., the microcanonical ensemble. We find that in the limit of high momentum transfer, based on the SFA, there is a direct relation between the cylindrical radial distribution dP/dρ and the final angular momentum distribution dP/dL. This leads to a close analytical expression for the partial wave populations (dP/dL) SFA-Q given by dP SFA-Q / dL = 4Z 3 L 2 / (Δp) 3 K 1 (2ZL/Δp) which, together with the prescription L = l + 1/2, reproduces quite accurately the quantum (CVA) results. Considering the inverse problem, knowing the final angular momentum distribution can lead to the inference of the initial probability distribution

  1. The Probabilistic Convolution Tree: Efficient Exact Bayesian Inference for Faster LC-MS/MS Protein Inference

    Science.gov (United States)

    Serang, Oliver

    2014-01-01

    Exact Bayesian inference can sometimes be performed efficiently for special cases where a function has commutative and associative symmetry of its inputs (called “causal independence”). For this reason, it is desirable to exploit such symmetry on big data sets. Here we present a method to exploit a general form of this symmetry on probabilistic adder nodes by transforming those probabilistic adder nodes into a probabilistic convolution tree with which dynamic programming computes exact probabilities. A substantial speedup is demonstrated using an illustration example that can arise when identifying splice forms with bottom-up mass spectrometry-based proteomics. On this example, even state-of-the-art exact inference algorithms require a runtime more than exponential in the number of splice forms considered. By using the probabilistic convolution tree, we reduce the runtime to and the space to where is the number of variables joined by an additive or cardinal operator. This approach, which can also be used with junction tree inference, is applicable to graphs with arbitrary dependency on counting variables or cardinalities and can be used on diverse problems and fields like forward error correcting codes, elemental decomposition, and spectral demixing. The approach also trivially generalizes to multiple dimensions. PMID:24626234

  2. Making inference from wildlife collision data: inferring predator absence from prey strikes.

    Science.gov (United States)

    Caley, Peter; Hosack, Geoffrey R; Barry, Simon C

    2017-01-01

    Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.

  3. Making inference from wildlife collision data: inferring predator absence from prey strikes

    Directory of Open Access Journals (Sweden)

    Peter Caley

    2017-02-01

    Full Text Available Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.

  4. Balkanization and Unification of Probabilistic Inferences

    Science.gov (United States)

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  5. Time clustered sampling can inflate the inferred substitution rate in foot-and-mouth disease virus analyses

    DEFF Research Database (Denmark)

    Pedersen, Casper-Emil Tingskov; Frandsen, Peter; Wekesa, Sabenzia N.

    2015-01-01

    through a study of the foot-and-mouth (FMD) disease virus serotypes SAT 1 and SAT 2. Our study shows that clustered temporal sampling in phylogenetic analyses of FMD viruses will strongly bias the inferences of substitution rates and tMRCA because the inferred rates in such data sets reflect a rate closer......With the emergence of analytical software for the inference of viral evolution, a number of studies have focused on estimating important parameters such as the substitution rate and the time to the most recent common ancestor (tMRCA) for rapidly evolving viruses. Coupled with an increasing...... to the mutation rate rather than the substitution rate. Estimating evolutionary parameters from viral sequences should be performed with due consideration of the differences in short-term and longer-term evolutionary processes occurring within sets of temporally sampled viruses, and studies should carefully...

  6. Nuclear Forensic Inferences Using Iterative Multidimensional Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Robel, M; Kristo, M J; Heller, M A

    2009-06-09

    Nuclear forensics involves the analysis of interdicted nuclear material for specific material characteristics (referred to as 'signatures') that imply specific geographical locations, production processes, culprit intentions, etc. Predictive signatures rely on expert knowledge of physics, chemistry, and engineering to develop inferences from these material characteristics. Comparative signatures, on the other hand, rely on comparison of the material characteristics of the interdicted sample (the 'questioned sample' in FBI parlance) with those of a set of known samples. In the ideal case, the set of known samples would be a comprehensive nuclear forensics database, a database which does not currently exist. In fact, our ability to analyze interdicted samples and produce an extensive list of precise materials characteristics far exceeds our ability to interpret the results. Therefore, as we seek to develop the extensive databases necessary for nuclear forensics, we must also develop the methods necessary to produce the necessary inferences from comparison of our analytical results with these large, multidimensional sets of data. In the work reported here, we used a large, multidimensional dataset of results from quality control analyses of uranium ore concentrate (UOC, sometimes called 'yellowcake'). We have found that traditional multidimensional techniques, such as principal components analysis (PCA), are especially useful for understanding such datasets and drawing relevant conclusions. In particular, we have developed an iterative partial least squares-discriminant analysis (PLS-DA) procedure that has proven especially adept at identifying the production location of unknown UOC samples. By removing classes which fell far outside the initial decision boundary, and then rebuilding the PLS-DA model, we have consistently produced better and more definitive attributions than with a single pass classification approach. Performance of the

  7. Design, science and naturalism

    Science.gov (United States)

    Deming, David

    2008-09-01

    The Design Argument is the proposition that the presence of order in the universe is evidence for the existence of God. The Argument dates at least to the presocratic Greek philosophers, and is largely based on analogical reasoning. Following the appearance of Aquinas' Summa Theologica in the 13th century, the Christian Church in Europe embraced a Natural Theology based on observation and reason that allowed it to dominate the entire world of knowledge. Science in turn advanced itself by demonstrating that it could be of service to theology, the recognized queen of the sciences. During the heyday of British Natural Theology in the 17th and 18th centuries, the watchmaker, shipbuilder, and architect analogies were invoked reflexively by philosophers, theologians, and scientists. The Design Argument was not systematically and analytically criticized until David Hume wrote Dialogues on Natural Religion in the 1750s. After Darwin published Origin of Species in 1859, Design withered on the vine. But in recent years, the Argument has been resurrected under the appellation "intelligent design," and been the subject of political and legal controversy in the United States. Design advocates have argued that intelligent design can be formulated as a scientific hypothesis, that new scientific discoveries validate a design inference, and that naturalism must be removed as a methodological requirement in science. If science is defined by a model of concentric epistemological zonation, design cannot be construed as a scientific hypothesis because it is inconsistent with the core aspects of scientific methodology: naturalism, uniformity, induction, and efficient causation. An analytical examination of claims by design advocates finds no evidence of any type to support either scientific or philosophical claims that design can be unambiguously inferred from nature. The apparent irreducible complexity of biological mechanisms may be explained by exaptation or scaffolding. The argument

  8. SUNYAEV-ZEL'DOVICH EFFECT OBSERVATIONS OF STRONG LENSING GALAXY CLUSTERS: PROBING THE OVERCONCENTRATION PROBLEM

    International Nuclear Information System (INIS)

    Gralla, Megan B.; Gladders, Michael D.; Marrone, Daniel P.; Bayliss, Matthew; Carlstrom, John E.; Greer, Christopher; Hennessy, Ryan; Koester, Benjamin; Leitch, Erik; Sharon, Keren; Barrientos, L. Felipe; Bonamente, Massimiliano; Bulbul, Esra; Hasler, Nicole; Culverhouse, Thomas; Hawkins, David; Lamb, James; Gilbank, David G.; Joy, Marshall; Miller, Amber

    2011-01-01

    We have measured the Sunyaev-Zel'dovich (SZ) effect for a sample of 10 strong lensing selected galaxy clusters using the Sunyaev-Zel'dovich Array (SZA). The SZA is sensitive to structures on spatial scales of a few arcminutes, while the strong lensing mass modeling constrains the mass at small scales (typically <30''). Combining the two provides information about the projected concentrations of the strong lensing clusters. The Einstein radii we measure are twice as large as expected given the masses inferred from SZ scaling relations. A Monte Carlo simulation indicates that a sample randomly drawn from the expected distribution would have a larger median Einstein radius than the observed clusters about 3% of the time. The implied overconcentration has been noted in previous studies and persists for this sample, even when we take into account that we are selecting large Einstein radius systems, suggesting that the theoretical models still do not fully describe the observed properties of strong lensing clusters.

  9. Stochastic Collapsed Variational Bayesian Inference for Latent Dirichlet Allocation

    NARCIS (Netherlands)

    Foulds, J.; Boyles, L.; DuBois, C.; Smyth, P.; Welling, M.; Dhillon, I.S.; Koren, Y.; Ghani, R.; Senator, T.E.; Bradley, P.; Parekh, R.; He, J.; Grossman, R.L.; Uthurusamy, R.

    2013-01-01

    There has been an explosion in the amount of digital text information available in recent years, leading to challenges of scale for traditional inference algorithms for topic models. Recent advances in stochastic variational inference algorithms for latent Dirichlet allocation (LDA) have made it

  10. A Comparative Analysis of Fuzzy Inference Engines in Context of ...

    African Journals Online (AJOL)

    Fuzzy inference engine has found successful applications in a wide variety of fields, such as automatic control, data classification, decision analysis, expert engines, time series prediction, robotics, pattern recognition, etc. This paper presents a comparative analysis of three fuzzy inference engines, max-product, max-min ...

  11. A Comparative Analysis of Fuzzy Inference Engines in Context of ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    The horizontal coordinate of the "fuzzy centroid" of the area under that function is taken as the output. This method does not combine the effects of all applicable rules but does produce a continuous output function and is easy to implement. The product inference engine and the minimum inference engine are the most ...

  12. Application of adaptive neuro-fuzzy inference system technique in ...

    African Journals Online (AJOL)

    In this paper, an adaptive neuro‐fuzzy inference systems (ANFIS) technique is used in design of MPA. This artificial Intelligence (AI) technique is used in determining the parameters used in the design of a rectangular microstrip patch antenna. The ANFIS has the advantages of expert knowledge of fuzzy inference system ...

  13. Developing Measures and Predictors of Observation and Inference Abilities

    Science.gov (United States)

    1976-05-01

    SUPPLEMENTARY NOTES IS. KEY WORDS ( Continuo on reverse side It necessary and Identify by block number) Observation Multiple Measurement Inference...Discussion SI group observation) to twenty-three out of thirty-nine (Table V-3, Film inference - Bob). The data indicate modest improvement in item charac

  14. Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers.

    Science.gov (United States)

    Dowd, Jason E; Thompson, Robert J; Schiff, Leslie A; Reynolds, Julie A

    2018-01-01

    Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students' development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students' writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students' scientific reasoning in their writing. © 2018 J. E. Dowd et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  15. Strongly correlating liquids and their isomorphs

    OpenAIRE

    Pedersen, Ulf R.; Gnan, Nicoletta; Bailey, Nicholas P.; Schröder, Thomas B.; Dyre, Jeppe C.

    2010-01-01

    This paper summarizes the properties of strongly correlating liquids, i.e., liquids with strong correlations between virial and potential energy equilibrium fluctuations at constant volume. We proceed to focus on the experimental predictions for strongly correlating glass-forming liquids. These predictions include i) density scaling, ii) isochronal superposition, iii) that there is a single function from which all frequency-dependent viscoelastic response functions may be calculated, iv) that...

  16. Atom collisions in a strong electromagnetic field

    International Nuclear Information System (INIS)

    Smirnov, V.S.; Chaplik, A.V.

    1976-01-01

    It is shown that the long-range part of interatomic interaction is considerably altered in a strong electromagnetic field. Instead of the van der Waals law the potential asymptote can best be described by a dipole-dipole R -3 law. Impact broadening and the line shift in a strong nonresonant field are calculated. The possibility of bound states of two atoms being formed in a strong light field is discussed

  17. Science of science.

    Science.gov (United States)

    Fortunato, Santo; Bergstrom, Carl T; Börner, Katy; Evans, James A; Helbing, Dirk; Milojević, Staša; Petersen, Alexander M; Radicchi, Filippo; Sinatra, Roberta; Uzzi, Brian; Vespignani, Alessandro; Waltman, Ludo; Wang, Dashun; Barabási, Albert-László

    2018-03-02

    Identifying fundamental drivers of science and developing predictive models to capture its evolution are instrumental for the design of policies that can improve the scientific enterprise-for example, through enhanced career paths for scientists, better performance evaluation for organizations hosting research, discovery of novel effective funding vehicles, and even identification of promising regions along the scientific frontier. The science of science uses large-scale data on the production of science to search for universal and domain-specific patterns. Here, we review recent developments in this transdisciplinary field. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  18. Primate diversification inferred from phylogenies and fossils.

    Science.gov (United States)

    Herrera, James P

    2017-12-01

    Biodiversity arises from the balance between speciation and extinction. Fossils record the origins and disappearance of organisms, and the branching patterns of molecular phylogenies allow estimation of speciation and extinction rates, but the patterns of diversification are frequently incongruent between these two data sources. I tested two hypotheses about the diversification of primates based on ∼600 fossil species and 90% complete phylogenies of living species: (1) diversification rates increased through time; (2) a significant extinction event occurred in the Oligocene. Consistent with the first hypothesis, analyses of phylogenies supported increasing speciation rates and negligible extinction rates. In contrast, fossils showed that while speciation rates increased, speciation and extinction rates tended to be nearly equal, resulting in zero net diversification. Partially supporting the second hypothesis, the fossil data recorded a clear pattern of diversity decline in the Oligocene, although diversification rates were near zero. The phylogeny supported increased extinction ∼34 Ma, but also elevated extinction ∼10 Ma, coinciding with diversity declines in some fossil clades. The results demonstrated that estimates of speciation and extinction ignoring fossils are insufficient to infer diversification and information on extinct lineages should be incorporated into phylogenetic analyses. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.

  19. Functional network inference of the suprachiasmatic nucleus

    Energy Technology Data Exchange (ETDEWEB)

    Abel, John H.; Meeker, Kirsten; Granados-Fuentes, Daniel; St. John, Peter C.; Wang, Thomas J.; Bales, Benjamin B.; Doyle, Francis J.; Herzog, Erik D.; Petzold, Linda R.

    2016-04-04

    In the mammalian suprachiasmatic nucleus (SCN), noisy cellular oscillators communicate within a neuronal network to generate precise system-wide circadian rhythms. Although the intracellular genetic oscillator and intercellular biochemical coupling mechanisms have been examined previously, the network topology driving synchronization of the SCN has not been elucidated. This network has been particularly challenging to probe, due to its oscillatory components and slow coupling timescale. In this work, we investigated the SCN network at a single-cell resolution through a chemically induced desynchronization. We then inferred functional connections in the SCN by applying the maximal information coefficient statistic to bioluminescence reporter data from individual neurons while they resynchronized their circadian cycling. Our results demonstrate that the functional network of circadian cells associated with resynchronization has small-world characteristics, with a node degree distribution that is exponential. We show that hubs of this small-world network are preferentially located in the central SCN, with sparsely connected shells surrounding these cores. Finally, we used two computational models of circadian neurons to validate our predictions of network structure.

  20. Statistical Inference Based on L-Moments

    Directory of Open Access Journals (Sweden)

    Tereza Šimková

    2017-03-01

    Full Text Available To overcome drawbacks of central moments and comoment matrices usually used to characterize univariate and multivariate distributions, respectively, their generalization, termed L-moments, has been proposed. L-moments of all orders are defined for any random variable or vector with finite mean. L-moments have been widely employed in the past 20 years in statistical inference. The aim of the paper is to present the review of the theory of L-moments and to illustrate their application in parameter estimating and hypothesis testing. The problem of estimating the three-parameter generalized Pareto distribution’s (GPD parameters that is generally used in modelling extreme events is considered. A small simulation study is performed to show the superiority of the L-moment method in some cases. Because nowadays L-moments are often employed in estimating extreme events by regional approaches, the focus is on the key assumption of index-flood based regional frequency analysis (RFA, that is homogeneity testing. The benefits of the nonparametric L-moment homogeneity test are implemented on extreme meteorological events observed in the Czech Republic.

  1. Bayesian Inference of a Multivariate Regression Model

    Directory of Open Access Journals (Sweden)

    Marick S. Sinay

    2014-01-01

    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  2. Aesthetic quality inference for online fashion shopping

    Science.gov (United States)

    Chen, Ming; Allebach, Jan

    2014-03-01

    On-line fashion communities in which participants post photos of personal fashion items for viewing and possible purchase by others are becoming increasingly popular. Generally, these photos are taken by individuals who have no training in photography with low-cost mobile phone cameras. It is desired that photos of the products have high aesthetic quality to improve the users' online shopping experience. In this work, we design features for aesthetic quality inference in the context of online fashion shopping. Psychophysical experiments are conducted to construct a database of the photos' aesthetic evaluation, specifically for photos from an online fashion shopping website. We then extract both generic low-level features and high-level image attributes to represent the aesthetic quality. Using a support vector machine framework, we train a predictor of the aesthetic quality rating based on the feature vector. Experimental results validate the efficacy of our approach. Metadata such as the product type are also used to further improve the result.

  3. Inferring gene networks from discrete expression data

    KAUST Repository

    Zhang, L.

    2013-07-18

    The modeling of gene networks from transcriptional expression data is an important tool in biomedical research to reveal signaling pathways and to identify treatment targets. Current gene network modeling is primarily based on the use of Gaussian graphical models applied to continuous data, which give a closedformmarginal likelihood. In this paper,we extend network modeling to discrete data, specifically data from serial analysis of gene expression, and RNA-sequencing experiments, both of which generate counts of mRNAtranscripts in cell samples.We propose a generalized linear model to fit the discrete gene expression data and assume that the log ratios of the mean expression levels follow a Gaussian distribution.We restrict the gene network structures to decomposable graphs and derive the graphs by selecting the covariance matrix of the Gaussian distribution with the hyper-inverse Wishart priors. Furthermore, we incorporate prior network models based on gene ontology information, which avails existing biological information on the genes of interest. We conduct simulation studies to examine the performance of our discrete graphical model and apply the method to two real datasets for gene network inference. © The Author 2013. Published by Oxford University Press. All rights reserved.

  4. Probabilistic phylogenetic inference with insertions and deletions.

    Directory of Open Access Journals (Sweden)

    Elena Rivas

    2008-09-01

    Full Text Available A fundamental task in sequence analysis is to calculate the probability of a multiple alignment given a phylogenetic tree relating the sequences and an evolutionary model describing how sequences change over time. However, the most widely used phylogenetic models only account for residue substitution events. We describe a probabilistic model of a multiple sequence alignment that accounts for insertion and deletion events in addition to substitutions, given a phylogenetic tree, using a rate matrix augmented by the gap character. Starting from a continuous Markov process, we construct a non-reversible generative (birth-death evolutionary model for insertions and deletions. The model assumes that insertion and deletion events occur one residue at a time. We apply this model to phylogenetic tree inference by extending the program dnaml in phylip. Using standard benchmarking methods on simulated data and a new "concordance test" benchmark on real ribosomal RNA alignments, we show that the extended program dnamlepsilon improves accuracy relative to the usual approach of ignoring gaps, while retaining the computational efficiency of the Felsenstein peeling algorithm.

  5. Information-Theoretic Inference of Common Ancestors

    Directory of Open Access Journals (Sweden)

    Bastian Steudel

    2015-04-01

    Full Text Available A directed acyclic graph (DAG partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs, our result can be seen as a quantitative extension of Reichenbach’s principle of common cause to more than two variables. Our conclusions are valid also for non-probabilistic observations, such as binary strings, since we state the proof for an axiomatized notion of “mutual information” that includes the stochastic as well as the algorithmic version.

  6. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  7. Logical inference techniques for loop parallelization

    KAUST Repository

    Oancea, Cosmin E.

    2012-01-01

    This paper presents a fully automatic approach to loop parallelization that integrates the use of static and run-time analysis and thus overcomes many known difficulties such as nonlinear and indirect array indexing and complex control flow. Our hybrid analysis framework validates the parallelization transformation by verifying the independence of the loop\\'s memory references. To this end it represents array references using the USR (uniform set representation) language and expresses the independence condition as an equation, S = Ø, where S is a set expression representing array indexes. Using a language instead of an array-abstraction representation for S results in a smaller number of conservative approximations but exhibits a potentially-high runtime cost. To alleviate this cost we introduce a language translation F from the USR set-expression language to an equally rich language of predicates (F(S) ⇒ S = Ø). Loop parallelization is then validated using a novel logic inference algorithm that factorizes the obtained complex predicates (F(S)) into a sequence of sufficient-independence conditions that are evaluated first statically and, when needed, dynamically, in increasing order of their estimated complexities. We evaluate our automated solution on 26 benchmarks from PERFECTCLUB and SPEC suites and show that our approach is effective in parallelizing large, complex loops and obtains much better full program speedups than the Intel and IBM Fortran compilers. Copyright © 2012 ACM.

  8. Virtual reality and consciousness inference in dreaming.

    Science.gov (United States)

    Hobson, J Allan; Hong, Charles C-H; Friston, Karl J

    2014-01-01

    This article explores the notion that the brain is genetically endowed with an innate virtual reality generator that - through experience-dependent plasticity - becomes a generative or predictive model of the world. This model, which is most clearly revealed in rapid eye movement (REM) sleep dreaming, may provide the theater for conscious experience. Functional neuroimaging evidence for brain activations that are time-locked to rapid eye movements (REMs) endorses the view that waking consciousness emerges from REM sleep - and dreaming lays the foundations for waking perception. In this view, the brain is equipped with a virtual model of the world that generates predictions of its sensations. This model is continually updated and entrained by sensory prediction errors in wakefulness to ensure veridical perception, but not in dreaming. In contrast, dreaming plays an essential role in maintaining and enhancing the capacity to model the world by minimizing model complexity and thereby maximizing both statistical and thermodynamic efficiency. This perspective suggests that consciousness corresponds to the embodied process of inference, realized through the generation of virtual realities (in both sleep and wakefulness). In short, our premise or hypothesis is that the waking brain engages with the world to predict the causes of sensations, while in sleep the brain's generative model is actively refined so that it generates more efficient predictions during waking. We review the evidence in support of this hypothesis - evidence that grounds consciousness in biophysical computations whose neuronal and neurochemical infrastructure has been disclosed by sleep research.

  9. Multiple sequence alignment accuracy and phylogenetic inference.

    Science.gov (United States)

    Ogden, T Heath; Rosenberg, Michael S

    2006-04-01

    Phylogenies are often thought to be more dependent upon the specifics of the sequence alignment rather than on the method of reconstruction. Simulation of sequences containing insertion and deletion events was performed in order to determine the role that alignment accuracy plays during phylogenetic inference. Data sets were simulated for pectinate, balanced, and random tree shapes under different conditions (ultrametric equal branch length, ultrametric random branch length, nonultrametric random branch length). Comparisons between hypothesized alignments and true alignments enabled determination of two measures of alignment accuracy, that of the total data set and that of individual branches. In general, our results indicate that as alignment error increases, topological accuracy decreases. This trend was much more pronounced for data sets derived from more pectinate topologies. In contrast, for balanced, ultrametric, equal branch length tree shapes, alignment inaccuracy had little average effect on tree reconstruction. These conclusions are based on average trends of many analyses under different conditions, and any one specific analysis, independent of the alignment accuracy, may recover very accurate or inaccurate topologies. Maximum likelihood and Bayesian, in general, outperformed neighbor joining and maximum parsimony in terms of tree reconstruction accuracy. Results also indicated that as the length of the branch and of the neighboring branches increase, alignment accuracy decreases, and the length of the neighboring branches is the major factor in topological accuracy. Thus, multiple-sequence alignment can be an important factor in downstream effects on topological reconstruction.

  10. Phylogenetic inference with weighted codon evolutionary distances.

    Science.gov (United States)

    Criscuolo, Alexis; Michel, Christian J

    2009-04-01

    We develop a new approach to estimate a matrix of pairwise evolutionary distances from a codon-based alignment based on a codon evolutionary model. The method first computes a standard distance matrix for each of the three codon positions. Then these three distance matrices are weighted according to an estimate of the global evolutionary rate of each codon position and averaged into a unique distance matrix. Using a large set of both real and simulated codon-based alignments of nucleotide sequences, we show that this approach leads to distance matrices that have a significantly better treelikeness compared to those obtained by standard nucleotide evolutionary distances. We also propose an alternative weighting to eliminate the part of the noise often associated with some codon positions, particularly the third position, which is known to induce a fast evolutionary rate. Simulation results show that fast distance-based tree reconstruction algorithms on distance matrices based on this codon position weighting can lead to phylogenetic trees that are at least as accurate as, if not better, than those inferred by maximum likelihood. Finally, a well-known multigene dataset composed of eight yeast species and 106 codon-based alignments is reanalyzed and shows that our codon evolutionary distances allow building a phylogenetic tree which is similar to those obtained by non-distance-based methods (e.g., maximum parsimony and maximum likelihood) and also significantly improved compared to standard nucleotide evolutionary distance estimates.

  11. Towards Inferring Protein Interactions: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Ji Xiang

    2006-01-01

    Full Text Available Discovering interacting proteins has been an essential part of functional genomics. However, existing experimental techniques only uncover a small portion of any interactome. Furthermore, these data often have a very high false rate. By conceptualizing the interactions at domain level, we provide a more abstract representation of interactome, which also facilitates the discovery of unobserved protein-protein interactions. Although several domain-based approaches have been proposed to predict protein-protein interactions, they usually assume that domain interactions are independent on each other for the convenience of computational modeling. A new framework to predict protein interactions is proposed in this paper, where no assumption is made about domain interactions. Protein interactions may be the result of multiple domain interactions which are dependent on each other. A conjunctive norm form representation is used to capture the relationships between protein interactions and domain interactions. The problem of interaction inference is then modeled as a constraint satisfiability problem and solved via linear programing. Experimental results on a combined yeast data set have demonstrated the robustness and the accuracy of the proposed algorithm. Moreover, we also map some predicted interacting domains to three-dimensional structures of protein complexes to show the validity of our predictions.

  12. Bayesian inference of radiation belt loss timescales.

    Science.gov (United States)

    Camporeale, E.; Chandorkar, M.

    2017-12-01

    Electron fluxes in the Earth's radiation belts are routinely studied using the classical quasi-linear radial diffusion model. Although this simplified linear equation has proven to be an indispensable tool in understanding the dynamics of the radiation belt, it requires specification of quantities such as the diffusion coefficient and electron loss timescales that are never directly measured. Researchers have so far assumed a-priori parameterisations for radiation belt quantities and derived the best fit using satellite data. The state of the art in this domain lacks a coherent formulation of this problem in a probabilistic framework. We present some recent progress that we have made in performing Bayesian inference of radial diffusion parameters. We achieve this by making extensive use of the theory connecting Gaussian Processes and linear partial differential equations, and performing Markov Chain Monte Carlo sampling of radial diffusion parameters. These results are important for understanding the role and the propagation of uncertainties in radiation belt simulations and, eventually, for providing a probabilistic forecast of energetic electron fluxes in a Space Weather context.

  13. Inferring Molecular Processes Heterogeneity from Transcriptional Data.

    Science.gov (United States)

    Gogolewski, Krzysztof; Wronowska, Weronika; Lech, Agnieszka; Lesyng, Bogdan; Gambin, Anna

    2017-01-01

    RNA microarrays and RNA-seq are nowadays standard technologies to study the transcriptional activity of cells. Most studies focus on tracking transcriptional changes caused by specific experimental conditions. Information referring to genes up- and downregulation is evaluated analyzing the behaviour of relatively large population of cells by averaging its properties. However, even assuming perfect sample homogeneity, different subpopulations of cells can exhibit diverse transcriptomic profiles, as they may follow different regulatory/signaling pathways. The purpose of this study is to provide a novel methodological scheme to account for possible internal, functional heterogeneity in homogeneous cell lines, including cancer ones. We propose a novel computational method to infer the proportion between subpopulations of cells that manifest various functional behaviour in a given sample. Our method was validated using two datasets from RNA microarray experiments. Both experiments aimed to examine cell viability in specific experimental conditions. The presented methodology can be easily extended to RNA-seq data as well as other molecular processes. Moreover, it complements standard tools to indicate most important networks from transcriptomic data and in particular could be useful in the analysis of cancer cell lines affected by biologically active compounds or drugs.

  14. Scalable inference for stochastic block models

    KAUST Repository

    Peng, Chengbin

    2017-12-08

    Community detection in graphs is widely used in social and biological networks, and the stochastic block model is a powerful probabilistic tool for describing graphs with community structures. However, in the era of "big data," traditional inference algorithms for such a model are increasingly limited due to their high time complexity and poor scalability. In this paper, we propose a multi-stage maximum likelihood approach to recover the latent parameters of the stochastic block model, in time linear with respect to the number of edges. We also propose a parallel algorithm based on message passing. Our algorithm can overlap communication and computation, providing speedup without compromising accuracy as the number of processors grows. For example, to process a real-world graph with about 1.3 million nodes and 10 million edges, our algorithm requires about 6 seconds on 64 cores of a contemporary commodity Linux cluster. Experiments demonstrate that the algorithm can produce high quality results on both benchmark and real-world graphs. An example of finding more meaningful communities is illustrated consequently in comparison with a popular modularity maximization algorithm.

  15. MISTIC: Mutual information server to infer coevolution.

    Science.gov (United States)

    Simonetti, Franco L; Teppa, Elin; Chernomoretz, Ariel; Nielsen, Morten; Marino Buslje, Cristina

    2013-07-01

    MISTIC (mutual information server to infer coevolution) is a web server for graphical representation of the information contained within a MSA (multiple sequence alignment) and a complete analysis tool for Mutual Information networks in protein families. The server outputs a graphical visualization of several information-related quantities using a circos representation. This provides an integrated view of the MSA in terms of (i) the mutual information (MI) between residue pairs, (ii) sequence conservation and (iii) the residue cumulative and proximity MI scores. Further, an interactive interface to explore and characterize the MI network is provided. Several tools are offered for selecting subsets of nodes from the network for visualization. Node coloring can be set to match different attributes, such as conservation, cumulative MI, proximity MI and secondary structure. Finally, a zip file containing all results can be downloaded. The server is available at http://mistic.leloir.org.ar. In summary, MISTIC allows for a comprehensive, compact, visually rich view of the information contained within an MSA in a manner unique to any other publicly available web server. In particular, the use of circos representation of MI networks and the visualization of the cumulative MI and proximity MI concepts is novel.

  16. Attention as a Bayesian inference process

    Science.gov (United States)

    Chikkerur, Sharat; Serre, Thomas; Tan, Cheston; Poggio, Tomaso

    2011-03-01

    David Marr famously defined vision as "knowing what is where by seeing". In the framework described here, attention is the inference process that solves the visual recognition problem of what is where. The theory proposes a computational role for attention and leads to a model that performs well in recognition tasks and that predicts some of the main properties of attention at the level of psychophysics and physiology. We propose an algorithmic implementation a Bayesian network that can be mapped into the basic functional anatomy of attention involving the ventral stream and the dorsal stream. This description integrates bottom-up, feature-based as well as spatial (context based) attentional mechanisms. We show that the Bayesian model predicts well human eye fixations (considered as a proxy for shifts of attention) in natural scenes, and can improve accuracy in object recognition tasks involving cluttered real world images. In both cases, we found that the proposed model can predict human performance better than existing bottom-up and top-down computational models.

  17. Logical inference techniques for loop parallelization

    DEFF Research Database (Denmark)

    Oancea, Cosmin Eugen; Rauchwerger, Lawrence

    2012-01-01

    This paper presents a fully automatic approach to loop parallelization that integrates the use of static and run-time analysis and thus overcomes many known difficulties such as nonlinear and indirect array indexing and complex control flow. Our hybrid analysis framework validates the paralleliza......This paper presents a fully automatic approach to loop parallelization that integrates the use of static and run-time analysis and thus overcomes many known difficulties such as nonlinear and indirect array indexing and complex control flow. Our hybrid analysis framework validates...... the parallelization transformation by verifying the independence of the loop's memory references. To this end it represents array references using the USR (uniform set representation) language and expresses the independence condition as an equation, S={}, where S is a set expression representing array indexes. Using...... ( F(S) => S = {} ). Loop parallelization is then validated using a novel logic inference algorithm that factorizes the obtained complex predicates F(S) into a sequence of sufficient-independence conditions that are evaluated first statically and, when needed, dynamically, in increasing order...

  18. Inference by replication in densely connected systems.

    Science.gov (United States)

    Neirotti, Juan P; Saad, David

    2007-10-01

    An efficient Bayesian inference method for problems that can be mapped onto dense graphs is presented. The approach is based on message passing where messages are averaged over a large number of replicated variable systems exposed to the same evidential nodes. An assumption about the symmetry of the solutions is required for carrying out the averages; here we extend the previous derivation based on a replica-symmetric- (RS)-like structure to include a more complex one-step replica-symmetry-breaking-like (1RSB-like) ansatz. To demonstrate the potential of the approach it is employed for studying critical properties of the Ising linear perceptron and for multiuser detection in code division multiple access (CDMA) under different noise models. Results obtained under the RS assumption in the noncritical regime give rise to a highly efficient signal detection algorithm in the context of CDMA; while in the critical regime one observes a first-order transition line that ends in a continuous phase transition point. Finite size effects are also observed. While the 1RSB ansatz is not required for the original problems, it was applied to the CDMA signal detection problem with a more complex noise model that exhibits RSB behavior, resulting in an improvement in performance.

  19. Statistical causal inferences and their applications in public health research

    CERN Document Server

    Wu, Pan; Chen, Ding-Geng

    2016-01-01

    This book compiles and presents new developments in statistical causal inference. The accompanying data and computer programs are publicly available so readers may replicate the model development and data analysis presented in each chapter. In this way, methodology is taught so that readers may implement it directly. The book brings together experts engaged in causal inference research to present and discuss recent issues in causal inference methodological development. This is also a timely look at causal inference applied to scenarios that range from clinical trials to mediation and public health research more broadly. In an academic setting, this book will serve as a reference and guide to a course in causal inference at the graduate level (Master's or Doctorate). It is particularly relevant for students pursuing degrees in Statistics, Biostatistics and Computational Biology. Researchers and data analysts in public health and biomedical research will also find this book to be an important reference.

  20. An Integrated Procedure for Bayesian Reliability Inference Using MCMC

    Directory of Open Access Journals (Sweden)

    Jing Lin

    2014-01-01

    Full Text Available The recent proliferation of Markov chain Monte Carlo (MCMC approaches has led to the use of the Bayesian inference in a wide variety of fields. To facilitate MCMC applications, this paper proposes an integrated procedure for Bayesian inference using MCMC methods, from a reliability perspective. The goal is to build a framework for related academic research and engineering applications to implement modern computational-based Bayesian approaches, especially for reliability inferences. The procedure developed here is a continuous improvement process with four stages (Plan, Do, Study, and Action and 11 steps, including: (1 data preparation; (2 prior inspection and integration; (3 prior selection; (4 model selection; (5 posterior sampling; (6 MCMC convergence diagnostic; (7 Monte Carlo error diagnostic; (8 model improvement; (9 model comparison; (10 inference making; (11 data updating and inference improvement. The paper illustrates the proposed procedure using a case study.

  1. Human Inferences about Sequences: A Minimal Transition Probability Model.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2016-12-01

    Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.

  2. In-depth analysis of protein inference algorithms using multiple search engines and well-defined metrics.

    Science.gov (United States)

    Audain, Enrique; Uszkoreit, Julian; Sachsenberg, Timo; Pfeuffer, Julianus; Liang, Xiao; Hermjakob, Henning; Sanchez, Aniel; Eisenacher, Martin; Reinert, Knut; Tabb, David L; Kohlbacher, Oliver; Perez-Riverol, Yasset

    2017-01-06

    In mass spectrometry-based shotgun proteomics, protein identifications are usually the desired result. However, most of the analytical methods are based on the identification of reliable peptides and not the direct identification of intact proteins. Thus, assembling peptides identified from tandem mass spectra into a list of proteins, referred to as protein inference, is a critical step in proteomics research. Currently, different protein inference algorithms and tools are available for the proteomics community. Here, we evaluated five software tools for protein inference (PIA, ProteinProphet, Fido, ProteinLP, MSBayesPro) using three popular database search engines: Mascot, X!Tandem, and MS-GF+. All the algorithms were evaluated using a highly customizable KNIME workflow using four different public datasets with varying complexities (different sample preparation, species and analytical instruments). We defined a set of quality control metrics to evaluate the performance of each combination of search engines, protein inference algorithm, and parameters on each dataset. We show that the results for complex samples vary not only regarding the actual numbers of reported protein groups but also concerning the actual composition of groups. Furthermore, the robustness of reported proteins when using databases of differing complexities is strongly dependant on the applied inference algorithm. Finally, merging the identifications of multiple search engines does not necessarily increase the number of reported proteins, but does increase the number of peptides per protein and thus can generally be recommended. Protein inference is one of the major challenges in MS-based proteomics nowadays. Currently, there are a vast number of protein inference algorithms and implementations available for the proteomics community. Protein assembly impacts in the final results of the research, the quantitation values and the final claims in the research manuscript. Even though protein

  3. Science Smiles

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education. Science Smiles. Articles in Resonance – Journal of Science Education. Volume 1 Issue 4 April 1996 pp 4-4 Science Smiles. Chief Editor's column / Science Smiles · R K Laxman · More Details Fulltext PDF. Volume 1 Issue 5 May 1996 pp 3-3 Science Smiles.

  4. Modelling and Inference Strategies for Biological Systems

    OpenAIRE

    Palmisano, Alida

    2010-01-01

    For many years, computers have played an important role in helping scientists to store, manipulate, and analyze data coming from many different disciplines. In recent years, however, new technological capabilities and new ways of thinking about the usefulness of computer science is extending the reach of computers from simple analysis of collected data to hypothesis generation. The aim of this work is to provide a contribution in the Computational Systems Biology field. The main purpose of...

  5. Science at a crossroads.

    Science.gov (United States)

    Bonvillian, William B

    2002-07-01

    Science is entering an alliance with the economy that will speed the effect of innovation through society. Despite the slowdown of the 'new economy', a cascade paradigm of innovation appears key to increasing the rate of economic growth. Yet for science to continue to thrive and make this contribution to innovation, it must traverse at least three key crossroads. First, while life sciences have built a strong advocacy model to secure growing federal research funding, the physical sciences (including mathematics and engineering) have not and must now do so to thrive. Second, the drop in the numbers of physical scientists and engineers must be reversed if we are to have the talent to maintain a strong trend of scientific advance. Third, although science advances are increasingly interdisciplinary and occurring in the space between the historic science stovepipes, the organization of federal science support is largely unchanged since the beginning of the cold war. While a decentralized model has value, we must also consider new approaches that encourage deeper cooperation across science sectors and agencies.

  6. Strong ideal convergence in probabilistic metric spaces

    Indian Academy of Sciences (India)

    sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and ... also important applications in nonlinear analysis [2]. The theory was brought to ..... for each t > 0 since each set on the right-hand side of the relation (3.1) belongs to I. Thus, by Definition 2.11 and the ...

  7. Large N baryons, strong coupling theory, quarks

    International Nuclear Information System (INIS)

    Sakita, B.

    1984-01-01

    It is shown that in QCD the large N limit is the same as the static strong coupling limit. By using the static strong coupling techniques some of the results of large N baryons are derived. The results are consistent with the large N SU(6) static quark model. (author)

  8. Optimization of strong and weak coordinates

    NARCIS (Netherlands)

    Swart, M.; Bickelhaupt, F.M.

    2006-01-01

    We present a new scheme for the geometry optimization of equilibrium and transition state structures that can be used for both strong and weak coordinates. We use a screening function that depends on atom-pair distances to differentiate strong coordinates from weak coordinates. This differentiation

  9. Strong decays of nucleon and delta resonances

    International Nuclear Information System (INIS)

    Bijker, R.; Leviatan, A.

    1996-01-01

    We study the strong couplings of the nucleon and delta resonances in a collective model. In the ensuing algebraic treatment we derive closed expressions for decay widths which are used to analyze the experimental data for strong decays into the pion and eta channels. (Author)

  10. Theoretical studies of strongly correlated fermions

    Energy Technology Data Exchange (ETDEWEB)

    Logan, D. [Institut Max von Laue - Paul Langevin (ILL), 38 - Grenoble (France)

    1997-04-01

    Strongly correlated fermions are investigated. An understanding of strongly correlated fermions underpins a diverse range of phenomena such as metal-insulator transitions, high-temperature superconductivity, magnetic impurity problems and the properties of heavy-fermion systems, in all of which local moments play an important role. (author).

  11. Science or Science Fiction?

    DEFF Research Database (Denmark)

    Lefsrud, Lianne M.; Meyer, Renate

    2012-01-01

    This paper examines the framings and identity work associated with professionals’ discursive construction of climate change science, their legitimation of themselves as experts on ‘the truth’, and their attitudes towards regulatory measures. Drawing from survey responses of 1077 professional......, legitimation strategies, and use of emotionality and metaphor. By linking notions of the science or science fiction of climate change to the assessment of the adequacy of global and local policies and of potential organizational responses, we contribute to the understanding of ‘defensive institutional work...

  12. Water, law, science

    Science.gov (United States)

    Narasimhan, T. N.

    2008-01-01

    SummaryIn a world with water resources severely impacted by technology, science must actively contribute to water law. To this end, this paper is an earth scientist's attempt to comprehend essential elements of water law, and to examine their connections to science. Science and law share a common logical framework of starting with a priori prescribed tenets, and drawing consistent inferences. In science, observationally established physical laws constitute the tenets, while in law, they stem from social values. The foundations of modern water law in Europe and the New World were formulated nearly two thousand years ago by Roman jurists who were inspired by Greek philosophy of reason. Recognizing that vital natural elements such as water, air, and the sea were governed by immutable natural laws, they reasoned that these elements belonged to all humans, and therefore cannot be owned as private property. Legally, such public property was to be governed by jus gentium, the law of all people or the law of all nations. In contrast, jus civile or civil law governed private property. Remarkably, jus gentium continues to be relevant in our contemporary society in which science plays a pivotal role in exploiting vital resources common to all. This paper examines the historical roots of modern water law, follows their evolution through the centuries, and examines how the spirit of science inherent in jus gentium is profoundly influencing evolving water and environmental laws in Europe, the United States and elsewhere. In a technological world, scientific knowledge has to lie at the core of water law. Yet, science cannot formulate law. It is hoped that a philosophical understanding of the relationships between science and law will contribute to their constructively coming together in the service of society.

  13. Seismic switch for strong motion measurement

    Science.gov (United States)

    Harben, P.E.; Rodgers, P.W.; Ewert, D.W.

    1995-05-30

    A seismic switching device is described that has an input signal from an existing microseismic station seismometer and a signal from a strong motion measuring instrument. The seismic switch monitors the signal level of the strong motion instrument and passes the seismometer signal to the station data telemetry and recording systems. When the strong motion instrument signal level exceeds a user set threshold level, the seismometer signal is switched out and the strong motion signal is passed to the telemetry system. The amount of time the strong motion signal is passed before switching back to the seismometer signal is user controlled between 1 and 15 seconds. If the threshold level is exceeded during a switch time period, the length of time is extended from that instant by one user set time period. 11 figs.

  14. Inferring tie strength from online directed behavior.

    Directory of Open Access Journals (Sweden)

    Jason J Jones

    Full Text Available Some social connections are stronger than others. People have not only friends, but also best friends. Social scientists have long recognized this characteristic of social connections and researchers frequently use the term tie strength to refer to this concept. We used online interaction data (specifically, Facebook interactions to successfully identify real-world strong ties. Ground truth was established by asking users themselves to name their closest friends in real life. We found the frequency of online interaction was diagnostic of strong ties, and interaction frequency was much more useful diagnostically than were attributes of the user or the user's friends. More private communications (messages were not necessarily more informative than public communications (comments, wall posts, and other interactions.

  15. Inferring mental states from neuroimaging data: From reverse inference to large-scale decoding

    OpenAIRE

    Poldrack, Russell A.

    2011-01-01

    A common goal of neuroimaging research is to use imaging data to identify the mental processes that are engaged when a subject performs a mental task. The use of reasoning from activation to mental functions, known as “reverse inference”, has been previously criticized on the basis that it does not take into account how selectively the area is activated by the mental process in question. In this Perspective, I outline the critique of informal reverse inference, and describe a number of new de...

  16. On the Hardness of Topology Inference

    Science.gov (United States)

    Acharya, H. B.; Gouda, M. G.

    Many systems require information about the topology of networks on the Internet, for purposes like management, efficiency, testing of new protocols and so on. However, ISPs usually do not share the actual topology maps with outsiders; thus, in order to obtain the topology of a network on the Internet, a system must reconstruct it from publicly observable data. The standard method employs traceroute to obtain paths between nodes; next, a topology is generated such that the observed paths occur in the graph. However, traceroute has the problem that some routers refuse to reveal their addresses, and appear as anonymous nodes in traces. Previous research on the problem of topology inference with anonymous nodes has demonstrated that it is at best NP-complete. In this paper, we improve upon this result. In our previous research, we showed that in the special case where nodes may be anonymous in some traces but not in all traces (so all node identifiers are known), there exist trace sets that are generable from multiple topologies. This paper extends our theory of network tracing to the general case (with strictly anonymous nodes), and shows that the problem of computing the network that generated a trace set, given the trace set, has no general solution. The weak version of the problem, which allows an algorithm to output a "small" set of networks- any one of which is the correct one- is also not solvable. Any algorithm guaranteed to output the correct topology outputs at least an exponential number of networks. Our results are surprisingly robust: they hold even when the network is known to have exactly two anonymous nodes, and every node as well as every edge in the network is guaranteed to occur in some trace. On the basis of this result, we suggest that exact reconstruction of network topology requires more powerful tools than traceroute.

  17. Inferring pathway activity toward precise disease classification.

    Directory of Open Access Journals (Sweden)

    Eunjung Lee

    2008-11-01

    Full Text Available The advent of microarray technology has made it possible to classify disease states based on gene expression profiles of patients. Typically, marker genes are selected by measuring the power of their expression profiles to discriminate among patients of different disease states. However, expression-based classification can be challenging in complex diseases due to factors such as cellular heterogeneity within a tissue sample and genetic heterogeneity across patients. A promising technique for coping with these challenges is to incorporate pathway information into the disease classification procedure in order to classify disease based on the activity of entire signaling pathways or protein complexes rather than on the expression levels of individual genes or proteins. We propose a new classification method based on pathway activities inferred for each patient. For each pathway, an activity level is summarized from the gene expression levels of its condition-responsive genes (CORGs, defined as the subset of genes in the pathway whose combined expression delivers optimal discriminative power for the disease phenotype. We show that classifiers using pathway activity achieve better performance than classifiers based on individual gene expression, for both simple and complex case-control studies including differentiation of perturbed from non-perturbed cells and subtyping of several different kinds of cancer. Moreover, the new method outperforms several previous approaches that use a static (i.e., non-conditional definition of pathways. Within a pathway, the identified CORGs may facilitate the development of better diagnostic markers and the discovery of core alterations in human disease.

  18. Communicating Science

    Science.gov (United States)

    Holland, G. J.; McCaffrey, M. S.; Kiehl, J. T.; Schmidt, C.

    2010-12-01

    We are in an era of rapidly changing communication media, which is driving a major evolution in the modes of communicating science. In the past, a mainstay of scientific communication in popular media was through science “translators”; science journalists and presenters. These have now nearly disappeared and are being replaced by widespread dissemination through, e.g., the internet, blogs, YouTube and journalists who often have little scientific background and sharp deadlines. Thus, scientists are required to assume increasing responsibility for translating their scientific findings and calibrating their communications to non-technical audiences, a task for which they are often ill prepared, especially when it comes to controversial societal issues such as tobacco, evolution, and most recently climate change (Oreskes and Conway 2010). Such issues have been politicized and hi-jacked by ideological belief systems to such an extent that constructive dialogue is often impossible. Many scientists are excellent communicators, to their peers. But this requires careful attention to detail and logical explanation, open acknowledgement of uncertainties, and dispassionate delivery. These qualities become liabilities when communicating to a non-scientific audience where entertainment, attention grabbing, 15 second sound bites, and self assuredness reign (e.g. Olson 2009). Here we report on a program initiated by NCAR and UCAR to develop new approaches to science communication and to equip present and future scientists with the requisite skills. If we start from a sound scientific finding with general scientific consensus, such as the warming of the planet by greenhouse gases, then the primary emphasis moves from the “science” to the “art” of communication. The art cannot have free reign, however, as there remains a strong requirement for objectivity, honesty, consistency, and above all a resistance to advocating particular policy positions. Targeting audience

  19. Inferring structural connectivity using Ising couplings in models of neuronal networks.

    Science.gov (United States)

    Kadirvelu, Balasundaram; Hayashi, Yoshikatsu; Nasuto, Slawomir J

    2017-08-15

    Functional connectivity metrics have been widely used to infer the underlying structural connectivity in neuronal networks. Maximum entropy based Ising models have been suggested to discount the effect of indirect interactions and give good results in inferring the true anatomical connections. However, no benchmarking is currently available to assess the performance of Ising couplings against other functional connectivity metrics in the microscopic scale of neuronal networks through a wide set of network conditions and network structures. In this paper, we study the performance of the Ising model couplings to infer the synaptic connectivity in in silico networks of neurons and compare its performance against partial and cross-correlations for different correlation levels, firing rates, network sizes, network densities, and topologies. Our results show that the relative performance amongst the three functional connectivity metrics depends primarily on the network correlation levels. Ising couplings detected the most structural links at very weak network correlation levels, and partial correlations outperformed Ising couplings and cross-correlations at strong correlation levels. The result was consistent across varying firing rates, network sizes, and topologies. The findings of this paper serve as a guide in choosing the right functional connectivity tool to reconstruct the structural connectivity.

  20. Indexing the Environmental Quality Performance Based on A Fuzzy Inference Approach

    Science.gov (United States)

    Iswari, Lizda

    2018-03-01

    Environmental performance strongly deals with the quality of human life. In Indonesia, this performance is quantified through Environmental Quality Index (EQI) which consists of three indicators, i.e. river quality index, air quality index, and coverage of land cover. The current of this instrument data processing was done by averaging and weighting each index to represent the EQI at the provincial level. However, we found EQI interpretations that may contain some uncertainties and have a range of circumstances possibly less appropriate if processed under a common statistical approach. In this research, we aim to manage the indicators of EQI with a more intuitive computation technique and make some inferences related to the environmental performance in 33 provinces in Indonesia. Research was conducted in three stages of Mamdani Fuzzy Inference System (MAFIS), i.e. fuzzification, data inference, and defuzzification. Data input consists of 10 environmental parameters and the output is an index of Environmental Quality Performance (EQP). Research was applied to the environmental condition data set in 2015 and quantified the results into the scale of 0 to 100, i.e. 10 provinces at good performance with the EQP above 80 dominated by provinces in eastern part of Indonesia, 22 provinces with the EQP between 80 to 50, and one province in Java Island with the EQP below 20. This research shows that environmental quality performance can be quantified without eliminating the natures of the data set and simultaneously is able to show the environment behavior along with its spatial pattern distribution.

  1. Gene Systems Network Inferred from Expression Profiles in Hepatocellular Carcinogenesis by Graphical Gaussian Model

    Directory of Open Access Journals (Sweden)

    Saito Shigeru

    2007-01-01

    Full Text Available Hepatocellular carcinoma (HCC in a liver with advanced-stage chronic hepatitis C (CHC is induced by hepatitis C virus, which chronically infects about 170 million people worldwide. To elucidate the associations between gene groups in hepatocellular carcinogenesis, we analyzed the profiles of the genes characteristically expressed in the CHC and HCC cell stages by a statistical method for inferring the network between gene systems based on the graphical Gaussian model. A systematic evaluation of the inferred network in terms of the biological knowledge revealed that the inferred network was strongly involved in the known gene-gene interactions with high significance , and that the clusters characterized by different cancer-related responses were associated with those of the gene groups related to metabolic pathways and morphological events. Although some relationships in the network remain to be interpreted, the analyses revealed a snapshot of the orchestrated expression of cancer-related groups and some pathways related with metabolisms and morphological events in hepatocellular carcinogenesis, and thus provide possible clues on the disease mechanism and insights that address the gap between molecular and clinical assessments.

  2. Dynamic probabilistic threshold networks to infer signaling pathways from time-course perturbation data.

    Science.gov (United States)

    Kiani, Narsis A; Kaderali, Lars

    2014-07-22

    Network inference deals with the reconstruction of molecular networks from experimental data. Given N molecular species, the challenge is to find the underlying network. Due to data limitations, this typically is an ill-posed problem, and requires the integration of prior biological knowledge or strong regularization. We here focus on the situation when time-resolved measurements of a system's response after systematic perturbations are available. We present a novel method to infer signaling networks from time-course perturbation data. We utilize dynamic Bayesian networks with probabilistic Boolean threshold functions to describe protein activation. The model posterior distribution is analyzed using evolutionary MCMC sampling and subsequent clustering, resulting in probability distributions over alternative networks. We evaluate our method on simulated data, and study its performance with respect to data set size and levels of noise. We then use our method to study EGF-mediated signaling in the ERBB pathway. Dynamic Probabilistic Threshold Networks is a new method to infer signaling networks from time-series perturbation data. It exploits the dynamic response of a system after external perturbation for network reconstruction. On simulated data, we show that the approach outperforms current state of the art methods. On the ERBB data, our approach recovers a significant fraction of the known interactions, and predicts novel mechanisms in the ERBB pathway.

  3. Active inference and robot control: a case study.

    Science.gov (United States)

    Pio-Lopez, Léo; Nizard, Ange; Friston, Karl; Pezzulo, Giovanni

    2016-09-01

    Active inference is a general framework for perception and action that is gaining prominence in computational and systems neuroscience but is less known outside these fields. Here, we discuss a proof-of-principle implementation of the active inference scheme for the control or the 7-DoF arm of a (simulated) PR2 robot. By manipulating visual and proprioceptive noise levels, we show under which conditions robot control under the active inference scheme is accurate. Besides accurate control, our analysis of the internal system dynamics (e.g. the dynamics of the hidden states that are inferred during the inference) sheds light on key aspects of the framework such as the quintessentially multimodal nature of control and the differential roles of proprioception and vision. In the discussion, we consider the potential importance of being able to implement active inference in robots. In particular, we briefly review the opportunities for modelling psychophysiological phenomena such as sensory attenuation and related failures of gain control, of the sort seen in Parkinson's disease. We also consider the fundamental difference between active inference and optimal control formulations, showing that in the former the heavy lifting shifts from solving a dynamical inverse problem to creating deep forward or generative models with dynamics, whose attracting sets prescribe desired behaviours. © 2016 The Authors.

  4. Principle and Uncertainty Quantification of an Experiment Designed to Infer Actinide Neutron Capture Cross-Sections

    International Nuclear Information System (INIS)

    Youinou, G.; Palmiotti, G.; Salvatorre, M.; Imel, G.; Pardo, R.; Kondev, F.; Paul, M.

    2010-01-01

    An integral reactor physics experiment devoted to infer higher actinide (Am, Cm, Bk, Cf) neutron cross sections will take place in the US. This report presents the principle of the planned experiment as well as a first exercise aiming at quantifying the uncertainties related to the inferred quantities. It has been funded in part by the DOE Office of Science in the framework of the Recovery Act and has been given the name MANTRA for Measurement of Actinides Neutron TRAnsmutation. The principle is to irradiate different pure actinide samples in a test reactor like INL's Advanced Test Reactor, and, after a given time, determine the amount of the different transmutation products. The precise characterization of the nuclide densities before and after neutron irradiation allows the energy integrated neutron cross-sections to be inferred since the relation between the two are the well-known neutron-induced transmutation equations. This approach has been used in the past and the principal novelty of this experiment is that the atom densities of the different transmutation products will be determined with the Accelerator Mass Spectroscopy (AMS) facility located at ANL. While AMS facilities traditionally have been limited to the assay of low-to-medium atomic mass materials, i.e., A 200. The detection limit of AMS being orders of magnitude lower than that of standard mass spectroscopy techniques, more transmutation products could be measured and, potentially, more cross-sections could be inferred from the irradiation of a single sample. Furthermore, measurements will be carried out at the INL using more standard methods in order to have another set of totally uncorrelated information.

  5. Accounting for the Effect of Earth's Rotation in Magnetotelluric Inference

    Science.gov (United States)

    Riegert, D. L.; Thomson, D. J.

    2017-12-01

    The study of geomagnetism has been documented as far back as 1722 when the watchmaker G. Graham constructed a more sensitive compass and showed that the variations in geomagnetic direction varied with an irregular daily pattern. Increased interest in geomagnetism in geomagnetism began at the end of the 19th century (Lamb, Schuster, Chapman, and Price). The Magnetotelluric Method was first introduced in the 1950's (Cagniard and Tikhonov), and, at its core, is simply a regression problem. The result of this method is a transfer function estimate which describes the earth's response to magnetic field variations. This estimate can then be used to infer the earth's subsurface structure; useful for applications such as natural resource exploration. The statistical problem of estimating a transfer function between geomagnetic and induced current measurements has evolved since the 1950's due to a variety of problems: non-stationarity, outliers, and violation of Gaussian assumptions. To address some of these issues, robust regression methods (Chave and Thomson, 2004) and the remote reference method (Gambel, 1979) have been proposed and used. The current method seems to provide reasonable estimates, but still requires a large amount of data. Using the multitaper method of spectral analysis (Thomson, 1982), taking long (greater than 4 months) blocks of geomagnetic data, and concentrating on frequencies below 1000 microhertz to avoid ultraviolet effects, one finds that:1) the cross-spectra are dominated by many offset frequencies including plus and minus 1 and 2 cycles per day;2) the coherence at these offset frequencies is often stronger than at zero offset;3) there are strong couplings from the "quasi two-day" cycle;4) frequencines are usually not symmetric;5) the spectra are dominated by the normal modes of the Sun. This talk will discuss the method of incorporating these observations into the transfer function estimation model, some of the difficulties that arose, their

  6. Some problems in inference from time series of geophysical processes

    Science.gov (United States)

    Koutsoyiannis, Demetris

    2010-05-01

    Due to the complexity of geophysical processes, their modelling and the conducting of typical tasks, such as estimation, prediction and hypothesis testing, heavily rely on available data series and their statistical processing. The classical statistical approaches, which are often used in geophysical modelling, are based upon several simplifying assumptions, which are invalidated in natural processes. Central among these is the (usually tacit) time independence assumption which is regarded to simplify modelling and statistical testing at no substantial cost for the validity of results. Moreover, the perception of the general behaviour of the natural processes and the implied uncertainty is heavily affected by the classical statistical paradigm that is in common use. However, the study of natural behaviours reveals the dominance of change at a multitude of time scales, which in statistical terms is translated in strong time dependence, decaying very slowly with lag time. In its simplest form, this dependence, and equivalently the multi-scale change, can be described by a Hurst-Kolmogorov process using a single parameter additional to those of the marginal distribution. Remarkably, the Hurst-Kolmogorov stochastic dynamics results in much higher uncertainty in comparison to either nonstationary descriptions, or to typical stationary descriptions with independent random processes and common Markov-type processes. In addition, as far as typical statistical estimation is concerned, the Hurst-Kolmogorov dynamics implies dramatically higher intervals in the estimation of location statistical parameters (e.g., mean) and highly negative bias in the estimation of dispersion parameters (e.g., standard deviation), not to mention the bias and uncertainty in higher order moments. Surprisingly, all these differences are commonly unaccounted for in most studies of geophysical processes, which may result in inappropriate modelling, wrong inferences and false claims about the

  7. Evolving approaches toward science based forest management.

    Science.gov (United States)

    Robert C. Szaro; Charles E. Peterson

    2004-01-01

    The sale, scope, and complexity of natural resource and environmental issues have dramatically increased, yet the urgency to solve these issues often requires immediate information that spans disciplinary boundaries, synthesizes material from a variety of sources, draws inferences, and identifies levels of confidence. Although science information and knowledge are only...

  8. "The art of scientific investigation" and "The logic of scientific inference"

    Directory of Open Access Journals (Sweden)

    SN Arseculeratne

    2013-10-01

    Full Text Available The contemporary modes of “medical Education” in Sri Lankan faculties of medicine are briefly reviewed. What are missing are discussions on rational practice of scientific research including the use of proper controls, an understanding of the role of logical inference from research results, and a knowledge of the basic philosophy of modern science; this lack results in a state of scientific illiteracy that leads to misconceptions in the interpretation of research findings. A further desideratum is an awareness of the preceding, valid, published literature. Illustrations from the research literature are provided in illustration of the consequences of these deficiencies. None of the Sri Lankan universities, except the Open University as far as the author is aware, includes discussions on the Philosophy of Modern Science in their curricula.

  9. Strong and superstrong pulsed magnetic fields generation

    CERN Document Server

    Shneerson, German A; Krivosheev, Sergey I

    2014-01-01

    Strong pulsed magnetic fields are important for several fields in physics and engineering, such as power generation and accelerator facilities. Basic aspects of the generation of strong and superstrong pulsed magnetic fields technique are given, including the physics and hydrodynamics of the conductors interacting with the field as well as an account of the significant progress in generation of strong magnetic fields using the magnetic accumulation technique. Results of computer simulations as well as a survey of available field technology are completing the volume.

  10. Impurity screening in strongly coupled plasma systems

    CERN Document Server

    Kyrkos, S

    2003-01-01

    We present an overview of the problem of screening of an impurity in a strongly coupled one-component plasma within the framework of the linear response (LR) theory. We consider 3D, 2D and quasi-2D layered systems. For a strongly coupled plasma the LR can be determined by way of the known S(k) structure functions. In general, an oscillating screening potential with local overscreening and antiscreening regions emerges. In the case of the bilayer, this phenomenon becomes global, as overscreening develops in the layer of the impurity and antiscreening in the adjacent layer. We comment on the limitations of the LR theory in the strong coupling situation.

  11. The lambda sigma calculus and strong normalization

    DEFF Research Database (Denmark)

    Schack-Nielsen, Anders; Schürmann, Carsten

    Explicit substitution calculi can be classified into several dis- tinct categories depending on whether they are confluent, meta-confluent, strong normalization preserving, strongly normalizing, simulating, fully compositional, and/or local. In this paper we present a variant of the λσ-calculus......, which satisfies all seven conditions. In particular, we show how to circumvent Mellies counter-example to strong normalization by a slight restriction of the congruence rules. The calculus is implemented as the core data structure of the Celf logical framework. All meta-theoretic aspects of this work...

  12. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. BEDARTHA GOSWAMI. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short ...

  13. Resonance – Journal of Science Education | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education. V R Padmawar. Articles written in Resonance – Journal of Science Education. Volume 1 Issue 5 May 1996 pp 49-58 General Article. Sampling, Probability Models and Statistical Reasoning Statistical Inference · Mohan Delampady V R Padmawar · More Details ...

  14. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. NORBERT MARWAN. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short time ...

  15. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. PAUL SCHULTZ. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short time ...

  16. One-tailed asymptotic inferences for the difference of proportions: Analysis of 97 methods of inference.

    Science.gov (United States)

    Hernández, María Álvarez; Andrés, Antonio Martín; Tejedor, Inmaculada Herranz

    2018-04-02

    Two-tailed asymptotic inferences for the difference d = p 2  - p 1 with independent proportions have been widely studied in the literature. Nevertheless, the case of one tail has received less attention, despite its great practical importance (superiority studies and noninferiority studies). This paper assesses 97 methods to make these inferences (test and confidence intervals [CIs]), although it also alludes to many others. The conclusions obtained are (1) the optimal method in general (and particularly for errors α = 1% and 5%) is based on arcsine transformation, with the maximum likelihood estimator restricted to the null hypothesis and increasing the successes and failures by 3/8; (2) the optimal method for α = 10% is a modification of the classic model of Peskun; (3) a more simple and acceptable option for large sample sizes and values of d not near to ±1 is the classic method of Peskun; and (4) in the particular case of the superiority and inferiority tests, the optimal method is the classic Wald method (with continuity correction) when the successes and failures are increased by one. We additionally select the optimal methods to make compatible the conclusions of the homogeneity test and the CI for d, both for one tail and for two (methods which are related to arcsine transformation and the Wald method).

  17. International Conference on Trends and Perspectives in Linear Statistical Inference

    CERN Document Server

    Rosen, Dietrich

    2018-01-01

    This volume features selected contributions on a variety of topics related to linear statistical inference. The peer-reviewed papers from the International Conference on Trends and Perspectives in Linear Statistical Inference (LinStat 2016) held in Istanbul, Turkey, 22-25 August 2016, cover topics in both theoretical and applied statistics, such as linear models, high-dimensional statistics, computational statistics, the design of experiments, and multivariate analysis. The book is intended for statisticians, Ph.D. students, and professionals who are interested in statistical inference. .

  18. Working memory supports inference learning just like classification learning.

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan

    2013-08-01

    Recent research has found a positive relationship between people's working memory capacity (WMC) and their speed of category learning. To date, only classification-learning tasks have been considered, in which people learn to assign category labels to objects. It is unknown whether learning to make inferences about category features might also be related to WMC. We report data from a study in which 119 participants undertook classification learning and inference learning, and completed a series of WMC tasks. Working memory capacity was positively related to people's classification and inference learning performance.

  19. Surrogate based approaches to parameter inference in ocean models

    KAUST Repository

    Knio, Omar

    2016-01-06

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  20. Fast and scalable inference of multi-sample cancer lineages.

    KAUST Repository

    Popic, Victoria

    2015-05-06

    Somatic variants can be used as lineage markers for the phylogenetic reconstruction of cancer evolution. Since somatic phylogenetics is complicated by sample heterogeneity, novel specialized tree-building methods are required for cancer phylogeny reconstruction. We present LICHeE (Lineage Inference for Cancer Heterogeneity and Evolution), a novel method that automates the phylogenetic inference of cancer progression from multiple somatic samples. LICHeE uses variant allele frequencies of somatic single nucleotide variants obtained by deep sequencing to reconstruct multi-sample cell lineage trees and infer the subclonal composition of the samples. LICHeE is open source and available at http://viq854.github.io/lichee .

  1. Correlated Fluctuations in Strongly Coupled Binary Networks Beyond Equilibrium

    Directory of Open Access Journals (Sweden)

    David Dahmen

    2016-08-01

    Full Text Available Randomly coupled Ising spins constitute the classical model of collective phenomena in disordered systems, with applications covering glassy magnetism and frustration, combinatorial optimization, protein folding, stock market dynamics, and social dynamics. The phase diagram of these systems is obtained in the thermodynamic limit by averaging over the quenched randomness of the couplings. However, many applications require the statistics of activity for a single realization of the possibly asymmetric couplings in finite-sized networks. Examples include reconstruction of couplings from the observed dynamics, representation of probability distributions for sampling-based inference, and learning in the central nervous system based on the dynamic and correlation-dependent modification of synaptic connections. The systematic cumulant expansion for kinetic binary (Ising threshold units with strong, random, and asymmetric couplings presented here goes beyond mean-field theory and is applicable outside thermodynamic equilibrium; a system of approximate nonlinear equations predicts average activities and pairwise covariances in quantitative agreement with full simulations down to hundreds of units. The linearized theory yields an expansion of the correlation and response functions in collective eigenmodes, leads to an efficient algorithm solving the inverse problem, and shows that correlations are invariant under scaling of the interaction strengths.

  2. Strong and strategic conformity understanding by 3- and 5-year-old children.

    Science.gov (United States)

    Cordonier, Laurent; Nettles, Theresa; Rochat, Philippe

    2017-12-18

    'Strong conformity' corresponds to the public endorsement of majority opinions that are in blatant contradiction to one's own correct perceptual judgements of the situation. We tested strong conformity inference by 3- and 5-year-old children using a third-person perspective paradigm. Results show that at neither age, children spontaneously expect that an ostracized third-party individual who wants to affiliate with the majority group will show strong conformity. However, when questioned as to what the ostracized individual should do to befriend others, from 5 years of age children explicitly demonstrate that they construe strong conformity as a strategic means of social affiliation. Additional data suggest that strong and strategic conformity understanding from an observer's third-person perspective is linked to the passing of the language-mediated false belief theory of mind task, an index of children's emerging 'meta' ability to construe the mental state of others. Statement of contribution What is already known on this subject? 'Strong conformity' corresponds to the public endorsement of majority opinions that are in blatant contradiction to one's own correct perceptual judgements of the situation. Asch's (1956, Psychological Monographs: General and Applied, 70, 1) classic demonstration of strong conformity with adults has been replicated with preschool children: 3- to 4-year-olds manifest signs of strong conformity by reversing about thirty to forty per cent of the time their correct perceptual judgements to fit with contradictory statements held unanimously by other individuals (Corriveau & Harris, 2010, Developmental Psychology, 46, 437; Corriveau et al., 2013, Journal of Cognition and Culture, 13, 367; Haun & Tomasello, 2011, Child Development, 82, 1759). As for adults, strong conformity does not obliterate children's own private, accurate knowledge of the situation. It is in essence a public expression to fit the group and alleviate social dissonance

  3. Color inference in visual communication: the meaning of colors in recycling.

    Science.gov (United States)

    Schloss, Karen B; Lessard, Laurent; Walmsley, Charlotte S; Foley, Kathleen

    2018-01-01

    People interpret abstract meanings from colors, which makes color a useful perceptual feature for visual communication. This process is complicated, however, because there is seldom a one-to-one correspondence between colors and meanings. One color can be associated with many different concepts (one-to-many mapping) and many colors can be associated with the same concept (many-to-one mapping). We propose that to interpret color-coding systems, people perform assignment inference to determine how colors map onto concepts. We studied assignment inference in the domain of recycling. Participants saw images of colored but unlabeled bins and were asked to indicate which bins they would use to discard different kinds of recyclables and trash. In Experiment 1, we tested two hypotheses for how people perform assignment inference. The local assignment hypothesis predicts that people simply match objects with their most strongly associated color. The global assignment hypothesis predicts that people also account for the association strengths between all other objects and colors within the scope of the color-coding system. Participants discarded objects in bins that optimized the color-object associations of the entire set, which is consistent with the global assignment hypothesis. This sometimes resulted in discarding objects in bins whose colors were weakly associated with the object, even when there was a stronger associated option available. In Experiment 2, we tested different methods for encoding color-coding systems and found that people were better at assignment inference when color sets simultaneously maximized the association strength between assigned color-object parings while minimizing associations between unassigned pairings. Our study provides an approach for designing intuitive color-coding systems that facilitate communication through visual media such as graphs, maps, signs, and artifacts.

  4. No Evidence for Phylostratigraphic Bias Impacting Inferences on Patterns of Gene Emergence and Evolution.

    Science.gov (United States)

    Domazet-Lošo, Tomislav; Carvunis, Anne-Ruxandra; Albà, M Mar; Šestak, Martin Sebastijan; Bakaric, Robert; Neme, Rafik; Tautz, Diethard

    2017-04-01

    Phylostratigraphy is a computational framework for dating the emergence of DNA and protein sequences in a phylogeny. It has been extensively applied to make inferences on patterns of genome evolution, including patterns of disease gene evolution, ontogeny and de novo gene origination. Phylostratigraphy typically relies on BLAST searches along a species tree, but new simulation studies have raised concerns about the ability of BLAST to detect remote homologues and its impact on phylostratigraphic inferences. Here, we re-assessed these simulations. We found that, even with a possible overall BLAST false negative rate between 11-15%, the large majority of sequences assigned to a recent evolutionary origin by phylostratigraphy is unaffected by technical concerns about BLAST. Where the results of the simulations did cast doubt on previously reported findings, we repeated the original analyses but now excluded all questionable sequences. The originally described patterns remained essentially unchanged. These new analyses strongly support phylostratigraphic inferences, including: genes that emerged after the origin of eukaryotes are more likely to be expressed in the ectoderm than in the endoderm or mesoderm in Drosophila, and the de novo emergence of protein-coding genes from non-genic sequences occurs through proto-gene intermediates in yeast. We conclude that BLAST is an appropriate and sufficiently sensitive tool in phylostratigraphic analysis that does not appear to introduce significant biases into evolutionary pattern inferences. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  5. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Date of birth: 15 September 1955. Specialization: Strongly Interacting Electron Systems, Disordered Systems, Nanomaterials and Energy Materials Address: Professor, Solid State & Structural Chemistry Unit, Indian Institute of Science, Bengaluru 560 012, Karnataka Contact: Office: (080) 2360 7576, (080) 2293 2945

  6. Inferring species interactions through joint mark–recapture analysis

    Science.gov (United States)

    Yackulic, Charles B.; Korman, Josh; Yard, Michael D.; Dzul, Maria C.

    2018-01-01

    Introduced species are frequently implicated in declines of native species. In many cases, however, evidence linking introduced species to native declines is weak. Failure to make strong inferences regarding the role of introduced species can hamper attempts to predict population viability and delay effective management responses. For many species, mark–recapture analysis is the more rigorous form of demographic analysis. However, to our knowledge, there are no mark–recapture models that allow for joint modeling of interacting species. Here, we introduce a two‐species mark–recapture population model in which the vital rates (and capture probabilities) of one species are allowed to vary in response to the abundance of the other species. We use a simulation study to explore bias and choose an approach to model selection. We then use the model to investigate species interactions between endangered humpback chub (Gila cypha) and introduced rainbow trout (Oncorhynchus mykiss) in the Colorado River between 2009 and 2016. In particular, we test hypotheses about how two environmental factors (turbidity and temperature), intraspecific density dependence, and rainbow trout abundance are related to survival, growth, and capture of juvenile humpback chub. We also project the long‐term effects of different rainbow trout abundances on adult humpback chub abundances. Our simulation study suggests this approach has minimal bias under potentially challenging circumstances (i.e., low capture probabilities) that characterized our application and that model selection using indicator variables could reliably identify the true generating model even when process error was high. When the model was applied to rainbow trout and humpback chub, we identified negative relationships between rainbow trout abundance and the survival, growth, and capture probability of juvenile humpback chub. Effects on interspecific interactions on survival and capture probability were strongly

  7. Models and Inference for Multivariate Spatial Extremes

    KAUST Repository

    Vettori, Sabrina

    2017-12-07

    The development of flexible and interpretable statistical methods is necessary in order to provide appropriate risk assessment measures for extreme events and natural disasters. In this thesis, we address this challenge by contributing to the developing research field of Extreme-Value Theory. We initially study the performance of existing parametric and non-parametric estimators of extremal dependence for multivariate maxima. As the dimensionality increases, non-parametric estimators are more flexible than parametric methods but present some loss in efficiency that we quantify under various scenarios. We introduce a statistical tool which imposes the required shape constraints on non-parametric estimators in high dimensions, significantly improving their performance. Furthermore, by embedding the tree-based max-stable nested logistic distribution in the Bayesian framework, we develop a statistical algorithm that identifies the most likely tree structures representing the data\\'s extremal dependence using the reversible jump Monte Carlo Markov Chain method. A mixture of these trees is then used for uncertainty assessment in prediction through Bayesian model averaging. The computational complexity of full likelihood inference is significantly decreased by deriving a recursive formula for the nested logistic model likelihood. The algorithm performance is verified through simulation experiments which also compare different likelihood procedures. Finally, we extend the nested logistic representation to the spatial framework in order to jointly model multivariate variables collected across a spatial region. This situation emerges often in environmental applications but is not often considered in the current literature. Simulation experiments show that the new class of multivariate max-stable processes is able to detect both the cross and inner spatial dependence of a number of extreme variables at a relatively low computational cost, thanks to its Bayesian hierarchical

  8. PREFACE: ELC International Meeting on Inference, Computation, and Spin Glasses (ICSG2013)

    Science.gov (United States)

    Kabashima, Yoshiyuki; Hukushima, Koji; Inoue, Jun-ichi; Tanaka, Toshiyuki; Watanabe, Osamu

    2013-12-01

    The close relationship between probability-based inference and statistical mechanics of disordered systems has been noted for some time. This relationship has provided researchers with a theoretical foundation in various fields of information processing for analytical performance evaluation and construction of efficient algorithms based on message-passing or Monte Carlo sampling schemes. The ELC International Meeting on 'Inference, Computation, and Spin Glasses (ICSG2013)', was held in Sapporo 28-30 July 2013. The meeting was organized as a satellite meeting of STATPHYS25 in order to offer a forum where concerned researchers can assemble and exchange information on the latest results and newly established methodologies, and discuss future directions of the interdisciplinary studies between statistical mechanics and information sciences. Financial support from Grant-in-Aid for Scientific Research on Innovative Areas, MEXT, Japan 'Exploring the Limits of Computation (ELC)' is gratefully acknowledged. We are pleased to publish 23 papers contributed by invited speakers of ICSG2013 in this volume of Journal of Physics: Conference Series. We hope that this volume will promote further development of this highly vigorous interdisciplinary field between statistical mechanics and information/computer science. Editors and ICSG2013 Organizing Committee: Koji Hukushima Jun-ichi Inoue (Local Chair of ICSG2013) Yoshiyuki Kabashima (Editor-in-Chief) Toshiyuki Tanaka Osamu Watanabe (General Chair of ICSG2013)

  9. Strong Coupling Corrections in Quantum Thermodynamics

    Science.gov (United States)

    Perarnau-Llobet, M.; Wilming, H.; Riera, A.; Gallego, R.; Eisert, J.

    2018-03-01

    Quantum systems strongly coupled to many-body systems equilibrate to the reduced state of a global thermal state, deviating from the local thermal state of the system as it occurs in the weak-coupling limit. Taking this insight as a starting point, we study the thermodynamics of systems strongly coupled to thermal baths. First, we provide strong-coupling corrections to the second law applicable to general systems in three of its different readings: As a statement of maximal extractable work, on heat dissipation, and bound to the Carnot efficiency. These corrections become relevant for small quantum systems and vanish in first order in the interaction strength. We then move to the question of power of heat engines, obtaining a bound on the power enhancement due to strong coupling. Our results are exemplified on the paradigmatic non-Markovian quantum Brownian motion.

  10. Finding quantum effects in strong classical potentials

    Science.gov (United States)

    Hegelich, B. Manuel; Labun, Lance; Labun, Ou Z.

    2017-06-01

    The long-standing challenge to describing charged particle dynamics in strong classical electromagnetic fields is how to incorporate classical radiation, classical radiation reaction and quantized photon emission into a consistent unified framework. The current, semiclassical methods to describe the dynamics of quantum particles in strong classical fields also provide the theoretical framework for fundamental questions in gravity and hadron-hadron collisions, including Hawking radiation, cosmological particle production and thermalization of particles created in heavy-ion collisions. However, as we show, these methods break down for highly relativistic particles propagating in strong fields. They must therefore be improved and adapted for the description of laser-plasma experiments that typically involve the acceleration of electrons. Theory developed from quantum electrodynamics, together with dedicated experimental efforts, offer the best controllable context to establish a robust, experimentally validated foundation for the fundamental theory of quantum effects in strong classical potentials.

  11. The Charm and Beauty of Strong Interactions

    Science.gov (United States)

    El-Bennich, Bruno

    2018-01-01

    We briefly review common features and overlapping issues in hadron and flavor physics focussing on continuum QCD approaches to heavy bound states, their mass spectrum and weak decay constants in different strong interaction models.

  12. Atomica ionization by strong coherent radiation

    International Nuclear Information System (INIS)

    Brandi, H.S.; Davidovich, L.

    1979-07-01

    The relation among the three most frequently used non-perturbative methods proposed to study the ionization of atoms by strong electromagnetic fields is established. Their range of validity is also determined. (Author) [pt

  13. Perturbation of an exact strong gravity solution

    International Nuclear Information System (INIS)

    Baran, S.A.

    1982-10-01

    Perturbations of an exact strong gravity solution are investigated. It is shown, by using the new multipole expansions previously presented, that this exact and static spherically symmetric solution is stable under odd parity perturbations. (author)

  14. Strong-force theorists scoop Noble Prize

    CERN Multimedia

    Durrani, Matin

    2004-01-01

    Three US theorists have shared the 2004 Nobel Prize in Physics "for the discovery of asymptotic freedom in the theory of the strong interaction". Their theoretical work explains why quarks behave almost as free particles at high energies (½ page)

  15. Calculating hadronic properties in strong QCD

    International Nuclear Information System (INIS)

    Pennington, M.R.

    1996-01-01

    This talk gives a brief review of the progress that has been made in calculating the properties of hadrons in strong QCD. In keeping with this meeting I will concentrate on those properties that can be studied with electromagnetic probes. Though perturbative QCD is highly successful, it only applies in a limited kinematic regime, where hard scattering occur, and the quarks move in the interaction region as if they are free, pointlike objects. However, the bulk of strong interactions are governed by the long distance regime, where the strong interaction is strong. It is this regime of length scales of the order of a Fermi, that determines the spectrum of light hadrons and their properties. The calculation of these properties requires an understanding of non-perturbative QCD, of confinement and chiral symmetry breaking. (author)

  16. Philosophy for the rest of cognitive science.

    Science.gov (United States)

    Stepp, Nigel; Chemero, Anthony; Turvey, Michael T

    2011-04-01

    Cognitive science has always included multiple methodologies and theoretical commitments. The philosophy of cognitive science should embrace, or at least acknowledge, this diversity. Bechtel's (2009a) proposed philosophy of cognitive science, however, applies only to representationalist and mechanist cognitive science, ignoring the substantial minority of dynamically oriented cognitive scientists. As an example of nonrepresentational, dynamical cognitive science, we describe strong anticipation as a model for circadian systems (Stepp & Turvey, 2009). We then propose a philosophy of science appropriate to nonrepresentational, dynamical cognitive science. Copyright © 2011 Cognitive Science Society, Inc.

  17. Building strong brands – does it matter?

    OpenAIRE

    Aure, Kristin Gaaseide; Nervik, Kristine Dybvik

    2014-01-01

    Brand equity has proven, through several decades of research, to be a primary source of competitive advantage and future earnings (Yoo & Donthu, 2001). Building strong brands has therefore become a priority for many organizations, with the presumption that building strong brands yields these advantages (Yasin et al., 2007). A quantitative survey was conducted at Sunnmøre in Norway in order to answer the two developed research questions. - Does the brand equity dimensions; brand...

  18. Algebra of strong and electroweak interactions

    International Nuclear Information System (INIS)

    Bolokhov, S.V.; Vladimirov, Yu.S.

    2004-01-01

    The algebraic approach to describing the electroweak and strong interactions is considered within the frames of the binary geometrophysics, based on the principles of the Fokker-Feynman direct interparticle interaction theories of the Kaluza-Klein multidimensional geometrical models and the physical structures theory. It is shown that in this approach the electroweak and strong elementary particles interaction through the intermediate vector bosons, are characterized by the subtypes of the algebraic classification of the complex 3 x 3-matrices [ru

  19. Problem Solving as Probabilistic Inference with Subgoaling: Explaining Human Successes and Pitfalls in the Tower of Hanoi.

    Science.gov (United States)

    Donnarumma, Francesco; Maisto, Domenico; Pezzulo, Giovanni

    2016-04-01

    How do humans and other animals face novel problems for which predefined solutions are not available? Human problem solving links to flexible reasoning and inference rather than to slow trial-and-error learning. It has received considerable attention since the early days of cognitive science, giving rise to well known cognitive architectures such as SOAR and ACT-R, but its computational and brain mechanisms remain incompletely known. Furthermore, it is still unclear whether problem solving is a "specialized" domain or module of cognition, in the sense that it requires computations that are fundamentally different from those supporting perception and action systems. Here we advance a novel view of human problem solving as probabilistic inference with subgoaling. In this perspective, key insights from cognitive architectures are retained such as the importance of using subgoals to split problems into subproblems. However, here the underlying computations use probabilistic inference methods analogous to those that are increasingly popular in the study of perception and action systems. To test our model we focus on the widely used Tower of Hanoi (ToH) task, and show that our proposed method can reproduce characteristic idiosyncrasies of human problem solvers: their sensitivity to the "community structure" of the ToH and their difficulties in executing so-called "counterintuitive" movements. Our analysis reveals that subgoals have two key roles in probabilistic inference and problem solving. First, prior beliefs on (likely) useful subgoals carve the problem space and define an implicit metric for the problem at hand-a metric to which humans are sensitive. Second, subgoals are used as waypoints in the probabilistic problem solving inference and permit to find effective solutions that, when unavailable, lead to problem solving deficits. Our study thus suggests that a probabilistic inference scheme enhanced with subgoals provides a comprehensive framework to study problem

  20. Manipulating light with strongly modulated photonic crystals

    International Nuclear Information System (INIS)

    Notomi, Masaya

    2010-01-01

    Recently, strongly modulated photonic crystals, fabricated by the state-of-the-art semiconductor nanofabrication process, have realized various novel optical properties. This paper describes the way in which they differ from other optical media, and clarifies what they can do. In particular, three important issues are considered: light confinement, frequency dispersion and spatial dispersion. First, I describe the latest status and impact of ultra-strong light confinement in a wavelength-cubic volume achieved in photonic crystals. Second, the extreme reduction in the speed of light is reported, which was achieved as a result of frequency dispersion management. Third, strange negative refraction in photonic crystals is introduced, which results from their unique spatial dispersion, and it is clarified how this leads to perfect imaging. The last two sections are devoted to applications of these novel properties. First, I report the fact that strong light confinement and huge light-matter interaction enhancement make strongly modulated photonic crystals promising for on-chip all-optical processing, and present several examples including all-optical switches/memories and optical logics. As a second application, it is shown that the strong light confinement and slow light in strongly modulated photonic crystals enable the adiabatic tuning of light, which leads to various novel ways of controlling light, such as adiabatic frequency conversion, efficient optomechanics systems, photon memories and photons pinning.