WorldWideScience

Sample records for grn inference schemes

  1. Inference of Gene Regulatory Network Based on Local Bayesian Networks.

    Science.gov (United States)

    Liu, Fei; Zhang, Shao-Wu; Guo, Wei-Feng; Wei, Ze-Gang; Chen, Luonan

    2016-08-01

    The inference of gene regulatory networks (GRNs) from expression data can mine the direct regulations among genes and gain deep insights into biological processes at a network level. During past decades, numerous computational approaches have been introduced for inferring the GRNs. However, many of them still suffer from various problems, e.g., Bayesian network (BN) methods cannot handle large-scale networks due to their high computational complexity, while information theory-based methods cannot identify the directions of regulatory interactions and also suffer from false positive/negative problems. To overcome the limitations, in this work we present a novel algorithm, namely local Bayesian network (LBN), to infer GRNs from gene expression data by using the network decomposition strategy and false-positive edge elimination scheme. Specifically, LBN algorithm first uses conditional mutual information (CMI) to construct an initial network or GRN, which is decomposed into a number of local networks or GRNs. Then, BN method is employed to generate a series of local BNs by selecting the k-nearest neighbors of each gene as its candidate regulatory genes, which significantly reduces the exponential search space from all possible GRN structures. Integrating these local BNs forms a tentative network or GRN by performing CMI, which reduces redundant regulations in the GRN and thus alleviates the false positive problem. The final network or GRN can be obtained by iteratively performing CMI and local BN on the tentative network. In the iterative process, the false or redundant regulations are gradually removed. When tested on the benchmark GRN datasets from DREAM challenge as well as the SOS DNA repair network in E.coli, our results suggest that LBN outperforms other state-of-the-art methods (ARACNE, GENIE3 and NARROMI) significantly, with more accurate and robust performance. In particular, the decomposition strategy with local Bayesian networks not only effectively reduce

  2. Inference of Gene Regulatory Network Based on Local Bayesian Networks.

    Directory of Open Access Journals (Sweden)

    Fei Liu

    2016-08-01

    Full Text Available The inference of gene regulatory networks (GRNs from expression data can mine the direct regulations among genes and gain deep insights into biological processes at a network level. During past decades, numerous computational approaches have been introduced for inferring the GRNs. However, many of them still suffer from various problems, e.g., Bayesian network (BN methods cannot handle large-scale networks due to their high computational complexity, while information theory-based methods cannot identify the directions of regulatory interactions and also suffer from false positive/negative problems. To overcome the limitations, in this work we present a novel algorithm, namely local Bayesian network (LBN, to infer GRNs from gene expression data by using the network decomposition strategy and false-positive edge elimination scheme. Specifically, LBN algorithm first uses conditional mutual information (CMI to construct an initial network or GRN, which is decomposed into a number of local networks or GRNs. Then, BN method is employed to generate a series of local BNs by selecting the k-nearest neighbors of each gene as its candidate regulatory genes, which significantly reduces the exponential search space from all possible GRN structures. Integrating these local BNs forms a tentative network or GRN by performing CMI, which reduces redundant regulations in the GRN and thus alleviates the false positive problem. The final network or GRN can be obtained by iteratively performing CMI and local BN on the tentative network. In the iterative process, the false or redundant regulations are gradually removed. When tested on the benchmark GRN datasets from DREAM challenge as well as the SOS DNA repair network in E.coli, our results suggest that LBN outperforms other state-of-the-art methods (ARACNE, GENIE3 and NARROMI significantly, with more accurate and robust performance. In particular, the decomposition strategy with local Bayesian networks not only

  3. Standing Sausage Modes in Nonuniform Magnetic Tubes: An Inversion Scheme for Inferring Flare Loop Parameters

    Science.gov (United States)

    Chen, Shao-Xia; Li, Bo; Xiong, Ming; Yu, Hui; Guo, Ming-Zhe

    2015-10-01

    Standing sausage modes in flare loops are important for interpreting quasi-periodic pulsations (QPPs) in solar flare light curves. We propose an inversion scheme that consistently uses their periods P and damping times τ to diagnose flare loop parameters. We derive a generic dispersion relation governing linear sausage waves in pressure-less straight tubes, for which the transverse density inhomogeneity takes place in a layer of arbitrary width l and is of arbitrary form. We find that P and τ depend on the combination of [R/{v}{Ai},L/R,l/R,{ρ }{{i}}/{ρ }{{e}}], where R is the loop radius, L is the looplength, vAi is the internal Alfvén speed, and ρi/ρe is the density contrast. For all the density profiles examined, P and τ experience saturation when L/R ≫ 1, yielding an inversion curve in the [R/{v}{Ai},l/R,{ρ }{{i}}/{ρ }{{e}}] space with a specific density profile when L/R is sufficiently large. When applied to a spatially unresolved QPP event, the scheme yields that R/vAi is the best constrained, whereas l/R corresponds to the other extreme. For spatially resolved QPPs, while L/R ≫ 1 cannot be assumed beforehand, an inversion curve remains possible due to additional geometrical constraints. When a spatially resolved QPP event involves another mode, as is the case for a recent event, the full set of [{v}{Ai},l,{ρ }{{i}}/{ρ }{{e}}] can be inferred. We conclude that the proposed scheme provides a useful tool for magneto-seismologically exploiting QPPs.

  4. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    .1 with the title ‘Inference'.) This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods using Markov chain Monte Carlo (MCMC) simulations. Due to space limitations the focus...

  5. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    2010-01-01

    Chapter 9: This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based on a maximum likelihood or Bayesian approach combined with markov chain Monte Carlo...

  6. Imetelstat (GRN163L)--telomerase-based cancer therapy.

    Science.gov (United States)

    Röth, Alexander; Harley, Calvin B; Baerlocher, Gabriela M

    2010-01-01

    Telomeres and telomerase play essential roles in the regulation of the lifespan of human cells. While normal human somatic cells do not or only transiently express telomerase and therefore shorten their telomeres with each cell division, most human cancer cells typically express high levels of telomerase and show unlimited cell proliferation. High telomerase expression allows cells to proliferate and expand long-term and therefore supports tumor growth. Owing to the high expression and its role, telomerase has become an attractive diagnostic and therapeutic cancer target. Imetelstat (GRN163L) is a potent and specific telomerase inhibitor and so far the only drug of its class in clinical trials. Here, we report on the structure and the mechanism of action of imetelstat as well as about the preclinical and clinical data and future prospects using imetelstat in cancer therapy.

  7. Practical aspects of gene regulatory inference via conditional inference forests from expression data.

    Science.gov (United States)

    Bessonov, Kyrylo; Van Steen, Kristel

    2016-12-01

    Gene regulatory network (GRN) inference is an active area of research that facilitates understanding the complex interplays between biological molecules. We propose a novel framework to create such GRNs, based on Conditional Inference Forests (CIFs) as proposed by Strobl et al. Our framework consists of using ensembles of Conditional Inference Trees (CITs) and selecting an appropriate aggregation scheme for variant selection prior to network construction. We show on synthetic microarray data that taking the original implementation of CIFs with conditional permutation scheme (CIFcond ) may lead to improved performance compared to Breiman's implementation of Random Forests (RF). Among all newly introduced CIF-based methods and five network scenarios obtained from the DREAM4 challenge, CIFcond performed best. Networks derived from well-tuned CIFs, obtained by simply averaging P-values over tree ensembles (CIFmean ) are particularly attractive, because they combine adequate performance with computational efficiency. Moreover, thresholds for variable selection are based on significance levels for P-values and, hence, do not need to be tuned. From a practical point of view, our extensive simulations show the potential advantages of CIFmean -based methods. Although more work is needed to improve on speed, especially when fully exploiting the advantages of CITs in the context of heterogeneous and correlated data, we have shown that CIF methodology can be flexibly inserted in a framework to infer biological interactions. Notably, we confirmed biologically relevant interaction between IL2RA and FOXP1, linked to the IL-2 signaling pathway and to type 1 diabetes.

  8. Likelihood Inference under Generalized Hybrid Censoring Scheme with Comp eting Risks

    Institute of Scientific and Technical Information of China (English)

    MAO Song; SHI Yi-min

    2016-01-01

    Statistical inference is developed for the analysis of generalized type-II hybrid censoring data under exponential competing risks model. In order to solve the problem that approximate methods make unsatisfactory performances in the case of small sample size, we establish the exact conditional distributions of estimators for parameters by conditional moment generating function(CMGF). Furthermore, confidence intervals(CIs) are constructed by exact distributions, approximate distributions as well as bootstrap method respectively, and their performances are evaluated by Monte Carlo simulations. And finally, a real data set is analyzed to illustrate all the methods developed here.

  9. DMAC-AN INTEGRATED ENCRYPTION SCHEME WITH RSA FOR AC TO OBSTRUCT INFERENCE ATTACKS

    Directory of Open Access Journals (Sweden)

    R. Jeeva

    2012-12-01

    Full Text Available The proposal of indistinguishable encryption in Randomized Arithmetic Coding(RAC doesn’t make the system efficient because it was not encrypting the messages it sends. It recomputes the cipher form of every messages it sends that increases not only the computational cost but also increases the response time.Floating point representation in cipher increases the difficulty in decryption side because of loss in precison.RAC doesn’t handle the inference attacks like Man-in-Middle attack,Third party attack etc. In our system, Dynamic Matrix Arithmetic Coding(DMAC using dynamic session matrix to encrypt the messages. The size of the matrix is deduced from the session key that contains ID of end users which proves the server authentication.Nonce values is represented as the public key of the opponents encrypted by the session key will be exchanged between the end users to provide mutual authentication. If the adversary try to compromise either server or end users,the other system won’t respond and the intrusion will be easily detected. we have increased the hacking complexity of AC by integrating with RSA upto 99%.

  10. Standing Sausage Modes In Nonuniform Magnetic Tubes: An Inversion Scheme For Inferring Flare Loop Parameters

    CERN Document Server

    Chen, Shao-Xia; Xiong, Ming; Yu, Hui; Guo, Ming-Zhe

    2015-01-01

    Standing sausage modes in flare loops are important for interpreting quasi-periodic pulsations (QPPs) in solar flare lightcurves. We propose an inversion scheme that consistently uses their periods $P$ and damping times $\\tau$ to diagnose flare loop parameters. We derive a generic dispersion relation governing linear sausage waves in pressure-less straight tubes, for which the transverse density inhomogeneity takes place in a layer of arbitrary width $l$ and is of arbitrary form. We find that $P$ and $\\tau$ depend on the combination of $[R/v_{\\rm Ai}, L/R, l/R, \\rho_{\\rm i}/\\rho_{\\rm e}]$, where $R$ is the loop radius, $L$ is the looplength, $v_{\\rm Ai}$ is the internal Alfv\\'en speed, and $\\rho_{\\rm i}/\\rho_{\\rm e}$ is the density contrast. For all the density profiles examined, $P$ and $\\tau$ experience saturation when $L/R \\gg 1$, yielding an inversion curve in the $[R/v_{\\rm Ai}, l/R, \\rho_{\\rm i}/\\rho_{\\rm e}]$ space with a specific density profile when $L/R$ is sufficiently large. When applied to a spat...

  11. Ancestral state reconstruction by comparative analysis of a GRN kernel operating in echinoderms.

    Science.gov (United States)

    Erkenbrack, Eric M; Ako-Asare, Kayla; Miller, Emily; Tekelenburg, Saira; Thompson, Jeffrey R; Romano, Laura

    2016-01-01

    Diverse sampling of organisms across the five major classes in the phylum Echinodermata is beginning to reveal much about the structure and function of gene regulatory networks (GRNs) in development and evolution. Sea urchins are the most studied clade within this phylum, and recent work suggests there has been dramatic rewiring at the top of the skeletogenic GRN along the lineage leading to extant members of the euechinoid sea urchins. Such rewiring likely accounts for some of the observed developmental differences between the two major subclasses of sea urchins-cidaroids and euechinoids. To address effects of topmost rewiring on downstream GRN events, we cloned four downstream regulatory genes within the skeletogenic GRN and surveyed their spatiotemporal expression patterns in the cidaroid Eucidaris tribuloides. We performed phylogenetic analyses with homologs from other non-vertebrate deuterostomes and characterized their spatiotemporal expression by quantitative polymerase chain reaction (qPCR) and whole-mount in situ hybridization (WMISH). Our data suggest the erg-hex-tgif subcircuit, a putative GRN kernel, exhibits a mesoderm-specific expression pattern early in Eucidaris development that is directly downstream of the initial mesodermal GRN circuitry. Comparative analysis of the expression of this subcircuit in four echinoderm taxa allowed robust ancestral state reconstruction, supporting hypotheses that its ancestral function was to stabilize the mesodermal regulatory state and that it has been co-opted and deployed as a unit in mesodermal subdomains in distantly diverged echinoderms. Importantly, our study supports the notion that GRN kernels exhibit structural and functional modularity, locking down and stabilizing clade-specific, embryonic regulatory states.

  12. Telomerase inhibitor Imetelstat (GRN163L limits the lifespan of human pancreatic cancer cells.

    Directory of Open Access Journals (Sweden)

    Katrina M Burchett

    Full Text Available Telomerase is required for the unlimited lifespan of cancer cells. The vast majority of pancreatic adenocarcinomas overexpress telomerase activity and blocking telomerase could limit their lifespan. GRN163L (Imetelstat is a lipid-conjugated N3'→P5' thio-phosphoramidate oligonucleotide that blocks the template region of telomerase. The aim of this study was to define the effects of long-term GRN163L exposure on the maintenance of telomeres and lifespan of pancreatic cancer cells. Telomere size, telomerase activity, and telomerase inhibition response to GRN163L were measured in a panel of 10 pancreatic cancer cell lines. The cell lines exhibited large differences in levels of telomerase activity (46-fold variation, but most lines had very short telomeres (2-3 kb in size. GRN163L inhibited telomerase in all 10 pancreatic cancer cell lines, with IC50 ranging from 50 nM to 200 nM. Continuous GRN163L exposure of CAPAN1 (IC50 = 75 nM and CD18 cells (IC50 = 204 nM resulted in an initial rapid shortening of the telomeres followed by the maintenance of extremely short but stable telomeres. Continuous exposure to the drug eventually led to crisis and to a complete loss of viability after 47 (CAPAN1 and 69 (CD18 doublings. Crisis In these cells was accompanied by activation of a DNA damage response (γ-H2AX and evidence of both senescence (SA-β-galactosidase activity and apoptosis (sub-G1 DNA content, PARP cleavage. Removal of the drug after long-term GRN163L exposure led to a reactivation of telomerase and re-elongation of telomeres in the third week of cultivation without GRN163L. These findings show that the lifespan of pancreatic cancer cells can be limited by continuous telomerase inhibition. These results should facilitate the design of future clinical trials of GRN163L in patients with pancreatic cancer.

  13. Telomerase inhibitor Imetelstat (GRN163L) limits the lifespan of human pancreatic cancer cells.

    Science.gov (United States)

    Burchett, Katrina M; Yan, Ying; Ouellette, Michel M

    2014-01-01

    Telomerase is required for the unlimited lifespan of cancer cells. The vast majority of pancreatic adenocarcinomas overexpress telomerase activity and blocking telomerase could limit their lifespan. GRN163L (Imetelstat) is a lipid-conjugated N3'→P5' thio-phosphoramidate oligonucleotide that blocks the template region of telomerase. The aim of this study was to define the effects of long-term GRN163L exposure on the maintenance of telomeres and lifespan of pancreatic cancer cells. Telomere size, telomerase activity, and telomerase inhibition response to GRN163L were measured in a panel of 10 pancreatic cancer cell lines. The cell lines exhibited large differences in levels of telomerase activity (46-fold variation), but most lines had very short telomeres (2-3 kb in size). GRN163L inhibited telomerase in all 10 pancreatic cancer cell lines, with IC50 ranging from 50 nM to 200 nM. Continuous GRN163L exposure of CAPAN1 (IC50 = 75 nM) and CD18 cells (IC50 = 204 nM) resulted in an initial rapid shortening of the telomeres followed by the maintenance of extremely short but stable telomeres. Continuous exposure to the drug eventually led to crisis and to a complete loss of viability after 47 (CAPAN1) and 69 (CD18) doublings. Crisis In these cells was accompanied by activation of a DNA damage response (γ-H2AX) and evidence of both senescence (SA-β-galactosidase activity) and apoptosis (sub-G1 DNA content, PARP cleavage). Removal of the drug after long-term GRN163L exposure led to a reactivation of telomerase and re-elongation of telomeres in the third week of cultivation without GRN163L. These findings show that the lifespan of pancreatic cancer cells can be limited by continuous telomerase inhibition. These results should facilitate the design of future clinical trials of GRN163L in patients with pancreatic cancer.

  14. Progress in Telomerase Inhibitor GRN163L in the Treatment for Cancer%端粒酶抑制剂GRN163L在肿瘤治疗中的研究进展

    Institute of Scientific and Technical Information of China (English)

    喻嫦娥; 余忠华

    2013-01-01

    GRN163L是一种针对hTR (端粒酶RNA)的反义核苷酸药物,它能够抑制端粒酶的活性,引起肿瘤染色体末端端粒长度的不断缩短,诱导肿瘤细胞停止分裂.作为针对hTR基因靶向治疗的前沿药物,GRN163L逐渐成为临床研究的热点.全文拟从端粒及端粒酶、GRN 163L的理化性质、生物学功能及肿瘤治疗的相关研究进展作一综述.%GRN163L (Geron Presentations on Imetelstat) is an antisense nueleotide drugs arm-ming at hTR (telomerase RNA),which can inhibit the activity of telomerase, cause telomere length of tumor chromosome ends constantly shortened and induce tumor cells to stop splitting. As a therapy forefront drug targetting the hTR gene,GRN163L gradually become the focus of clinical research. This paper will expound telomeres and telomerase,phy scal and chemical properties of GRN163L,biological function of GRN163L and research progre is of tumor therapy.

  15. Suppression of Ov-grn-1 encoding granulin of Opisthorchis viverrini inhibits proliferation of biliary epithelial cells.

    Science.gov (United States)

    Papatpremsiri, Atiroch; Smout, Michael J; Loukas, Alex; Brindley, Paul J; Sripa, Banchob; Laha, Thewarach

    2015-01-01

    Multistep processes likely underlie cholangiocarcinogenesis induced by chronic infection with the fish-borne liver fluke, Opisthorchis viverrini. One process appears to be cellular proliferation of the host bile duct epithelia driven by excretory-secretory (ES) products of this pathogen. Specifically, the secreted growth factor Ov-GRN-1, a liver fluke granulin, is a prominent component of ES and a known driver of hyper-proliferation of cultured human and mouse cells in vitro. We show potent hyper-proliferation of human cholangiocytes induced by low nanomolar levels of recombinant Ov-GRN-1 and similar growth produced by low microgram concentrations of ES products and soluble lysates of the adult worm. To further explore the influence of Ov-GRN-1 on the flukes and the host cells, expression of Ov-grn-1 was repressed using RNA interference. Expression of Ov-grn-1 was suppressed by 95% by day 3 and by ~100% by day 7. Co-culture of Ov-grn-1 suppressed flukes with human cholangiocyte (H-69) or human cholangiocarcinoma (KKU-M214) cell lines retarded cell hyper-proliferation by 25% and 92%, respectively. Intriguingly, flukes in which expression of Ov-grn-1 was repressed were less viable in culture, suggesting that Ov-GRN-1 is an essential growth factor for survival of the adult stage of O. viverrini, at least in vitro. To summarize, specific knock down of Ov-grn-1 reduced in vitro survival and capacity of ES products to drive host cell proliferation. These findings may help to contribute to a deeper understanding of liver fluke induced cholangiocarcinogenesis.

  16. How difficult is inference of mammalian causal gene regulatory networks?

    Directory of Open Access Journals (Sweden)

    Djordje Djordjevic

    Full Text Available Gene regulatory networks (GRNs play a central role in systems biology, especially in the study of mammalian organ development. One key question remains largely unanswered: Is it possible to infer mammalian causal GRNs using observable gene co-expression patterns alone? We assembled two mouse GRN datasets (embryonic tooth and heart and matching microarray gene expression profiles to systematically investigate the difficulties of mammalian causal GRN inference. The GRNs were assembled based on > 2,000 pieces of experimental genetic perturbation evidence from manually reading > 150 primary research articles. Each piece of perturbation evidence records the qualitative change of the expression of one gene following knock-down or over-expression of another gene. Our data have thorough annotation of tissue types and embryonic stages, as well as the type of regulation (activation, inhibition and no effect, which uniquely allows us to estimate both sensitivity and specificity of the inference of tissue specific causal GRN edges. Using these unprecedented datasets, we found that gene co-expression does not reliably distinguish true positive from false positive interactions, making inference of GRN in mammalian development very difficult. Nonetheless, if we have expression profiling data from genetic or molecular perturbation experiments, such as gene knock-out or signalling stimulation, it is possible to use the set of differentially expressed genes to recover causal regulatory relationships with good sensitivity and specificity. Our result supports the importance of using perturbation experimental data in causal network reconstruction. Furthermore, we showed that causal gene regulatory relationship can be highly cell type or developmental stage specific, suggesting the importance of employing expression profiles from homogeneous cell populations. This study provides essential datasets and empirical evidence to guide the development of new GRN inference

  17. How difficult is inference of mammalian causal gene regulatory networks?

    Science.gov (United States)

    Djordjevic, Djordje; Yang, Andrian; Zadoorian, Armella; Rungrugeecharoen, Kevin; Ho, Joshua W K

    2014-01-01

    Gene regulatory networks (GRNs) play a central role in systems biology, especially in the study of mammalian organ development. One key question remains largely unanswered: Is it possible to infer mammalian causal GRNs using observable gene co-expression patterns alone? We assembled two mouse GRN datasets (embryonic tooth and heart) and matching microarray gene expression profiles to systematically investigate the difficulties of mammalian causal GRN inference. The GRNs were assembled based on > 2,000 pieces of experimental genetic perturbation evidence from manually reading > 150 primary research articles. Each piece of perturbation evidence records the qualitative change of the expression of one gene following knock-down or over-expression of another gene. Our data have thorough annotation of tissue types and embryonic stages, as well as the type of regulation (activation, inhibition and no effect), which uniquely allows us to estimate both sensitivity and specificity of the inference of tissue specific causal GRN edges. Using these unprecedented datasets, we found that gene co-expression does not reliably distinguish true positive from false positive interactions, making inference of GRN in mammalian development very difficult. Nonetheless, if we have expression profiling data from genetic or molecular perturbation experiments, such as gene knock-out or signalling stimulation, it is possible to use the set of differentially expressed genes to recover causal regulatory relationships with good sensitivity and specificity. Our result supports the importance of using perturbation experimental data in causal network reconstruction. Furthermore, we showed that causal gene regulatory relationship can be highly cell type or developmental stage specific, suggesting the importance of employing expression profiles from homogeneous cell populations. This study provides essential datasets and empirical evidence to guide the development of new GRN inference methods for

  18. "Antelope": a hybrid-logic model checker for branching-time Boolean GRN analysis

    Directory of Open Access Journals (Sweden)

    Arellano Gustavo

    2011-12-01

    Full Text Available Abstract Background In Thomas' formalism for modeling gene regulatory networks (GRNs, branching time, where a state can have more than one possible future, plays a prominent role. By representing a certain degree of unpredictability, branching time can model several important phenomena, such as (a asynchrony, (b incompletely specified behavior, and (c interaction with the environment. Introducing more than one possible future for a state, however, creates a difficulty for ordinary simulators, because infinitely many paths may appear, limiting ordinary simulators to statistical conclusions. Model checkers for branching time, by contrast, are able to prove properties in the presence of infinitely many paths. Results We have developed Antelope ("Analysis of Networks through TEmporal-LOgic sPEcifications", http://turing.iimas.unam.mx:8080/AntelopeWEB/, a model checker for analyzing and constructing Boolean GRNs. Currently, software systems for Boolean GRNs use branching time almost exclusively for asynchrony. Antelope, by contrast, also uses branching time for incompletely specified behavior and environment interaction. We show the usefulness of modeling these two phenomena in the development of a Boolean GRN of the Arabidopsis thaliana root stem cell niche. There are two obstacles to a direct approach when applying model checking to Boolean GRN analysis. First, ordinary model checkers normally only verify whether or not a given set of model states has a given property. In comparison, a model checker for Boolean GRNs is preferable if it reports the set of states having a desired property. Second, for efficiency, the expressiveness of many model checkers is limited, resulting in the inability to express some interesting properties of Boolean GRNs. Antelope tries to overcome these two drawbacks: Apart from reporting the set of all states having a given property, our model checker can express, at the expense of efficiency, some properties that ordinary

  19. Cloning and Expression of Human Grn Gene%重组人Grn的克隆与表达

    Institute of Scientific and Technical Information of China (English)

    谌思; 粟艳林; 李悦; 彭小宁

    2015-01-01

    目的:构建人Grn基因原核表达载体并观察其表达。方法:从人单核细胞系THP-1细胞cDNA中扩增出Grn片段,通过酶切与连接,将Grn片段构建到pGEX-4T-2质粒载体上,再转化入感受态细胞TOP10,菌落PCR鉴定阳性克隆并测序分析,测序正确的重组质粒再转化入感受态细胞DE3中表达蛋白,用Western blot鉴定结果。结果:构建的pGEX-4T-2表达载体作PCR和双酶切鉴定,证实其中有目的片段完整插入,插入片段测序结果与Grn序列设计完全一致,将其转化入DE3中,Western blot结果表明在细菌裂解上清有分子质量为91KD的融合蛋白。结论:重组人Grn的克隆与表达成功,为进一步研究Grn的功能奠定了基础。%ObjectsTo construct a prokaryotic expression vector containing human Grn gene and observe its expression. MethodsThe gene Grn was amplifified by PCR from THP-1 cDNA. The cloned gene was digested by restrictive endonucle-ase and subcloned into eulcaryotic experssion vecter pGEX-4T-2. Grn had been cloned into pGEX-4T-2, and the recom-bined plasmid was transduced into TOP10. The positive cloning was identified by PCR and sequencing. The recombinant plasmid was transformed into competent cell of DE3. The expression of GRN was analyzed by Western blot.ResultsIt was verified by PCR and nucleotide sequencing that the constructed Grn eukaryotic vector was correct. After pGEX-4T-2 was transfected into DE3, a GST fusion protein molecule of91KD was found in supernatant of lysis of bactoria by Western blot. ConclusionA prokaryotic expression plasmid containing human grn gene is successfully constructed, and it can express out objectiveprotein, which has laid a concrete fundation for future study on grn.

  20. Emergent adaptive behaviour of GRN-controlled simulated robots in a changing environment

    Directory of Open Access Journals (Sweden)

    Yao Yao

    2016-12-01

    Full Text Available We developed a bio-inspired robot controller combining an artificial genome with an agent-based control system. The genome encodes a gene regulatory network (GRN that is switched on by environmental cues and, following the rules of transcriptional regulation, provides output signals to actuators. Whereas the genome represents the full encoding of the transcriptional network, the agent-based system mimics the active regulatory network and signal transduction system also present in naturally occurring biological systems. Using such a design that separates the static from the conditionally active part of the gene regulatory network contributes to a better general adaptive behaviour. Here, we have explored the potential of our platform with respect to the evolution of adaptive behaviour, such as preying when food becomes scarce, in a complex and changing environment and show through simulations of swarm robots in an A-life environment that evolution of collective behaviour likely can be attributed to bio-inspired evolutionary processes acting at different levels, from the gene and the genome to the individual robot and robot population.

  1. myGRN: a database and visualisation system for the storage and analysis of developmental genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Bacha Jamil

    2009-06-01

    Full Text Available Abstract Background Biological processes are regulated by complex interactions between transcription factors and signalling molecules, collectively described as Genetic Regulatory Networks (GRNs. The characterisation of these networks to reveal regulatory mechanisms is a long-term goal of many laboratories. However compiling, visualising and interacting with such networks is non-trivial. Current tools and databases typically focus on GRNs within simple, single celled organisms. However, data is available within the literature describing regulatory interactions in multi-cellular organisms, although not in any systematic form. This is particularly true within the field of developmental biology, where regulatory interactions should also be tagged with information about the time and anatomical location of development in which they occur. Description We have developed myGRN (http://www.myGRN.org, a web application for storing and interrogating interaction data, with an emphasis on developmental processes. Users can submit interaction and gene expression data, either curated from published sources or derived from their own unpublished data. All interactions associated with publications are publicly visible, and unpublished interactions can only be shared between collaborating labs prior to publication. Users can group interactions into discrete networks based on specific biological processes. Various filters allow dynamic production of network diagrams based on a range of information including tissue location, developmental stage or basic topology. Individual networks can be viewed using myGRV, a tool focused on displaying developmental networks, or exported in a range of formats compatible with third party tools. Networks can also be analysed for the presence of common network motifs. We demonstrate the capabilities of myGRN using a network of zebrafish interactions integrated with expression data from the zebrafish database, ZFIN. Conclusion Here we

  2. Bagging statistical network inference from large-scale gene expression data.

    OpenAIRE

    Ricardo de Matos Simoes; Frank Emmert-Streib

    2012-01-01

    Modern biology and medicine aim at hunting molecular and cellular causes of biological functions and diseases. Gene regulatory networks (GRN) inferred from gene expression data are considered an important aid for this research by providing a map of molecular interactions. Hence, GRNs have the potential enabling and enhancing basic as well as applied research in the life sciences. In this paper, we introduce a new method called BC3NET for inferring causal gene regulatory networks from large-sc...

  3. Variability of the clinical phenotype in an Italian family with dementia associated with an intronic deletion in the GRN gene.

    Science.gov (United States)

    Marcon, Gabriella; Rossi, Giacomina; Giaccone, Giorgio; Giovagnoli, Anna Rita; Piccoli, Elena; Zanini, Sergio; Geatti, Onelio; Toso, Vito; Grisoli, Marina; Tagliavini, Fabrizio

    2011-01-01

    Mutations in the progranulin gene (GRN) were recently identified as an important cause of familial frontotemporal dementia (FTD). More than 60 pathogenic mutations have been reported up to now and prominent phenotypic variability within and among affected kindreds has been described. We have studied an Italian family with clinical evidence of dementia, and here we report detailed clinical records, imaging, sequential neurological examinations, cognitive assessments, and genetic analysis of three affected members of the same generation. Genetic analysis revealed the presence of the null mutation IVS6 + 5_8delGTGA in GRN, leading to haploinsufficiency, as documented by mRNA analysis. The mutation is associated with wide variation of the clinical phenotype, ranging from FTD to Alzheimer's disease and to a rapidly-progressive dementia. In summary, the patients of this kindred showed highly variable clinical features that do not have a close correspondence with the pattern of the cerebral atrophy. Our data extend the phenotypic spectrum and the complexity of neurodegenerative diseases linked to GRN mutations.

  4. Single-cell and coupled GRN models of cell patterning in the Arabidopsis thaliana root stem cell niche

    Directory of Open Access Journals (Sweden)

    Alvarez-Buylla Elena R

    2010-10-01

    Full Text Available Abstract Background Recent experimental work has uncovered some of the genetic components required to maintain the Arabidopsis thaliana root stem cell niche (SCN and its structure. Two main pathways are involved. One pathway depends on the genes SHORTROOT and SCARECROW and the other depends on the PLETHORA genes, which have been proposed to constitute the auxin readouts. Recent evidence suggests that a regulatory circuit, composed of WOX5 and CLE40, also contributes to the SCN maintenance. Yet, we still do not understand how the niche is dynamically maintained and patterned or if the uncovered molecular components are sufficient to recover the observed gene expression configurations that characterize the cell types within the root SCN. Mathematical and computational tools have proven useful in understanding the dynamics of cell differentiation. Hence, to further explore root SCN patterning, we integrated available experimental data into dynamic Gene Regulatory Network (GRN models and addressed if these are sufficient to attain observed gene expression configurations in the root SCN in a robust and autonomous manner. Results We found that an SCN GRN model based only on experimental data did not reproduce the configurations observed within the root SCN. We developed several alternative GRN models that recover these expected stable gene configurations. Such models incorporate a few additional components and interactions in addition to those that have been uncovered. The recovered configurations are stable to perturbations, and the models are able to recover the observed gene expression profiles of almost all the mutants described so far. However, the robustness of the postulated GRNs is not as high as that of other previously studied networks. Conclusions These models are the first published approximations for a dynamic mechanism of the A. thaliana root SCN cellular pattering. Our model is useful to formally show that the data now available are not

  5. Expression, refolding and purification of Ov-GRN-1, a granulin-like growth factor from the carcinogenic liver fluke, that causes proliferation of mammalian host cells.

    Science.gov (United States)

    Smout, Michael J; Mulvenna, Jason P; Jones, Malcolm K; Loukas, Alex

    2011-10-01

    Granulins (GRNs) are potent growth factors that are upregulated in many aggressive cancers from a wide range of organs. GRNs form tight, disulphide bonded, beta hairpin stacks, making them difficult to express in recombinant form. We recently described Ov-GRN-1, a GRN family member secreted by the carcinogenic liver fluke of humans, Opisthorchis viverrini, and showed that recombinant Ov-GRN-1 expressed and refolded from Escherichia coli caused proliferation of mammalian cell lines at nanomolar concentrations. We now report on an optimized method to express and purify monomeric Ov-GRN-1 in E. coli using a straightforward and scalable purification and refolding process. Purified monomeric protein caused proliferation at nanomolar concentrations of cancerous and non-cancerous cell lines derived from human bile duct tissue. The expression and purification method we describe herein will serve as a backbone upon which to develop expression and purification processes for recombinant GRNs from other organisms, accelerating research on this intriguing family of proteins.

  6. Profiling of Ubiquitination Pathway Genes in Peripheral Cells from Patients with Frontotemporal Dementia due to C9ORF72 and GRN Mutations

    Directory of Open Access Journals (Sweden)

    Maria Serpente

    2015-01-01

    Full Text Available We analysed the expression levels of 84 key genes involved in the regulated degradation of cellular protein by the ubiquitin-proteasome system in peripheral cells from patients with frontotemporal dementia (FTD due to C9ORF72 and GRN mutations, as compared with sporadic FTD and age-matched controls. A SABiosciences PCR array was used to investigate the transcription profile in a discovery population consisting of six patients each in C9ORF72, GRN, sporadic FTD and age-matched control groups. A generalized down-regulation of gene expression compared with controls was observed in C9ORF72 expansion carriers and sporadic FTD patients. In particular, in both groups, four genes, UBE2I, UBE2Q1, UBE2E1 and UBE2N, were down-regulated at a statistically significant (p < 0.05 level. All of them encode for members of the E2 ubiquitin-conjugating enzyme family. In GRN mutation carriers, no statistically significant deregulation of ubiquitination pathway genes was observed, except for the UBE2Z gene, which displays E2 ubiquitin conjugating enzyme activity, and was found to be statistically significant up-regulated (p = 0.006. These preliminary results suggest that the proteasomal degradation pathway plays a role in the pathogenesis of FTD associated with TDP-43 pathology, although different proteins are altered in carriers of GRN mutations as compared with carriers of the C9ORF72 expansion.

  7. Defining the association of TMEM106B variants among frontotemporal lobar degeneration patients with GRN mutations and C9orf72 repeat expansions.

    Science.gov (United States)

    Lattante, Serena; Le Ber, Isabelle; Galimberti, Daniela; Serpente, Maria; Rivaud-Péchoux, Sophie; Camuzat, Agnès; Clot, Fabienne; Fenoglio, Chiara; Scarpini, Elio; Brice, Alexis; Kabashi, Edor

    2014-11-01

    TMEM106B was identified as a risk factor for frontotemporal lobar degeneration (FTD) with TAR DNA-binding protein 43 kDa inclusions. It has been reported that variants in this gene are genetic modifiers of the disease and that this association is stronger in patients carrying a GRN mutation or a pathogenic expansion in chromosome 9 open reading frame 72 (C9orf72) gene. Here, we investigated the contribution of TMEM106B polymorphisms in cohorts of FTD and FTD with amyotrophic lateral sclerosis patients from France and Italy. Patients carrying the C9orf72 expansion (n = 145) and patients with GRN mutations (n = 76) were compared with a group of FTD patients (n = 384) negative for mutations and to a group of healthy controls (n = 552). In our cohorts, the presence of the C9orf72 expansion did not correlate with TMEM106B genotypes but the association was very strong in individuals with pathogenic GRN mutations (p = 9.54 × 10(-6)). Our data suggest that TMEM106B genotypes differ in FTD patient cohorts and strengthen the protective role of TMEM106B in GRN carriers. Further studies are needed to determine whether TMEM106B polymorphisms are associated with other genetic causes for FTD, including C9orf72 repeat expansions.

  8. Variation at GRN3′-UTR rs5848 is not associated with a risk of frontotemporal lobar degenerationin Dutch population

    NARCIS (Netherlands)

    J. Simón-Sánchez (Javier); H. Seelaar (Harro); Z. Bochdanovits (Zoltan); D.J.H. Deeg (Dorly); J.C. van Swieten (John); P. Heutink (Peter)

    2009-01-01

    textabstractBackground: A single nucleotide polymorphism (rs5848) located in the 3′- untranslated region of GRN has recently been associated with a risk of frontotemporal lobar degeneration (FTLD) in North American population particularly in pathologically confirmed cases with neural inclusions

  9. New regulatory circuit controlling spatial and temporal gene expression in the sea urchin embryo oral ectoderm GRN.

    Science.gov (United States)

    Li, Enhu; Materna, Stefan C; Davidson, Eric H

    2013-10-01

    The sea urchin oral ectoderm gene regulatory network (GRN) model has increased in complexity as additional genes are added to it, revealing its multiple spatial regulatory state domains. The formation of the oral ectoderm begins with an oral-aboral redox gradient, which is interpreted by the cis-regulatory system of the nodal gene to cause its expression on the oral side of the embryo. Nodal signaling drives cohorts of regulatory genes within the oral ectoderm and its derived subdomains. Activation of these genes occurs sequentially, spanning the entire blastula stage. During this process the stomodeal subdomain emerges inside of the oral ectoderm, and bilateral subdomains defining the lateral portions of the future ciliary band emerge adjacent to the central oral ectoderm. Here we examine two regulatory genes encoding repressors, sip1 and ets4, which selectively prevent transcription of oral ectoderm genes until their expression is cleared from the oral ectoderm as an indirect consequence of Nodal signaling. We show that the timing of transcriptional de-repression of sip1 and ets4 targets which occurs upon their clearance explains the dynamics of oral ectoderm gene expression. In addition two other repressors, the direct Nodal target not, and the feed forward Nodal target goosecoid, repress expression of regulatory genes in the central animal oral ectoderm thereby confining their expression to the lateral domains of the animal ectoderm. These results have permitted construction of an enhanced animal ectoderm GRN model highlighting the repressive interactions providing precise temporal and spatial control of regulatory gene expression. © 2013 Elsevier Inc. All rights reserved.

  10. Mutation Frequency of the Major Frontotemporal Dementia Genes, MAPT, GRN and C9ORF72 in a Turkish Cohort of Dementia Patients

    Science.gov (United States)

    Guven, Gamze; Lohmann, Ebba; Bras, Jose; Gibbs, J. Raphael; Gurvit, Hakan; Bilgic, Basar; Hanagasi, Hasmet; Rizzu, Patrizia; Heutink, Peter; Emre, Murat; Erginel-Unaltuna, Nihan; Just, Walter; Hardy, John; Singleton, Andrew; Guerreiro, Rita

    2016-01-01

    ‘Microtubule-associated protein tau’ (MAPT), ‘granulin’ (GRN) and ‘chromosome 9 open reading frame72’ (C9ORF72) gene mutations are the major known genetic causes of frontotemporal dementia (FTD). Recent studies suggest that mutations in these genes may also be associated with other forms of dementia. Therefore we investigated whether MAPT, GRN and C9ORF72 gene mutations are major contributors to dementia in a random, unselected Turkish cohort of dementia patients. A combination of whole-exome sequencing, Sanger sequencing and fragment analysis/Southern blot was performed in order to identify pathogenic mutations and novel variants in these genes as well as other FTD-related genes such as the ‘charged multivesicular body protein 2B’ (CHMP2B), the ‘FUS RNA binding protein’ (FUS), the ‘TAR DNA binding protein’ (TARDBP), the ‘sequestosome1’ (SQSTM1), and the ‘valosin containing protein’ (VCP). We determined one pathogenic MAPT mutation (c.1906C>T, p.P636L) and one novel missense variant (c.38A>G, p.D13G). In GRN we identified a probably pathogenic TGAG deletion in the splice donor site of exon 6. Three patients were found to carry the GGGGCC expansions in the non-coding region of the C9ORF72 gene. In summary, a complete screening for mutations in MAPT, GRN and C9ORF72 genes revealed a frequency of 5.4% of pathogenic mutations in a random cohort of 93 Turkish index patients with dementia. PMID:27632209

  11. The homologous putative GTPases Grn1p from fission yeast and the human GNL3L are required for growth and play a role in processing of nucleolar pre-rRNA.

    Science.gov (United States)

    Du, Xianming; Rao, Malireddi R K Subba; Chen, Xue Qin; Wu, Wei; Mahalingam, Sundarasamy; Balasundaram, David

    2006-01-01

    Grn1p from fission yeast and GNL3L from human cells, two putative GTPases from the novel HSR1_MMR1 GTP-binding protein subfamily with circularly permuted G-motifs play a critical role in maintaining normal cell growth. Deletion of Grn1 resulted in a severe growth defect, a marked reduction in mature rRNA species with a concomitant accumulation of the 35S pre-rRNA transcript, and failure to export the ribosomal protein Rpl25a from the nucleolus. Deleting any of the Grn1p G-domain motifs resulted in a null phenotype and nuclear/nucleolar localization consistent with the lack of nucleolar export of preribosomes accompanied by a distortion of nucleolar structure. Heterologous expression of GNL3L in a Deltagrn1 mutant restored processing of 35S pre-rRNA, nuclear export of Rpl25a and cell growth to wild-type levels. Genetic complementation in yeast and siRNA knockdown in HeLa cells confirmed the homologous proteins Grn1p and GNL3L are required for growth. Failure of two similar HSR1_MMR1 putative nucleolar GTPases, Nucleostemin (NS), or the dose-dependent response of breast tumor autoantigen NGP-1, to rescue deltagrn1 implied the highly specific roles of Grn1p or GNL3L in nucleolar events. Our analysis uncovers an important role for Grn1p/GNL3L within this unique group of nucleolar GTPases.

  12. Ecological Inference

    Science.gov (United States)

    King, Gary; Rosen, Ori; Tanner, Martin A.

    2004-09-01

    This collection of essays brings together a diverse group of scholars to survey the latest strategies for solving ecological inference problems in various fields. The last half-decade has witnessed an explosion of research in ecological inference--the process of trying to infer individual behavior from aggregate data. Although uncertainties and information lost in aggregation make ecological inference one of the most problematic types of research to rely on, these inferences are required in many academic fields, as well as by legislatures and the Courts in redistricting, by business in marketing research, and by governments in policy analysis.

  13. Statistical Inference and String Theory

    CERN Document Server

    Heckman, Jonathan J

    2013-01-01

    In this note we expose some surprising connections between string theory and statistical inference. We consider a large collective of agents sweeping out a family of nearby statistical models for an M-dimensional manifold of statistical fitting parameters. When the agents making nearby inferences align along a d-dimensional grid, we find that the pooled probability that the collective reaches a correct inference is the partition function of a non-linear sigma model in d dimensions. Stability under perturbations to the original inference scheme requires the agents of the collective to distribute along two dimensions. Conformal invariance of the sigma model corresponds to the condition of a stable inference scheme, directly leading to the Einstein field equations for classical gravity. By summing over all possible arrangements of the agents in the collective, we reach a string theory. We also use this perspective to quantify how much an observer can hope to learn about the internal geometry of a superstring com...

  14. Assessment of network inference methods: how to cope with an underdetermined problem.

    Directory of Open Access Journals (Sweden)

    Caroline Siegenthaler

    Full Text Available The inference of biological networks is an active research area in the field of systems biology. The number of network inference algorithms has grown tremendously in the last decade, underlining the importance of a fair assessment and comparison among these methods. Current assessments of the performance of an inference method typically involve the application of the algorithm to benchmark datasets and the comparison of the network predictions against the gold standard or reference networks. While the network inference problem is often deemed underdetermined, implying that the inference problem does not have a (unique solution, the consequences of such an attribute have not been rigorously taken into consideration. Here, we propose a new procedure for assessing the performance of gene regulatory network (GRN inference methods. The procedure takes into account the underdetermined nature of the inference problem, in which gene regulatory interactions that are inferable or non-inferable are determined based on causal inference. The assessment relies on a new definition of the confusion matrix, which excludes errors associated with non-inferable gene regulations. For demonstration purposes, the proposed assessment procedure is applied to the DREAM 4 In Silico Network Challenge. The results show a marked change in the ranking of participating methods when taking network inferability into account.

  15. Improving gene regulatory network inference using network topology information.

    Science.gov (United States)

    Nair, Ajay; Chetty, Madhu; Wangikar, Pramod P

    2015-09-01

    Inferring the gene regulatory network (GRN) structure from data is an important problem in computational biology. However, it is a computationally complex problem and approximate methods such as heuristic search techniques, restriction of the maximum-number-of-parents (maxP) for a gene, or an optimal search under special conditions are required. The limitations of a heuristic search are well known but literature on the detailed analysis of the widely used maxP technique is lacking. The optimal search methods require large computational time. We report the theoretical analysis and experimental results of the strengths and limitations of the maxP technique. Further, using an optimal search method, we combine the strengths of the maxP technique and the known GRN topology to propose two novel algorithms. These algorithms are implemented in a Bayesian network framework and tested on biological, realistic, and in silico networks of different sizes and topologies. They overcome the limitations of the maxP technique and show superior computational speed when compared to the current optimal search algorithms.

  16. Causal inference

    Directory of Open Access Journals (Sweden)

    Richard Shoemaker

    2014-04-01

    Full Text Available Establishing causality has been a problem throughout history of philosophy of science. This paper discusses the philosophy of causal inference along the different school of thoughts and methods: Rationalism, Empiricism, Inductive method, Hypothetical deductive method with pros and cons. The article it starting from the Problem of Hume, also close to the positions of Russell, Carnap, Popper and Kuhn to better understand the modern interpretation and implications of causal inference in epidemiological research.

  17. Investigating the role of rare coding variability in Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP) in late-onset Alzheimer's disease

    Science.gov (United States)

    Sassi, Celeste; Guerreiro, Rita; Gibbs, Raphael; Ding, Jinhui; Lupton, Michelle K.; Troakes, Claire; Al-Sarraj, Safa; Niblock, Michael; Gallo, Jean-Marc; Adnan, Jihad; Killick, Richard; Brown, Kristelle S.; Medway, Christopher; Lord, Jenny; Turton, James; Bras, Jose; Morgan, Kevin; Powell, John F.; Singleton, Andrew; Hardy, John

    2014-01-01

    The overlapping clinical and neuropathologic features between late-onset apparently sporadic Alzheimer's disease (LOAD), familial Alzheimer's disease (FAD), and other neurodegenerative dementias (frontotemporal dementia, corticobasal degeneration, progressive supranuclear palsy, and Creutzfeldt-Jakob disease) raise the question of whether shared genetic risk factors may explain the similar phenotype among these disparate disorders. To investigate this intriguing hypothesis, we analyzed rare coding variability in 6 Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP), in 141 LOAD patients and 179 elderly controls, neuropathologically proven, from the UK. In our cohort, 14 LOAD cases (10%) and 11 controls (6%) carry at least 1 rare variant in the genes studied. We report a novel variant in PSEN1 (p.I168T) and a rare variant in PSEN2 (p.A237V), absent in controls and both likely pathogenic. Our findings support previous studies, suggesting that (1) rare coding variability in PSEN1 and PSEN2 may influence the susceptibility for LOAD and (2) GRN, MAPT, and PRNP are not major contributors to LOAD. Thus, genetic screening is pivotal for the clinical differential diagnosis of these neurodegenerative dementias. PMID:25104557

  18. Labeling Schemes with Queries

    OpenAIRE

    2006-01-01

    We study the question of ``how robust are the known lower bounds of labeling schemes when one increases the number of consulted labels''. Let $f$ be a function on pairs of vertices. An $f$-labeling scheme for a family of graphs $\\cF$ labels the vertices of all graphs in $\\cF$ such that for every graph $G\\in\\cF$ and every two vertices $u,v\\in G$, the value $f(u,v)$ can be inferred by merely inspecting the labels of $u$ and $v$. This paper introduces a natural generalization: the notion of $f$-...

  19. Urothelial cancer gene regulatory networks inferred from large-scale RNAseq, Bead and Oligo gene expression data.

    Science.gov (United States)

    de Matos Simoes, Ricardo; Dalleau, Sabine; Williamson, Kate E; Emmert-Streib, Frank

    2015-05-14

    Urothelial pathogenesis is a complex process driven by an underlying network of interconnected genes. The identification of novel genomic target regions and gene targets that drive urothelial carcinogenesis is crucial in order to improve our current limited understanding of urothelial cancer (UC) on the molecular level. The inference of genome-wide gene regulatory networks (GRN) from large-scale gene expression data provides a promising approach for a detailed investigation of the underlying network structure associated to urothelial carcinogenesis. In our study we inferred and compared three GRNs by the application of the BC3Net inference algorithm to large-scale transitional cell carcinoma gene expression data sets from Illumina RNAseq (179 samples), Illumina Bead arrays (165 samples) and Affymetrix Oligo microarrays (188 samples). We investigated the structural and functional properties of GRNs for the identification of molecular targets associated to urothelial cancer. We found that the urothelial cancer (UC) GRNs show a significant enrichment of subnetworks that are associated with known cancer hallmarks including cell cycle, immune response, signaling, differentiation and translation. Interestingly, the most prominent subnetworks of co-located genes were found on chromosome regions 5q31.3 (RNAseq), 8q24.3 (Oligo) and 1q23.3 (Bead), which all represent known genomic regions frequently deregulated or aberated in urothelial cancer and other cancer types. Furthermore, the identified hub genes of the individual GRNs, e.g., HID1/DMC1 (tumor development), RNF17/TDRD4 (cancer antigen) and CYP4A11 (angiogenesis/ metastasis) are known cancer associated markers. The GRNs were highly dataset specific on the interaction level between individual genes, but showed large similarities on the biological function level represented by subnetworks. Remarkably, the RNAseq UC GRN showed twice the proportion of significant functional subnetworks. Based on our analysis of inferential

  20. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  1. A bayesian framework that integrates heterogeneous data for inferring gene regulatory networks.

    Science.gov (United States)

    Santra, Tapesh

    2014-01-01

    Reconstruction of gene regulatory networks (GRNs) from experimental data is a fundamental challenge in systems biology. A number of computational approaches have been developed to infer GRNs from mRNA expression profiles. However, expression profiles alone are proving to be insufficient for inferring GRN topologies with reasonable accuracy. Recently, it has been shown that integration of external data sources (such as gene and protein sequence information, gene ontology data, protein-protein interactions) with mRNA expression profiles may increase the reliability of the inference process. Here, I propose a new approach that incorporates transcription factor binding sites (TFBS) and physical protein interactions (PPI) among transcription factors (TFs) in a Bayesian variable selection (BVS) algorithm which can infer GRNs from mRNA expression profiles subjected to genetic perturbations. Using real experimental data, I show that the integration of TFBS and PPI data with mRNA expression profiles leads to significantly more accurate networks than those inferred from expression profiles alone. Additionally, the performance of the proposed algorithm is compared with a series of least absolute shrinkage and selection operator (LASSO) regression-based network inference methods that can also incorporate prior knowledge in the inference framework. The results of this comparison suggest that BVS can outperform LASSO regression-based method in some circumstances.

  2. A Bayesian Framework that integrates heterogeneous data for inferring gene regulatory networks

    Directory of Open Access Journals (Sweden)

    Tapesh eSantra

    2014-05-01

    Full Text Available Reconstruction of gene regulatory networks (GRNs from experimental data is a fundamental challenge in systems biology. A number of computational approaches have been developed to infer GRNs from mRNA expression profiles. However, expression profiles alone are proving to be insufficient for inferring GRN topologies with reasonable accuracy. Recently, it has been shown that integration of external data sources (such as gene and protein sequence information, gene ontology data, protein protein interactions with mRNA expression profiles may increase the reliability of the inference process. Here, I propose a new approach that incorporates transcription factor binding sites (TFBS and physical protein interactions (PPI among transcription factors (TFs in a Bayesian Variable Selection (BVS algorithm which can infer GRNs from mRNA expression profiles subjected to genetic perturbations. Using real experimental data, I show that the integration of TFBS and PPI data with mRNA expression profiles leads to significantly more accurate networks than those inferred from expression profiles alone. Additionally, the performance of the proposed algorithm is compared with a series of LASSO regression based network inference methods that can also incorporate prior knowledge in the inference framework. The results of this comparison suggest that BVS can outperform LASSO regression based method in some circumstances.

  3. Colour schemes

    DEFF Research Database (Denmark)

    van Leeuwen, Theo

    2013-01-01

    This chapter presents a framework for analysing colour schemes based on a parametric approach that includes not only hue, value and saturation, but also purity, transparency, luminosity, luminescence, lustre, modulation and differentiation....

  4. Isolation and Characterization of Two Lytic Bacteriophages, φSt2 and φGrn1; Phage Therapy Application for Biological Control of Vibrio alginolyticus in Aquaculture Live Feeds.

    Directory of Open Access Journals (Sweden)

    Panos G Kalatzis

    Full Text Available Bacterial infections are a serious problem in aquaculture since they can result in massive mortalities in farmed fish and invertebrates. Vibriosis is one of the most common diseases in marine aquaculture hatcheries and its causative agents are bacteria of the genus Vibrio mostly entering larval rearing water through live feeds, such as Artemia and rotifers. The pathogenic Vibrio alginolyticus strain V1, isolated during a vibriosis outbreak in cultured seabream, Sparus aurata, was used as host to isolate and characterize the two novel bacteriophages φSt2 and φGrn1 for phage therapy application. In vitro cell lysis experiments were performed against the bacterial host V. alginolyticus strain V1 but also against 12 presumptive Vibrio strains originating from live prey Artemia salina cultures indicating the strong lytic efficacy of the 2 phages. In vivo administration of the phage cocktail, φSt2 and φGrn1, at MOI = 100 directly on live prey A. salina cultures, led to a 93% decrease of presumptive Vibrio population after 4 h of treatment. Current study suggests that administration of φSt2 and φGrn1 to live preys could selectively reduce Vibrio load in fish hatcheries. Innovative and environmental friendly solutions against bacterial diseases are more than necessary and phage therapy is one of them.

  5. Isolation and Characterization of Two Lytic Bacteriophages, φSt2 and φGrn1; Phage Therapy Application for Biological Control of Vibrio alginolyticus in Aquaculture Live Feeds.

    Science.gov (United States)

    Kalatzis, Panos G; Bastías, Roberto; Kokkari, Constantina; Katharios, Pantelis

    2016-01-01

    Bacterial infections are a serious problem in aquaculture since they can result in massive mortalities in farmed fish and invertebrates. Vibriosis is one of the most common diseases in marine aquaculture hatcheries and its causative agents are bacteria of the genus Vibrio mostly entering larval rearing water through live feeds, such as Artemia and rotifers. The pathogenic Vibrio alginolyticus strain V1, isolated during a vibriosis outbreak in cultured seabream, Sparus aurata, was used as host to isolate and characterize the two novel bacteriophages φSt2 and φGrn1 for phage therapy application. In vitro cell lysis experiments were performed against the bacterial host V. alginolyticus strain V1 but also against 12 presumptive Vibrio strains originating from live prey Artemia salina cultures indicating the strong lytic efficacy of the 2 phages. In vivo administration of the phage cocktail, φSt2 and φGrn1, at MOI = 100 directly on live prey A. salina cultures, led to a 93% decrease of presumptive Vibrio population after 4 h of treatment. Current study suggests that administration of φSt2 and φGrn1 to live preys could selectively reduce Vibrio load in fish hatcheries. Innovative and environmental friendly solutions against bacterial diseases are more than necessary and phage therapy is one of them.

  6. Isolation and Characterization of Two Lytic Bacteriophages, φSt2 and φGrn1; Phage Therapy Application for Biological Control of Vibrio alginolyticus in Aquaculture Live Feeds

    Science.gov (United States)

    Kalatzis, Panos G.; Bastías, Roberto; Kokkari, Constantina; Katharios, Pantelis

    2016-01-01

    Bacterial infections are a serious problem in aquaculture since they can result in massive mortalities in farmed fish and invertebrates. Vibriosis is one of the most common diseases in marine aquaculture hatcheries and its causative agents are bacteria of the genus Vibrio mostly entering larval rearing water through live feeds, such as Artemia and rotifers. The pathogenic Vibrio alginolyticus strain V1, isolated during a vibriosis outbreak in cultured seabream, Sparus aurata, was used as host to isolate and characterize the two novel bacteriophages φSt2 and φGrn1 for phage therapy application. In vitro cell lysis experiments were performed against the bacterial host V. alginolyticus strain V1 but also against 12 presumptive Vibrio strains originating from live prey Artemia salina cultures indicating the strong lytic efficacy of the 2 phages. In vivo administration of the phage cocktail, φSt2 and φGrn1, at MOI = 100 directly on live prey A. salina cultures, led to a 93% decrease of presumptive Vibrio population after 4 h of treatment. Current study suggests that administration of φSt2 and φGrn1 to live preys could selectively reduce Vibrio load in fish hatcheries. Innovative and environmental friendly solutions against bacterial diseases are more than necessary and phage therapy is one of them. PMID:26950336

  7. Inferring regulatory networks from expression data using tree-based methods.

    Directory of Open Access Journals (Sweden)

    Vân Anh Huynh-Thu

    Full Text Available One of the pressing open problems of computational systems biology is the elucidation of the topology of genetic regulatory networks (GRNs using high throughput genomic data, in particular microarray gene expression data. The Dialogue for Reverse Engineering Assessments and Methods (DREAM challenge aims to evaluate the success of GRN inference algorithms on benchmarks of simulated data. In this article, we present GENIE3, a new algorithm for the inference of GRNs that was best performer in the DREAM4 In Silico Multifactorial challenge. GENIE3 decomposes the prediction of a regulatory network between p genes into p different regression problems. In each of the regression problems, the expression pattern of one of the genes (target gene is predicted from the expression patterns of all the other genes (input genes, using tree-based ensemble methods Random Forests or Extra-Trees. The importance of an input gene in the prediction of the target gene expression pattern is taken as an indication of a putative regulatory link. Putative regulatory links are then aggregated over all genes to provide a ranking of interactions from which the whole network is reconstructed. In addition to performing well on the DREAM4 In Silico Multifactorial challenge simulated data, we show that GENIE3 compares favorably with existing algorithms to decipher the genetic regulatory network of Escherichia coli. It doesn't make any assumption about the nature of gene regulation, can deal with combinatorial and non-linear interactions, produces directed GRNs, and is fast and scalable. In conclusion, we propose a new algorithm for GRN inference that performs well on both synthetic and real gene expression data. The algorithm, based on feature selection with tree-based ensemble methods, is simple and generic, making it adaptable to other types of genomic data and interactions.

  8. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Directory of Open Access Journals (Sweden)

    Xiangyun Xiao

    Full Text Available The reconstruction of gene regulatory networks (GRNs from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM, experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  9. Optimal probabilistic dense coding schemes

    Science.gov (United States)

    Kögler, Roger A.; Neves, Leonardo

    2017-04-01

    Dense coding with non-maximally entangled states has been investigated in many different scenarios. We revisit this problem for protocols adopting the standard encoding scheme. In this case, the set of possible classical messages cannot be perfectly distinguished due to the non-orthogonality of the quantum states carrying them. So far, the decoding process has been approached in two ways: (i) The message is always inferred, but with an associated (minimum) error; (ii) the message is inferred without error, but only sometimes; in case of failure, nothing else is done. Here, we generalize on these approaches and propose novel optimal probabilistic decoding schemes. The first uses quantum-state separation to increase the distinguishability of the messages with an optimal success probability. This scheme is shown to include (i) and (ii) as special cases and continuously interpolate between them, which enables the decoder to trade-off between the level of confidence desired to identify the received messages and the success probability for doing so. The second scheme, called multistage decoding, applies only for qudits ( d-level quantum systems with d>2) and consists of further attempts in the state identification process in case of failure in the first one. We show that this scheme is advantageous over (ii) as it increases the mutual information between the sender and receiver.

  10. Gauging Variational Inference

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ahn, Sungsoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of); Shin, Jinwoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2017-05-25

    Computing partition function is the most important statistical inference task arising in applications of Graphical Models (GM). Since it is computationally intractable, approximate methods have been used to resolve the issue in practice, where meanfield (MF) and belief propagation (BP) are arguably the most popular and successful approaches of a variational type. In this paper, we propose two new variational schemes, coined Gauged-MF (G-MF) and Gauged-BP (G-BP), improving MF and BP, respectively. Both provide lower bounds for the partition function by utilizing the so-called gauge transformation which modifies factors of GM while keeping the partition function invariant. Moreover, we prove that both G-MF and G-BP are exact for GMs with a single loop of a special structure, even though the bare MF and BP perform badly in this case. Our extensive experiments, on complete GMs of relatively small size and on large GM (up-to 300 variables) confirm that the newly proposed algorithms outperform and generalize MF and BP.

  11. SEMANTIC PATCH INFERENCE

    DEFF Research Database (Denmark)

    Andersen, Jesper

    2009-01-01

    Collateral evolution the problem of updating several library-using programs in response to API changes in the used library. In this dissertation we address the issue of understanding collateral evolutions by automatically inferring a high-level specification of the changes evident in a given set ...... specifications inferred by spdiff in Linux are shown. We find that the inferred specifications concisely capture the actual collateral evolution performed in the examples....

  12. Inference in `poor` languages

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, S.

    1996-10-01

    Languages with a solvable implication problem but without complete and consistent systems of inference rules (`poor` languages) are considered. The problem of existence of finite complete and consistent inference rule system for a ``poor`` language is stated independently of the language or rules syntax. Several properties of the problem arc proved. An application of results to the language of join dependencies is given.

  13. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  14. Knowledge and inference

    CERN Document Server

    Nagao, Makoto

    1990-01-01

    Knowledge and Inference discusses an important problem for software systems: How do we treat knowledge and ideas on a computer and how do we use inference to solve problems on a computer? The book talks about the problems of knowledge and inference for the purpose of merging artificial intelligence and library science. The book begins by clarifying the concept of """"knowledge"""" from many points of view, followed by a chapter on the current state of library science and the place of artificial intelligence in library science. Subsequent chapters cover central topics in the artificial intellig

  15. Analysis of the GRNs Inference by Using Tsallis Entropy and a Feature Selection Approach

    Science.gov (United States)

    Lopes, Fabrício M.; de Oliveira, Evaldo A.; Cesar, Roberto M.

    An important problem in the bioinformatics field is to understand how genes are regulated and interact through gene networks. This knowledge can be helpful for many applications, such as disease treatment design and drugs creation purposes. For this reason, it is very important to uncover the functional relationship among genes and then to construct the gene regulatory network (GRN) from temporal expression data. However, this task usually involves data with a large number of variables and small number of observations. In this way, there is a strong motivation to use pattern recognition and dimensionality reduction approaches. In particular, feature selection is specially important in order to select the most important predictor genes that can explain some phenomena associated with the target genes. This work presents a first study about the sensibility of entropy methods regarding the entropy functional form, applied to the problem of topology recovery of GRNs. The generalized entropy proposed by Tsallis is used to study this sensibility. The inference process is based on a feature selection approach, which is applied to simulated temporal expression data generated by an artificial gene network (AGN) model. The inferred GRNs are validated in terms of global network measures. Some interesting conclusions can be drawn from the experimental results, as reported for the first time in the present paper.

  16. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  17. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  18. Surrogate based approaches to parameter inference in ocean models

    KAUST Repository

    Knio, Omar

    2016-01-06

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  19. Active inference and robot control: a case study

    Science.gov (United States)

    Nizard, Ange; Friston, Karl; Pezzulo, Giovanni

    2016-01-01

    Active inference is a general framework for perception and action that is gaining prominence in computational and systems neuroscience but is less known outside these fields. Here, we discuss a proof-of-principle implementation of the active inference scheme for the control or the 7-DoF arm of a (simulated) PR2 robot. By manipulating visual and proprioceptive noise levels, we show under which conditions robot control under the active inference scheme is accurate. Besides accurate control, our analysis of the internal system dynamics (e.g. the dynamics of the hidden states that are inferred during the inference) sheds light on key aspects of the framework such as the quintessentially multimodal nature of control and the differential roles of proprioception and vision. In the discussion, we consider the potential importance of being able to implement active inference in robots. In particular, we briefly review the opportunities for modelling psychophysiological phenomena such as sensory attenuation and related failures of gain control, of the sort seen in Parkinson's disease. We also consider the fundamental difference between active inference and optimal control formulations, showing that in the former the heavy lifting shifts from solving a dynamical inverse problem to creating deep forward or generative models with dynamics, whose attracting sets prescribe desired behaviours. PMID:27683002

  20. Active inference and robot control: a case study.

    Science.gov (United States)

    Pio-Lopez, Léo; Nizard, Ange; Friston, Karl; Pezzulo, Giovanni

    2016-09-01

    Active inference is a general framework for perception and action that is gaining prominence in computational and systems neuroscience but is less known outside these fields. Here, we discuss a proof-of-principle implementation of the active inference scheme for the control or the 7-DoF arm of a (simulated) PR2 robot. By manipulating visual and proprioceptive noise levels, we show under which conditions robot control under the active inference scheme is accurate. Besides accurate control, our analysis of the internal system dynamics (e.g. the dynamics of the hidden states that are inferred during the inference) sheds light on key aspects of the framework such as the quintessentially multimodal nature of control and the differential roles of proprioception and vision. In the discussion, we consider the potential importance of being able to implement active inference in robots. In particular, we briefly review the opportunities for modelling psychophysiological phenomena such as sensory attenuation and related failures of gain control, of the sort seen in Parkinson's disease. We also consider the fundamental difference between active inference and optimal control formulations, showing that in the former the heavy lifting shifts from solving a dynamical inverse problem to creating deep forward or generative models with dynamics, whose attracting sets prescribe desired behaviours.

  1. Generalized Group Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The concept of generalized group signature scheme will bepresent. Based on the generalized secret sharing scheme proposed by Lin and Ha rn, a non-interactive approach is designed for realizing such generalized group signature scheme. Using the new scheme, the authorized subsets of the group in w hich the group member can cooperate to produce the valid signature for any messa ge can be randomly specified

  2. Finite Boltzmann schemes

    NARCIS (Netherlands)

    Sman, van der R.G.M.

    2006-01-01

    In the special case of relaxation parameter = 1 lattice Boltzmann schemes for (convection) diffusion and fluid flow are equivalent to finite difference/volume (FD) schemes, and are thus coined finite Boltzmann (FB) schemes. We show that the equivalence is inherent to the homology of the

  3. MIDI Programming in Scheme

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2010-01-01

    A Scheme representation of Standard MIDI Files is proposed. The Scheme expressions are defined and constrained by an XML-language, which in the starting point is inspired by a MIDI XML event language made by the MIDI Manufactures Association. The representation of Standard MIDI Files in Scheme ma...

  4. MIDI Programming in Scheme

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2010-01-01

    A Scheme representation of Standard MIDI Files is proposed. The Scheme expressions are defined and constrained by an XML-language, which in the starting point is inspired by a MIDI XML event language made by the MIDI Manufactures Association. The representation of Standard MIDI Files in Scheme ma...

  5. The Bayes Inference Engine

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, K.M.; Cunningham, G.S.

    1996-04-01

    The authors are developing a computer application, called the Bayes Inference Engine, to provide the means to make inferences about models of physical reality within a Bayesian framework. The construction of complex nonlinear models is achieved by a fully object-oriented design. The models are represented by a data-flow diagram that may be manipulated by the analyst through a graphical programming environment. Maximum a posteriori solutions are achieved using a general, gradient-based optimization algorithm. The application incorporates a new technique of estimating and visualizing the uncertainties in specific aspects of the model.

  6. Foundations of Inference

    CERN Document Server

    Knuth, Kevin H

    2010-01-01

    We present a foundation for inference that unites and significantly extends the approaches of Kolmogorov and Cox. Our approach is based on quantifying finite lattices of logical statements in a way that satisfies general lattice symmetries. With other applications in mind, our derivations assume minimal symmetries, relying on neither complementarity nor continuity or differentiability. Each relevant symmetry corresponds to an axiom of quantification, and these axioms are used to derive a unique set of rules governing quantification of the lattice. These rules form the familiar probability calculus. We also derive a unique quantification of divergence and information. Taken together these results form a simple and clear foundation for the quantification of inference.

  7. Making Type Inference Practical

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Oxhøj, Nicholas; Palsberg, Jens

    1992-01-01

    We present the implementation of a type inference algorithm for untyped object-oriented programs with inheritance, assignments, and late binding. The algorithm significantly improves our previous one, presented at OOPSLA'91, since it can handle collection classes, such as List, in a useful way. Abo....... Experiments indicate that the implementation type checks as much as 100 lines pr. second. This results in a mature product, on which a number of tools can be based, for example a safety tool, an image compression tool, a code optimization tool, and an annotation tool. This may make type inference for object...

  8. Mathematical inference and control of molecular networks from perturbation experiments

    Science.gov (United States)

    Mohammed-Rasheed, Mohammed

    One of the main challenges facing biologists and mathematicians in the post genomic era is to understand the behavior of molecular networks and harness this understanding into an educated intervention of the cell. The cell maintains its function via an elaborate network of interconnecting positive and negative feedback loops of genes, RNA and proteins that send different signals to a large number of pathways and molecules. These structures are referred to as genetic regulatory networks (GRNs) or molecular networks. GRNs can be viewed as dynamical systems with inherent properties and mechanisms, such as steady-state equilibriums and stability, that determine the behavior of the cell. The biological relevance of the mathematical concepts are important as they may predict the differentiation of a stem cell, the maintenance of a normal cell, the development of cancer and its aberrant behavior, and the design of drugs and response to therapy. Uncovering the underlying GRN structure from gene/protein expression data, e.g., microarrays or perturbation experiments, is called inference or reverse engineering of the molecular network. Because of the high cost and time consuming nature of biological experiments, the number of available measurements or experiments is very small compared to the number of molecules (genes, RNA and proteins). In addition, the observations are noisy, where the noise is due to the measurements imperfections as well as the inherent stochasticity of genetic expression levels. Intra-cellular activities and extra-cellular environmental attributes are also another source of variability. Thus, the inference of GRNs is, in general, an under-determined problem with a highly noisy set of observations. The ultimate goal of GRN inference and analysis is to be able to intervene within the network, in order to force it away from undesirable cellular states and into desirable ones. However, it remains a major challenge to design optimal intervention strategies

  9. Dopamine, reward learning, and active inference

    Directory of Open Access Journals (Sweden)

    Thomas eFitzgerald

    2015-11-01

    Full Text Available Temporal difference learning models propose phasic dopamine signalling encodes reward prediction errors that drive learning. This is supported by studies where optogenetic stimulation of dopamine neurons can stand in lieu of actual reward. Nevertheless, a large body of data also shows that dopamine is not necessary for learning, and that dopamine depletion primarily affects task performance. We offer a resolution to this paradox based on an hypothesis that dopamine encodes the precision of beliefs about alternative actions, and thus controls the outcome-sensitivity of behaviour. We extend an active inference scheme for solving Markov decision processes to include learning, and show that simulated dopamine dynamics strongly resemble those actually observed during instrumental conditioning. Furthermore, simulated dopamine depletion impairs performance but spares learning, while simulated excitation of dopamine neurons drives reward learning, through aberrant inference about outcome states. Our formal approach provides a novel and parsimonious reconciliation of apparently divergent experimental findings.

  10. The anatomy of choice: active inference and agency

    Directory of Open Access Journals (Sweden)

    Karl eFriston

    2013-09-01

    Full Text Available This paper considers agency in the setting of embodied or active inference. In brief, we associate a sense of agency with prior beliefs about action and ask what sorts of beliefs underlie optimal behaviour. In particular, we consider prior beliefs that action minimises the Kullback-Leibler divergence between desired states and attainable states in the future. This allows one to formulate bounded rationality as approximate Bayesian inference that optimises a free energy bound on model evidence. We show that constructs like expected utility, exploration bonuses, softmax choice rules and optimism bias emerge as natural consequences of this formulation. Previous accounts of active inference have focused on predictive coding and Bayesian filtering schemes for minimising free energy. Here, we consider variational Bayes as an alternative scheme that provides formal constraints on the computational anatomy of inference and action – constraints that are remarkably consistent with neuroanatomy. Furthermore, this scheme contextualises optimal decision theory and economic (utilitarian formulations as pure inference problems. For example, expected utility theory emerges as a special case of free energy minimisation, where the sensitivity or inverse temperature (of softmax functions and quantal response equilibria has a unique and Bayes-optimal solution – that minimises free energy. This sensitivity corresponds to the precision of beliefs about behaviour, such that attainable goals are afforded a higher precision or confidence. In turn, this means that optimal behaviour entails a representation of confidence about outcomes that are under an agent's control.

  11. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    This paper describes and discusses two different Scheme documentation tools. The first is SchemeDoc, which is intended for documentation of the interfaces of Scheme libraries (APIs). The second is the Scheme Elucidator, which is for internal documentation of Scheme programs. Although the tools...... are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  12. Inference as Prediction

    Science.gov (United States)

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  13. Inferring maps of forces inside cell membrane microdomains

    CERN Document Server

    Masson, J -B; Tuerkcan, S; Voisinne, G; Popoff, M R; Vergassola, M; Alexandrou, A

    2015-01-01

    Mapping of the forces on biomolecules in cell membranes has spurred the development of effective labels, e.g. organic fluorophores and nanoparticles, to track trajectories of single biomolecules. Standard methods use particular statistics, namely the mean square displacement, to analyze the underlying dynamics. Here, we introduce general inference methods to fully exploit information in the experimental trajectories, providing sharp estimates of the forces and the diffusion coefficients in membrane microdomains. Rapid and reliable convergence of the inference scheme is demonstrated on trajectories generated numerically. The method is then applied to infer forces and potentials acting on the receptor of the $\\epsilon$-toxin labeled by lanthanide-ion nanoparticles. Our scheme is applicable to any labeled biomolecule and results show show its general relevance for membrane compartmentation.

  14. Message-Passing Algorithms for Channel Estimation and Decoding Using Approximate Inference

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Kirkelund, Gunvor Elisabeth; Manchón, Carles Navarro

    2012-01-01

    We design iterative receiver schemes for a generic communication system by treating channel estimation and information decoding as an inference problem in graphical models. We introduce a recently proposed inference framework that combines belief propagation (BP) and the mean field (MF) approxima...

  15. Convertible Proxy Signcryption Scheme

    Institute of Scientific and Technical Information of China (English)

    李继国; 李建中; 曹珍富; 张亦辰

    2004-01-01

    In 1996, Mambo et al introduced the concept of proxy signature. However, proxy signature can only provide the delegated authenticity and cannot provide confidentiality. Recently, Gamage et al and Chan and Wei proposed different proxy signcryption schemes respectively, which extended the concept of proxy signature.However, only the specified receiver can decrypt and verify the validity of proxy signcryption in their schemes.To protect the receiver' s benefit in case of a later dispute, Wu and Hsu proposed a convertible authenticated encryption scheme, which carn enable the receiver to convert signature into an ordinary one that can be verified by anyone. Based on Wu and Hsu' s scheme and improved Kim' s scheme, we propose a convertible proxy signcryption scheme. The security of the proposed scheme is based on the intractability of reversing the one-way hash function and solving the discrete logarithm problem. The proposed scheme can satisfy all properties of strong proxy signature and withstand the public key substitution attack and does not use secure channel. In addition, the proposed scheme can be extended to convertible threshold proxy signcryption scheme.

  16. Causal inference in econometrics

    CERN Document Server

    Kreinovich, Vladik; Sriboonchitta, Songsak

    2016-01-01

    This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.

  17. Russell and Humean Inferences

    Directory of Open Access Journals (Sweden)

    João Paulo Monteiro

    2001-12-01

    Full Text Available Russell's The Problems of Philosophy tries to establish a new theory of induction, at the same time that Hume is there accused of an irrational/ scepticism about induction". But a careful analysis of the theory of knowledge explicitly acknowledged by Hume reveals that, contrary to the standard interpretation in the XXth century, possibly influenced by Russell, Hume deals exclusively with causal inference (which he never classifies as "causal induction", although now we are entitled to do so, never with inductive inference in general, mainly generalizations about sensible qualities of objects ( whether, e.g., "all crows are black" or not is not among Hume's concerns. Russell's theories are thus only false alternatives to Hume's, in (1912 or in his (1948.

  18. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  19. Difference Schemes and Applications

    Science.gov (United States)

    2015-02-06

    of the shallow water equations that is well suited for complex geometries and moving boundaries. Another (similar) regularization of...the solid wall extrapolation followed by the interpolation in the phase space (by solving the Riemann problem between the internal cell averages and...scheme. This Godunov-type scheme enjoys all major advantages of Riemann -problem-solver-free, non-oscillatory central schemes and, at the same time, have

  20. Efficient Threshold Signature Scheme

    Directory of Open Access Journals (Sweden)

    Sattar J Aboud

    2012-01-01

    Full Text Available In this paper, we introduce a new threshold signature RSA-typed scheme. The proposed scheme has the characteristics of un-forgeable and robustness in random oracle model. Also, signature generation and verification is entirely non-interactive. In addition, the length of the entity signature participate is restricted by a steady times of the length of the RSA signature modulus. Also, the signing process of the proposed scheme is more efficient in terms of time complexity and interaction.

  1. Stateless Transitive Signature Schemes

    Institute of Scientific and Technical Information of China (English)

    MA Chun-guang; CAI Man-chun; YANG Yi-xian

    2004-01-01

    A new practical method is introduced to transform the stateful transitive signature scheme to stateless one without the loss of security. According to the approach, two concrete stateless transitive signature schemes based on Factoring and RSA are presented respectively. Under the assumption of the hardness of factoring and one-more- RSA-inversion problem, both two schemes are secure under the adaptive chosen-message attacks in random oracle model.

  2. INFERENCES FROM ROSSI TRACES

    Energy Technology Data Exchange (ETDEWEB)

    KENNETH M. HANSON; JANE M. BOOKER

    2000-09-08

    The authors an uncertainty analysis of data taken using the Rossi technique, in which the horizontal oscilloscope sweep is driven sinusoidally in time ,while the vertical axis follows the signal amplitude. The analysis is done within a Bayesian framework. Complete inferences are obtained by tilting the Markov chain Monte Carlo technique, which produces random samples from the posterior probability distribution expressed in terms of the parameters.

  3. Inferring Microbial Fitness Landscapes

    Science.gov (United States)

    2016-02-25

    experiments on evolving microbial populations. Although these experiments have produced examples of remarkable phenomena – e.g. the emergence of mutator...what specific mutations, avian influenza viruses will adapt to novel human hosts; or how readily infectious bacteria will escape antibiotics or the...infer from data the determinants of microbial evolution with sufficient resolution that we can quantify 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND

  4. Continuous Integrated Invariant Inference Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed project will develop a new technique for invariant inference and embed this and other current invariant inference and checking techniques in an...

  5. Probabilistic Inferences in Bayesian Networks

    OpenAIRE

    Ding, Jianguo

    2010-01-01

    This chapter summarizes the popular inferences methods in Bayesian networks. The results demonstrates that the evidence can propagated across the Bayesian networks by any links, whatever it is forward or backward or intercausal style. The belief updating of Bayesian networks can be obtained by various available inference techniques. Theoretically, exact inferences in Bayesian networks is feasible and manageable. However, the computing and inference is NP-hard. That means, in applications, in ...

  6. Multimodel inference and adaptive management

    Science.gov (United States)

    Rehme, S.E.; Powell, L.A.; Allen, C.R.

    2011-01-01

    Ecology is an inherently complex science coping with correlated variables, nonlinear interactions and multiple scales of pattern and process, making it difficult for experiments to result in clear, strong inference. Natural resource managers, policy makers, and stakeholders rely on science to provide timely and accurate management recommendations. However, the time necessary to untangle the complexities of interactions within ecosystems is often far greater than the time available to make management decisions. One method of coping with this problem is multimodel inference. Multimodel inference assesses uncertainty by calculating likelihoods among multiple competing hypotheses, but multimodel inference results are often equivocal. Despite this, there may be pressure for ecologists to provide management recommendations regardless of the strength of their study’s inference. We reviewed papers in the Journal of Wildlife Management (JWM) and the journal Conservation Biology (CB) to quantify the prevalence of multimodel inference approaches, the resulting inference (weak versus strong), and how authors dealt with the uncertainty. Thirty-eight percent and 14%, respectively, of articles in the JWM and CB used multimodel inference approaches. Strong inference was rarely observed, with only 7% of JWM and 20% of CB articles resulting in strong inference. We found the majority of weak inference papers in both journals (59%) gave specific management recommendations. Model selection uncertainty was ignored in most recommendations for management. We suggest that adaptive management is an ideal method to resolve uncertainty when research results in weak inference.

  7. Multiresolution signal decomposition schemes

    NARCIS (Netherlands)

    J. Goutsias (John); H.J.A.M. Heijmans (Henk)

    1998-01-01

    textabstract[PNA-R9810] Interest in multiresolution techniques for signal processing and analysis is increasing steadily. An important instance of such a technique is the so-called pyramid decomposition scheme. This report proposes a general axiomatic pyramid decomposition scheme for signal analysis

  8. Multiresolution signal decomposition schemes

    NARCIS (Netherlands)

    Goutsias, J.; Heijmans, H.J.A.M.

    1998-01-01

    [PNA-R9810] Interest in multiresolution techniques for signal processing and analysis is increasing steadily. An important instance of such a technique is the so-called pyramid decomposition scheme. This report proposes a general axiomatic pyramid decomposition scheme for signal analysis and synthes

  9. Nanotechnology and statistical inference

    Science.gov (United States)

    Vesely, Sara; Vesely, Leonardo; Vesely, Alessandro

    2017-08-01

    We discuss some problems that arise when applying statistical inference to data with the aim of disclosing new func-tionalities. A predictive model analyzes the data taken from experiments on a specific material to assess the likelihood that another product, with similar structure and properties, will exhibit the same functionality. It doesn't have much predictive power if vari-ability occurs as a consequence of a specific, non-linear behavior. We exemplify our discussion on some experiments with biased dice.

  10. Foundations of Inference

    Directory of Open Access Journals (Sweden)

    Kevin H. Knuth

    2012-06-01

    Full Text Available We present a simple and clear foundation for finite inference that unites and significantly extends the approaches of Kolmogorov and Cox. Our approach is based on quantifying lattices of logical statements in a way that satisfies general lattice symmetries. With other applications such as measure theory in mind, our derivations assume minimal symmetries, relying on neither negation nor continuity nor differentiability. Each relevant symmetry corresponds to an axiom of quantification, and these axioms are used to derive a unique set of quantifying rules that form the familiar probability calculus. We also derive a unique quantification of divergence, entropy and information.

  11. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  12. Generic patch inference

    DEFF Research Database (Denmark)

    Andersen, Jesper; Lawall, Julia

    2010-01-01

    A key issue in maintaining Linux device drivers is the need to keep them up to date with respect to evolutions in Linux internal libraries. Currently, there is little tool support for performing and documenting such changes. In this paper we present a tool, spdiff, that identifies common changes...... developers can use it to extract an abstract representation of the set of changes that others have made. Our experiments on recent changes in Linux show that the inferred generic patches are more concise than the corresponding patches found in commits to the Linux source tree while being safe with respect...

  13. Statistical inferences in phylogeography

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Beaumont, Mark A

    2009-01-01

    In conventional phylogeographic studies, historical demographic processes are elucidated from the geographical distribution of individuals represented on an inferred gene tree. However, the interpretation of gene trees in this context can be difficult as the same demographic/geographical process ...... may also be challenged by computational problems or poor model choice. In this review, we will describe the development of statistical methods in phylogeographic analysis, and discuss some of the challenges facing these methods....... can randomly lead to multiple different genealogies. Likewise, the same gene trees can arise under different demographic models. This problem has led to the emergence of many statistical methods for making phylogeographic inferences. A popular phylogeographic approach based on nested clade analysis...... is challenged by the fact that a certain amount of the interpretation of the data is left to the subjective choices of the user, and it has been argued that the method performs poorly in simulation studies. More rigorous statistical methods based on coalescence theory have been developed. However, these methods...

  14. Moment inference from tomograms

    Science.gov (United States)

    Day-Lewis, F. D.; Chen, Y.; Singha, K.

    2007-01-01

    Time-lapse geophysical tomography can provide valuable qualitative insights into hydrologic transport phenomena associated with aquifer dynamics, tracer experiments, and engineered remediation. Increasingly, tomograms are used to infer the spatial and/or temporal moments of solute plumes; these moments provide quantitative information about transport processes (e.g., advection, dispersion, and rate-limited mass transfer) and controlling parameters (e.g., permeability, dispersivity, and rate coefficients). The reliability of moments calculated from tomograms is, however, poorly understood because classic approaches to image appraisal (e.g., the model resolution matrix) are not directly applicable to moment inference. Here, we present a semi-analytical approach to construct a moment resolution matrix based on (1) the classic model resolution matrix and (2) image reconstruction from orthogonal moments. Numerical results for radar and electrical-resistivity imaging of solute plumes demonstrate that moment values calculated from tomograms depend strongly on plume location within the tomogram, survey geometry, regularization criteria, and measurement error. Copyright 2007 by the American Geophysical Union.

  15. Active Inference and Learning in the Cerebellum.

    Science.gov (United States)

    Friston, Karl; Herreros, Ivan

    2016-09-01

    This letter offers a computational account of Pavlovian conditioning in the cerebellum based on active inference and predictive coding. Using eyeblink conditioning as a canonical paradigm, we formulate a minimal generative model that can account for spontaneous blinking, startle responses, and (delay or trace) conditioning. We then establish the face validity of the model using simulated responses to unconditioned and conditioned stimuli to reproduce the sorts of behavior that are observed empirically. The scheme's anatomical validity is then addressed by associating variables in the predictive coding scheme with nuclei and neuronal populations to match the (extrinsic and intrinsic) connectivity of the cerebellar (eyeblink conditioning) system. Finally, we try to establish predictive validity by reproducing selective failures of delay conditioning, trace conditioning, and extinction using (simulated and reversible) focal lesions. Although rather metaphorical, the ensuing scheme can account for a remarkable range of anatomical and neurophysiological aspects of cerebellar circuitry-and the specificity of lesion-deficit mappings that have been established experimentally. From a computational perspective, this work shows how conditioning or learning can be formulated in terms of minimizing variational free energy (or maximizing Bayesian model evidence) using exactly the same principles that underlie predictive coding in perception.

  16. A Fuzzy Commitment Scheme

    CERN Document Server

    Al-saggaf, Alawi A

    2008-01-01

    This paper attempt has been made to explain a fuzzy commitment scheme. In the conventional Commitment schemes, both committed string m and valid opening key are required to enable the sender to prove the commitment. However there could be many instances where the transmission involves noise or minor errors arising purely because of the factors over which neither the sender nor the receiver have any control. The fuzzy commitment scheme presented in this paper is to accept the opening key that is close to the original one in suitable distance metric, but not necessarily identical. The concept itself is illustrated with the help of simple situation.

  17. CSR schemes in agribusiness

    DEFF Research Database (Denmark)

    Pötz, Katharina Anna; Haas, Rainer; Balzarova, Michaela

    2013-01-01

    Purpose – The rise of CSR followed a demand for CSR standards and guidelines. In a sector already characterized by a large number of standards, the authors seek to ask what CSR schemes apply to agribusiness, and how they can be systematically compared and analysed. Design....../methodology/approach – Following a deductive-inductive approach the authors develop a model to compare and analyse CSR schemes based on existing studies and on coding qualitative data on 216 CSR schemes. Findings – The authors confirm that CSR standards and guidelines have entered agribusiness and identify a complex landscape...... of schemes that can be categorized on focus areas, scales, mechanisms, origins, types and commitment levels. Research limitations/implications – The findings contribute to conceptual and empirical research on existing models to compare and analyse CSR standards. Sampling technique and depth of analysis limit...

  18. Tabled Execution in Scheme

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, J J; Lumsdaine, A; Quinlan, D J

    2008-08-19

    Tabled execution is a generalization of memorization developed by the logic programming community. It not only saves results from tabled predicates, but also stores the set of currently active calls to them; tabled execution can thus provide meaningful semantics for programs that seemingly contain infinite recursions with the same arguments. In logic programming, tabled execution is used for many purposes, both for improving the efficiency of programs, and making tasks simpler and more direct to express than with normal logic programs. However, tabled execution is only infrequently applied in mainstream functional languages such as Scheme. We demonstrate an elegant implementation of tabled execution in Scheme, using a mix of continuation-passing style and mutable data. We also show the use of tabled execution in Scheme for a problem in formal language and automata theory, demonstrating that tabled execution can be a valuable tool for Scheme users.

  19. Inferring attitudes from mindwandering.

    Science.gov (United States)

    Critcher, Clayton R; Gilovich, Thomas

    2010-09-01

    Self-perception theory posits that people understand their own attitudes and preferences much as they understand others', by interpreting the meaning of their behavior in light of the context in which it occurs. Four studies tested whether people also rely on unobservable "behavior," their mindwandering, when making such inferences. It is proposed here that people rely on the content of their mindwandering to decide whether it reflects boredom with an ongoing task or a reverie's irresistible pull. Having the mind wander to positive events, to concurrent as opposed to past activities, and to many events rather than just one tends to be attributed to boredom and therefore leads to perceived dissatisfaction with an ongoing task. Participants appeared to rely spontaneously on the content of their wandering minds as a cue to their attitudes, but not when an alternative cause for their mindwandering was made salient.

  20. Bayesian inference in geomagnetism

    Science.gov (United States)

    Backus, George E.

    1988-01-01

    The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.

  1. Inferring the eccentricity distribution

    CERN Document Server

    Hogg, David W; Bovy, Jo

    2010-01-01

    Standard maximum-likelihood estimators for binary-star and exoplanet eccentricities are biased high, in the sense that the estimated eccentricity tends to be larger than the true eccentricity. As with most non-trivial observables, a simple histogram of estimated eccentricities is not a good estimate of the true eccentricity distribution. Here we develop and test a hierarchical probabilistic method for performing the relevant meta-analysis, that is, inferring the true eccentricity distribution, taking as input the likelihood functions for the individual-star eccentricities, or samplings of the posterior probability distributions for the eccentricities (under a given, uninformative prior). The method is a simple implementation of a hierarchical Bayesian model; it can also be seen as a kind of heteroscedastic deconvolution. It can be applied to any quantity measured with finite precision--other orbital parameters, or indeed any astronomical measurements of any kind, including magnitudes, parallaxes, or photometr...

  2. Inferring deterministic causal relations

    CERN Document Server

    Daniusis, Povilas; Mooij, Joris; Zscheischler, Jakob; Steudel, Bastian; Zhang, Kun; Schoelkopf, Bernhard

    2012-01-01

    We consider two variables that are related to each other by an invertible function. While it has previously been shown that the dependence structure of the noise can provide hints to determine which of the two variables is the cause, we presently show that even in the deterministic (noise-free) case, there are asymmetries that can be exploited for causal inference. Our method is based on the idea that if the function and the probability density of the cause are chosen independently, then the distribution of the effect will, in a certain sense, depend on the function. We provide a theoretical analysis of this method, showing that it also works in the low noise regime, and link it to information geometry. We report strong empirical results on various real-world data sets from different domains.

  3. Admissibility of logical inference rules

    CERN Document Server

    Rybakov, VV

    1997-01-01

    The aim of this book is to present the fundamental theoretical results concerning inference rules in deductive formal systems. Primary attention is focused on: admissible or permissible inference rules the derivability of the admissible inference rules the structural completeness of logics the bases for admissible and valid inference rules. There is particular emphasis on propositional non-standard logics (primary, superintuitionistic and modal logics) but general logical consequence relations and classical first-order theories are also considered. The book is basically self-contained and

  4. XTR-Kurosawa-Desmedt Scheme

    Institute of Scientific and Technical Information of China (English)

    DING XIU-HUAN; FU ZHI-GUO; ZHANG SHU-GONG

    2009-01-01

    This paper proposes an XTR version of the Kurosawa-Desmedt scheme. Our scheme is secure against adaptive choeen-ciphertext attack under the XTR version of the Decisional Diffie-Hellman assumption in the standard model. Comparing efficiency between the Kurosawa-Desmedt scheme and the proposed XTR-Kurosawa-Desmedt scheme, we find that the proposed scheme is more efficient than the Kurosawa-Desmedt scheme both in communication and computation without compromising security.

  5. Signal inference with unknown response: calibration uncertainty renormalized estimator

    CERN Document Server

    Dorn, Sebastian; Greiner, Maksim; Selig, Marco; Böhm, Vanessa

    2014-01-01

    The calibration of a measurement device is crucial for every scientific experiment, where a signal has to be inferred from data. We present CURE, the calibration uncertainty renormalized estimator, to reconstruct a signal and simultaneously the instrument's calibration from the same data without knowing the exact calibration, but its covariance structure. The idea of CURE is starting with an assumed calibration to successively include more and more portions of calibration uncertainty into the signal inference equations and to absorb the resulting corrections into renormalized signal (and calibration) solutions. Thereby, the signal inference and calibration problem turns into solving a single system of ordinary differential equations and can be identified with common resummation techniques used in field theories. We verify CURE by applying it to a simplistic toy example and compare it against existent self-calibration schemes, Wiener filter solutions, and Markov Chain Monte Carlo sampling. We conclude that the...

  6. Succesful labelling schemes

    DEFF Research Database (Denmark)

    Juhl, Hans Jørn; Stacey, Julia

    2001-01-01

    It is usual practice to evaluate the success of a labelling scheme by looking at the awareness percentage, but in many cases this is not sufficient. The awareness percentage gives no indication of which of the consumer segments that are aware of and use labelling schemes and which do not. In the ......It is usual practice to evaluate the success of a labelling scheme by looking at the awareness percentage, but in many cases this is not sufficient. The awareness percentage gives no indication of which of the consumer segments that are aware of and use labelling schemes and which do not....... In the spring of 2001 MAPP carried out an extensive consumer study with special emphasis on the Nordic environmentally friendly label 'the swan'. The purpose was to find out how much consumers actually know and use various labelling schemes. 869 households were contacted and asked to fill in a questionnaire...... it into consideration when I go shopping. The respondent was asked to pick the most suitable answer, which described her use of each label. 29% - also called 'the labelling blind' - responded that they basically only knew the recycling label and the Government controlled organic label 'Ø-mærket'. Another segment of 6...

  7. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framework...

  8. Interactive Instruction in Bayesian Inference

    DEFF Research Database (Denmark)

    Khan, Azam; Breslav, Simon; Hornbæk, Kasper

    2017-01-01

    An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction...... that an instructional approach to improving human performance in Bayesian inference is a promising direction....

  9. Causal Inference and Developmental Psychology

    Science.gov (United States)

    Foster, E. Michael

    2010-01-01

    Causal inference is of central importance to developmental psychology. Many key questions in the field revolve around improving the lives of children and their families. These include identifying risk factors that if manipulated in some way would foster child development. Such a task inherently involves causal inference: One wants to know whether…

  10. Causal Inference and Developmental Psychology

    Science.gov (United States)

    Foster, E. Michael

    2010-01-01

    Causal inference is of central importance to developmental psychology. Many key questions in the field revolve around improving the lives of children and their families. These include identifying risk factors that if manipulated in some way would foster child development. Such a task inherently involves causal inference: One wants to know whether…

  11. Compact Spreader Schemes

    Energy Technology Data Exchange (ETDEWEB)

    Placidi, M.; Jung, J. -Y.; Ratti, A.; Sun, C.

    2014-07-25

    This paper describes beam distribution schemes adopting a novel implementation based on low amplitude vertical deflections combined with horizontal ones generated by Lambertson-type septum magnets. This scheme offers substantial compactness in the longitudinal layouts of the beam lines and increased flexibility for beam delivery of multiple beam lines on a shot-to-shot basis. Fast kickers (FK) or transverse electric field RF Deflectors (RFD) provide the low amplitude deflections. Initially proposed at the Stanford Linear Accelerator Center (SLAC) as tools for beam diagnostics and more recently adopted for multiline beam pattern schemes, RFDs offer repetition capabilities and a likely better amplitude reproducibility when compared to FKs, which, in turn, offer more modest financial involvements both in construction and operation. Both solutions represent an ideal approach for the design of compact beam distribution systems resulting in space and cost savings while preserving flexibility and beam quality.

  12. Towards Symbolic Encryption Schemes

    DEFF Research Database (Denmark)

    Ahmed, Naveed; Jensen, Christian D.; Zenner, Erik

    2012-01-01

    Symbolic encryption, in the style of Dolev-Yao models, is ubiquitous in formal security models. In its common use, encryption on a whole message is specified as a single monolithic block. From a cryptographic perspective, however, this may require a resource-intensive cryptographic algorithm......, namely an authenticated encryption scheme that is secure under chosen ciphertext attack. Therefore, many reasonable encryption schemes, such as AES in the CBC or CFB mode, are not among the implementation options. In this paper, we report new attacks on CBC and CFB based implementations of the well......-known Needham-Schroeder and Denning-Sacco protocols. To avoid such problems, we advocate the use of refined notions of symbolic encryption that have natural correspondence to standard cryptographic encryption schemes....

  13. Variational Program Inference

    CERN Document Server

    Harik, Georges

    2010-01-01

    We introduce a framework for representing a variety of interesting problems as inference over the execution of probabilistic model programs. We represent a "solution" to such a problem as a guide program which runs alongside the model program and influences the model program's random choices, leading the model program to sample from a different distribution than from its priors. Ideally the guide program influences the model program to sample from the posteriors given the evidence. We show how the KL- divergence between the true posterior distribution and the distribution induced by the guided model program can be efficiently estimated (up to an additive constant) by sampling multiple executions of the guided model program. In addition, we show how to use the guide program as a proposal distribution in importance sampling to statistically prove lower bounds on the probability of the evidence and on the probability of a hypothesis and the evidence. We can use the quotient of these two bounds as an estimate of ...

  14. Neural model of gene regulatory network: a survey on supportive meta-heuristics.

    Science.gov (United States)

    Biswas, Surama; Acharyya, Sriyankar

    2016-06-01

    Gene regulatory network (GRN) is produced as a result of regulatory interactions between different genes through their coded proteins in cellular context. Having immense importance in disease detection and drug finding, GRN has been modelled through various mathematical and computational schemes and reported in survey articles. Neural and neuro-fuzzy models have been the focus of attraction in bioinformatics. Predominant use of meta-heuristic algorithms in training neural models has proved its excellence. Considering these facts, this paper is organized to survey neural modelling schemes of GRN and the efficacy of meta-heuristic algorithms towards parameter learning (i.e. weighting connections) within the model. This survey paper renders two different structure-related approaches to infer GRN which are global structure approach and substructure approach. It also describes two neural modelling schemes, such as artificial neural network/recurrent neural network based modelling and neuro-fuzzy modelling. The meta-heuristic algorithms applied so far to learn the structure and parameters of neutrally modelled GRN have been reviewed here.

  15. Bayesian inference for a wavefront model of the Neolithisation of Europe

    CERN Document Server

    Baggaley, Andrew W; Shukurov, Anvar; Boys, Richard J; Golightly, Andrew

    2012-01-01

    We consider a wavefront model for the spread of Neolithic culture across Europe, and use Bayesian inference techniques to provide estimates for the parameters within this model, as constrained by radiocarbon data from Southern and Western Europe. Our wavefront model allows for both an isotropic background spread (incorporating the effects of local geography), and a localized anisotropic spread associated with major waterways. We introduce an innovative numerical scheme to track the wavefront, allowing us to simulate the times of the first arrival at any site orders of magnitude more efficiently than traditional PDE approaches. We adopt a Bayesian approach to inference and use Gaussian process emulators to facilitate further increases in efficiency in the inference scheme, thereby making Markov chain Monte Carlo methods practical. We allow for uncertainty in the fit of our model, and also infer a parameter specifying the magnitude of this uncertainty. We obtain a magnitude for the background spread of order 1 ...

  16. Alternative health insurance schemes

    DEFF Research Database (Denmark)

    Keiding, Hans; Hansen, Bodil O.

    2002-01-01

    In this paper, we present a simple model of health insurance with asymmetric information, where we compare two alternative ways of organizing the insurance market. Either as a competitive insurance market, where some risks remain uninsured, or as a compulsory scheme, where however, the level...... competitive insurance; this situation turns out to be at least as good as either of the alternatives...

  17. Optimization methods for logical inference

    CERN Document Server

    Chandru, Vijay

    2011-01-01

    Merging logic and mathematics in deductive inference-an innovative, cutting-edge approach. Optimization methods for logical inference? Absolutely, say Vijay Chandru and John Hooker, two major contributors to this rapidly expanding field. And even though ""solving logical inference problems with optimization methods may seem a bit like eating sauerkraut with chopsticks. . . it is the mathematical structure of a problem that determines whether an optimization model can help solve it, not the context in which the problem occurs."" Presenting powerful, proven optimization techniques for logic in

  18. Dynamic droop scheme considering effect of intermittent renewable energy source

    DEFF Research Database (Denmark)

    Wang, Yanbo; Chen, Zhe; Deng, Fujin

    2016-01-01

    generator are calculated in different wind speed and insolation ranges. Then, piecewise droop relationships between distributed generators are built. Finally, the dynamic droop control is proposed to perform power sharing according to wind speed and sunlight information from local sensors. The dynamic droop...... controller of each DG unit is activated through local logic variable inferred by wind speed and solar insolation information. Simulation results are given for validating the droop control scheme. The proposed dynamic droop scheme preserves the advantage of conventional droop control method, and provides...

  19. Statistical inference via fiducial methods

    NARCIS (Netherlands)

    Salomé, Diemer

    1998-01-01

    In this thesis the attention is restricted to inductive reasoning using a mathematical probability model. A statistical procedure prescribes, for every theoretically possible set of data, the inference about the unknown of interest. ... Zie: Summary

  20. On principles of inductive inference

    OpenAIRE

    Kostecki, Ryszard Paweł

    2011-01-01

    We propose an intersubjective epistemic approach to foundations of probability theory and statistical inference, based on relative entropy and category theory, and aimed to bypass the mathematical and conceptual problems of existing foundational approaches.

  1. On Converting Secret Sharing Scheme to Visual Secret Sharing Scheme

    Directory of Open Access Journals (Sweden)

    Wang Daoshun

    2010-01-01

    Full Text Available Abstract Traditional Secret Sharing (SS schemes reconstruct secret exactly the same as the original one but involve complex computation. Visual Secret Sharing (VSS schemes decode the secret without computation, but each share is m times as big as the original and the quality of the reconstructed secret image is reduced. Probabilistic visual secret sharing (Prob.VSS schemes for a binary image use only one subpixel to share the secret image; however the probability of white pixels in a white area is higher than that in a black area in the reconstructed secret image. SS schemes, VSS schemes, and Prob. VSS schemes have various construction methods and advantages. This paper first presents an approach to convert (transform a -SS scheme to a -VSS scheme for greyscale images. The generation of the shadow images (shares is based on Boolean XOR operation. The secret image can be reconstructed directly by performing Boolean OR operation, as in most conventional VSS schemes. Its pixel expansion is significantly smaller than that of VSS schemes. The quality of the reconstructed images, measured by average contrast, is the same as VSS schemes. Then a novel matrix-concatenation approach is used to extend the greyscale -SS scheme to a more general case of greyscale -VSS scheme.

  2. Type Inference for Guarded Recursive Data Types

    OpenAIRE

    Stuckey, Peter J.; Sulzmann, Martin

    2005-01-01

    We consider type inference for guarded recursive data types (GRDTs) -- a recent generalization of algebraic data types. We reduce type inference for GRDTs to unification under a mixed prefix. Thus, we obtain efficient type inference. Inference is incomplete because the set of type constraints allowed to appear in the type system is only a subset of those type constraints generated by type inference. Hence, inference only succeeds if the program is sufficiently type annotated. We present refin...

  3. Statistical Inference in Graphical Models

    Science.gov (United States)

    2008-06-17

    Probabilistic Network Library ( PNL ). While not fully mature, PNL does provide the most commonly-used algorithms for inference and learning with the efficiency...of C++, and also offers interfaces for calling the library from MATLAB and R 1361. Notably, both BNT and PNL provide learning and inference algorithms...mature and has been used for research purposes for several years, it is written in MATLAB and thus is not suitable to be used in real-time settings. PNL

  4. Implementing Deep Inference in Tom

    OpenAIRE

    Kahramanogullari, Ozan; Moreau, Pierre-Etienne; Reilles, Antoine

    2005-01-01

    ISSN 1430-211X; The calculus of structures is a proof theoretical formalism which generalizes sequent calculus with the feature of deep inference: in contrast to sequent calculus, the calculus of structures does not rely on the notion of main connective and, like in term rewriting, it permits the application of the inference rules at any depth inside a formula. Tom is a pattern matching processor that integrates term rewriting facilities into imperative languages. In this paper, relying on th...

  5. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framewor......-accelerated primitives specializes iLang to the spatial data-structures that arise in imaging applications. We illustrate the framework through a challenging application: spatio-temporal tomographic reconstruction with compressive sensing....

  6. Bayesian Inference: with ecological applications

    Science.gov (United States)

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  7. Statistical Inference: The Big Picture.

    Science.gov (United States)

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  8. Abductive inference and delusional belief.

    Science.gov (United States)

    Coltheart, Max; Menzies, Peter; Sutton, John

    2010-01-01

    Delusional beliefs have sometimes been considered as rational inferences from abnormal experiences. We explore this idea in more detail, making the following points. First, the abnormalities of cognition that initially prompt the entertaining of a delusional belief are not always conscious and since we prefer to restrict the term "experience" to consciousness we refer to "abnormal data" rather than "abnormal experience". Second, we argue that in relation to many delusions (we consider seven) one can clearly identify what the abnormal cognitive data are which prompted the delusion and what the neuropsychological impairment is which is responsible for the occurrence of these data; but one can equally clearly point to cases where this impairment is present but delusion is not. So the impairment is not sufficient for delusion to occur: a second cognitive impairment, one that affects the ability to evaluate beliefs, must also be present. Third (and this is the main thrust of our paper), we consider in detail what the nature of the inference is that leads from the abnormal data to the belief. This is not deductive inference and it is not inference by enumerative induction; it is abductive inference. We offer a Bayesian account of abductive inference and apply it to the explanation of delusional belief.

  9. Active inference, communication and hermeneutics.

    Science.gov (United States)

    Friston, Karl J; Frith, Christopher D

    2015-07-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others--during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions--both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then--in principle--they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Reverse flood routing with the inverted Muskingum storage routing scheme

    OpenAIRE

    A. D. Koussis; K. Mazi; S. Lykoudis; Argiriou, A. A.

    2012-01-01

    This work treats reverse flood routing aiming at signal identification: inflows are inferred from observed outflows by orienting the Muskingum scheme against the wave propagation direction. Routing against the wave propagation is an ill-posed, inverse problem (small errors amplify, leading to large spurious responses); therefore, the reverse solution must be smoothness-constrained towards stability and uniqueness (regularised). Theoretical constrains on the coefficients of the reverse routing...

  11. A New Deferred Sentencing Scheme

    Directory of Open Access Journals (Sweden)

    N. K. Chakravarti

    1968-10-01

    Full Text Available A new deferred sentencing scheme resembling double sampling scheme has been suggested from viewpoint of operational and administrative. It is recommended particularly when the inspection is destructive. The O.C. curves of the scheme for two sample sizes of 5 and 10 have been given.

  12. Bonus schemes and trading activity

    NARCIS (Netherlands)

    Pikulina, E.S.; Renneboog, L.D.R.; ter Horst, J.R.; Tobler, P.N.

    2014-01-01

    Little is known about how different bonus schemes affect traders' propensity to trade and which bonus schemes improve traders' performance. We study the effects of linear versus threshold bonus schemes on traders' behavior. Traders buy and sell shares in an experimental stock market on the basis of

  13. Bonus Schemes and Trading Activity

    NARCIS (Netherlands)

    Pikulina, E.S.; Renneboog, L.D.R.; Ter Horst, J.R.; Tobler, P.N.

    2013-01-01

    Abstract: Little is known about how different bonus schemes affect traders’ propensity to trade and which bonus schemes improve traders’ performance. We study the effects of linear versus threshold (convex) bonus schemes on traders’ behavior. Traders purchase and sell shares in an experimental stock

  14. Bonus schemes and trading activity

    NARCIS (Netherlands)

    Pikulina, E.S.; Renneboog, L.D.R.; ter Horst, J.R.; Tobler, P.N.

    2014-01-01

    Little is known about how different bonus schemes affect traders' propensity to trade and which bonus schemes improve traders' performance. We study the effects of linear versus threshold bonus schemes on traders' behavior. Traders buy and sell shares in an experimental stock market on the basis of

  15. Two Improved Digital Signature Schemes

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In this paper, two improved digital signature schemes are presented based on the design of directed signaturescheme [3]. The peculiarity of the system is that only if the scheme is specific recipient, the signature is authenticated.Since the scheme adds the screen of some information parameters, the difficulty of deciphered keys and the security ofdigital signature system are increased.

  16. CONSIDERATIONS CONCERNING GUARANTEE SCHEMES

    Directory of Open Access Journals (Sweden)

    EMILIA CLIPICI

    2013-05-01

    Full Text Available When a large withdrawal from banks occurs, customers withdraw their deposits, so banks are likely to go bankrupt because of liquidity problems. There are several mechanisms that allow the banking system to avoid the phenomenon of massive withdrawals from banks. The most effective one is the deposit insurance. The deposit insurance is seen primarily as a means of protecting depositors of credit institutions, and secondly as a means of ensuring the stability of the banking system. This article described deposit guarantee scheme in Romania and other country.

  17. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framework...... is composed of a set of language primitives and of an inference engine based on a message-passing system that integrates cutting-edge computational tools, including proximal algorithms and high performance Hamiltonian Markov Chain Monte Carlo techniques. A set of domain-specific highly optimized GPU......-accelerated primitives specializes iLang to the spatial data-structures that arise in imaging applications. We illustrate the framework through a challenging application: spatio-temporal tomographic reconstruction with compressive sensing....

  18. Locative inferences in medical texts.

    Science.gov (United States)

    Mayer, P S; Bailey, G H; Mayer, R J; Hillis, A; Dvoracek, J E

    1987-06-01

    Medical research relies on epidemiological studies conducted on a large set of clinical records that have been collected from physicians recording individual patient observations. These clinical records are recorded for the purpose of individual care of the patient with little consideration for their use by a biostatistician interested in studying a disease over a large population. Natural language processing of clinical records for epidemiological studies must deal with temporal, locative, and conceptual issues. This makes text understanding and data extraction of clinical records an excellent area for applied research. While much has been done in making temporal or conceptual inferences in medical texts, parallel work in locative inferences has not been done. This paper examines the locative inferences as well as the integration of temporal, locative, and conceptual issues in the clinical record understanding domain by presenting an application that utilizes two key concepts in its parsing strategy--a knowledge-based parsing strategy and a minimal lexicon.

  19. Sick, the spectroscopic inference crank

    CERN Document Server

    Casey, Andrew R

    2016-01-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives which remain severely under-utilised. The lack of reliable open-source tools for analysing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this Article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick can be used to provide a nearest-neighbour estimate of model parameters, a numerically optimised point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalise on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-di...

  20. Secure mediated certificateless signature scheme

    Institute of Scientific and Technical Information of China (English)

    YANG Chen; MA Wen-ping; WANG Xin-mei

    2007-01-01

    Ju et al proposed a certificateless signature scheme with instantaneous revocation by introducing security mediator (SEM) mechanism. This article presents a detailed cryptoanalysis of this scheme and shows that, in their proposed scheme, once a valid signature has been produced, the signer can recover his private key information and the instantaneous revocation property will be damaged. Furthermore, an improved mediated signature scheme, which can eliminate these disadvantages, is proposed, and security proof of the improved scheme under elliptic curve factorization problem (ECFP) assumption and bilinear computational diffie-hellman problem (BCDH) assumption is also proposed.

  1. Eight challenges in phylodynamic inference

    Directory of Open Access Journals (Sweden)

    Simon D.W. Frost

    2015-03-01

    Full Text Available The field of phylodynamics, which attempts to enhance our understanding of infectious disease dynamics using pathogen phylogenies, has made great strides in the past decade. Basic epidemiological and evolutionary models are now well characterized with inferential frameworks in place. However, significant challenges remain in extending phylodynamic inference to more complex systems. These challenges include accounting for evolutionary complexities such as changing mutation rates, selection, reassortment, and recombination, as well as epidemiological complexities such as stochastic population dynamics, host population structure, and different patterns at the within-host and between-host scales. An additional challenge exists in making efficient inferences from an ever increasing corpus of sequence data.

  2. Automatic Inference of DATR Theories

    CERN Document Server

    Barg, P

    1996-01-01

    This paper presents an approach for the automatic acquisition of linguistic knowledge from unstructured data. The acquired knowledge is represented in the lexical knowledge representation language DATR. A set of transformation rules that establish inheritance relationships and a default-inference algorithm make up the basis components of the system. Since the overall approach is not restricted to a special domain, the heuristic inference strategy uses criteria to evaluate the quality of a DATR theory, where different domains may require different criteria. The system is applied to the linguistic learning task of German noun inflection.

  3. Perception, illusions and Bayesian inference.

    Science.gov (United States)

    Nour, Matthew M; Nour, Joseph M

    2015-01-01

    Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.

  4. Object-Oriented Type Inference

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Palsberg, Jens

    1991-01-01

    We present a new approach to inferring types in untyped object-oriented programs with inheritance, assignments, and late binding. It guarantees that all messages are understood, annotates the program with type information, allows polymorphic methods, and can be used as the basis of an op-timizing......We present a new approach to inferring types in untyped object-oriented programs with inheritance, assignments, and late binding. It guarantees that all messages are understood, annotates the program with type information, allows polymorphic methods, and can be used as the basis of an op...

  5. Inference of asynchronous Boolean network from biological pathways.

    Science.gov (United States)

    Das, Haimabati; Layek, Ritwik Kumar

    2015-01-01

    Gene regulation is a complex process with multiple levels of interactions. In order to describe this complex dynamical system with tractable parameterization, the choice of the dynamical system model is of paramount importance. The right abstraction of the modeling scheme can reduce the complexity in the inference and intervention design, both computationally and experimentally. This article proposes an asynchronous Boolean network framework to capture the transcriptional regulation as well as the protein-protein interactions in a genetic regulatory system. The inference of asynchronous Boolean network from biological pathways information and experimental evidence are explained using an algorithm. The suitability of this paradigm for the variability of several reaction rates is also discussed. This methodology and model selection open up new research challenges in understanding gene-protein interactive system in a coherent way and can be beneficial for designing effective therapeutic intervention strategy.

  6. Inferring modules from human protein interactome classes

    Directory of Open Access Journals (Sweden)

    Chaurasia Gautam

    2010-07-01

    Full Text Available Abstract Background The integration of protein-protein interaction networks derived from high-throughput screening approaches and complementary sources is a key topic in systems biology. Although integration of protein interaction data is conventionally performed, the effects of this procedure on the result of network analyses has not been examined yet. In particular, in order to optimize the fusion of heterogeneous interaction datasets, it is crucial to consider not only their degree of coverage and accuracy, but also their mutual dependencies and additional salient features. Results We examined this issue based on the analysis of modules detected by network clustering methods applied to both integrated and individual (disaggregated data sources, which we call interactome classes. Due to class diversity, we deal with variable dependencies of data features arising from structural specificities and biases, but also from possible overlaps. Since highly connected regions of the human interactome may point to potential protein complexes, we have focused on the concept of modularity, and elucidated the detection power of module extraction algorithms by independent validations based on GO, MIPS and KEGG. From the combination of protein interactions with gene expressions, a confidence scoring scheme has been proposed before proceeding via GO with further classification in permanent and transient modules. Conclusions Disaggregated interactomes are shown to be informative for inferring modularity, thus contributing to perform an effective integrative analysis. Validation of the extracted modules by multiple annotation allows for the assessment of confidence measures assigned to the modules in a protein pathway context. Notably, the proposed multilayer confidence scheme can be used for network calibration by enabling a transition from unweighted to weighted interactomes based on biological evidence.

  7. Pretzel scheme for CEPC

    Science.gov (United States)

    Geng, Huiping

    2016-11-01

    CEPC was proposed as an electron and positron collider ring with a circumference of 50-100 km to study the Higgs boson. Since the proposal was made, the lattice design for CEPC has been carried out and a preliminary conceptual design report has been written at the end of 2014. In this paper, we will describe the principles of pretzel scheme design, which is one of most important issues in CEPC lattice design. Then, we will show the modification of the lattice based on the lattice design shown in the Pre-CDR. The latest pretzel orbit design result will also be shown. The issues remained to be solved in the present design will be discussed and a brief summary will be given at the end.

  8. Bayesian Inference for Structured Spike and Slab Priors

    DEFF Research Database (Denmark)

    Andersen, Michael Riis; Winther, Ole; Hansen, Lars Kai

    2014-01-01

    Sparse signal recovery addresses the problem of solving underdetermined linear inverse problems subject to a sparsity constraint. We propose a novel prior formulation, the structured spike and slab prior, which allows to incorporate a priori knowledge of the sparsity pattern by imposing a spatial...... Gaussian process on the spike and slab probabilities. Thus, prior information on the structure of the sparsity pattern can be encoded using generic covariance functions. Furthermore, we provide a Bayesian inference scheme for the proposed model based on the expectation propagation framework. Using...

  9. Bayesian Inference for Structured Spike and Slab Priors

    DEFF Research Database (Denmark)

    Andersen, Michael Riis; Winther, Ole; Hansen, Lars Kai

    2014-01-01

    Sparse signal recovery addresses the problem of solving underdetermined linear inverse problems subject to a sparsity constraint. We propose a novel prior formulation, the structured spike and slab prior, which allows to incorporate a priori knowledge of the sparsity pattern by imposing a spatial...... Gaussian process on the spike and slab probabilities. Thus, prior information on the structure of the sparsity pattern can be encoded using generic covariance functions. Furthermore, we provide a Bayesian inference scheme for the proposed model based on the expectation propagation framework. Using...

  10. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  11. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  12. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  13. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  14. On principles of inductive inference

    CERN Document Server

    Kostecki, Ryszard Paweł

    2011-01-01

    We discuss the mathematical and conceptual problems of main approaches to foundations of probability theory and statistical inference and propose new foundational approach, aimed to improve the mathematical structure of the theory and to bypass the old conceptual problems. In particular, we introduce the intersubjective interpretation of probability, which is designed to deal with the troubles of `subjective' and `objective' bayesian interpretations.

  15. Regular inference as vertex coloring

    NARCIS (Netherlands)

    Costa Florêncio, C.; Verwer, S.

    2012-01-01

    This paper is concerned with the problem of supervised learning of deterministic finite state automata, in the technical sense of identification in the limit from complete data, by finding a minimal DFA consistent with the data (regular inference). We solve this problem by translating it in its enti

  16. Type inference for COBOL systems

    NARCIS (Netherlands)

    Deursen, A. van; Moonen, L.M.F.

    1998-01-01

    Types are a good starting point for various software reengineering tasks. Unfortunately, programs requiring reengineering most desperately are written in languages without an adequate type system (such as COBOL). To solve this problem, we propose a method of automated type inference for these lang

  17. Regular inference as vertex coloring

    NARCIS (Netherlands)

    Costa Florêncio, C.; Verwer, S.

    2012-01-01

    This paper is concerned with the problem of supervised learning of deterministic finite state automata, in the technical sense of identification in the limit from complete data, by finding a minimal DFA consistent with the data (regular inference). We solve this problem by translating it in its

  18. Statistical inference on variance components

    NARCIS (Netherlands)

    Verdooren, L.R.

    1988-01-01

    In several sciences but especially in animal and plant breeding, the general mixed model with fixed and random effects plays a great role. Statistical inference on variance components means tests of hypotheses about variance components, constructing confidence intervals for them, estimating them,

  19. Covering, Packing and Logical Inference

    Science.gov (United States)

    1993-10-01

    of Operations Research 43 (1993). [34] *Hooker, J. N., Generalized resolution for 0-1 linear inequalities, Annals of Mathematics and A 16 271-286. [35...Hooker, J. N. and C. Fedjki, Branch-and-cut solution of inference prob- lems in propositional logic, Annals of Mathematics and AI 1 (1990) 123-140. [40

  20. Mathematical Programming and Logical Inference

    Science.gov (United States)

    1990-12-01

    solution of inference problems in propositional logic, to appear in Annals of Mathematics and Al. (271 Howard, R. A., and J. E. Matheson, Influence...1981). (281 Jeroslow, R., and J. Wang, Solving propositional satisfiability problems, to appear in Annals of Mathematics and Al. [29] Nilsson, N. J

  1. An Introduction to Causal Inference

    Science.gov (United States)

    2009-11-02

    legitimize causal inference, has removed causation from its natural habitat, and distorted its face beyond recognition. This exclusivist attitude is...In contrast, when the mediation problem is approached from an exclusivist potential-outcome viewpoint, void of the structural guidance of Eq. (28

  2. Spontaneous evaluative inferences and their relationship to spontaneous trait inferences.

    Science.gov (United States)

    Schneid, Erica D; Carlston, Donal E; Skowronski, John J

    2015-05-01

    Three experiments are reported that explore affectively based spontaneous evaluative impressions (SEIs) of stimulus persons. Experiments 1 and 2 used modified versions of the savings in relearning paradigm (Carlston & Skowronski, 1994) to confirm the occurrence of SEIs, indicating that they are equivalent whether participants are instructed to form trait impressions, evaluative impressions, or neither. These experiments also show that SEIs occur independently of explicit recall for the trait implications of the stimuli. Experiment 3 provides a single dissociation test to distinguish SEIs from spontaneous trait inferences (STIs), showing that disrupting cognitive processing interferes with a trait-based prediction task that presumably reflects STIs, but not with an affectively based social approach task that presumably reflects SEIs. Implications of these findings for the potential independence of spontaneous trait and evaluative inferences, as well as limitations and important steps for future study are discussed. (c) 2015 APA, all rights reserved).

  3. Improved Ternary Subdivision Interpolation Scheme

    Institute of Scientific and Technical Information of China (English)

    WANG Huawei; QIN Kaihuai

    2005-01-01

    An improved ternary subdivision interpolation scheme was developed for computer graphics applications that can manipulate open control polygons unlike the previous ternary scheme, with the resulting curve proved to be still C2-continuous. Parameterizations of the limit curve near the two endpoints are given with expressions for the boundary derivatives. The split joint problem is handled with the interpolating ternary subdivision scheme. The improved scheme can be used for modeling interpolation curves in computer aided geometric design systems, and provides a method for joining two limit curves of interpolating ternary subdivisions.

  4. Formal Verification of NTRUEncrypt Scheme

    Directory of Open Access Journals (Sweden)

    Gholam Reza Moghissi

    2016-04-01

    Full Text Available In this paper we explore a mechanized verification of the NTRUEncrypt scheme, with the formal proof system Isabelle/HOL. More precisely, the functional correctness of this algorithm, in its reduced form, is formally verified with computer support. We show that this scheme is correct what is a necessary condition for the usefulness of any cryptographic encryption scheme. Besides, we present a convenient and application specific formalization of the NTRUEncrypt scheme in the Isabelle/HOL system that can be used in further study around the functional and security analysis of NTRUEncrypt family.

  5. AUTISTIC CHILDREN PROTECTION SCHEME

    Directory of Open Access Journals (Sweden)

    Dragan LUKIC

    1998-09-01

    Full Text Available The present article sets forth the theoretical grounds which make the basis for the organizational scheme of the autistic persons social protection. This protection consists of the below listed forms of work:· Health service with the role of an early detection and participation in the creation of rehabilitation programs;· Social protection with its programs of work from the diagnostics where the defectologist makes a team together with the physician and the psychologists to the systems of rehabilitation institutions where the defectologist’s is the main responsibility.The present article underlines two facts, namely:· that an autistic person requires to be followed and every spare moment used to promote and advance the activities the doer commenced himself instead of having him carry out the programs which are beyond his internal motivations and which he finds emotionally inaccessible;· that and form of work organization with autistic persons must subordinate its administrative part to the basic professional requirements this kind of disorder (handicap sets in front of each professional.

  6. Statistical inference on residual life

    CERN Document Server

    Jeong, Jong-Hyeon

    2014-01-01

    This is a monograph on the concept of residual life, which is an alternative summary measure of time-to-event data, or survival data. The mean residual life has been used for many years under the name of life expectancy, so it is a natural concept for summarizing survival or reliability data. It is also more interpretable than the popular hazard function, especially for communications between patients and physicians regarding the efficacy of a new drug in the medical field. This book reviews existing statistical methods to infer the residual life distribution. The review and comparison includes existing inference methods for mean and median, or quantile, residual life analysis through medical data examples. The concept of the residual life is also extended to competing risks analysis. The targeted audience includes biostatisticians, graduate students, and PhD (bio)statisticians. Knowledge in survival analysis at an introductory graduate level is advisable prior to reading this book.

  7. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  8. Bayesian Inference for Radio Observations

    CERN Document Server

    Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin

    2015-01-01

    (Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...

  9. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  10. Network Inference from Grouped Data

    CERN Document Server

    Zhao, Yunpeng

    2016-01-01

    In medical research, economics, and the social sciences data frequently appear as subsets of a set of objects. Over the past century a number of descriptive statistics have been developed to construct network structure from such data. However, these measures lack a generating mechanism that links the inferred network structure to the observed groups. To address this issue, we propose a model-based approach called the Hub Model which assumes that every observed group has a leader and that the leader has brought together the other members of the group. The performance of Hub Models is demonstrated by simulation studies. We apply this model to infer the relationships among Senators serving in the 110th United States Congress, the characters in a famous 18th century Chinese novel, and the distribution of flora in North America.

  11. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  12. Inferring Centrality from Network Snapshots

    Science.gov (United States)

    Shao, Haibin; Mesbahi, Mehran; Li, Dewei; Xi, Yugeng

    2017-01-01

    The topology and dynamics of a complex network shape its functionality. However, the topologies of many large-scale networks are either unavailable or incomplete. Without the explicit knowledge of network topology, we show how the data generated from the network dynamics can be utilised to infer the tempo centrality, which is proposed to quantify the influence of nodes in a consensus network. We show that the tempo centrality can be used to construct an accurate estimate of both the propagation rate of influence exerted on consensus networks and the Kirchhoff index of the underlying graph. Moreover, the tempo centrality also encodes the disturbance rejection of nodes in a consensus network. Our findings provide an approach to infer the performance of a consensus network from its temporal data. PMID:28098166

  13. Statistical learning and selective inference.

    Science.gov (United States)

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  14. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  15. Causal inference based on counterfactuals

    Directory of Open Access Journals (Sweden)

    Höfler M

    2005-09-01

    Full Text Available Abstract Background The counterfactual or potential outcome model has become increasingly standard for causal inference in epidemiological and medical studies. Discussion This paper provides an overview on the counterfactual and related approaches. A variety of conceptual as well as practical issues when estimating causal effects are reviewed. These include causal interactions, imperfect experiments, adjustment for confounding, time-varying exposures, competing risks and the probability of causation. It is argued that the counterfactual model of causal effects captures the main aspects of causality in health sciences and relates to many statistical procedures. Summary Counterfactuals are the basis of causal inference in medicine and epidemiology. Nevertheless, the estimation of counterfactual differences pose several difficulties, primarily in observational studies. These problems, however, reflect fundamental barriers only when learning from observations, and this does not invalidate the counterfactual concept.

  16. Applied statistical inference with MINITAB

    CERN Document Server

    Lesik, Sally

    2009-01-01

    Through clear, step-by-step mathematical calculations, Applied Statistical Inference with MINITAB enables students to gain a solid understanding of how to apply statistical techniques using a statistical software program. It focuses on the concepts of confidence intervals, hypothesis testing, validating model assumptions, and power analysis.Illustrates the techniques and methods using MINITABAfter introducing some common terminology, the author explains how to create simple graphs using MINITAB and how to calculate descriptive statistics using both traditional hand computations and MINITAB. Sh

  17. Security Inference from Noisy Data

    Science.gov (United States)

    2008-04-08

    Junk Mail Samples (JMS)” later) is collected from Hotmail using a different method. JMS is collected from email in inboxes that is reported as spam (or...The data consist of side channel traces from attackers: spam email messages received by Hotmail, one of the largest Web mail services. The basic...similar content and determining the senders of these email messages, one can infer the composition of the botnet. This approach can analyze botnets re

  18. Optimal Inference in Cointegrated Systems

    OpenAIRE

    1988-01-01

    This paper studies the properties of maximum likelihood estimates of co-integrated systems. Alternative formulations of such models are considered including a new triangular system error correction mechanism. It is shown that full system maximum likelihood brings the problem of inference within the family that is covered by the locally asymptotically mixed normal asymptotic theory provided that all unit roots in the system have been eliminated by specification and data transformation. This re...

  19. Inferring Centrality from Network Snapshots

    OpenAIRE

    Haibin Shao; Mehran Mesbahi; Dewei Li; Yugeng Xi

    2017-01-01

    The topology and dynamics of a complex network shape its functionality. However, the topologies of many large-scale networks are either unavailable or incomplete. Without the explicit knowledge of network topology, we show how the data generated from the network dynamics can be utilised to infer the tempo centrality, which is proposed to quantify the influence of nodes in a consensus network. We show that the tempo centrality can be used to construct an accurate estimate of both the propagati...

  20. On Quantum Statistical Inference, II

    OpenAIRE

    Barndorff-Nielsen, O. E.; Gill, R. D.; Jupp, P.E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, theoretical developments in the theory of quantum measurements have brought the basic mathematical framework for the probability calculations much closer to that of classical probability theory. The present paper reviews this field and proposes and inte...

  1. An introduction to causal inference.

    Science.gov (United States)

    Pearl, Judea

    2010-02-26

    This paper summarizes recent advances in causal inference and underscores the paradigmatic shifts that must be undertaken in moving from traditional statistical analysis to causal analysis of multivariate data. Special emphasis is placed on the assumptions that underlie all causal inferences, the languages used in formulating those assumptions, the conditional nature of all causal and counterfactual claims, and the methods that have been developed for the assessment of such claims. These advances are illustrated using a general theory of causation based on the Structural Causal Model (SCM) described in Pearl (2000a), which subsumes and unifies other approaches to causation, and provides a coherent mathematical foundation for the analysis of causes and counterfactuals. In particular, the paper surveys the development of mathematical tools for inferring (from a combination of data and assumptions) answers to three types of causal queries: those about (1) the effects of potential interventions, (2) probabilities of counterfactuals, and (3) direct and indirect effects (also known as "mediation"). Finally, the paper defines the formal and conceptual relationships between the structural and potential-outcome frameworks and presents tools for a symbiotic analysis that uses the strong features of both. The tools are demonstrated in the analyses of mediation, causes of effects, and probabilities of causation.

  2. El: A Program for Ecological Inference

    OpenAIRE

    King, Gary

    2004-01-01

    The program EI provides a method of inferring individual behavior from aggregate data. It implements the statistical procedures, diagnostics, and graphics from the book A Solution to the Ecological Inference Problem: Reconstructing Individual Behavior from Aggregate Data (King 1997). Ecological inference, as traditionally defined, is the process of using aggregate (i.e., “ecological”) data to infer discrete individual-level relationships of interest when individual- level data are not avai...

  3. EI: A Program for Ecological Inference

    OpenAIRE

    Gary King

    2004-01-01

    The program EI provides a method of inferring individual behavior from aggregate data. It implements the statistical procedures, diagnostics, and graphics from the book A Solution to the Ecological Inference Problem: Reconstructing Individual Behavior from Aggregate Data (King 1997). Ecological inference, as traditionally defined, is the process of using aggregate (i.e., "ecological") data to infer discrete individual-level relationships of interest when individual-level data are not ava...

  4. Chaotic communication scheme with multiplication

    Science.gov (United States)

    Bobreshov, A. M.; Karavaev, A. A.

    2007-05-01

    A new scheme of data transmission with nonlinear admixing is described, in which the two mutually inverse operations (multiplication and division) ensure multiplicative mixing of the informative and chaotic signals that provides a potentially higher degree of security. A special feature of the proposed scheme is the absence of limitations (related to the division by zero) imposed on the types of informative signals.

  5. Homographic scheme for Riccati equation

    CERN Document Server

    Dubois, François

    2011-01-01

    In this paper we present a numerical scheme for the resolution of matrix Riccati equation, usualy used in control problems. The scheme is unconditionnaly stable and the solution is definite positive at each time step of the resolution. We prove the convergence in the scalar case and present several numerical experiments for classical test cases.

  6. Differential operators and automorphism schemes

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The ring of global differential operators of a variety is in closed and deep relation with its automorphism scheme.This relation can be applied to the study of homogeneous schemes,giving some criteria of homogeneity,a generalization of Serre-Lang theorem,and some consequences about abelian varieties.

  7. Children's schemes for anticipating the validity of nets for solids

    Science.gov (United States)

    Wright, Vince; Smith, Ken

    2017-06-01

    There is growing acknowledgement of the importance of spatial abilities to student achievement across a broad range of domains and disciplines. Nets are one way to connect three-dimensional shapes and their two-dimensional representations and are a common focus of geometry curricula. Thirty-four students at year 6 (upper primary school) were interviewed on two occasions about their anticipation of whether or not given nets for the cube- and square-based pyramid would fold to form the target solid. Vergnaud's (Journal of Mathematical Behavior, 17(2), 167-181, 1998, Human Development, 52, 83-94, 2009) four characteristics of schemes were used as a theoretical lens to analyse the data. Successful schemes depended on the interaction of operational invariants, such as strategic choice of the base, rules for action, particularly rotation of shapes, and anticipations of composites of polygons in the net forming arrangements of faces in the solid. Inferences were rare. These data suggest that students need teacher support to make inferences, in order to create transferable schemes.

  8. Children's schemes for anticipating the validity of nets for solids

    Science.gov (United States)

    Wright, Vince; Smith, Ken

    2017-09-01

    There is growing acknowledgement of the importance of spatial abilities to student achievement across a broad range of domains and disciplines. Nets are one way to connect three-dimensional shapes and their two-dimensional representations and are a common focus of geometry curricula. Thirty-four students at year 6 (upper primary school) were interviewed on two occasions about their anticipation of whether or not given nets for the cube- and square-based pyramid would fold to form the target solid. Vergnaud's ( Journal of Mathematical Behavior, 17(2), 167-181, 1998, Human Development, 52, 83-94, 2009) four characteristics of schemes were used as a theoretical lens to analyse the data. Successful schemes depended on the interaction of operational invariants, such as strategic choice of the base, rules for action, particularly rotation of shapes, and anticipations of composites of polygons in the net forming arrangements of faces in the solid. Inferences were rare. These data suggest that students need teacher support to make inferences, in order to create transferable schemes.

  9. Evidence and Inference in Educational Assessment.

    Science.gov (United States)

    1995-02-01

    Educational assessment concerns inference about students’ knowledge, skills, and accomplishments. Because data are never so comprehensive and...techniques can be viewed as applications of more general principles for inference in the presence of uncertainty. Issues of evidence and inference in educational assessment are discussed from this perspective. (AN)

  10. A Self-adaptive Scope Allocation Scheme for Labeling Dynamic XML Documents

    NARCIS (Netherlands)

    Shen, Y.; Feng, L.; Shen, T.; Wang, B.

    2004-01-01

    This paper proposes a self-adaptive scope allocation scheme for labeling dynamic XML documents. It is general, light-weight and can be built upon existing data retrieval mechanisms. Bayesian inference is used to compute the actual scope allocated for labeling a certain node based on both the prior i

  11. The Occupational Pension Schemes Survey 2006

    OpenAIRE

    Sarah Levy; David Miller

    2008-01-01

    Presents findings on the number of schemes, their membership and contributions to schemes by employers and employeesThis article presents findings on the number of occupational pension schemes in 2006, their membership and contributions to schemes by employers and employees. It is based on the Occupational Pension Schemes Annual Report (2006 edition). The findings distinguish between public and private sector schemes and include breakdowns by scheme status (open, closed, frozen or winding up)...

  12. Perceptual inference and autistic traits

    DEFF Research Database (Denmark)

    Skewes, Joshua; Jegindø, Else-Marie Elmholdt; Gebauer, Line

    2015-01-01

    Autistic people are better at perceiving details. Major theories explain this in terms of bottom-up sensory mechanisms, or in terms of top-down cognitive biases. Recently, it has become possible to link these theories within a common framework. This framework assumes that perception is implicit...... neural inference, combining sensory evidence with prior perceptual knowledge. Within this framework, perceptual differences may occur because of enhanced precision in how sensory evidence is represented, or because sensory evidence is weighted much higher than prior perceptual knowledge...

  13. Logical inferences in discourse analysis

    Institute of Scientific and Technical Information of China (English)

    刘峰廷

    2014-01-01

    Cohesion and coherence are two important characteristics of discourses. Halliday and Hasan have pointed out that cohesion is the basis of coherence and coherence is the premise of forming discourse. The commonly used cohesive devices are: preference, ellipsis, substitution, etc. Discourse coherence is mainly manifested in sentences and paragraphs. However, in real discourse analysis environment, traditional methods on cohesion and coherence are not enough. This article talks about the conception of discourse analysis at the beginning. Then, we list some of the traditional cohesive devices and its uses. Following that, we make corpus analysis. Finally, we explore and find a new device in textual analysis:discourse logical inferences.

  14. SICK: THE SPECTROSCOPIC INFERENCE CRANK

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Andrew R., E-mail: arc@ast.cam.ac.uk [Institute of Astronomy, University of Cambridge, Madingley Road, Cambdridge, CB3 0HA (United Kingdom)

    2016-03-15

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  15. Universum Inference and Corpus Homogeneity

    Science.gov (United States)

    Vogel, Carl; Lynch, Gerard; Janssen, Jerom

    Universum Inference is re-interpreted for assessment of corpus homogeneity in computational stylometry. Recent stylometric research quantifies strength of characterization within dramatic works by assessing the homogeneity of corpora associated with dramatic personas. A methodological advance is suggested to mitigate the potential for the assessment of homogeneity to be achieved by chance. Baseline comparison analysis is constructed for contributions to debates by nonfictional participants: the corpus analyzed consists of transcripts of US Presidential and Vice-Presidential debates from the 2000 election cycle. The corpus is also analyzed in translation to Italian, Spanish and Portuguese. Adding randomized categories makes assessments of homogeneity more conservative.

  16. Inferring Network Structure from Cascades

    CERN Document Server

    Ghonge, Sushrut

    2016-01-01

    Many physical, biological and social phenomena can be described by cascades taking place on a network. Often, the activity can be empirically observed, but not the underlying network of interactions. In this paper we solve the dynamics of general cascade processes. We then offer three topological inversion methods to infer the structure of any directed network given a set of cascade arrival times. Our forward and inverse formulas hold for a very general class of models where the activation probability of a node is a generic function of its degree and the number of its active neighbors. We report high success rates for synthetic and real networks, for 5 different cascade models.

  17. Schemes for Deterministic Polynomial Factoring

    CERN Document Server

    Ivanyos, Gábor; Saxena, Nitin

    2008-01-01

    In this work we relate the deterministic complexity of factoring polynomials (over finite fields) to certain combinatorial objects we call m-schemes. We extend the known conditional deterministic subexponential time polynomial factoring algorithm for finite fields to get an underlying m-scheme. We demonstrate how the properties of m-schemes relate to improvements in the deterministic complexity of factoring polynomials over finite fields assuming the generalized Riemann Hypothesis (GRH). In particular, we give the first deterministic polynomial time algorithm (assuming GRH) to find a nontrivial factor of a polynomial of prime degree n where (n-1) is a smooth number.

  18. Coordinated renewable energy support schemes

    DEFF Research Database (Denmark)

    Morthorst, P.E.; Jensen, S.G.

    2006-01-01

    This paper illustrates the effect that can be observed when support schemes for renewable energy are regionalised. Two theoretical examples are used to explain interactive effects on, e.g., the price of power, conditions for conventional power producers, and changes in import and export of power...... RES-E support schemes already has a common liberalised power market. In this case the introduction of a common support scheme for renewable technologies will lead to more efficient sitings of renewable plants, improving economic and environmental performance of the total power system...

  19. Provable Secure Identity Based Generalized Signcryption Scheme

    CERN Document Server

    Yu, Gang; Shen, Yong; Han, Wenbao

    2010-01-01

    According to actual needs, generalized signcryption scheme can flexibly work as an encryption scheme, a signature scheme or a signcryption scheme. In this paper, firstly, we give a security model for identity based generalized signcryption which is more complete than existing model. Secondly, we propose an identity based generalized signcryption scheme. Thirdly, we give the security proof of the new scheme in this complete model. Comparing with existing identity based generalized signcryption, the new scheme has less implementation complexity. Moreover, the new scheme has comparable computation complexity with the existing normal signcryption schemes.

  20. Bayesian inference for OPC modeling

    Science.gov (United States)

    Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.

    2016-03-01

    The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.

  1. Dopamine, Affordance and Active Inference

    Science.gov (United States)

    Friston, Karl J.; Shiner, Tamara; FitzGerald, Thomas; Galea, Joseph M.; Adams, Rick; Brown, Harriet; Dolan, Raymond J.; Moran, Rosalyn; Stephan, Klaas Enno; Bestmann, Sven

    2012-01-01

    The role of dopamine in behaviour and decision-making is often cast in terms of reinforcement learning and optimal decision theory. Here, we present an alternative view that frames the physiology of dopamine in terms of Bayes-optimal behaviour. In this account, dopamine controls the precision or salience of (external or internal) cues that engender action. In other words, dopamine balances bottom-up sensory information and top-down prior beliefs when making hierarchical inferences (predictions) about cues that have affordance. In this paper, we focus on the consequences of changing tonic levels of dopamine firing using simulations of cued sequential movements. Crucially, the predictions driving movements are based upon a hierarchical generative model that infers the context in which movements are made. This means that we can confuse agents by changing the context (order) in which cues are presented. These simulations provide a (Bayes-optimal) model of contextual uncertainty and set switching that can be quantified in terms of behavioural and electrophysiological responses. Furthermore, one can simulate dopaminergic lesions (by changing the precision of prediction errors) to produce pathological behaviours that are reminiscent of those seen in neurological disorders such as Parkinson's disease. We use these simulations to demonstrate how a single functional role for dopamine at the synaptic level can manifest in different ways at the behavioural level. PMID:22241972

  2. Dopamine, affordance and active inference.

    Directory of Open Access Journals (Sweden)

    Karl J Friston

    2012-01-01

    Full Text Available The role of dopamine in behaviour and decision-making is often cast in terms of reinforcement learning and optimal decision theory. Here, we present an alternative view that frames the physiology of dopamine in terms of Bayes-optimal behaviour. In this account, dopamine controls the precision or salience of (external or internal cues that engender action. In other words, dopamine balances bottom-up sensory information and top-down prior beliefs when making hierarchical inferences (predictions about cues that have affordance. In this paper, we focus on the consequences of changing tonic levels of dopamine firing using simulations of cued sequential movements. Crucially, the predictions driving movements are based upon a hierarchical generative model that infers the context in which movements are made. This means that we can confuse agents by changing the context (order in which cues are presented. These simulations provide a (Bayes-optimal model of contextual uncertainty and set switching that can be quantified in terms of behavioural and electrophysiological responses. Furthermore, one can simulate dopaminergic lesions (by changing the precision of prediction errors to produce pathological behaviours that are reminiscent of those seen in neurological disorders such as Parkinson's disease. We use these simulations to demonstrate how a single functional role for dopamine at the synaptic level can manifest in different ways at the behavioural level.

  3. Underground hydro scheme for Ullapool

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Scoltish and Southern Energy has awarded a contract for a new hydropower plant Scotland, the company's first hydro project since the 1960s. The S6 million scheme will be built by Miller Civil Engineering Services Ltd.

  4. New Ideas on Labeling Schemes

    DEFF Research Database (Denmark)

    Rotbart, Noy Galil

    evaluation of fully dynamic labeling schemes. Due to a connection between adjacency labeling schemes and the graph theoretical study of induced universal graphs, we study these in depth and show novel results for bounded degree graphs and power-law graphs. We also survey and make progress on the related......With ever increasing size of graphs, many distributed graph systems emerged to store, preprocess and analyze them. While such systems ease up congestion on servers, they incur certain penalties compared to centralized data structure. First, the total storage required to store a graph...... in a distributed fashion increases. Second, attempting to answer queries on vertices of a graph stored in a distributed fashion can be significantly more complicated. In order to lay theoretical foundations to the first penalty mentioned a large body of work concentrated on labeling schemes. A labeling scheme...

  5. Capacity-achieving CPM schemes

    CERN Document Server

    Perotti, Alberto; Benedetto, Sergio; Montorsi, Guido

    2008-01-01

    The pragmatic approach to coded continuous-phase modulation (CPM) is proposed as a capacity-achieving low-complexity alternative to the serially-concatenated CPM (SC-CPM) coding scheme. In this paper, we first perform a selection of the best spectrally-efficient CPM modulations to be embedded into SC-CPM schemes. Then, we consider the pragmatic capacity (a.k.a. BICM capacity) of CPM modulations and optimize it through a careful design of the mapping between input bits and CPM waveforms. The so obtained schemes are cascaded with an outer serially-concatenated convolutional code to form a pragmatic coded-modulation system. The resulting schemes exhibit performance very close to the CPM capacity without requiring iterations between the outer decoder and the CPM demodulator. As a result, the receiver exhibits reduced complexity and increased flexibility due to the separation of the demodulation and decoding functions.

  6. Good governance for pension schemes

    CERN Document Server

    Thornton, Paul

    2011-01-01

    Regulatory and market developments have transformed the way in which UK private sector pension schemes operate. This has increased demands on trustees and advisors and the trusteeship governance model must evolve in order to remain fit for purpose. This volume brings together leading practitioners to provide an overview of what today constitutes good governance for pension schemes, from both a legal and a practical perspective. It provides the reader with an appreciation of the distinctive characteristics of UK occupational pension schemes, how they sit within the capital markets and their social and fiduciary responsibilities. Providing a holistic analysis of pension risk, both from the trustee and the corporate perspective, the essays cover the crucial role of the employer covenant, financing and investment risk, developments in longevity risk hedging and insurance de-risking, and best practice scheme administration.

  7. A Novel Iris Segmentation Scheme

    Directory of Open Access Journals (Sweden)

    Chen-Chung Liu

    2014-01-01

    Full Text Available One of the key steps in the iris recognition system is the accurate iris segmentation from its surrounding noises including pupil, sclera, eyelashes, and eyebrows of a captured eye-image. This paper presents a novel iris segmentation scheme which utilizes the orientation matching transform to outline the outer and inner iris boundaries initially. It then employs Delogne-Kåsa circle fitting (instead of the traditional Hough transform to further eliminate the outlier points to extract a more precise iris area from an eye-image. In the extracted iris region, the proposed scheme further utilizes the differences in the intensity and positional characteristics of the iris, eyelid, and eyelashes to detect and delete these noises. The scheme is then applied on iris image database, UBIRIS.v1. The experimental results show that the presented scheme provides a more effective and efficient iris segmentation than other conventional methods.

  8. An arbitrated quantum signature scheme

    CERN Document Server

    Zeng, G; Zeng, Guihua; Keitel, Christoph H.

    2002-01-01

    The general principle for a quantum signature scheme is proposed and investigated based on ideas from classical signature schemes and quantum cryptography. The suggested algorithm is implemented by a symmetrical quantum key cryptosystem and Greenberger-Horne-Zeilinger (GHZ) triplet states and relies on the availability of an arbitrator. We can guarantee the unconditional security of the algorithm, mostly due to the correlation of the GHZ triplet states and the use of quantum one-time pads.

  9. Breeding schemes in reindeer husbandry

    Directory of Open Access Journals (Sweden)

    Lars Rönnegård

    2003-04-01

    Full Text Available The objective of the paper was to investigate annual genetic gain from selection (G, and the influence of selection on the inbreeding effective population size (Ne, for different possible breeding schemes within a reindeer herding district. The breeding schemes were analysed for different proportions of the population within a herding district included in the selection programme. Two different breeding schemes were analysed: an open nucleus scheme where males mix and mate between owner flocks, and a closed nucleus scheme where the males in non-selected owner flocks are culled to maximise G in the whole population. The theory of expected long-term genetic contributions was used and maternal effects were included in the analyses. Realistic parameter values were used for the population, modelled with 5000 reindeer in the population and a sex ratio of 14 adult females per male. The standard deviation of calf weights was 4.1 kg. Four different situations were explored and the results showed: 1. When the population was randomly culled, Ne equalled 2400. 2. When the whole population was selected on calf weights, Ne equalled 1700 and the total annual genetic gain (direct + maternal in calf weight was 0.42 kg. 3. For the open nucleus scheme, G increased monotonically from 0 to 0.42 kg as the proportion of the population included in the selection programme increased from 0 to 1.0, and Ne decreased correspondingly from 2400 to 1700. 4. In the closed nucleus scheme the lowest value of Ne was 1300. For a given proportion of the population included in the selection programme, the difference in G between a closed nucleus scheme and an open one was up to 0.13 kg. We conclude that for mass selection based on calf weights in herding districts with 2000 animals or more, there are no risks of inbreeding effects caused by selection.

  10. Fuzzy Deductive Inference Scheme Application in Solving the Problem of Modelling Movements of the Hand Prosthesis

    Directory of Open Access Journals (Sweden)

    Bozhenyuk Alexander

    2015-12-01

    Full Text Available The decision-making model with basic fuzzy rule modus ponens is suggested in this paper to control the hand prosthesis. The hand movements are described by angles of finger and wrist flexion. Electromyogram (EMG of hand muscles was used as a source of the input data. Software was developed to implement the decision-making model with fuzzy rule modus ponens. In particular, the software receives EMG data, executes calculations and visualises the output data. The key advantage of the model is smoothness of output data changes; this way a maximum approach to natural hand movements is reached.

  11. Provable Secure Identity Based Generalized Signcryption Scheme

    OpenAIRE

    Yu, Gang; Ma, Xiaoxiao; Shen, Yong; Han, Wenbao

    2010-01-01

    According to actual needs, generalized signcryption scheme can flexibly work as an encryption scheme, a signature scheme or a signcryption scheme. In this paper, firstly, we give a security model for identity based generalized signcryption which is more complete than existing model. Secondly, we propose an identity based generalized signcryption scheme. Thirdly, we give the security proof of the new scheme in this complete model. Comparing with existing identity based generalized signcryption...

  12. Lower complexity bounds for lifted inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2015-01-01

    instances of the model. Numerous approaches for such “lifted inference” techniques have been proposed. While it has been demonstrated that these techniques will lead to significantly more efficient inference on some specific models, there are only very recent and still quite restricted results that show...... the feasibility of lifted inference on certain syntactically defined classes of models. Lower complexity bounds that imply some limitations for the feasibility of lifted inference on more expressive model classes were established earlier in Jaeger (2000; Jaeger, M. 2000. On the complexity of inference about...... that under the assumption that NETIME≠ETIME, there is no polynomial lifted inference algorithm for knowledge bases of weighted, quantifier-, and function-free formulas. Further strengthening earlier results, this is also shown to hold for approximate inference and for knowledge bases not containing...

  13. Spontaneous Trait Inferences on Social Media

    Science.gov (United States)

    Utz, Sonja

    2016-01-01

    The present research investigates whether spontaneous trait inferences occur under conditions characteristic of social media and networking sites: nonextreme, ostensibly self-generated content, simultaneous presentation of multiple cues, and self-paced browsing. We used an established measure of trait inferences (false recognition paradigm) and a direct assessment of impressions. Without being asked to do so, participants spontaneously formed impressions of people whose status updates they saw. Our results suggest that trait inferences occurred from nonextreme self-generated content, which is commonly found in social media updates (Experiment 1) and when nine status updates from different people were presented in parallel (Experiment 2). Although inferences did occur during free browsing, the results suggest that participants did not necessarily associate the traits with the corresponding status update authors (Experiment 3). Overall, the findings suggest that spontaneous trait inferences occur on social media. We discuss implications for online communication and research on spontaneous trait inferences. PMID:28123646

  14. Causal inference in public health.

    Science.gov (United States)

    Glass, Thomas A; Goodman, Steven N; Hernán, Miguel A; Samet, Jonathan M

    2013-01-01

    Causal inference has a central role in public health; the determination that an association is causal indicates the possibility for intervention. We review and comment on the long-used guidelines for interpreting evidence as supporting a causal association and contrast them with the potential outcomes framework that encourages thinking in terms of causes that are interventions. We argue that in public health this framework is more suitable, providing an estimate of an action's consequences rather than the less precise notion of a risk factor's causal effect. A variety of modern statistical methods adopt this approach. When an intervention cannot be specified, causal relations can still exist, but how to intervene to change the outcome will be unclear. In application, the often-complex structure of causal processes needs to be acknowledged and appropriate data collected to study them. These newer approaches need to be brought to bear on the increasingly complex public health challenges of our globalized world.

  15. Statistical inference for financial engineering

    CERN Document Server

    Taniguchi, Masanobu; Ogata, Hiroaki; Taniai, Hiroyuki

    2014-01-01

    This monograph provides the fundamentals of statistical inference for financial engineering and covers some selected methods suitable for analyzing financial time series data. In order to describe the actual financial data, various stochastic processes, e.g. non-Gaussian linear processes, non-linear processes, long-memory processes, locally stationary processes etc. are introduced and their optimal estimation is considered as well. This book also includes several statistical approaches, e.g., discriminant analysis, the empirical likelihood method, control variate method, quantile regression, realized volatility etc., which have been recently developed and are considered to be powerful tools for analyzing the financial data, establishing a new bridge between time series and financial engineering. This book is well suited as a professional reference book on finance, statistics and statistical financial engineering. Readers are expected to have an undergraduate-level knowledge of statistics.

  16. Polynomial Regressions and Nonsense Inference

    Directory of Open Access Journals (Sweden)

    Daniel Ventosa-Santaulària

    2013-11-01

    Full Text Available Polynomial specifications are widely used, not only in applied economics, but also in epidemiology, physics, political analysis and psychology, just to mention a few examples. In many cases, the data employed to estimate such specifications are time series that may exhibit stochastic nonstationary behavior. We extend Phillips’ results (Phillips, P. Understanding spurious regressions in econometrics. J. Econom. 1986, 33, 311–340. by proving that an inference drawn from polynomial specifications, under stochastic nonstationarity, is misleading unless the variables cointegrate. We use a generalized polynomial specification as a vehicle to study its asymptotic and finite-sample properties. Our results, therefore, lead to a call to be cautious whenever practitioners estimate polynomial regressions.

  17. Mod/Resc Parsimony Inference

    CERN Document Server

    Nor, Igor; Charlat, Sylvain; Engelstadter, Jan; Reuter, Max; Duron, Olivier; Sagot, Marie-France

    2010-01-01

    We address in this paper a new computational biology problem that aims at understanding a mechanism that could potentially be used to genetically manipulate natural insect populations infected by inherited, intra-cellular parasitic bacteria. In this problem, that we denote by \\textsc{Mod/Resc Parsimony Inference}, we are given a boolean matrix and the goal is to find two other boolean matrices with a minimum number of columns such that an appropriately defined operation on these matrices gives back the input. We show that this is formally equivalent to the \\textsc{Bipartite Biclique Edge Cover} problem and derive some complexity results for our problem using this equivalence. We provide a new, fixed-parameter tractability approach for solving both that slightly improves upon a previously published algorithm for the \\textsc{Bipartite Biclique Edge Cover}. Finally, we present experimental results where we applied some of our techniques to a real-life data set.

  18. Bayesian Inference with Optimal Maps

    CERN Document Server

    Moselhy, Tarek A El

    2011-01-01

    We present a new approach to Bayesian inference that entirely avoids Markov chain simulation, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. We discuss various means of explicitly parameterizing the map and computing it efficiently through solution of an optimization problem, exploiting gradient information from the forward model when possible. The resulting algorithm overcomes many of the computational bottlenecks associated with Markov chain Monte Carlo. Advantages of a map-based representation of the posterior include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent posterior samples without additional likelihood evaluations or forward solves. The optimization approach also provides clear convergence criteria for posterior approximation and facilitates model selectio...

  19. Relevance-driven Pragmatic Inferences

    Institute of Scientific and Technical Information of China (English)

    王瑞彪

    2013-01-01

    Relevance theory, an inferential approach to pragmatics, claims that the hearer is expected to pick out the input of op-timal relevance from a mass of alternative inputs produced by the speaker in order to interpret the speaker ’s intentions. The de-gree of the relevance of an input can be assessed in terms of cognitive effects and the processing effort. The input of optimal rele-vance is the one yielding the greatest positive cognitive effect and requiring the least processing effort. This paper attempts to as-sess the degrees of the relevance of a mass of alternative inputs produced by an imaginary speaker from the perspective of her cor-responding hearer in terms of cognitive effects and the processing effort with a view to justifying the feasibility of the principle of relevance in pragmatic inferences.

  20. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  1. Definitive Consensus for Distributed Data Inference

    OpenAIRE

    2011-01-01

    Inference from data is of key importance in many applications of informatics. The current trend in performing such a task of inference from data is to utilise machine learning algorithms. Moreover, in many applications that it is either required or is preferable to infer from the data in a distributed manner. Many practical difficulties arise from the fact that in many distributed applications we avert from transferring data or parts of it due to cost...

  2. Approximate inference on planar graphs using loop calculus and belief progagation

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Laboratory; Gomez, Vicenc [RADBOUD UNIV; Kappen, Hilbert [RADBOUD UNIV

    2009-01-01

    We introduce novel results for approximate inference on planar graphical models using the loop calculus framework. The loop calculus (Chertkov and Chernyak, 2006b) allows to express the exact partition function Z of a graphical model as a finite sum of terms that can be evaluated once the belief propagation (BP) solution is known. In general, full summation over all correction terms is intractable. We develop an algorithm for the approach presented in Chertkov et al. (2008) which represents an efficient truncation scheme on planar graphs and a new representation of the series in terms of Pfaffians of matrices. We analyze in detail both the loop series and the Pfaffian series for models with binary variables and pairwise interactions, and show that the first term of the Pfaffian series can provide very accurate approximations. The algorithm outperforms previous truncation schemes of the loop series and is competitive with other state-of-the-art methods for approximate inference.

  3. Constraint Processing in Lifted Probabilistic Inference

    CERN Document Server

    Kisynski, Jacek

    2012-01-01

    First-order probabilistic models combine representational power of first-order logic with graphical models. There is an ongoing effort to design lifted inference algorithms for first-order probabilistic models. We analyze lifted inference from the perspective of constraint processing and, through this viewpoint, we analyze and compare existing approaches and expose their advantages and limitations. Our theoretical results show that the wrong choice of constraint processing method can lead to exponential increase in computational complexity. Our empirical tests confirm the importance of constraint processing in lifted inference. This is the first theoretical and empirical study of constraint processing in lifted inference.

  4. Inference Attacks and Control on Database Structures

    Directory of Open Access Journals (Sweden)

    Muhamed Turkanovic

    2015-02-01

    Full Text Available Today’s databases store information with sensitivity levels that range from public to highly sensitive, hence ensuring confidentiality can be highly important, but also requires costly control. This paper focuses on the inference problem on different database structures. It presents possible treats on privacy with relation to the inference, and control methods for mitigating these treats. The paper shows that using only access control, without any inference control is inadequate, since these models are unable to protect against indirect data access. Furthermore, it covers new inference problems which rise from the dimensions of new technologies like XML, semantics, etc.

  5. State Sampling Dependence of Hopfield Network Inference

    Institute of Scientific and Technical Information of China (English)

    黄海平

    2012-01-01

    The fully connected Hopfield network is inferred based on observed magnetizations and pairwise correlations. We present the system in the glassy phase with low temperature and high memory load. We find that the inference error is very sensitive to the form of state sampling. When a single state is sampled to compute magnetizations and correlations, the inference error is almost indistinguishable irrespective of the sampled state. However, the error can be greatly reduced if the data is collected with state transitions. Our result holds for different disorder samples and accounts for the previously observed large fluctuations of inference error at low temperatures.

  6. Multiuser switched diversity scheduling schemes

    KAUST Repository

    Shaqfeh, Mohammad

    2012-09-01

    Multiuser switched-diversity scheduling schemes were recently proposed in order to overcome the heavy feedback requirements of conventional opportunistic scheduling schemes by applying a threshold-based, distributed, and ordered scheduling mechanism. The main idea behind these schemes is that slight reduction in the prospected multiuser diversity gains is an acceptable trade-off for great savings in terms of required channel-state-information feedback messages. In this work, we characterize the achievable rate region of multiuser switched diversity systems and compare it with the rate region of full feedback multiuser diversity systems. We propose also a novel proportional fair multiuser switched-based scheduling scheme and we demonstrate that it can be optimized using a practical and distributed method to obtain the feedback thresholds. We finally demonstrate by numerical examples that switched-diversity scheduling schemes operate within 0.3 bits/sec/Hz from the ultimate network capacity of full feedback systems in Rayleigh fading conditions. © 2012 IEEE.

  7. Energy partitioning schemes: a dilemma.

    Science.gov (United States)

    Mayer, I

    2007-01-01

    Two closely related energy partitioning schemes, in which the total energy is presented as a sum of atomic and diatomic contributions by using the "atomic decomposition of identity", are compared on the example of N,N-dimethylformamide, a simple but chemically rich molecule. Both schemes account for different intramolecular interactions, for instance they identify the weak C-H...O intramolecular interactions, but give completely different numbers. (The energy decomposition scheme based on the virial theorem is also considered.) The comparison of the two schemes resulted in a dilemma which is especially striking when these schemes are applied for molecules distorted from their equilibrium structures: one either gets numbers which are "on the chemical scale" and have quite appealing values at the equilibrium molecular geometries, but exhibiting a counter-intuitive distance dependence (the two-center energy components increase in absolute value with the increase of the interatomic distances)--or numbers with too large absolute values but "correct" distance behaviour. The problem is connected with the quick decay of the diatomic kinetic energy components.

  8. Weighting schemes in metabolic graphs for identifying biochemical routes.

    Science.gov (United States)

    Ghosh, S; Baloni, P; Vishveshwara, S; Chandra, N

    2014-03-01

    Metabolism forms an integral part of all cells and its study is important to understand the functioning of the system, to understand alterations that occur in disease state and hence for subsequent applications in drug discovery. Reconstruction of genome-scale metabolic graphs from genomics and other molecular or biochemical data is now feasible. Few methods have also been reported for inferring biochemical pathways from these networks. However, given the large scale and complex inter-connections in the networks, the problem of identifying biochemical routes is not trivial and some questions still remain open. In particular, how a given path is altered in perturbed conditions remains a difficult problem, warranting development of improved methods. Here we report a comparison of 6 different weighting schemes to derive node and edge weights for a metabolic graph, weights reflecting various kinetic, thermodynamic parameters as well as abundances inferred from transcriptome data. Using a network of 50 nodes and 107 edges of carbohydrate metabolism, we show that kinetic parameter derived weighting schemes [Formula: see text] fare best. However, these are limited by their extent of availability, highlighting the usefulness of omics data under such conditions. Interestingly, transcriptome derived weights yield paths with best scores, but are inadequate to discriminate the theoretical paths. The method is tested on a system of Escherichia coli stress response. The approach illustrated here is generic in nature and can be used in the analysis for metabolic network from any species and perhaps more importantly for comparing condition-specific networks.

  9. Distance labeling schemes for trees

    DEFF Research Database (Denmark)

    Alstrup, Stephen; Gørtz, Inge Li; Bistrup Halvorsen, Esben

    2016-01-01

    We consider distance labeling schemes for trees: given a tree with n nodes, label the nodes with binary strings such that, given the labels of any two nodes, one can determine, by looking only at the labels, the distance in the tree between the two nodes. A lower bound by Gavoille et al. [Gavoille...... variants such as, for example, small distances in trees [Alstrup et al., SODA, 2003]. We improve the known upper and lower bounds of exact distance labeling by showing that 1/4 log2(n) bits are needed and that 1/2 log2(n) bits are sufficient. We also give (1 + ε)-stretch labeling schemes using Theta......(log(n)) bits for constant ε> 0. (1 + ε)-stretch labeling schemes with polylogarithmic label size have previously been established for doubling dimension graphs by Talwar [Talwar, STOC, 2004]. In addition, we present matching upper and lower bounds for distance labeling for caterpillars, showing that labels...

  10. Towards Confirming Neural Circuit Inference from Population Calcium Imaging. NIPS Workshop on Connectivity Inference in Neuroimaging

    OpenAIRE

    NeuroData; Mishchenko, Y.; AM, Packer; TA, Machado; Yuste, R.; Paninski, L

    2015-01-01

    Vogelstein JT, Mishchenko Y, Packer AM, Machado TA, Yuste R, Paninski L. Towards Confirming Neural Circuit Inference from Population Calcium Imaging. NIPS Workshop on Connectivity Inference in Neuroimaging, 2009

  11. Protein inference: A protein quantification perspective.

    Science.gov (United States)

    He, Zengyou; Huang, Ting; Liu, Xiaoqing; Zhu, Peijun; Teng, Ben; Deng, Shengchun

    2016-08-01

    In mass spectrometry-based shotgun proteomics, protein quantification and protein identification are two major computational problems. To quantify the protein abundance, a list of proteins must be firstly inferred from the raw data. Then the relative or absolute protein abundance is estimated with quantification methods, such as spectral counting. Until now, most researchers have been dealing with these two processes separately. In fact, the protein inference problem can be regarded as a special protein quantification problem in the sense that truly present proteins are those proteins whose abundance values are not zero. Some recent published papers have conceptually discussed this possibility. However, there is still a lack of rigorous experimental studies to test this hypothesis. In this paper, we investigate the feasibility of using protein quantification methods to solve the protein inference problem. Protein inference methods aim to determine whether each candidate protein is present in the sample or not. Protein quantification methods estimate the abundance value of each inferred protein. Naturally, the abundance value of an absent protein should be zero. Thus, we argue that the protein inference problem can be viewed as a special protein quantification problem in which one protein is considered to be present if its abundance is not zero. Based on this idea, our paper tries to use three simple protein quantification methods to solve the protein inference problem effectively. The experimental results on six data sets show that these three methods are competitive with previous protein inference algorithms. This demonstrates that it is plausible to model the protein inference problem as a special protein quantification task, which opens the door of devising more effective protein inference algorithms from a quantification perspective. The source codes of our methods are available at: http://code.google.com/p/protein-inference/.

  12. Learning an Astronomical Catalog of the Visible Universe through Scalable Bayesian Inference

    CERN Document Server

    Regier, Jeffrey; Giordano, Ryan; Thomas, Rollin; Schlegel, David; McAuliffe, Jon; Prabhat,

    2016-01-01

    Celeste is a procedure for inferring astronomical catalogs that attains state-of-the-art scientific results. To date, Celeste has been scaled to at most hundreds of megabytes of astronomical images: Bayesian posterior inference is notoriously demanding computationally. In this paper, we report on a scalable, parallel version of Celeste, suitable for learning catalogs from modern large-scale astronomical datasets. Our algorithmic innovations include a fast numerical optimization routine for Bayesian posterior inference and a statistically efficient scheme for decomposing astronomical optimization problems into subproblems. Our scalable implementation is written entirely in Julia, a new high-level dynamic programming language designed for scientific and numerical computing. We use Julia's high-level constructs for shared and distributed memory parallelism, and demonstrate effective load balancing and efficient scaling on up to 8192 Xeon cores on the NERSC Cori supercomputer.

  13. Electrical Injection Schemes for Nanolasers

    DEFF Research Database (Denmark)

    Lupi, Alexandra; Chung, Il-Sug; Yvind, Kresten

    2014-01-01

    Three electrical injection schemes based on recently demonstrated electrically pumped photonic crystal nanolasers have been numerically investigated: 1) a vertical p-i-n junction through a post structure; 2) a lateral p-i-n junction with a homostructure; and 3) a lateral p-i-n junction....... For this analysis, the properties of different schemes, i.e., electrical resistance, threshold voltage, threshold current, and internal efficiency as energy requirements for optical interconnects are compared and the physics behind the differences is discussed....

  14. Small-scale classification schemes

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2004-01-01

    . While coordination mechanisms focus on how classification schemes enable cooperation among people pursuing a common goal, boundary objects embrace the implicit consequences of classification schemes in situations involving conflicting goals. Moreover, the requirements specification focused on functional...... requirements and provided little information about why these requirements were considered relevant. This stands in contrast to the discussions at the project meetings where the software engineers made frequent use of both abstract goal descriptions and concrete examples to make sense of the requirements....... This difference between the written requirements specification and the oral discussions at the meetings may help explain software engineers’ general preference for people, rather than documents, as their information sources....

  15. Automatic control of biomass gasifiers using fuzzy inference systems

    Energy Technology Data Exchange (ETDEWEB)

    Sagues, C. [Universidad de Zaragoza (Spain). Dpto. de Informatica e Ingenieria de Sistemas; Garcia-Bacaicoa, P.; Serrano, S. [Universidad de Zaragoza (Spain). Dpto. de Ingenieria Quimica y Medio Ambiente

    2007-03-15

    A fuzzy controller for biomass gasifiers is proposed. Although fuzzy inference systems do not need models to be tuned, a plant model is proposed which has turned out very useful to prove different combinations of membership functions and rules in the proposed fuzzy control. The global control scheme is shown, including the elements to generate the set points for the process variables automatically. There, the type of biomass and its moisture content are the only data which need to be introduced to the controller by a human operator at the beginning of operation to make it work autonomously. The advantages and good performance of the fuzzy controller with the automatic generation of set points, compared to controllers utilising fixed parameters, are demonstrated. (author)

  16. Automatic control of biomass gasifiers using fuzzy inference systems.

    Science.gov (United States)

    Sagüés, C; García-Bacaicoa, P; Serrano, S

    2007-03-01

    A fuzzy controller for biomass gasifiers is proposed. Although fuzzy inference systems do not need models to be tuned, a plant model is proposed which has turned out very useful to prove different combinations of membership functions and rules in the proposed fuzzy control. The global control scheme is shown, including the elements to generate the set points for the process variables automatically. There, the type of biomass and its moisture content are the only data which need to be introduced to the controller by a human operator at the beginning of operation to make it work autonomously. The advantages and good performance of the fuzzy controller with the automatic generation of set points, compared to controllers utilising fixed parameters, are demonstrated.

  17. Smart and easy: Co-occurring activation of spontaneous trait inferences and spontaneous situational inferences

    NARCIS (Netherlands)

    Ham, J.R.C.; Vonk, R.

    2003-01-01

    Social perceivers have been shown to draw spontaneous trait inferences (STI's) about the behavior of an actor as well as spontaneous situational inferences (SSI's) about the situation the actor is in. In two studies, we examined inferences about behaviors that allow for both an STI and an SSI. In

  18. Validating Inductive Hypotheses by Mode Inference

    Institute of Scientific and Technical Information of China (English)

    王志坚

    1993-01-01

    Sme criteria based on mode inference for validating inductive hypotheses are presented in this paper.Mode inference is caried out mechanically,thus such kind of validation can result in low overhead in consistency check and high efficiency in performance.

  19. Causal inference in economics and marketing.

    Science.gov (United States)

    Varian, Hal R

    2016-07-05

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual-a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference.

  20. Local and Global Thinking in Statistical Inference

    Science.gov (United States)

    Pratt, Dave; Johnston-Wilder, Peter; Ainley, Janet; Mason, John

    2008-01-01

    In this reflective paper, we explore students' local and global thinking about informal statistical inference through our observations of 10- to 11-year-olds, challenged to infer the unknown configuration of a virtual die, but able to use the die to generate as much data as they felt necessary. We report how they tended to focus on local changes…

  1. The Reasoning behind Informal Statistical Inference

    Science.gov (United States)

    Makar, Katie; Bakker, Arthur; Ben-Zvi, Dani

    2011-01-01

    Informal statistical inference (ISI) has been a frequent focus of recent research in statistics education. Considering the role that context plays in developing ISI calls into question the need to be more explicit about the reasoning that underpins ISI. This paper uses educational literature on informal statistical inference and philosophical…

  2. Forward and backward inference in spatial cognition.

    Directory of Open Access Journals (Sweden)

    Will D Penny

    Full Text Available This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of 'lower-level' computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.

  3. Fiducial inference - A Neyman-Pearson interpretation

    NARCIS (Netherlands)

    Salome, D; VonderLinden, W; Dose,; Fischer, R; Preuss, R

    1999-01-01

    Fisher's fiducial argument is a tool for deriving inferences in the form of a probability distribution on the parameter space, not based on Bayes's Theorem. Lindley established that in exceptional situations fiducial inferences coincide with posterior distributions; in the other situations fiducial

  4. The Design and Implementation of Typed Scheme: From Scripts to Programs

    CERN Document Server

    Tobin-Hochstadt, Sam

    2011-01-01

    When scripts in untyped languages grow into large programs, maintaining them becomes difficult. A lack of explicit type annotations in typical scripting languages forces programmers to must (re)discover critical pieces of design information every time they wish to change a program. This analysis step both slows down the maintenance process and may even introduce mistakes due to the violation of undiscovered invariants. This paper presents Typed Scheme, an explicitly typed extension of PLT Scheme, an untyped scripting language. Its type system is based on the novel notion of occurrence typing, which we formalize and mechanically prove sound. The implementation of Typed Scheme additionally borrows elements from a range of approaches, including recursive types, true unions and subtyping, plus polymorphism combined with a modicum of local inference. The formulation of occurrence typing naturally leads to a simple and expressive version of predicates to describe refinement types. A Typed Scheme program can use the...

  5. Active Inference: A Process Theory.

    Science.gov (United States)

    Friston, Karl; FitzGerald, Thomas; Rigoli, Francesco; Schwartenbeck, Philipp; Pezzulo, Giovanni

    2017-01-01

    This article describes a process theory based on active inference and belief propagation. Starting from the premise that all neuronal processing (and action selection) can be explained by maximizing Bayesian model evidence-or minimizing variational free energy-we ask whether neuronal responses can be described as a gradient descent on variational free energy. Using a standard (Markov decision process) generative model, we derive the neuronal dynamics implicit in this description and reproduce a remarkable range of well-characterized neuronal phenomena. These include repetition suppression, mismatch negativity, violation responses, place-cell activity, phase precession, theta sequences, theta-gamma coupling, evidence accumulation, race-to-bound dynamics, and transfer of dopamine responses. Furthermore, the (approximately Bayes' optimal) behavior prescribed by these dynamics has a degree of face validity, providing a formal explanation for reward seeking, context learning, and epistemic foraging. Technically, the fact that a gradient descent appears to be a valid description of neuronal activity means that variational free energy is a Lyapunov function for neuronal dynamics, which therefore conform to Hamilton's principle of least action.

  6. Redshift data and statistical inference

    Science.gov (United States)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  7. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  8. EI: A Program for Ecological Inference

    Directory of Open Access Journals (Sweden)

    Gary King

    2004-09-01

    Full Text Available The program EI provides a method of inferring individual behavior from aggregate data. It implements the statistical procedures, diagnostics, and graphics from the book A Solution to the Ecological Inference Problem: Reconstructing Individual Behavior from Aggregate Data (King 1997. Ecological inference, as traditionally defined, is the process of using aggregate (i.e., "ecological" data to infer discrete individual-level relationships of interest when individual-level data are not available. Ecological inferences are required in political science research when individual-level surveys are unavailable (e.g., local or comparative electoral politics, unreliable (racial politics, insufficient (political geography, or infeasible (political history. They are also required in numerous areas of ma jor significance in public policy (e.g., for applying the Voting Rights Act and other academic disciplines ranging from epidemiology and marketing to sociology and quantitative history.

  9. On the criticality of inferred models

    CERN Document Server

    Mastromatteo, Iacopo

    2011-01-01

    Advanced inference techniques allow one to reconstruct the pattern of interaction from high dimensional data sets. We focus here on the statistical properties of inferred models and argue that inference procedures are likely to yield models which are close to a phase transition. On one side, we show that the reparameterization invariant metrics in the space of probability distributions of these models (the Fisher Information) is directly related to the model's susceptibility. As a result, distinguishable models tend to accumulate close to critical points, where the susceptibility diverges in infinite systems. On the other, this region is the one where the estimate of inferred parameters is most stable. In order to illustrate these points, we discuss inference of interacting point processes with application to financial data and show that sensible choices of observation time-scales naturally yield models which are close to criticality.

  10. Efficient adaptive fuzzy control scheme

    NARCIS (Netherlands)

    Papp, Z.; Driessen, B.J.F.

    1995-01-01

    The paper presents an adaptive nonlinear (state-) feedback control structure, where the nonlinearities are implemented as smooth fuzzy mappings defined as rule sets. The fine tuning and adaption of the controller is realized by an indirect adaptive scheme, which modifies the parameters of the fuzzy

  11. Homogenization scheme for acoustic metamaterials

    KAUST Repository

    Yang, Min

    2014-02-26

    We present a homogenization scheme for acoustic metamaterials that is based on reproducing the lowest orders of scattering amplitudes from a finite volume of metamaterials. This approach is noted to differ significantly from that of coherent potential approximation, which is based on adjusting the effective-medium parameters to minimize scatterings in the long-wavelength limit. With the aid of metamaterials’ eigenstates, the effective parameters, such as mass density and elastic modulus can be obtained by matching the surface responses of a metamaterial\\'s structural unit cell with a piece of homogenized material. From the Green\\'s theorem applied to the exterior domain problem, matching the surface responses is noted to be the same as reproducing the scattering amplitudes. We verify our scheme by applying it to three different examples: a layered lattice, a two-dimensional hexagonal lattice, and a decorated-membrane system. It is shown that the predicted characteristics and wave fields agree almost exactly with numerical simulations and experiments and the scheme\\'s validity is constrained by the number of dominant surface multipoles instead of the usual long-wavelength assumption. In particular, the validity extends to the full band in one dimension and to regimes near the boundaries of the Brillouin zone in two dimensions.

  12. Distance labeling schemes for trees

    DEFF Research Database (Denmark)

    Alstrup, Stephen; Gørtz, Inge Li; Bistrup Halvorsen, Esben;

    2016-01-01

    We consider distance labeling schemes for trees: given a tree with n nodes, label the nodes with binary strings such that, given the labels of any two nodes, one can determine, by looking only at the labels, the distance in the tree between the two nodes. A lower bound by Gavoille et al. [Gavoill...

  13. Distance labeling schemes for trees

    DEFF Research Database (Denmark)

    Alstrup, Stephen; Gørtz, Inge Li; Bistrup Halvorsen, Esben

    2016-01-01

    variants such as, for example, small distances in trees [Alstrup et al., SODA, 2003]. We improve the known upper and lower bounds of exact distance labeling by showing that 1/4 log2(n) bits are needed and that 1/2 log2(n) bits are sufficient. We also give (1 + ε)-stretch labeling schemes using Theta...

  14. Inference by Minimizing Size, Divergence, or their Sum

    CERN Document Server

    Riedel, Sebastian; McCallum, Andrew

    2012-01-01

    We speed up marginal inference by ignoring factors that do not significantly contribute to overall accuracy. In order to pick a suitable subset of factors to ignore, we propose three schemes: minimizing the number of model factors under a bound on the KL divergence between pruned and full models; minimizing the KL divergence under a bound on factor count; and minimizing the weighted sum of KL divergence and factor count. All three problems are solved using an approximation of the KL divergence than can be calculated in terms of marginals computed on a simple seed graph. Applied to synthetic image denoising and to three different types of NLP parsing models, this technique performs marginal inference up to 11 times faster than loopy BP, with graph sizes reduced up to 98%-at comparable error in marginals and parsing accuracy. We also show that minimizing the weighted sum of divergence and size is substantially faster than minimizing either of the other objectives based on the approximation to divergence present...

  15. Feedback Message Passing for Inference in Gaussian Graphical Models

    CERN Document Server

    Liu, Ying; Anandkumar, Animashree; Willsky, Alan S

    2011-01-01

    While loopy belief propagation (LBP) performs reasonably well for inference in some Gaussian graphical models with cycles, its performance is unsatisfactory for many others. In particular for some models LBP does not converge, and in general when it does converge, the computed variances are incorrect (except for cycle-free graphs for which belief propagation (BP) is non-iterative and exact). In this paper we propose {\\em feedback message passing} (FMP), a message-passing algorithm that makes use of a special set of vertices (called a {\\em feedback vertex set} or {\\em FVS}) whose removal results in a cycle-free graph. In FMP, standard BP is employed several times on the cycle-free subgraph excluding the FVS while a special message-passing scheme is used for the nodes in the FVS. The computational complexity of exact inference is $O(k^2n)$, where $k$ is the number of feedback nodes, and $n$ is the total number of nodes. When the size of the FVS is very large, FMP is intractable. Hence we propose {\\em approximat...

  16. Post-model selection inference and model averaging

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2011-07-01

    Full Text Available Although model selection is routinely used in practice nowadays, little is known about its precise effects on any subsequent inference that is carried out. The same goes for the effects induced by the closely related technique of model averaging. This paper is concerned with the use of the same data first to select a model and then to carry out inference, in particular point estimation and point prediction. The properties of the resulting estimator, called a post-model-selection estimator (PMSE, are hard to derive. Using selection criteria such as hypothesis testing, AIC, BIC, HQ and Cp, we illustrate that, in terms of risk function, no single PMSE dominates the others. The same conclusion holds more generally for any penalised likelihood information criterion. We also compare various model averaging schemes and show that no single one dominates the others in terms of risk function. Since PMSEs can be regarded as a special case of model averaging, with 0-1 random-weights, we propose a connection between the two theories, in the frequentist approach, by taking account of the selection procedure when performing model averaging. We illustrate the point by simulating a simple linear regression model.

  17. Quantum group blind signature scheme without entanglement

    Science.gov (United States)

    Xu, Rui; Huang, Liusheng; Yang, Wei; He, Libao

    2011-07-01

    In this paper we propose a quantum group blind signature scheme designed for distributed e-voting system. Our scheme combines the properties of group signature and blind signature to provide anonymity of voters in an e-voting system. The unconditional security of our scheme is ensured by quantum mechanics. Without employing entanglement, the proposed scheme is easier to be realized comparing with other quantum signature schemes.

  18. Fair Electronic Payment Scheme Based on DSA

    Institute of Scientific and Technical Information of China (English)

    WANG Shao-bin; HONG Fan; ZHU Xian

    2005-01-01

    We present a multi-signature scheme based on DSA and describes a fair electronic payment scheme based on improved DSA signatures. The scheme makes both sides in equal positions during the course of electronic transaction. A Trusted Third Party (TTP) is involved in the scheme to guarantee the fairness of the scheme for both sides. However, only during the course of registration and dispute resolution will TTP be needed. TTP is not needed during the normal payment stage.

  19. Causal inference in obesity research.

    Science.gov (United States)

    Franks, P W; Atabaki-Pasdar, N

    2017-03-01

    Obesity is a risk factor for a plethora of severe morbidities and premature death. Most supporting evidence comes from observational studies that are prone to chance, bias and confounding. Even data on the protective effects of weight loss from randomized controlled trials will be susceptible to confounding and bias if treatment assignment cannot be masked, which is usually the case with lifestyle and surgical interventions. Thus, whilst obesity is widely considered the major modifiable risk factor for many chronic diseases, its causes and consequences are often difficult to determine. Addressing this is important, as the prevention and treatment of any disease requires that interventions focus on causal risk factors. Disease prediction, although not dependent on knowing the causes, is nevertheless enhanced by such knowledge. Here, we provide an overview of some of the barriers to causal inference in obesity research and discuss analytical approaches, such as Mendelian randomization, that can help to overcome these obstacles. In a systematic review of the literature in this field, we found: (i) probable causal relationships between adiposity and bone health/disease, cancers (colorectal, lung and kidney cancers), cardiometabolic traits (blood pressure, fasting insulin, inflammatory markers and lipids), uric acid concentrations, coronary heart disease and venous thrombosis (in the presence of pulmonary embolism), (ii) possible causal relationships between adiposity and gray matter volume, depression and common mental disorders, oesophageal cancer, macroalbuminuria, end-stage renal disease, diabetic kidney disease, nuclear cataract and gall stone disease, and (iii) no evidence for causal relationships between adiposity and Alzheimer's disease, pancreatic cancer, venous thrombosis (in the absence of pulmonary embolism), liver function and periodontitis.

  20. Fuzzy inference game approach to uncertainty in business decisions and market competitions.

    Science.gov (United States)

    Oderanti, Festus Oluseyi

    2013-01-01

    The increasing challenges and complexity of business environments are making business decisions and operations more difficult for entrepreneurs to predict the outcomes of these processes. Therefore, we developed a decision support scheme that could be used and adapted to various business decision processes. These involve decisions that are made under uncertain situations such as business competition in the market or wage negotiation within a firm. The scheme uses game strategies and fuzzy inference concepts to effectively grasp the variables in these uncertain situations. The games are played between human and fuzzy players. The accuracy of the fuzzy rule base and the game strategies help to mitigate the adverse effects that a business may suffer from these uncertain factors. We also introduced learning which enables the fuzzy player to adapt over time. We tested this scheme in different scenarios and discover that it could be an invaluable tool in the hand of entrepreneurs that are operating under uncertain and competitive business environments.

  1. Bayesian inference and life testing plans for generalized exponential distribution

    Institute of Scientific and Technical Information of China (English)

    KUNDU; Debasis; PRADHAN; Biswabrata

    2009-01-01

    Recently generalized exponential distribution has received considerable attentions.In this paper,we deal with the Bayesian inference of the unknown parameters of the progressively censored generalized exponential distribution.It is assumed that the scale and the shape parameters have independent gamma priors.The Bayes estimates of the unknown parameters cannot be obtained in the closed form.Lindley’s approximation and importance sampling technique have been suggested to compute the approximate Bayes estimates.Markov Chain Monte Carlo method has been used to compute the approximate Bayes estimates and also to construct the highest posterior density credible intervals.We also provide different criteria to compare two different sampling schemes and hence to ?nd the optimal sampling schemes.It is observed that ?nding the optimum censoring procedure is a computationally expensive process.And we have recommended to use the sub-optimal censoring procedure,which can be obtained very easily.Monte Carlo simulations are performed to compare the performances of the different methods and one data analysis has been performed for illustrative purposes.

  2. Reverse flood routing with the inverted Muskingum storage routing scheme

    Directory of Open Access Journals (Sweden)

    A. D. Koussis

    2012-01-01

    Full Text Available This work treats reverse flood routing aiming at signal identification: inflows are inferred from observed outflows by orienting the Muskingum scheme against the wave propagation direction. Routing against the wave propagation is an ill-posed, inverse problem (small errors amplify, leading to large spurious responses; therefore, the reverse solution must be smoothness-constrained towards stability and uniqueness (regularised. Theoretical constrains on the coefficients of the reverse routing scheme assist in error control, but optimal grids are derived by numerical experimentation. Exact solutions of the convection-diffusion equation, for a single and a composite wave, are reverse-routed and in both instances the wave is backtracked well for a range of grid parameters. In the arduous test of a square pulse, the result is comparable to those of more complex methods. Seeding outflow data with random errors enhances instability; to cope with the spurious oscillations, the reversed solution is conditioned by smoothing via low-pass filtering or optimisation. Good-quality inflow hydrographs are recovered with either smoothing treatment, yet the computationally demanding optimisation is superior. Finally, the reverse Muskingum routing method is compared to a reverse-solution method of the St. Venant equations of flood wave motion and is found to perform equally well, at a fraction of the computing effort. This study leads us to conclude that the efficiently attained good inflow identification rests on the simplicity of the Muskingum reverse routing scheme that endows it with numerical robustness.

  3. Reverse flood routing with the inverted Muskingum storage routing scheme

    Science.gov (United States)

    Koussis, A. D.; Mazi, K.; Lykoudis, S.; Argiriou, A. A.

    2012-01-01

    This work treats reverse flood routing aiming at signal identification: inflows are inferred from observed outflows by orienting the Muskingum scheme against the wave propagation direction. Routing against the wave propagation is an ill-posed, inverse problem (small errors amplify, leading to large spurious responses); therefore, the reverse solution must be smoothness-constrained towards stability and uniqueness (regularised). Theoretical constrains on the coefficients of the reverse routing scheme assist in error control, but optimal grids are derived by numerical experimentation. Exact solutions of the convection-diffusion equation, for a single and a composite wave, are reverse-routed and in both instances the wave is backtracked well for a range of grid parameters. In the arduous test of a square pulse, the result is comparable to those of more complex methods. Seeding outflow data with random errors enhances instability; to cope with the spurious oscillations, the reversed solution is conditioned by smoothing via low-pass filtering or optimisation. Good-quality inflow hydrographs are recovered with either smoothing treatment, yet the computationally demanding optimisation is superior. Finally, the reverse Muskingum routing method is compared to a reverse-solution method of the St. Venant equations of flood wave motion and is found to perform equally well, at a fraction of the computing effort. This study leads us to conclude that the efficiently attained good inflow identification rests on the simplicity of the Muskingum reverse routing scheme that endows it with numerical robustness.

  4. Linguistic Markers of Inference Generation While Reading.

    Science.gov (United States)

    Clinton, Virginia; Carlson, Sarah E; Seipel, Ben

    2016-06-01

    Words can be informative linguistic markers of psychological constructs. The purpose of this study is to examine associations between word use and the process of making meaningful connections to a text while reading (i.e., inference generation). To achieve this purpose, think-aloud data from third-fifth grade students ([Formula: see text]) reading narrative texts were hand-coded for inferences. These data were also processed with a computer text analysis tool, Linguistic Inquiry and Word Count, for percentages of word use in the following categories: cognitive mechanism words, nonfluencies, and nine types of function words. Findings indicate that cognitive mechanisms were an independent, positive predictor of connections to background knowledge (i.e., elaborative inference generation) and nonfluencies were an independent, negative predictor of connections within the text (i.e., bridging inference generation). Function words did not provide unique variance towards predicting inference generation. These findings are discussed in the context of a cognitive reflection model and the differences between bridging and elaborative inference generation. In addition, potential practical implications for intelligent tutoring systems and computer-based methods of inference identification are presented.

  5. WEIGHTED COMPACT SCHEME FOR SHOCK CAPTURING

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A new class of finite difference schemes--the weighted compact schemes are proposed. According to the idea of the WENO schemes, the weighted compact scheme is constructed by a combination of the approximations of derivatives on candidate stencils with properly assigned weights so that the non-oscillatory property is achieved when discontinuities appear. The primitive function reconstruction method of ENO schemes is applied to obtain the conservative form of the weighted compact scheme. This new scheme not only preserves the characteristic of standard compact schemes and achieves high order accuracy and high resolution using a compact stencil,but also can accurately capture shock waves and discontinuities without oscillation, Numerical examples show that the new scheme is very promising and successful.``

  6. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  7. Inference and the introductory statistics course

    Science.gov (United States)

    Pfannkuch, Maxine; Regan, Matt; Wild, Chris; Budgett, Stephanie; Forbes, Sharleen; Harraway, John; Parsonage, Ross

    2011-10-01

    This article sets out some of the rationale and arguments for making major changes to the teaching and learning of statistical inference in introductory courses at our universities by changing from a norm-based, mathematical approach to more conceptually accessible computer-based approaches. The core problem of the inferential argument with its hypothetical probabilistic reasoning process is examined in some depth. We argue that the revolution in the teaching of inference must begin. We also discuss some perplexing issues, problematic areas and some new insights into language conundrums associated with introducing the logic of inference through randomization methods.

  8. Dual Watermarking Scheme with Encryption

    CERN Document Server

    Dhanalakshmi, R

    2010-01-01

    Digital Watermarking is used for copyright protection and authentication. In the proposed system, a Dual Watermarking Scheme based on DWT SVD with chaos encryption algorithm, will be developed to improve the robustness and protection along with security. DWT and SVD have been used as a mathematical tool to embed watermark in the image. Two watermarks are embedded in the host image. The secondary is embedded into primary watermark and the resultant watermarked image is encrypted using chaos based logistic map. This provides an efficient and secure way for image encryption and transmission. The watermarked image is decrypted and a reliable watermark extraction scheme is developed for the extraction of the primary as well as secondary watermark from the distorted image.

  9. Ion Polarization Scheme for MEIC

    CERN Document Server

    Kondratenko, A M; Filatov, Yu N; Derbenev, Ya S; Lin, F; Morozov, V S; Zhang, Y

    2016-01-01

    The choice of a figure 8 shape for the booster and collider rings of MEIC opens wide possibilities for preservation of the ion polarization during beam acceleration as well as for control of the polarization at the collider's interaction points. As in the case of accelerators with Siberian snakes, the spin tune is energy independent but is equal to zero instead of one half. The figure-8 topology eliminates the effect of arcs on the spin motion. There appears a unique opportunity to control the polarization of any particle species including deuterons, using longitudinal fields of small integrated strength (weak solenoids). Contrary to existing schemes, using weak solenoids in figure-8 colliders, one can control the polarization at the interaction points without essentially any effect on the beam's orbital characteristics. A universal scheme for control of the polarization using weak solenoids provides an elegant solution to the problem of ion acceleration completely eliminating resonant beam depolarization. It...

  10. Wrong way recollement for schemes

    OpenAIRE

    Jorgensen, Peter

    2005-01-01

    A recollement of triangulated categories makes it possible to view one such category as being glued together from two others. The prototypical example is that D(X), a suitable derived category of sheaves on the topological space X, has a recollement in terms of D(Z) and D(U) when Z is a closed subset of X and U is the open complement. This note gives a different, "wrong way" recollement in the scheme case.

  11. Parabolic sheaves on logarithmic schemes

    OpenAIRE

    Borne, Niels; Vistoli, Angelo

    2010-01-01

    We show how the natural context for the definition of parabolic sheaves on a scheme is that of logarithmic geometry. The key point is a reformulation of the concept of logarithmic structure in the language of symmetric monoidal categories, which might be of independent interest. Our main result states that parabolic sheaves can be interpreted as quasi-coherent sheaves on certain stacks of roots.

  12. Practical E-Payment Scheme

    Directory of Open Access Journals (Sweden)

    Mohammad Al-Fayoumi

    2010-05-01

    Full Text Available E-payment is now one of the most central research areas in e-commerce, mainly regarding online and offline payment scenarios. Customers are generally passive in e-commerce transaction. Relied on a blind signature, this paper introduces an e-payment protocol, in which customers have more initiative, and can terminate the transaction before possible cheats, its security is enhanced. Moreover, the cost of workers and communications falls down considerably while the cost of trusted authority and protecting information is increased. As there is no trusted authority in the proposed scheme, network overcrowding and conspiracy problems can be avoided. Furthermore, the protocol satisfies fairness and non-repudiation. This helps merchant and bank to speed up the financial transaction process and to give user instant services at any time. Also, in this paper, we will discuss an important e-payment protocol namely pay-word scheme and examine its advantages and limitations, which encourages the authors to improve the scheme that keeps all characteristics intact without compromise of the security robustness. The suggested protocol employs the idea of blind signature with the thought of hash chain. We will compare the proposed protocol with pay-word protocol and demonstrate that the proposed protocol offers more security and efficiency, which makes the protocol workable for real world services.

  13. Cambridge community Optometry Glaucoma Scheme.

    Science.gov (United States)

    Keenan, Jonathan; Shahid, Humma; Bourne, Rupert R; White, Andrew J; Martin, Keith R

    2015-04-01

    With a higher life expectancy, there is an increased demand for hospital glaucoma services in the United Kingdom. The Cambridge community Optometry Glaucoma Scheme (COGS) was initiated in 2010, where new referrals for suspected glaucoma are evaluated by community optometrists with a special interest in glaucoma, with virtual electronic review and validation by a consultant ophthalmologist with special interest in glaucoma. 1733 patients were evaluated by this scheme between 2010 and 2013. Clinical assessment is performed by the optometrist at a remote site. Goldmann applanation tonometry, pachymetry, monoscopic colour optic disc photographs and automated Humphrey visual field testing are performed. A clinical decision is made as to whether a patient has glaucoma or is a suspect, and referred on or discharged as a false positive referral. The clinical findings, optic disc photographs and visual field test results are transmitted electronically for virtual review by a consultant ophthalmologist. The number of false positive referrals from initial referral into the scheme. Of the patients, 46.6% were discharged at assessment and a further 5.7% were discharged following virtual review. Of the patients initially discharged, 2.8% were recalled following virtual review. Following assessment at the hospital, a further 10.5% were discharged after a single visit. The COGS community-based glaucoma screening programme is a safe and effective way of evaluating glaucoma referrals in the community and reducing false-positive referrals for glaucoma into the hospital system. © 2014 Royal Australian and New Zealand College of Ophthalmologists.

  14. A biometric signcryption scheme without bilinear pairing

    Science.gov (United States)

    Wang, Mingwen; Ren, Zhiyuan; Cai, Jun; Zheng, Wentao

    2013-03-01

    How to apply the entropy in biometrics into the encryption and remote authentication schemes to simplify the management of keys is a hot research area. Utilizing Dodis's fuzzy extractor method and Liu's original signcryption scheme, a biometric identity based signcryption scheme is proposed in this paper. The proposed scheme is more efficient than most of the previous proposed biometric signcryption schemes for that it does not need bilinear pairing computation and modular exponentiation computation which is time consuming largely. The analysis results show that under the CDH and DL hard problem assumption, the proposed scheme has the features of confidentiality and unforgeability simultaneously.

  15. Are Evaluations Inferred Directly From Overt Actions?

    Science.gov (United States)

    Brown, Donald; And Others

    1975-01-01

    The operation of a covert information processing mechanism was investigated in two experiments of the self-persuasion phenomena; i. e., making an inference about a stimulus on the basis of one's past behavior. (Editor)

  16. Autonomous forward inference via DNA computing

    Institute of Scientific and Technical Information of China (English)

    Fu Yan; Li Gen; Li Yin; Meng Dazhi

    2007-01-01

    Recent studies direct the researchers into building DNA computing machines with intelligence, which is measured by three main points: autonomous, programmable and able to learn and adapt. Logical inference plays an important role in programmable information processing or computing. Here we present a new method to perform autonomous molecular forward inference for expert system.A novel repetitive recognition site (RRS) technique is invented to design rule-molecules in knowledge base. The inference engine runs autonomously by digesting the rule-molecule, using a Class ⅡB restriction enzyme PpiⅠ. Concentration model has been built to show the feasibility of the inference process under ideal chemical reaction conditions. Moreover, we extend to implement a triggering communication between molecular automata, as a further application of the RRS technique in our model.

  17. Inferring AS Relationships from BGP Attributes

    CERN Document Server

    Giotsas, Vasileios

    2011-01-01

    Business relationships between autonomous systems (AS) are crucial for Internet routing. Existing algorithms used heuristics to infer AS relationships from AS topology data. In this paper we propose a different approach to infer AS relationships from more informative data sources, namely the BGP Community and Local Preference attributes. These data contain rich information on AS routing policies and therefore closely reflect AS relationships. We accumulate the BGP data from RouteViews, RIPE RIS and route servers in August 2010 and February 2011. We infer the AS relationships for 39% of links that are visible in our BGP data. They cover the majority of links among the Tier-1 and Tier-2 ASes. The BGP data also allow us to discover special relationship types, namely hybrid relationship, partial-transit relationship, indirect peering relationship and backup links. Finally we evaluate and analyse the problems of the existing inference algorithms.

  18. Bayesian Cosmological inference beyond statistical isotropy

    Science.gov (United States)

    Souradeep, Tarun; Das, Santanu; Wandelt, Benjamin

    2016-10-01

    With advent of rich data sets, computationally challenge of inference in cosmology has relied on stochastic sampling method. First, I review the widely used MCMC approach used to infer cosmological parameters and present a adaptive improved implementation SCoPE developed by our group. Next, I present a general method for Bayesian inference of the underlying covariance structure of random fields on a sphere. We employ the Bipolar Spherical Harmonic (BipoSH) representation of general covariance structure on the sphere. We illustrate the efficacy of the method with a principled approach to assess violation of statistical isotropy (SI) in the sky maps of Cosmic Microwave Background (CMB) fluctuations. The general, principled, approach to a Bayesian inference of the covariance structure in a random field on a sphere presented here has huge potential for application to other many aspects of cosmology and astronomy, as well as, more distant areas of research like geosciences and climate modelling.

  19. Metacognitive inferences from other people's memory performance.

    Science.gov (United States)

    Smith, Robert W; Schwarz, Norbert

    2016-09-01

    Three studies show that people draw metacognitive inferences about events from how well others remember the event. Given that memory fades over time, detailed accounts of distant events suggest that the event must have been particularly memorable, for example, because it was extreme. Accordingly, participants inferred that a physical assault (Study 1) or a poor restaurant experience (Studies 2-3) were more extreme when they were well remembered one year rather than one week later. These inferences influence behavioral intentions. For example, participants recommended a more severe punishment for a well-remembered distant rather than recent assault (Study 1). These metacognitive inferences are eliminated when people attribute the reporter's good memory to an irrelevant cause (e.g., photographic memory), thus undermining the informational value of memory performance (Study 3). These studies illuminate how people use lay theories of memory to learn from others' memory performance about characteristics of the world. (PsycINFO Database Record

  20. Artificial Hydrocarbon Networks Fuzzy Inference System

    Directory of Open Access Journals (Sweden)

    Hiram Ponce

    2013-01-01

    Full Text Available This paper presents a novel fuzzy inference model based on artificial hydrocarbon networks, a computational algorithm for modeling problems based on chemical hydrocarbon compounds. In particular, the proposed fuzzy-molecular inference model (FIM-model uses molecular units of information to partition the output space in the defuzzification step. Moreover, these molecules are linguistic units that can be partially understandable due to the organized structure of the topology and metadata parameters involved in artificial hydrocarbon networks. In addition, a position controller for a direct current (DC motor was implemented using the proposed FIM-model in type-1 and type-2 fuzzy inference systems. Experimental results demonstrate that the fuzzy-molecular inference model can be applied as an alternative of type-2 Mamdani’s fuzzy control systems because the set of molecular units can deal with dynamic uncertainties mostly present in real-world control applications.

  1. Experimental evidence for circular inference in schizophrenia

    Science.gov (United States)

    Jardri, Renaud; Duverne, Sandrine; Litvinova, Alexandra S.; Denève, Sophie

    2017-01-01

    Schizophrenia (SCZ) is a complex mental disorder that may result in some combination of hallucinations, delusions and disorganized thinking. Here SCZ patients and healthy controls (CTLs) report their level of confidence on a forced-choice task that manipulated the strength of sensory evidence and prior information. Neither group's responses can be explained by simple Bayesian inference. Rather, individual responses are best captured by a model with different degrees of circular inference. Circular inference refers to a corruption of sensory data by prior information and vice versa, leading us to `see what we expect' (through descending loops), to `expect what we see' (through ascending loops) or both. Ascending loops are stronger for SCZ than CTLs and correlate with the severity of positive symptoms. Descending loops correlate with the severity of negative symptoms. Both loops correlate with disorganized symptoms. The findings suggest that circular inference might mediate the clinical manifestations of SCZ.

  2. An inference engine for embedded diagnostic systems

    Science.gov (United States)

    Fox, Barry R.; Brewster, Larry T.

    1987-01-01

    The implementation of an inference engine for embedded diagnostic systems is described. The system consists of two distinct parts. The first is an off-line compiler which accepts a propositional logical statement of the relationship between facts and conclusions and produces data structures required by the on-line inference engine. The second part consists of the inference engine and interface routines which accept assertions of fact and return the conclusions which necessarily follow. Given a set of assertions, it will generate exactly the conclusions which logically follow. At the same time, it will detect any inconsistencies which may propagate from an inconsistent set of assertions or a poorly formulated set of rules. The memory requirements are fixed and the worst case execution times are bounded at compile time. The data structures and inference algorithms are very simple and well understood. The data structures and algorithms are described in detail. The system has been implemented on Lisp, Pascal, and Modula-2.

  3. Composite likelihood method for inferring local pedigrees

    Science.gov (United States)

    Nielsen, Rasmus

    2017-01-01

    Pedigrees contain information about the genealogical relationships among individuals and are of fundamental importance in many areas of genetic studies. However, pedigrees are often unknown and must be inferred from genetic data. Despite the importance of pedigree inference, existing methods are limited to inferring only close relationships or analyzing a small number of individuals or loci. We present a simulated annealing method for estimating pedigrees in large samples of otherwise seemingly unrelated individuals using genome-wide SNP data. The method supports complex pedigree structures such as polygamous families, multi-generational families, and pedigrees in which many of the member individuals are missing. Computational speed is greatly enhanced by the use of a composite likelihood function which approximates the full likelihood. We validate our method on simulated data and show that it can infer distant relatives more accurately than existing methods. Furthermore, we illustrate the utility of the method on a sample of Greenlandic Inuit. PMID:28827797

  4. Operation of the Bayes Inference Engine

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, K.M.; Cunningham, G.S.

    1998-07-27

    The authors have developed a computer application, called the Bayes Inference Engine, to enable one to make inferences about models of a physical object from radiographs taken of it. In the BIE calculational models are represented by a data-flow diagram that can be manipulated by the analyst in a graphical-programming environment. The authors demonstrate the operation of the BIE in terms of examples of two-dimensional tomographic reconstruction including uncertainty estimation.

  5. Causal inference in economics and marketing

    Science.gov (United States)

    Varian, Hal R.

    2016-01-01

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual—a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference. PMID:27382144

  6. A Quantile-Based Sequential Feedback Scheme via Overhearing in Multicarrier Access Networks

    CERN Document Server

    Baek, Seung Jun

    2010-01-01

    We propose a scheme to reduce the overhead associated with channel state information (CSI) feedback required for opportunistic scheduling in multicarrier access networks. We study the case where CSI is partially overheard by mobiles and one can suppress transmitting CSI reports for time varying channel of inferior quality. As a means to assess channel quality and exploit multiuser diversity we adopt maximum quantile (MQ) scheduling. We show that the problem of minimizing the average feedback overhead can be formulated as a Bayesian network problem. A greedy heuristic using probabilistic inference is proposed to deal with the NP-hardness of the problem. Leveraging properties of MQ scheduling we first show that networks having tree-like overhearing graphs admit simple inference. We then present a class of more general network structures for which exact inference is computationally tractable. Simulation results are provided to demonstrate the improvements offered by the proposed heuristic.

  7. Polynomial Chaos Surrogates for Bayesian Inference

    KAUST Repository

    Le Maitre, Olivier

    2016-01-06

    The Bayesian inference is a popular probabilistic method to solve inverse problems, such as the identification of field parameter in a PDE model. The inference rely on the Bayes rule to update the prior density of the sought field, from observations, and derive its posterior distribution. In most cases the posterior distribution has no explicit form and has to be sampled, for instance using a Markov-Chain Monte Carlo method. In practice the prior field parameter is decomposed and truncated (e.g. by means of Karhunen- Lo´eve decomposition) to recast the inference problem into the inference of a finite number of coordinates. Although proved effective in many situations, the Bayesian inference as sketched above faces several difficulties requiring improvements. First, sampling the posterior can be a extremely costly task as it requires multiple resolutions of the PDE model for different values of the field parameter. Second, when the observations are not very much informative, the inferred parameter field can highly depends on its prior which can be somehow arbitrary. These issues have motivated the introduction of reduced modeling or surrogates for the (approximate) determination of the parametrized PDE solution and hyperparameters in the description of the prior field. Our contribution focuses on recent developments in these two directions: the acceleration of the posterior sampling by means of Polynomial Chaos expansions and the efficient treatment of parametrized covariance functions for the prior field. We also discuss the possibility of making such approach adaptive to further improve its efficiency.

  8. Algebraic K-theory of generalized schemes

    DEFF Research Database (Denmark)

    Anevski, Stella Victoria Desiree

    Nikolai Durov has developed a generalization of conventional scheme theory in which commutative algebraic monads replace commutative unital rings as the basic algebraic objects. The resulting geometry is expressive enough to encompass conventional scheme theory, tropical algebraic geometry...

  9. Wireless Broadband Access and Accounting Schemes

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In this paper, we propose two wireless broadband access and accounting schemes. In both schemes, the accounting system adopts RADIUS protocol, but the access system adopts SSH and SSL protocols respectively.

  10. Elliptic Curve Blind Digital Signature Schemes

    Institute of Scientific and Technical Information of China (English)

    YOULin; YANGYixian; WENQiaoyan

    2003-01-01

    Blind signature schemes are important cryptographic protocols in guaranteeing the privacy or anonymity of the users.Three new blind signature schemes and their corresponding generalizations are pro-posed. Moreover, their securities are simply analyzed.

  11. Algebraic K-theory of generalized schemes

    DEFF Research Database (Denmark)

    Anevski, Stella Victoria Desiree

    Nikolai Durov has developed a generalization of conventional scheme theory in which commutative algebraic monads replace commutative unital rings as the basic algebraic objects. The resulting geometry is expressive enough to encompass conventional scheme theory, tropical algebraic geometry...

  12. Secret sharing scheme with inherited characteristic

    Institute of Scientific and Technical Information of China (English)

    Ye Zhenjun; Meng Fanzhen

    2006-01-01

    To assure the shareholders can look for their "legal" attorneys to renew the secret, once the secret sharing scheme is initialized, a secret sharing scheme with inherited characteristic is constructed. In this scheme, each shareholder can produce a new share by his algorithm, which is equivalent to the primary one. Together with other shares, the primary secret can be renewed. Since this scheme is constructed not by replacing the primary share with a new share produced by the dealer in his primitive secret sharing scheme, so no matter how much shares the shareholder produces, these shares can not be gathered together to renew the secret in this scheme. Compared with the existing secret sharing schemes, this scheme provides more agility for the shareholders by investing each of them a function but not affect its security.

  13. Mood Inference Machine: Framework to Infer Affective Phenomena in ROODA Virtual Learning Environment

    Directory of Open Access Journals (Sweden)

    Magalí Teresinha Longhi

    2012-02-01

    Full Text Available This article presents a mechanism to infer mood states, aiming to provide virtual learning environments (VLEs with a tool able to recognize the student’s motivation. The inference model has as its parameters personality traits, motivational factors obtained through behavioral standards and the affective subjectivity identified in texts made available in the communication functionalities of the VLE. In the inference machine, such variables are treated under probability reasoning, more precisely by Bayesian networks.

  14. Linear multi-secret sharing schemes

    Institute of Scientific and Technical Information of China (English)

    XIAO Liangliang; LIU Mulan

    2005-01-01

    In this paper the linear multi-secret sharing schemes are studied by using monotone span programs. A relation between computing monotone Boolean functions by using monotone span programs and realizing multi-access structures by using linear multisecret sharing schemes is shown. Furthermore, the concept of optimal linear multi-secret sharing scheme is presented and the several schemes are proved to be optimal.

  15. Improvement of publicly verifiable authenticated encryption scheme

    Institute of Scientific and Technical Information of China (English)

    LEI Fei-yu; CHEN Wen; MA Chang-she; CHEN Ke-fei

    2007-01-01

    A weakness of unforgeability is found in Ma and Chen scheme, and the root cause is the susceptive linear design in the scheme. In order to avoid the weakness and susceptive linear design, an improvement by means of two mechanisms including quadratic residue and composite discrete logarithm is proposed, which can defeat the forgery attacks in Ma and Chen scheme. The new scheme remains good confidentiality, public verifiability and efficiency.

  16. A massive momentum-subtraction scheme

    CERN Document Server

    Boyle, Peter; Khamseh, Ava

    2016-01-01

    A new renormalization scheme is defined for fermion bilinears in QCD at non vanishing quark masses. This new scheme, denoted RI/mSMOM, preserves the benefits of the nonexceptional momenta introduced in the RI/SMOM scheme, and allows a definition of renormalized composite fields away from the chiral limit. Some properties of the scheme are investigated by performing explicit one-loop computation in dimensional regularization.

  17. MIRD radionuclide data and decay schemes

    CERN Document Server

    Eckerman, Keith F

    2007-01-01

    For all physicians, scientists, and physicists working in the nuclear medicine field, the MIRD: Radionuclide Data and Decay Schemes updated edition is an essential sourcebook for radiation dosimetry and understanding the properties of radionuclides. Includes CD Table of Contents Decay schemes listed by atomic number Radioactive decay processes Serial decay schemes Decay schemes and decay tables This essential reference for nuclear medicine physicians, scientists and physicists also includes a CD with tabulations of the radionuclide data necessary for dosimetry calculations.

  18. Comparative study of numerical schemes of TVD3, UNO3-ACM and optimized compact scheme

    Science.gov (United States)

    Lee, Duck-Joo; Hwang, Chang-Jeon; Ko, Duck-Kon; Kim, Jae-Wook

    1995-01-01

    Three different schemes are employed to solve the benchmark problem. The first one is a conventional TVD-MUSCL (Monotone Upwind Schemes for Conservation Laws) scheme. The second scheme is a UNO3-ACM (Uniformly Non-Oscillatory Artificial Compression Method) scheme. The third scheme is an optimized compact finite difference scheme modified by us: the 4th order Runge Kutta time stepping, the 4th order pentadiagonal compact spatial discretization with the maximum resolution characteristics. The problems of category 1 are solved by using the second (UNO3-ACM) and third (Optimized Compact) schemes. The problems of category 2 are solved by using the first (TVD3) and second (UNO3-ACM) schemes. The problem of category 5 is solved by using the first (TVD3) scheme. It can be concluded from the present calculations that the Optimized Compact scheme and the UN03-ACM show good resolutions for category 1 and category 2 respectively.

  19. Population Monotonic Path Schemes for Simple Games

    NARCIS (Netherlands)

    Ciftci, B.B.; Borm, P.E.M.; Hamers, H.J.M.

    2006-01-01

    A path scheme for a simple game is composed of a path, i.e., a sequence of coalitions that is formed during the coalition formation process and a scheme, i.e., a payoff vector for each coalition in the path.A path scheme is called population monotonic if a player's payoff does not decrease as the pa

  20. A new semi-Lagrangian difference scheme

    Institute of Scientific and Technical Information of China (English)

    季仲贞; 陈嘉滨

    2001-01-01

    A new completely energy-conserving semi-Lagrangian scheme is constructed. The numerical solution of shallow water equation shows that this conservative scheme preserves the total energy in twelve significant digits, while the traditional scheme does only in five significant digits.

  1. Arbitrated quantum signature scheme with message recovery

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hwayean; Hong, Changho; Kim, Hyunsang; Lim, Jongin; Yang, Hyung Jin

    2004-02-16

    Two quantum signature schemes with message recovery relying on the availability of an arbitrator are proposed. One scheme uses a public board and the other does not. However both schemes provide confidentiality of the message and a higher efficiency in transmission.

  2. Quantum Signature Scheme with Weak Arbitrator

    Science.gov (United States)

    Luo, Ming-Xing; Chen, Xiu-Bo; Yun, Deng; Yang, Yi-Xian

    2012-07-01

    In this paper, we propose one quantum signature scheme with a weak arbitrator to sign classical messages. This scheme can preserve the merits in the original arbitrated scheme with some entanglement resources, and provide a higher efficiency in transmission and reduction the complexity of implementation. The arbitrator is costless and only involved in the disagreement case.

  3. Current terminology and diagnostic classification schemes.

    Science.gov (United States)

    Okeson, J P

    1997-01-01

    This article reviews the current terminology and classification schemes available for temporomandibular disorders. The origin of each term is presented, and the classification schemes that have been offered for temporomandibular disorders are briefly reviewed. Several important classifications are presented in more detail, with mention of advantages and disadvantages. Final recommendations are provided for future direction in the area of classification schemes.

  4. A Model of Hierarchical Key Assignment Scheme

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhigang; ZHAO Jing; XU Maozhi

    2006-01-01

    A model of the hierarchical key assignment scheme is approached in this paper, which can be used with any cryptography algorithm. Besides, the optimal dynamic control property of a hierarchical key assignment scheme will be defined in this paper. Also, our scheme model will meet this property.

  5. Support Schemes and Ownership Structures

    DEFF Research Database (Denmark)

    Ropenus, Stephanie; Schröder, Sascha Thorsten; Costa, Ana

    In recent years, fuel cell based micro‐combined heat and power has received increasing attention due to its potential contribution to energy savings, efficiency gains, customer proximity and flexibility in operation and capacity size. The FC4Home project assesses technical and economic aspects...... for promoting combined heat and power and energy from renewable sources. These Directives are to be implemented at the national level by the Member States. Section 3 conceptually presents the spectrum of national support schemes, ranging from investment support to market‐based operational support. The choice...

  6. Update on the Pyramid Scheme

    Science.gov (United States)

    Banks, Tom; Torres, T. J.

    2012-10-01

    We summarize recent work in which we attempt to make consistent models of LHC physics, from the Pyramid Scheme. The models share much with the NMSSM, in particular, enhanced tree level contributions to the Higgs mass and a preference for small tan β. There are three different singlet fields, and a new strongly coupled gauge theory, so the constraints of perturbative unification are quite different. We outline our general approach to the model, which contains a Kähler potential for three of the low energy fields, which is hard to calculate. Detailed calculations, based on approximations to the Kähler potential, will be presented in a future publication.

  7. Multisensory oddity detection as bayesian inference.

    Directory of Open Access Journals (Sweden)

    Timothy Hospedales

    Full Text Available A key goal for the perceptual system is to optimally combine information from all the senses that may be available in order to develop the most accurate and unified picture possible of the outside world. The contemporary theoretical framework of ideal observer maximum likelihood integration (MLI has been highly successful in modelling how the human brain combines information from a variety of different sensory modalities. However, in various recent experiments involving multisensory stimuli of uncertain correspondence, MLI breaks down as a successful model of sensory combination. Within the paradigm of direct stimulus estimation, perceptual models which use Bayesian inference to resolve correspondence have recently been shown to generalize successfully to these cases where MLI fails. This approach has been known variously as model inference, causal inference or structure inference. In this paper, we examine causal uncertainty in another important class of multi-sensory perception paradigm--that of oddity detection and demonstrate how a Bayesian ideal observer also treats oddity detection as a structure inference problem. We validate this approach by showing that it provides an intuitive and quantitative explanation of an important pair of multi-sensory oddity detection experiments--involving cues across and within modalities--for which MLI previously failed dramatically, allowing a novel unifying treatment of within and cross modal multisensory perception. Our successful application of structure inference models to the new 'oddity detection' paradigm, and the resultant unified explanation of across and within modality cases provide further evidence to suggest that structure inference may be a commonly evolved principle for combining perceptual information in the brain.

  8. Multisensory oddity detection as bayesian inference.

    Science.gov (United States)

    Hospedales, Timothy; Vijayakumar, Sethu

    2009-01-01

    A key goal for the perceptual system is to optimally combine information from all the senses that may be available in order to develop the most accurate and unified picture possible of the outside world. The contemporary theoretical framework of ideal observer maximum likelihood integration (MLI) has been highly successful in modelling how the human brain combines information from a variety of different sensory modalities. However, in various recent experiments involving multisensory stimuli of uncertain correspondence, MLI breaks down as a successful model of sensory combination. Within the paradigm of direct stimulus estimation, perceptual models which use Bayesian inference to resolve correspondence have recently been shown to generalize successfully to these cases where MLI fails. This approach has been known variously as model inference, causal inference or structure inference. In this paper, we examine causal uncertainty in another important class of multi-sensory perception paradigm--that of oddity detection and demonstrate how a Bayesian ideal observer also treats oddity detection as a structure inference problem. We validate this approach by showing that it provides an intuitive and quantitative explanation of an important pair of multi-sensory oddity detection experiments--involving cues across and within modalities--for which MLI previously failed dramatically, allowing a novel unifying treatment of within and cross modal multisensory perception. Our successful application of structure inference models to the new 'oddity detection' paradigm, and the resultant unified explanation of across and within modality cases provide further evidence to suggest that structure inference may be a commonly evolved principle for combining perceptual information in the brain.

  9. Inference of Isoforms from Short Sequence Reads

    Science.gov (United States)

    Feng, Jianxing; Li, Wei; Jiang, Tao

    Due to alternative splicing events in eukaryotic species, the identification of mRNA isoforms (or splicing variants) is a difficult problem. Traditional experimental methods for this purpose are time consuming and cost ineffective. The emerging RNA-Seq technology provides a possible effective method to address this problem. Although the advantages of RNA-Seq over traditional methods in transcriptome analysis have been confirmed by many studies, the inference of isoforms from millions of short sequence reads (e.g., Illumina/Solexa reads) has remained computationally challenging. In this work, we propose a method to calculate the expression levels of isoforms and infer isoforms from short RNA-Seq reads using exon-intron boundary, transcription start site (TSS) and poly-A site (PAS) information. We first formulate the relationship among exons, isoforms, and single-end reads as a convex quadratic program, and then use an efficient algorithm (called IsoInfer) to search for isoforms. IsoInfer can calculate the expression levels of isoforms accurately if all the isoforms are known and infer novel isoforms from scratch. Our experimental tests on known mouse isoforms with both simulated expression levels and reads demonstrate that IsoInfer is able to calculate the expression levels of isoforms with an accuracy comparable to the state-of-the-art statistical method and a 60 times faster speed. Moreover, our tests on both simulated and real reads show that it achieves a good precision and sensitivity in inferring isoforms when given accurate exon-intron boundary, TSS and PAS information, especially for isoforms whose expression levels are significantly high.

  10. Adaptive Quantization Index Modulation Audio Watermarking based on Fuzzy Inference System

    Directory of Open Access Journals (Sweden)

    Sunita V. Dhavale

    2014-02-01

    Full Text Available Many of the adaptive watermarking schemes reported in the literature consider only local audio signal properties. Many schemes require complex computation along with manual parameter settings. In this paper, we propose a novel, fuzzy, adaptive audio watermarking algorithm based on both global and local audio signal properties. The algorithm performs well for dynamic range of audio signals without requiring manual initial parameter selection. Here, mean value of energy (MVE and variance of spectral flux (VSF of a given audio signal constitutes global components, while the energy of each audio frame acts as local component. The Quantization Index Modulation (QIM step size Δ is made adaptive to both the global and local features. The global component automates the initial selection of Δ using the fuzzy inference system while the local component controls the variation in it based on the energy of individual audio frame. Hence Δ adaptively controls the strength of watermark to meet both the robustness and inaudibility requirements, making the system independent of audio nature. Experimental results reveal that our adaptive scheme outperforms other fixed step sized QIM schemes and adaptive schemes and is highly robust against general attacks.

  11. REMINDER: Saved Leave Scheme (SLS)

    CERN Multimedia

    2003-01-01

    Transfer of leave to saved leave accounts Under the provisions of the voluntary saved leave scheme (SLS), a maximum total of 10 days'* annual and compensatory leave (excluding saved leave accumulated in accordance with the provisions of Administrative Circular No 22B) can be transferred to the saved leave account at the end of the leave year (30 September). We remind you that unused leave of all those taking part in the saved leave scheme at the closure of the leave year accounts is transferred automatically to the saved leave account on that date. Therefore, staff members have no administrative steps to take. In addition, the transfer, which eliminates the risk of omitting to request leave transfers and rules out calculation errors in transfer requests, will be clearly shown in the list of leave transactions that can be consulted in EDH from October 2003 onwards. Furthermore, this automatic leave transfer optimizes staff members' chances of benefiting from a saved leave bonus provided that they ar...

  12. Pseudorandomness of Camellia-Like Scheme

    Institute of Scientific and Technical Information of China (English)

    Wen-Ling Wu

    2006-01-01

    Luby and Rackoff idealized DES by replacing each round function with one large random function. In this paper, the author idealizes Camellia by replacing each S-box with one small random function, which is named Camellialike scheme. It is then proved that five-round Camellia-like scheme is pseudorandom and eight-round Camellia-like scheme is super-pseudorandom for adaptive adversaries. Further the paper considers more efficient construction of Camellia-like scheme, and discusses how to construct pseudorandom Camellia-like scheme from less random functions.

  13. Modification of QUICK scheme by skew points

    Energy Technology Data Exchange (ETDEWEB)

    Mirzaei, M.; Mohammadi, R.; Malekzadeh, M. [K.N. Toosi Univ. of Technology, Aerospace Engineering Dept., Tehran (Iran, Islamic Republic of)]. E-mail: Mirzaei@kntu.ac.ir

    2005-07-01

    This paper presents a new method for convective flux approximation based on inclusions of skew points. The scheme uses the truncated terms of QUICK scheme and with the aid of an equation extracted from momentum equations, the skew points will appear in the convective flux formula. The results show that the presented scheme has better accuracy than the other schemes. Diffusion fluxes are approximated using power law scheme and for evaluation of the performance of the presented method several test cases were carried out and the results are compared with the results of other numerical works and experimental data. (author)

  14. A Provably Secure Asynchronous Proactive RSA Scheme

    Institute of Scientific and Technical Information of China (English)

    ZHANG Rui-shan; LI Qiang; CHEN Ke-fei

    2005-01-01

    The drawback of the first asynchronous proactive RSA scheme presented by Zhou in 2001, is that the se curity definition and security proof do not follow the approach of provable security. This paper presented a provably secure asynchronous proactive RSA scheme, which includes three protocols: initial key distribution protocol,signature generation protocol and share refreshing protocol. Taken these protocols together, a complete provably secure proactive RSA scheme was obtained. And the efficiency of the scheme is approximate to that of the scheme of Zhou.

  15. Hash function based secret sharing scheme designs

    CERN Document Server

    Chum, Chi Sing

    2011-01-01

    Secret sharing schemes create an effective method to safeguard a secret by dividing it among several participants. By using hash functions and the herding hashes technique, we first set up a (t+1, n) threshold scheme which is perfect and ideal, and then extend it to schemes for any general access structure. The schemes can be further set up as proactive or verifiable if necessary. The setup and recovery of the secret is efficient due to the fast calculation of the hash function. The proposed scheme is flexible because of the use of existing hash functions.

  16. A New Signature Scheme with Shared Verification

    Institute of Scientific and Technical Information of China (English)

    JIA Xiao-yun; LUO Shou-shan; YUAN Chao-wei

    2006-01-01

    With expanding user demands, digital signature techniques are also being expanded greatly, from single signature and single verification techniques to techniques supporting multi-users. This paper presents a new digital signature scheme vith shared verification based on the fiat-shamir signature scheme. This scheme is suitable not only for digital signatures of one public key, but also for situations where multiple public keys are required. In addition, the scheme can resist all kinds of collusion, making it more practicable and safer. Additionally it is more efficient than other schemes.

  17. Colluding attacks on a group signature scheme

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Xie and Yu (2005) proposed a group signature scheme and claimed that it is the most efficient group signature scheme so far and secure. In this paper, we show that two dishonest group members can collude to launch two attacks on the scheme. In the first attack they can derive the group secret key and then generate untraceable group signatures. In the second attack, they can impersonate other group members once they see their signatures. Therefore we conclude that the signature scheme is not secure.We show that some parameters should be carefully selected in the scheme to resist our attacks.

  18. Nonrepudiable Proxy Multi-Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    LI JiGuo(李继国); CAO ZhenFu(曹珍富); ZHANG YiChen(张亦辰)

    2003-01-01

    The concept of proxy signature introduced by Mambo, Usuda, and Okamotoallows a designated person, called a proxy signer, to sign on behalf of an original signer. However,most existing proxy signature schemes do not support nonrepudiation. In this paper, two securenonrepudiable proxy multi-signature schemes are proposed that overcome disadvantages of theexisting schemes. The proposed schemes can withstand public key substitution attack. In addition,the new schemes have some other advantages such as proxy signature key generation and updatingusing insecure channels. This approach can also be applied to other ElGamal-like proxy signatureschemes.

  19. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  20. Deep Learning for Population Genetic Inference.

    Directory of Open Access Journals (Sweden)

    Sara Sheehan

    2016-03-01

    Full Text Available Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data to the output (e.g., population genetic parameters of interest. We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history. Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  1. Deep Learning for Population Genetic Inference.

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S

    2016-03-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  2. Deep Learning for Population Genetic Inference

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S.

    2016-01-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme. PMID:27018908

  3. Generative Inferences Based on Learned Relations.

    Science.gov (United States)

    Chen, Dawn; Lu, Hongjing; Holyoak, Keith J

    2016-11-17

    A key property of relational representations is their generativity: From partial descriptions of relations between entities, additional inferences can be drawn about other entities. A major theoretical challenge is to demonstrate how the capacity to make generative inferences could arise as a result of learning relations from non-relational inputs. In the present paper, we show that a bottom-up model of relation learning, initially developed to discriminate between positive and negative examples of comparative relations (e.g., deciding whether a sheep is larger than a rabbit), can be extended to make generative inferences. The model is able to make quasi-deductive transitive inferences (e.g., "If A is larger than B and B is larger than C, then A is larger than C") and to qualitatively account for human responses to generative questions such as "What is an animal that is smaller than a dog?" These results provide evidence that relational models based on bottom-up learning mechanisms are capable of supporting generative inferences.

  4. Hybrid Transmission Scheme for MIMO Relay Channels

    Directory of Open Access Journals (Sweden)

    Guangming Xu

    2009-11-01

    Full Text Available To improve the achievable rate for the MIMO channels, we propose a hybrid transmission (HT scheme that mixes half-duplex decode-and-forward cooperative relaying transmission (DFRH)with direct transmission (DT. In the HT scheme, the source message is divided into two parts: one is transmitted by DFRH scheme and another is transmitted by DT scheme. Precoding and decoding are considered to convert the original MIMO relay channel into several parallel subchannels so that resource allocation can be easily performed. We focus on the spatial subchannel and power allocation problem. The objective of this problem is to maximize the total achievable rate under the constraints of joint total transmission power. Simulation results show that significant capacity gain can be achieved by the HT scheme compared to the DT scheme and the pure DFRH scheme.

  5. High-Order Energy Stable WENO Schemes

    Science.gov (United States)

    Yamaleev, Nail K.; Carpenter, Mark H.

    2008-01-01

    A new third-order Energy Stable Weighted Essentially NonOscillatory (ESWENO) finite difference scheme for scalar and vector linear hyperbolic equations with piecewise continuous initial conditions is developed. The new scheme is proven to be stable in the energy norm for both continuous and discontinuous solutions. In contrast to the existing high-resolution shock-capturing schemes, no assumption that the reconstruction should be total variation bounded (TVB) is explicitly required to prove stability of the new scheme. A rigorous truncation error analysis is presented showing that the accuracy of the 3rd-order ESWENO scheme is drastically improved if the tuning parameters of the weight functions satisfy certain criteria. Numerical results show that the new ESWENO scheme is stable and significantly outperforms the conventional third-order WENO finite difference scheme of Jiang and Shu in terms of accuracy, while providing essentially nonoscillatory solutions near strong discontinuities.

  6. Benchmarking the next generation of homology inference tools.

    Science.gov (United States)

    Saripella, Ganapathi Varma; Sonnhammer, Erik L L; Forslund, Kristoffer

    2016-09-01

    Over the last decades, vast numbers of sequences were deposited in public databases. Bioinformatics tools allow homology and consequently functional inference for these sequences. New profile-based homology search tools have been introduced, allowing reliable detection of remote homologs, but have not been systematically benchmarked. To provide such a comparison, which can guide bioinformatics workflows, we extend and apply our previously developed benchmark approach to evaluate the 'next generation' of profile-based approaches, including CS-BLAST, HHSEARCH and PHMMER, in comparison with the non-profile based search tools NCBI-BLAST, USEARCH, UBLAST and FASTA. We generated challenging benchmark datasets based on protein domain architectures within either the PFAM + Clan, SCOP/Superfamily or CATH/Gene3D domain definition schemes. From each dataset, homologous and non-homologous protein pairs were aligned using each tool, and standard performance metrics calculated. We further measured congruence of domain architecture assignments in the three domain databases. CSBLAST and PHMMER had overall highest accuracy. FASTA, UBLAST and USEARCH showed large trade-offs of accuracy for speed optimization. Profile methods are superior at inferring remote homologs but the difference in accuracy between methods is relatively small. PHMMER and CSBLAST stand out with the highest accuracy, yet still at a reasonable computational cost. Additionally, we show that less than 0.1% of Swiss-Prot protein pairs considered homologous by one database are considered non-homologous by another, implying that these classifications represent equivalent underlying biological phenomena, differing mostly in coverage and granularity. Benchmark datasets and all scripts are placed at (http://sonnhammer.org/download/Homology_benchmark). forslund@embl.de Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  7. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    Science.gov (United States)

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  8. Genetics Home Reference: GRN-related frontotemporal dementia

    Science.gov (United States)

    ... Neumann M, Kwong LK, Trojanowski JQ, Lee VM, Grossman M. Clinical, genetic, and pathologic characteristics of patients ... Feldman H, Woltjer R, Miller CA, Wood EM, Grossman M, McCluskey L, Clark CM, Neumann M, Danek ...

  9. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  10. Hierarchical probabilistic inference of cosmic shear

    CERN Document Server

    Schneider, Michael D; Marshall, Philip J; Dawson, William A; Meyers, Joshua; Bard, Deborah J; Lang, Dustin

    2014-01-01

    Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the glo...

  11. Lifted Inference for Relational Continuous Models

    CERN Document Server

    Choi, Jaesik; Hill, David J

    2012-01-01

    Relational Continuous Models (RCMs) represent joint probability densities over attributes of objects, when the attributes have continuous domains. With relational representations, they can model joint probability distributions over large numbers of variables compactly in a natural way. This paper presents a new exact lifted inference algorithm for RCMs, thus it scales up to large models of real world applications. The algorithm applies to Relational Pairwise Models which are (relational) products of potentials of arity 2. Our algorithm is unique in two ways. First, it substantially improves the efficiency of lifted inference with variables of continuous domains. When a relational model has Gaussian potentials, it takes only linear-time compared to cubic time of previous methods. Second, it is the first exact inference algorithm which handles RCMs in a lifted way. The algorithm is illustrated over an example from econometrics. Experimental results show that our algorithm outperforms both a groundlevel inferenc...

  12. On Tidal Inference in the Diurnal Band

    Science.gov (United States)

    Ray, R. D.

    2017-01-01

    Standard methods of tidal inference should be revised to account for a known resonance that occurs mostly within the K(sub 1) tidal group in the diurnal band. The resonance arises from a free rotational mode of Earth caused by the fluid core. In a set of 110 bottom-pressure tide stations, the amplitude of the P(sub 1) tidal constituent is shown to be suppressed relative to K(sub 1), which is in good agreement with the resonance theory. Standard formulas for the K(sub 1) nodal modulation remain essentially unaffected. Two examples are given of applications of the refined inference methodology: one with monthly tide gauge data and one with satellite altimetry. For some altimeter-constrained tide models, an inferred P(sub 1) constituent is found to be more accurate than a directly determined one.

  13. Grammatical inference algorithms, routines and applications

    CERN Document Server

    Wieczorek, Wojciech

    2017-01-01

    This book focuses on grammatical inference, presenting classic and modern methods of grammatical inference from the perspective of practitioners. To do so, it employs the Python programming language to present all of the methods discussed. Grammatical inference is a field that lies at the intersection of multiple disciplines, with contributions from computational linguistics, pattern recognition, machine learning, computational biology, formal learning theory and many others. Though the book is largely practical, it also includes elements of learning theory, combinatorics on words, the theory of automata and formal languages, plus references to real-world problems. The listings presented here can be directly copied and pasted into other programs, thus making the book a valuable source of ready recipes for students, academic researchers, and programmers alike, as well as an inspiration for their further development.>.

  14. Parameter inference with estimated covariance matrices

    CERN Document Server

    Sellentin, Elena

    2015-01-01

    When inferring parameters from a Gaussian-distributed data set by computing a likelihood, a covariance matrix is needed that describes the data errors and their correlations. If the covariance matrix is not known a priori, it may be estimated and thereby becomes a random object with some intrinsic uncertainty itself. We show how to infer parameters in the presence of such an estimated covariance matrix, by marginalising over the true covariance matrix, conditioned on its estimated value. This leads to a likelihood function that is no longer Gaussian, but rather an adapted version of a multivariate $t$-distribution, which has the same numerical complexity as the multivariate Gaussian. As expected, marginalisation over the true covariance matrix improves inference when compared with Hartlap et al.'s method, which uses an unbiased estimate of the inverse covariance matrix but still assumes that the likelihood is Gaussian.

  15. Inferring epidemic network topology from surveillance data.

    Science.gov (United States)

    Wan, Xiang; Liu, Jiming; Cheung, William K; Tong, Tiejun

    2014-01-01

    The transmission of infectious diseases can be affected by many or even hidden factors, making it difficult to accurately predict when and where outbreaks may emerge. One approach at the moment is to develop and deploy surveillance systems in an effort to detect outbreaks as timely as possible. This enables policy makers to modify and implement strategies for the control of the transmission. The accumulated surveillance data including temporal, spatial, clinical, and demographic information, can provide valuable information with which to infer the underlying epidemic networks. Such networks can be quite informative and insightful as they characterize how infectious diseases transmit from one location to another. The aim of this work is to develop a computational model that allows inferences to be made regarding epidemic network topology in heterogeneous populations. We apply our model on the surveillance data from the 2009 H1N1 pandemic in Hong Kong. The inferred epidemic network displays significant effect on the propagation of infectious diseases.

  16. Examples in parametric inference with R

    CERN Document Server

    Dixit, Ulhas Jayram

    2016-01-01

    This book discusses examples in parametric inference with R. Combining basic theory with modern approaches, it presents the latest developments and trends in statistical inference for students who do not have an advanced mathematical and statistical background. The topics discussed in the book are fundamental and common to many fields of statistical inference and thus serve as a point of departure for in-depth study. The book is divided into eight chapters: Chapter 1 provides an overview of topics on sufficiency and completeness, while Chapter 2 briefly discusses unbiased estimation. Chapter 3 focuses on the study of moments and maximum likelihood estimators, and Chapter 4 presents bounds for the variance. In Chapter 5, topics on consistent estimator are discussed. Chapter 6 discusses Bayes, while Chapter 7 studies some more powerful tests. Lastly, Chapter 8 examines unbiased and other tests. Senior undergraduate and graduate students in statistics and mathematics, and those who have taken an introductory cou...

  17. Picturing classical and quantum Bayesian inference

    CERN Document Server

    Coecke, Bob

    2011-01-01

    We introduce a graphical framework for Bayesian inference that is sufficiently general to accommodate not just the standard case but also recent proposals for a theory of quantum Bayesian inference wherein one considers density operators rather than probability distributions as representative of degrees of belief. The diagrammatic framework is stated in the graphical language of symmetric monoidal categories and of compact structures and Frobenius structures therein, in which Bayesian inversion boils down to transposition with respect to an appropriate compact structure. We characterize classical Bayesian inference in terms of a graphical property and demonstrate that our approach eliminates some purely conventional elements that appear in common representations thereof, such as whether degrees of belief are represented by probabilities or entropic quantities. We also introduce a quantum-like calculus wherein the Frobenius structure is noncommutative and show that it can accommodate Leifer's calculus of `cond...

  18. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian;

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...... injections in intermediate-to-strongly coupled systems could enable more accurate causal inferences. Given the inherent noisy nature of real-world systems, our findings enable a more accurate evaluation of CCM applicability and advance suggestions on how to overcome its weaknesses....

  19. A Learning Algorithm for Multimodal Grammar Inference.

    Science.gov (United States)

    D'Ulizia, A; Ferri, F; Grifoni, P

    2011-12-01

    The high costs of development and maintenance of multimodal grammars in integrating and understanding input in multimodal interfaces lead to the investigation of novel algorithmic solutions in automating grammar generation and in updating processes. Many algorithms for context-free grammar inference have been developed in the natural language processing literature. An extension of these algorithms toward the inference of multimodal grammars is necessary for multimodal input processing. In this paper, we propose a novel grammar inference mechanism that allows us to learn a multimodal grammar from its positive samples of multimodal sentences. The algorithm first generates the multimodal grammar that is able to parse the positive samples of sentences and, afterward, makes use of two learning operators and the minimum description length metrics in improving the grammar description and in avoiding the over-generalization problem. The experimental results highlight the acceptable performances of the algorithm proposed in this paper since it has a very high probability of parsing valid sentences.

  20. Generalized Collective Inference with Symmetric Clique Potentials

    CERN Document Server

    Gupta, Rahul; Dewan, Ajit A

    2009-01-01

    Collective graphical models exploit inter-instance associative dependence to output more accurate labelings. However existing models support very limited kind of associativity which restricts accuracy gains. This paper makes two major contributions. First, we propose a general collective inference framework that biases data instances to agree on a set of {\\em properties} of their labelings. Agreement is encouraged through symmetric clique potentials. We show that rich properties leads to bigger gains, and present a systematic inference procedure for a large class of such properties. The procedure performs message passing on the cluster graph, where property-aware messages are computed with cluster specific algorithms. This provides an inference-only solution for domain adaptation. Our experiments on bibliographic information extraction illustrate significant test error reduction over unseen domains. Our second major contribution consists of algorithms for computing outgoing messages from clique clusters with ...

  1. 广告传播与女性的自我意识——以运动品牌广告贵人鸟“我Run我的快乐”为例%Advertisement and Female's Self-awareness——Taking the Example of Sports Brand Grn-group "I run my happiness"

    Institute of Scientific and Technical Information of China (English)

    黎藜

    2012-01-01

    In the sports goods,it's always the male's world.It has not only promoted the product,also spread the correct concept of female in the new advertisements "I Run my happiness" of the Grn-group.Materialization of female in many advertisements has influenced the correct concept of female.At the same time,advertisement,one of the most important mass media,has influenced not only the popular culture but also the social concept.Based on the above reasons,this paper has the following viewpoint that the advertising communication must be an important means to guide the correct concept of female,to promote equality between men and women,and to urge social justice and equality.%体育用品一向是男性的天下,但贵人鸟运动鞋"我Run我的快乐"系列广告中采用了女性作为广告主角,不仅仅向受众推介了商品,也传播了健康、正确的女性观念。现代广告作为大众传播的重要媒介,对大众文化的传播、社会观念的形成影响甚大,众多商品广告中女性被物化的现象直接影响着社会女性观念的建构,并在一定程度上误导了性别歧视。因此广告传播者树立起正确的女性观念,引导大众的女性观念,才能有效促进以男女平等为核心的社会公平与平等,促进社会的文明与进步。

  2. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analysing Peter Diggle's heather data set, where we discuss the results of simulation......This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  3. Variational Bayesian Inference of Line Spectra

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri

    2017-01-01

    In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid; and the coeffici......In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid...

  4. Statistical Inference for Partially Observed Diffusion Processes

    DEFF Research Database (Denmark)

    Jensen, Anders Christian

    -dimensional Ornstein-Uhlenbeck where one coordinate is completely unobserved. This model does not have the Markov property and it makes parameter inference more complicated. Next we take a Bayesian approach and introduce some basic Markov chain Monte Carlo methods. In chapter ve and six we describe an Bayesian method...... to perform parameter inference in multivariate diffusion models that may be only partially observed. The methodology is applied to the stochastic FitzHugh-Nagumo model and the two-dimensional Ornstein-Uhlenbeck process. Chapter seven focus on parameter identifiability in the aprtially observed Ornstein...

  5. Scheme of thinking quantum systems

    CERN Document Server

    Yukalov, V I

    2009-01-01

    A general approach describing quantum decision procedures is developed. The approach can be applied to quantum information processing, quantum computing, creation of artificial quantum intelligence, as well as to analyzing decision processes of human decision makers. Our basic point is to consider an active quantum system possessing its own strategic state. Processing information by such a system is analogous to the cognitive processes associated to decision making by humans. The algebra of probability operators, associated with the possible options available to the decision maker, plays the role of the algebra of observables in quantum theory of measurements. A scheme is advanced for a practical realization of decision procedures by thinking quantum systems. Such thinking quantum systems can be realized by using spin lattices, systems of magnetic molecules, cold atoms trapped in optical lattices, ensembles of quantum dots, or multilevel atomic systems interacting with electromagnetic field.

  6. Electrical injection schemes for nanolasers

    DEFF Research Database (Denmark)

    Lupi, Alexandra; Chung, Il-Sug; Yvind, Kresten

    2013-01-01

    The performance of injection schemes among recently demonstrated electrically pumped photonic crystal nanolasers has been investigated numerically. The computation has been carried out at room temperature using a commercial semiconductor simulation software. For the simulations two electrical...... of 3 InGaAsP QWs on an InP substrate has been chosen for the modeling. In the simulations the main focus is on the electrical and optical properties of the nanolasers i.e. electrical resistance, threshold voltage, threshold current and wallplug efficiency. In the current flow evaluation the lowest...... threshold current has been achieved with the lateral electrical injection through the BH; while the lowest resistance has been obtained from the current post structure even though this model shows a higher current threshold because of the lack of carrier confinement. Final scope of the simulations...

  7. Fragment separator momentum compression schemes

    Energy Technology Data Exchange (ETDEWEB)

    Bandura, Laura, E-mail: bandura@anl.gov [Facility for Rare Isotope Beams (FRIB), 1 Cyclotron, East Lansing, MI 48824-1321 (United States); National Superconducting Cyclotron Lab, Michigan State University, 1 Cyclotron, East Lansing, MI 48824-1321 (United States); Erdelyi, Bela [Argonne National Laboratory, Argonne, IL 60439 (United States); Northern Illinois University, DeKalb, IL 60115 (United States); Hausmann, Marc [Facility for Rare Isotope Beams (FRIB), 1 Cyclotron, East Lansing, MI 48824-1321 (United States); Kubo, Toshiyuki [RIKEN Nishina Center, RIKEN, Wako (Japan); Nolen, Jerry [Argonne National Laboratory, Argonne, IL 60439 (United States); Portillo, Mauricio [Facility for Rare Isotope Beams (FRIB), 1 Cyclotron, East Lansing, MI 48824-1321 (United States); Sherrill, Bradley M. [National Superconducting Cyclotron Lab, Michigan State University, 1 Cyclotron, East Lansing, MI 48824-1321 (United States)

    2011-07-21

    We present a scheme to use a fragment separator and profiled energy degraders to transfer longitudinal phase space into transverse phase space while maintaining achromatic beam transport. The first order beam optics theory of the method is presented and the consequent enlargement of the transverse phase space is discussed. An interesting consequence of the technique is that the first order mass resolving power of the system is determined by the first dispersive section up to the energy degrader, independent of whether or not momentum compression is used. The fragment separator at the Facility for Rare Isotope Beams is a specific application of this technique and is described along with simulations by the code COSY INFINITY.

  8. Fragment separator momentum compression schemes.

    Energy Technology Data Exchange (ETDEWEB)

    Bandura, L.; Erdelyi, B.; Hausmann, M.; Kubo, T.; Nolen, J.; Portillo, M.; Sherrill, B.M. (Physics); (MSU); (Northern Illinois Univ.); (RIKEN)

    2011-07-21

    We present a scheme to use a fragment separator and profiled energy degraders to transfer longitudinal phase space into transverse phase space while maintaining achromatic beam transport. The first order beam optics theory of the method is presented and the consequent enlargement of the transverse phase space is discussed. An interesting consequence of the technique is that the first order mass resolving power of the system is determined by the first dispersive section up to the energy degrader, independent of whether or not momentum compression is used. The fragment separator at the Facility for Rare Isotope Beams is a specific application of this technique and is described along with simulations by the code COSY INFINITY.

  9. Network Regulation and Support Schemes

    DEFF Research Database (Denmark)

    Ropenus, Stephanie; Schröder, Sascha Thorsten; Jacobsen, Henrik

    2009-01-01

    -in tariffs to market-based quota systems, and network regulation approaches, comprising rate-of-return and incentive regulation. National regulation and the vertical structure of the electricity sector shape the incentives of market agents, notably of distributed generators and network operators....... This article seeks to investigate the interactions between the policy dimensions of support schemes and network regulation and how they affect the deployment of distributed generation. Firstly, a conceptual analysis examines how the incentives of the different market agents are affected. In particular......, it will be shown that there frequently exists a trade-off between the creation of incentives for distributed generators and for distribution system operators to facilitate the integration of distributed generation. Secondly, the interaction of these policy dimensions is analyzed, including case studies based...

  10. Scheme of thinking quantum systems

    Science.gov (United States)

    Yukalov, V. I.; Sornette, D.

    2009-11-01

    A general approach describing quantum decision procedures is developed. The approach can be applied to quantum information processing, quantum computing, creation of artificial quantum intelligence, as well as to analyzing decision processes of human decision makers. Our basic point is to consider an active quantum system possessing its own strategic state. Processing information by such a system is analogous to the cognitive processes associated to decision making by humans. The algebra of probability operators, associated with the possible options available to the decision maker, plays the role of the algebra of observables in quantum theory of measurements. A scheme is advanced for a practical realization of decision procedures by thinking quantum systems. Such thinking quantum systems can be realized by using spin lattices, systems of magnetic molecules, cold atoms trapped in optical lattices, ensembles of quantum dots, or multilevel atomic systems interacting with electromagnetic field.

  11. Network Regulation and Support Schemes

    DEFF Research Database (Denmark)

    Ropenus, Stephanie; Schröder, Sascha Thorsten; Jacobsen, Henrik

    2009-01-01

    -in tariffs to market-based quota systems, and network regulation approaches, comprising rate-of-return and incentive regulation. National regulation and the vertical structure of the electricity sector shape the incentives of market agents, notably of distributed generators and network operators....... This article seeks to investigate the interactions between the policy dimensions of support schemes and network regulation and how they affect the deployment of distributed generation. Firstly, a conceptual analysis examines how the incentives of the different market agents are affected. In particular......, it will be shown that there frequently exists a trade-off between the creation of incentives for distributed generators and for distribution system operators to facilitate the integration of distributed generation. Secondly, the interaction of these policy dimensions is analyzed, including case studies based...

  12. Timing and hamming weight attacks on minimal cost encryption scheme

    Institute of Scientific and Technical Information of China (English)

    YUAN Zheng; WANG Wei; ZHANG Hua; WEN Qiao-yan

    2009-01-01

    The timing and Hamming weight attacks on the data encryption standard (DES) cryptosystem for minimal cost encryption scheme is presented in this article. In the attack, timing information on encryption processing is used to select and collect effective plaintexts for attack. Then the collected plaintexts are utilized to infer the expanded key differences of the secret key, from which most bits of the expanded secret key are recovered. The remaining bits of the expanded secret key are deduced by the correlations between Hamming weight values of the input of the S-boxes in the first-round. Finally, from the linear relation of the encryption time and the secret key's Hamming weight, the entire 56 bits of the secret key are thoroughly recovered. Using the attack, the minimal cost encryption scheme can be broken with 223 known plaintexts and about 221 calculations at a success rate a>99%. The attack has lower computing complexity, and the method is more effective than other previous methods.

  13. Bayesian electron density inference from JET lithium beam emission spectra using Gaussian processes

    CERN Document Server

    Kwak, Sehyun; Brix, M; Ghim, Y -c

    2016-01-01

    A Bayesian model to infer edge electron density profiles is developed for the JET lithium beam emission spectroscopy system, measuring Li I line radiation using 26 channels with ~1 cm spatial resolution and 10~20 ms temporal resolution. The density profile is modelled using a Gaussian process prior, and the uncertainty of the density profile is calculated by a Markov Chain Monte Carlo (MCMC) scheme. From the spectra measured by the transmission grating spectrometer, the Li line intensities are extracted, and modelled as a function of the plasma density by a multi-state model which describes the relevant processes between neutral lithium beam atoms and plasma particles. The spectral model fully takes into account interference filter and instrument effects, that are separately estimated, again using Gaussian processes. The line intensities are inferred based on a spectral model consistent with the measured spectra within their uncertainties, which includes photon statistics and electronic noise. Our newly devel...

  14. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  15. Simple simulation of diffusion bridges with application to likelihood inference for diffusions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Sørensen, Michael

    With a view to likelihood inference for discretely observed diffusion type models, we propose a simple method of simulating approximations to diffusion bridges. The method is applicable to all one-dimensional diffusion processes and has the advantage that simple simulation methods like the Euler...... scheme can be applied to bridge simulation. Another advantage over other bridge simulation methods is that the proposed method works well when the diffusion bridge is defined in a long interval because the computational complexity of the method is linear in the length of the interval. In a simulation...... study we investigate the accuracy and efficiency of the new method and compare it to exact simulation methods. In the study the method provides a very good approximation to the distribution of a diffusion bridge for bridges that are likely to occur in applications to likelihood inference. To illustrate...

  16. Activity-Based Scene Decomposition for Topology Inference of Video Surveillance Network

    Directory of Open Access Journals (Sweden)

    Hongguang Zhang

    2014-01-01

    Full Text Available The topology inference is the study of spatial and temporal relationships among cameras within a video surveillance network. We propose a novel approach to understand activities based on the visual coverage of a video surveillance network. In our approach, an optimal camera placement scheme is firstly presented by using a binary integer programming algorithm in order to maximize the surveillance coverage. Then, each camera view is decomposed into regions based on the Histograms of Color Optical Flow (HCOF, according to the spatial-temporal distribution of activity patterns observed in a training set of video sequences. We conduct experiments by using hours of video sequences captured at an office building with seven camera views, all of which are sparse scenes with complex activities. The results of real scene experiment show that the features of histograms of color optic flow offer important contextual information for spatial and temporal topology inference of a camera network.

  17. Improvement of Some Proxy Signature Schemes

    Institute of Scientific and Technical Information of China (English)

    LIJiguo; LIANGZhenghe; ZHUYuelong; HANGYichen

    2005-01-01

    In 1996, Mambo et al. introduced the concept of proxy signature. Proxy signature can be applied to mobile agent, e-vote etc. Recently, Sun and Hsieh showed that Lee et al's strong proxy signature scheme and its application to multi-proxy signature scheme, Shum and Wei's privacy-protected strong proxy signature scheme, and Park and Lee's nominative proxy signature scheme were all insecure against the original signer's forgery attack. In this paper, we show those proxy signature schemes don't withstand public key substitution attack and give some slight but important modifications for those proxy signature schemes such that the resulting schemes are secure against the original signer's forgery attack and public key substitution attack. In addition, we show that Park and Lee's nominative proxy signature scheme don't satisfy strong nonrepudiation and strong identifiability. Improved schemes satisfy all properties of strong proxy signature scheme, and doesn't use secure channel between the original signer and the proxy signer.

  18. EVOLVING RETRIEVAL ALGORITHMS WITH A GENETIC PROGRAMMING SCHEME

    Energy Technology Data Exchange (ETDEWEB)

    J. THEILER; ET AL

    1999-06-01

    The retrieval of scene properties (surface temperature, material type, vegetation health, etc.) from remotely sensed data is the ultimate goal of many earth observing satellites. The algorithms that have been developed for these retrievals are informed by physical models of how the raw data were generated. This includes models of radiation as emitted and/or rejected by the scene, propagated through the atmosphere, collected by the optics, detected by the sensor, and digitized by the electronics. To some extent, the retrieval is the inverse of this ''forward'' modeling problem. But in contrast to this forward modeling, the practical task of making inferences about the original scene usually requires some ad hoc assumptions, good physical intuition, and a healthy dose of trial and error. The standard MTI data processing pipeline will employ algorithms developed with this traditional approach. But we will discuss some preliminary research on the use of a genetic programming scheme to ''evolve'' retrieval algorithms. Such a scheme cannot compete with the physical intuition of a remote sensing scientist, but it may be able to automate some of the trial and error. In this scenario, a training set is used, which consists of multispectral image data and the associated ''ground truth;'' that is, a registered map of the desired retrieval quantity. The genetic programming scheme attempts to combine a core set of image processing primitives to produce an IDL (Interactive Data Language) program which estimates this retrieval quantity from the raw data.

  19. Inference making ability and the function of inferences in reading comprehension

    Directory of Open Access Journals (Sweden)

    Salih Özenici

    2011-05-01

    Full Text Available The aim of this study is to explain the relation between reading comprehension and inference. The main target of reading process is to create a coherent mental representation of the text, therefore it is necessary to recognize relations between different parts of the texts and to relate them to one another. During reading process, to complete the missing information in the text or to add new information is necessary. All these processes require inference making ability beyond the information in the text. When the readers use such active reading strategies as monitoring the comprehension, prediction, inferring and background knowledge, they learn a lot more from the text and understand it better. In reading comprehension, making inference is a constructive thinking process, because it is a cognitive process in order to form the meaning. When reading comprehension models are considered, it can be easily seen that linguistics elements cannot explain these processes by themselves, therefore the ability of thinking and inference making is needed. During reading process, general world knowledge is necessary to form coherent relations between sentences. Information which comes from context of the text will not be adequate to understand the text. In order to overcome this deficiency and to integrate the meanings from different sentences witch each other, it is necessary to make inference. Readers make inference in order to completely understand what the writer means, to interpret the sentences and also to form the combinations and relations between them.

  20. Inference making ability and the function of inferences in reading comprehension

    Directory of Open Access Journals (Sweden)

    Salih Özenici

    2011-05-01

    Full Text Available The aim of this study is to explain the relation of reading comprehension and inference. The main target of reading process is to create a coherent mental representation of the text, therefore it is necessary to recognize relations between different parts of the texts and to relate them to one another. During reading process, to complete the missing information in the text or to add new information is necessary. All these processes require inference making ability beyond the information in the text. When the readers use such active reading strategies as monitoring the comprehension, prediction, inferring and background knowledge, they learn a lot more from the text and understand it better. In reading comprehension, making inference is a constructive thinking process, because it is a cognitive process in order to form the meaning. When reading comprehension models are considered, it can be easily seen that linguistics elements cannot explain these processes by themselves, therefore the ability of thinking and inference making is needed. During reading process, general world knowledge is necessary to form coherent relations between sentences. Information which comes from context of the text will not be adequate to understand the text. In order to overcome this deficiency and to integrate the meanings from different sentences witch each other, it is necessary to make inference. Readers make inference in order to completely understand what the writer means, to interpret the sentences and also to form the combinations and relations between them.

  1. Rapid Parameterization Schemes for Aircraft Shape Optimization

    Science.gov (United States)

    Li, Wu

    2012-01-01

    A rapid shape parameterization tool called PROTEUS is developed for aircraft shape optimization. This tool can be applied directly to any aircraft geometry that has been defined in PLOT3D format, with the restriction that each aircraft component must be defined by only one data block. PROTEUS has eight types of parameterization schemes: planform, wing surface, twist, body surface, body scaling, body camber line, shifting/scaling, and linear morphing. These parametric schemes can be applied to two types of components: wing-type surfaces (e.g., wing, canard, horizontal tail, vertical tail, and pylon) and body-type surfaces (e.g., fuselage, pod, and nacelle). These schemes permit the easy setup of commonly used shape modification methods, and each customized parametric scheme can be applied to the same type of component for any configuration. This paper explains the mathematics for these parametric schemes and uses two supersonic configurations to demonstrate the application of these schemes.

  2. On Optimal Designs of Some Censoring Schemes

    Directory of Open Access Journals (Sweden)

    Dr. Adnan Mohammad Awad

    2016-03-01

    Full Text Available The main objective of this paper  is to explore suitability of some entropy-information measures for introducing a new optimality censoring criterion and to apply it to some censoring schemes from some underlying life-time models.  In addition, the  paper investigates four related issues namely; the  effect of the parameter of parent distribution on optimal scheme, equivalence of schemes based on Shannon and Awad sup-entropy measures, the conjecture that the optimal scheme is one stage scheme, and  a conjecture by Cramer and Bagh (2011 about Shannon minimum and maximum schemes when parent distribution is reflected power. Guidelines for designing an optimal censoring plane are reported together with theoretical and numerical results and illustrations.

  3. A Signature Scheme with Non-Repudiation

    Institute of Scientific and Technical Information of China (English)

    XIN Xiangjun; GUO Xiaoli; XIAO Guozhen

    2006-01-01

    Based on the Schnorr signature scheme, a new signature scheme with non-repudiation is proposed. In this scheme, only the signer and the designated receiver can verify the signature signed by the signer, and if necessary, both the signer and the designated receiver can prove and show the validity of the signature signed by the signer. The proof of the validity of the signature is noninteractive and transferable. To verify and prove the validity of the signature, the signer and the nominated receiver needn't store extra information besides the signature. At the same time, neither the signer nor the designated receiver can deny a valid signature signed. Then, there is no repudiation in this new signature scheme. According to the security analysis of this scheme, it is found the proposed scheme is secure against existential forgery on adaptive chosen message attack.

  4. A new Hedging algorithm and its application to inferring latent random variables

    CERN Document Server

    Freund, Yoav

    2008-01-01

    We present a new online learning algorithm for cumulative discounted gain. This learning algorithm does not use exponential weights on the experts. Instead, it uses a weighting scheme that depends on the regret of the master algorithm relative to the experts. In particular, experts whose discounted cumulative gain is smaller (worse) than that of the master algorithm receive zero weight. We also sketch how a regret-based algorithm can be used as an alternative to Bayesian averaging in the context of inferring latent random variables.

  5. Inferring the Composition of a Trader Population in a Financial Market

    CERN Document Server

    Gupta, Nachi; Johnson, Neil F

    2007-01-01

    We discuss a method for predicting financial movements and finding pockets of predictability in the price-series, which is built around inferring the heterogeneity of trading strategies in a multi-agent trader population. This work explores extensions to our previous framework (arXiv:physics/0506134). Here we allow for more intelligent agents possessing a richer strategy set, and we no longer constrain the estimate for the heterogeneity of the agents to a probability space. We also introduce a scheme which allows the incorporation of models with a wide variety of agent types, and discuss a mechanism for the removal of bias from relevant parameters.

  6. Understanding COBOL systems using inferred types

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon)

    1999-01-01

    textabstractIn a typical COBOL program, the data division consists of 50 of the lines of code. Automatic type inference can help to understand the large collections of variable declarations contained therein, showing how variables are related based on their actual usage. The most problematic aspect

  7. John Updike and Norman Mailer: Sport Inferences.

    Science.gov (United States)

    Upshaw, Kathryn Jane

    The phenomenon of writer use of sport inferences in the literary genre of the novel is examined in the works of Updike and Mailer. Novels of both authors were reviewed in order to study the pattern of usage in each novel. From these patterns, concepts which illustrated the sport philosophies of each author were used for general comparisons of the…

  8. HIERARCHICAL PROBABILISTIC INFERENCE OF COSMIC SHEAR

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Michael D.; Dawson, William A. [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States); Hogg, David W. [Center for Cosmology and Particle Physics, New York University, New York, NY 10003 (United States); Marshall, Philip J.; Bard, Deborah J. [SLAC National Accelerator Laboratory, Menlo Park, CA 94025 (United States); Meyers, Joshua [Kavli Institute for Particle Astrophysics and Cosmology, Stanford University, 452 Lomita Mall, Stanford, CA 94035 (United States); Lang, Dustin, E-mail: schneider42@llnl.gov [Department of Physics, Carnegie Mellon University, Pittsburgh, PA 15213 (United States)

    2015-07-01

    Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics.

  9. Inferring Internet Denial-of-Service Activity

    Science.gov (United States)

    2007-11-02

    Inferring Internet Denial-of-Service Activity David Moore CAIDA San Diego Supercomputer Center University of California, San Diego dmoore@caida.org...the local network topology. kc claffy and Colleen Shannon at CAIDA provided support and valuable feed- back throughout the project. David Wetherall

  10. GAMBIT: Global And Modular BSM Inference Tool

    Science.gov (United States)

    GAMBIT Collaboration; Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrzä Szcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-08-01

    GAMBIT (Global And Modular BSM Inference Tool) performs statistical global fits of generic physics models using a wide range of particle physics and astrophysics data. Modules provide native simulations of collider and astrophysics experiments, a flexible system for interfacing external codes (the backend system), a fully featured statistical and parameter scanning framework, and additional tools for implementing and using hierarchical models.

  11. Linguistic Markers of Inference Generation While Reading

    Science.gov (United States)

    Clinton, Virginia; Carlson, Sarah E.; Seipel, Ben

    2016-01-01

    Words can be informative linguistic markers of psychological constructs. The purpose of this study is to examine associations between word use and the process of making meaningful connections to a text while reading (i.e., inference generation). To achieve this purpose, think-aloud data from third-fifth grade students (N = 218) reading narrative…

  12. New Inference Rules for Max-SAT

    CERN Document Server

    Li, C M; Planes, J; 10.1613/jair.2215

    2011-01-01

    Exact Max-SAT solvers, compared with SAT solvers, apply little inference at each node of the proof tree. Commonly used SAT inference rules like unit propagation produce a simplified formula that preserves satisfiability but, unfortunately, solving the Max-SAT problem for the simplified formula is not equivalent to solving it for the original formula. In this paper, we define a number of original inference rules that, besides being applied efficiently, transform Max-SAT instances into equivalent Max-SAT instances which are easier to solve. The soundness of the rules, that can be seen as refinements of unit resolution adapted to Max-SAT, are proved in a novel and simple way via an integer programming transformation. With the aim of finding out how powerful the inference rules are in practice, we have developed a new Max-SAT solver, called MaxSatz, which incorporates those rules, and performed an experimental investigation. The results provide empirical evidence that MaxSatz is very competitive, at least, on ran...

  13. Ignorability in Statistical and Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...

  14. Nonparametric Bayes inference for concave distribution functions

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Lauritzen, Steffen Lilholt

    2002-01-01

    Bayesian inference for concave distribution functions is investigated. This is made by transforming a mixture of Dirichlet processes on the space of distribution functions to the space of concave distribution functions. We give a method for sampling from the posterior distribution using a Pólya urn...

  15. Campbell's and Rubin's Perspectives on Causal Inference

    Science.gov (United States)

    West, Stephen G.; Thoemmes, Felix

    2010-01-01

    Donald Campbell's approach to causal inference (D. T. Campbell, 1957; W. R. Shadish, T. D. Cook, & D. T. Campbell, 2002) is widely used in psychology and education, whereas Donald Rubin's causal model (P. W. Holland, 1986; D. B. Rubin, 1974, 2005) is widely used in economics, statistics, medicine, and public health. Campbell's approach focuses on…

  16. Decision generation tools and Bayesian inference

    Science.gov (United States)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  17. Campbell's and Rubin's Perspectives on Causal Inference

    Science.gov (United States)

    West, Stephen G.; Thoemmes, Felix

    2010-01-01

    Donald Campbell's approach to causal inference (D. T. Campbell, 1957; W. R. Shadish, T. D. Cook, & D. T. Campbell, 2002) is widely used in psychology and education, whereas Donald Rubin's causal model (P. W. Holland, 1986; D. B. Rubin, 1974, 2005) is widely used in economics, statistics, medicine, and public health. Campbell's approach focuses on…

  18. "Comments on Slavin": Synthesizing Causal Inferences

    Science.gov (United States)

    Briggs, Derek C.

    2008-01-01

    When causal inferences are to be synthesized across multiple studies, efforts to establish the magnitude of a causal effect should be balanced by an effort to evaluate the generalizability of the effect. The evaluation of generalizability depends on two factors that are given little attention in current syntheses: construct validity and external…

  19. On Measurement Bias in Causal Inference

    CERN Document Server

    Pearl, Judea

    2012-01-01

    This paper addresses the problem of measurement errors in causal inference and highlights several algebraic and graphical methods for eliminating systematic bias induced by such errors. In particulars, the paper discusses the control of partially observable confounders in parametric and non parametric models and the computational problem of obtaining bias-free effect estimates in such models.

  20. Evolutionary inference via the Poisson Indel Process.

    Science.gov (United States)

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  1. Making statistical inferences about software reliability

    Science.gov (United States)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  2. Spurious correlations and inference in landscape genetics

    Science.gov (United States)

    Samuel A. Cushman; Erin L. Landguth

    2010-01-01

    Reliable interpretation of landscape genetic analyses depends on statistical methods that have high power to identify the correct process driving gene flow while rejecting incorrect alternative hypotheses. Little is known about statistical power and inference in individual-based landscape genetics. Our objective was to evaluate the power of causalmodelling with partial...

  3. Understanding COBOL systems using inferred types

    NARCIS (Netherlands)

    Deursen, A. van; Moonen, L.M.F.

    1999-01-01

    In a typical COBOL program, the data division consists of 50 of the lines of code. Automatic type inference can help to understand the large collections of variable declarations contained therein, showing how variables are related based on their actual usage. The most problematic aspect of type infe

  4. Double jeopardy in inferring cognitive processes.

    Science.gov (United States)

    Fific, Mario

    2014-01-01

    Inferences we make about underlying cognitive processes can be jeopardized in two ways due to problematic forms of aggregation. First, averaging across individuals is typically considered a very useful tool for removing random variability. The threat is that averaging across subjects leads to averaging across different cognitive strategies, thus harming our inferences. The second threat comes from the construction of inadequate research designs possessing a low diagnostic accuracy of cognitive processes. For that reason we introduced the systems factorial technology (SFT), which has primarily been designed to make inferences about underlying processing order (serial, parallel, coactive), stopping rule (terminating, exhaustive), and process dependency. SFT proposes that the minimal research design complexity to learn about n number of cognitive processes should be equal to 2 (n) . In addition, SFT proposes that (a) each cognitive process should be controlled by a separate experimental factor, and (b) The saliency levels of all factors should be combined in a full factorial design. In the current study, the author cross combined the levels of jeopardies in a 2 × 2 analysis, leading to four different analysis conditions. The results indicate a decline in the diagnostic accuracy of inferences made about cognitive processes due to the presence of each jeopardy in isolation and when combined. The results warrant the development of more individual subject analyses and the utilization of full-factorial (SFT) experimental designs.

  5. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.

  6. Colligation or, The Logical Inference of Interconnection

    DEFF Research Database (Denmark)

    Franksen, Ole Immanuel; Falster, Peter

    2000-01-01

    laws or assumptions. Yet interconnection as an abstract concept seems to be without scientific underpinning in oure logic. Adopting a historical viewpoint, our aim is to show that the reasoning of interconnection may be identified with a neglected kind of logical inference, called "colligation...

  7. Inferring comprehensible business/ICT alignment rules

    NARCIS (Netherlands)

    Cumps, B.; Martens, D.; De Backer, M.; Haesen, R.; Viaene, S.; Dedene, G.; Baesens, B.; Snoeck, M.

    2009-01-01

    We inferred business rules for business/ICT alignment by applying a novel rule induction algorithm on a data set containing rich alignment information polled from 641 organisations in 7 European countries. The alignment rule set was created using AntMiner+, a rule induction technique with a reputati

  8. Quasi-Experimental Designs for Causal Inference

    Science.gov (United States)

    Kim, Yongnam; Steiner, Peter

    2016-01-01

    When randomized experiments are infeasible, quasi-experimental designs can be exploited to evaluate causal treatment effects. The strongest quasi-experimental designs for causal inference are regression discontinuity designs, instrumental variable designs, matching and propensity score designs, and comparative interrupted time series designs. This…

  9. How to Make Inference in Reading

    Institute of Scientific and Technical Information of China (English)

    何少芳

    2013-01-01

    Students often have difficulties in reading comprehension because of too many new and unfamiliar words, too little background knowledge and different patterns of thinking among different countries. In this thesis, I recommend applying context clues, synonyms or antonyms, examples, definitions or explanations, cause/effect clues and background clues to make inference when we read texts.

  10. Investigating Mathematics Teachers' Thoughts of Statistical Inference

    Science.gov (United States)

    Yang, Kai-Lin

    2012-01-01

    Research on statistical cognition and application suggests that statistical inference concepts are commonly misunderstood by students and even misinterpreted by researchers. Although some research has been done on students' misunderstanding or misconceptions of confidence intervals (CIs), few studies explore either students' or mathematics…

  11. Non-Parametric Inference in Astrophysics

    CERN Document Server

    Wasserman, L H; Nichol, R C; Genovese, C; Jang, W; Connolly, A J; Moore, A W; Schneider, J; Wasserman, Larry; Miller, Christopher J.; Nichol, Robert C.; Genovese, Chris; Jang, Woncheol; Connolly, Andrew J.; Moore, Andrew W.; Schneider, Jeff; group, the PICA

    2001-01-01

    We discuss non-parametric density estimation and regression for astrophysics problems. In particular, we show how to compute non-parametric confidence intervals for the location and size of peaks of a function. We illustrate these ideas with recent data on the Cosmic Microwave Background. We also briefly discuss non-parametric Bayesian inference.

  12. The importance of learning when making inferences

    Directory of Open Access Journals (Sweden)

    Jorg Rieskamp

    2008-03-01

    Full Text Available The assumption that people possess a repertoire of strategies to solve the inference problems they face has been made repeatedly. The experimental findings of two previous studies on strategy selection are reexamined from a learning perspective, which argues that people learn to select strategies for making probabilistic inferences. This learning process is modeled with the strategy selection learning (SSL theory, which assumes that people develop subjective expectancies for the strategies they have. They select strategies proportional to their expectancies, which are updated on the basis of experience. For the study by Newell, Weston, and Shanks (2003 it can be shown that people did not anticipate the success of a strategy from the beginning of the experiment. Instead, the behavior observed at the end of the experiment was the result of a learning process that can be described by the SSL theory. For the second study, by Br"oder and Schiffer (2006, the SSL theory is able to provide an explanation for why participants only slowly adapted to new environments in a dynamic inference situation. The reanalysis of the previous studies illustrates the importance of learning for probabilistic inferences.

  13. Active interoceptive inference and the emotional brain

    Science.gov (United States)

    Friston, Karl J.

    2016-01-01

    We review a recent shift in conceptions of interoception and its relationship to hierarchical inference in the brain. The notion of interoceptive inference means that bodily states are regulated by autonomic reflexes that are enslaved by descending predictions from deep generative models of our internal and external milieu. This re-conceptualization illuminates several issues in cognitive and clinical neuroscience with implications for experiences of selfhood and emotion. We first contextualize interoception in terms of active (Bayesian) inference in the brain, highlighting its enactivist (embodied) aspects. We then consider the key role of uncertainty or precision and how this might translate into neuromodulation. We next examine the implications for understanding the functional anatomy of the emotional brain, surveying recent observations on agranular cortex. Finally, we turn to theoretical issues, namely, the role of interoception in shaping a sense of embodied self and feelings. We will draw links between physiological homoeostasis and allostasis, early cybernetic ideas of predictive control and hierarchical generative models in predictive processing. The explanatory scope of interoceptive inference ranges from explanations for autism and depression, through to consciousness. We offer a brief survey of these exciting developments. This article is part of the themed issue ‘Interoception beyond homeostasis: affect, cognition and mental health’. PMID:28080966

  14. Bayesian structural inference for hidden processes.

    Science.gov (United States)

    Strelioff, Christopher C; Crutchfield, James P

    2014-04-01

    We introduce a Bayesian approach to discovering patterns in structurally complex processes. The proposed method of Bayesian structural inference (BSI) relies on a set of candidate unifilar hidden Markov model (uHMM) topologies for inference of process structure from a data series. We employ a recently developed exact enumeration of topological ε-machines. (A sequel then removes the topological restriction.) This subset of the uHMM topologies has the added benefit that inferred models are guaranteed to be ε-machines, irrespective of estimated transition probabilities. Properties of ε-machines and uHMMs allow for the derivation of analytic expressions for estimating transition probabilities, inferring start states, and comparing the posterior probability of candidate model topologies, despite process internal structure being only indirectly present in data. We demonstrate BSI's effectiveness in estimating a process's randomness, as reflected by the Shannon entropy rate, and its structure, as quantified by the statistical complexity. We also compare using the posterior distribution over candidate models and the single, maximum a posteriori model for point estimation and show that the former more accurately reflects uncertainty in estimated values. We apply BSI to in-class examples of finite- and infinite-order Markov processes, as well to an out-of-class, infinite-state hidden process.

  15. Inference and the Introductory Statistics Course

    Science.gov (United States)

    Pfannkuch, Maxine; Regan, Matt; Wild, Chris; Budgett, Stephanie; Forbes, Sharleen; Harraway, John; Parsonage, Ross

    2011-01-01

    This article sets out some of the rationale and arguments for making major changes to the teaching and learning of statistical inference in introductory courses at our universities by changing from a norm-based, mathematical approach to more conceptually accessible computer-based approaches. The core problem of the inferential argument with its…

  16. Multi-object quantum traveling ballot scheme

    Institute of Scientific and Technical Information of China (English)

    Yuan Li; Guihua Zeng

    2009-01-01

    Based on quantum mechanics, a traveling ballot scheme with anonymity and secrecy is introduced to realize voting. By searching the objects in large amount of data bases, every voter may cast votes to his desired candidates. Therefore, the proposed scheme may be applied to voting with a great deal of candidates, such as network voting and so on. The security analysis of the present scheme is also performed.

  17. SYNCHRONIZATION RECOVERY SCHEME IN WATERMARKING DETECTION

    Institute of Scientific and Technical Information of China (English)

    Xiao Weiwei; Zhang Li; Ji Zhen; Zhang Jihong

    2003-01-01

    Most proposed digital watermarking algorithms are sensitive to geometric attacksbecause the synchronization information of watermark embedding and detection is destroyed. Inthis letter a novel synchronization recovery scheme based on image normalization is proposed. Thepresented scheme does not require the original image and can be applied to various watermarksystems. A wavelet-based watermarking scheme is proposed as an example and experimentalresults show that it is robust to geometric attacks.

  18. Blind Signature Scheme Based on Chebyshev Polynomials

    Directory of Open Access Journals (Sweden)

    Maheswara Rao Valluri

    2011-12-01

    Full Text Available A blind signature scheme is a cryptographic protocol to obtain a valid signature for a message from a signer such that signer’s view of the protocol can’t be linked to the resulting message signature pair. This paper presents blind signature scheme using Chebyshev polynomials. The security of the given scheme depends upon the intractability of the integer factorization problem and discrete logarithms ofChebyshev polynomials.

  19. Blind Signature Scheme Based on Chebyshev Polynomials

    OpenAIRE

    Maheswara Rao Valluri

    2011-01-01

    A blind signature scheme is a cryptographic protocol to obtain a valid signature for a message from a signer such that signer’s view of the protocol can’t be linked to the resulting message signature pair. This paper presents blind signature scheme using Chebyshev polynomials. The security of the given scheme depends upon the intractability of the integer factorization problem and discrete logarithms ofChebyshev polynomials.

  20. Pyramid Schemes on the Tibetan Plateau

    OpenAIRE

    Devin Gonier; Rgyal yum sgrol ma

    2012-01-01

    The unique features of pyramid schemes and certain underlying causes for their development on the Tibetan Plateau are analyzed. Research was conducted by analyzing 521 surveys, allowing estimation of pyramid scheme activity on the Plateau and an identification of related cultural and social specificities. Firsthand accounts were collected revealing details of personal involvement. Survey data and similarities in the accounts were studied to suggest how involvement in pyramid schemes might be ...

  1. Resonance ionization scheme development for europium

    CERN Document Server

    Chrysalidis, K; Fedosseev, V N; Marsh, B A; Naubereit, P; Rothe, S; Seiffert, C; Kron, T; Wendt, K

    2017-01-01

    Odd-parity autoionizing states of europium have been investigated by resonance ionization spectroscopy via two-step, two-resonance excitations. The aim of this work was to establish ionization schemes specifically suited for europium ion beam production using the ISOLDE Resonance Ionization Laser Ion Source (RILIS). 13 new RILIS-compatible ionization schemes are proposed. The scheme development was the first application of the Photo Ionization Spectroscopy Apparatus (PISA) which has recently been integrated into the RILIS setup.

  2. Signcryption scheme based on schnorr digital signature

    CERN Document Server

    Savu, Laura

    2012-01-01

    This article presents a new signcryption scheme which is based on the Schnorr digital signature algorithm. The new scheme represents my personal contribution to signcryption area. I have been implemented the algorithm in a program and here are provided the steps of the algorithm, the results and some examples. The paper also contains the presentation of the original Signcryption scheme, based on ElGamal digital signature and discusses the practical applications of Signcryption in real life.

  3. SIGNCRYPTION BASED ON DIFFERENT DIGITAL SIGNATURE SCHEMES

    OpenAIRE

    Adrian Atanasiu; Laura Savu

    2012-01-01

    This article presents two new signcryption schemes. The first one is based on Schnorr digital signature algorithm and the second one is using Proxy Signature scheme introduced by Mambo. Schnorr Signcryption has been implemented in a program and here are provided the steps of the algorithm, the results and some examples. The Mambo’s Proxy Signature is adapted for Shortened Digital Signature Standard, being part of a new Proxy Signcryption scheme.

  4. Resonance ionization scheme development for europium

    Energy Technology Data Exchange (ETDEWEB)

    Chrysalidis, K., E-mail: katerina.chrysalidis@cern.ch; Goodacre, T. Day; Fedosseev, V. N.; Marsh, B. A. [CERN (Switzerland); Naubereit, P. [Johannes Gutenberg-Universität, Institiut für Physik (Germany); Rothe, S.; Seiffert, C. [CERN (Switzerland); Kron, T.; Wendt, K. [Johannes Gutenberg-Universität, Institiut für Physik (Germany)

    2017-11-15

    Odd-parity autoionizing states of europium have been investigated by resonance ionization spectroscopy via two-step, two-resonance excitations. The aim of this work was to establish ionization schemes specifically suited for europium ion beam production using the ISOLDE Resonance Ionization Laser Ion Source (RILIS). 13 new RILIS-compatible ionization schemes are proposed. The scheme development was the first application of the Photo Ionization Spectroscopy Apparatus (PISA) which has recently been integrated into the RILIS setup.

  5. General Compact Labeling Schemes for Dynamic Trees

    OpenAIRE

    2006-01-01

    Let $F$ be a function on pairs of vertices. An {\\em $F$- labeling scheme} is composed of a {\\em marker} algorithm for labeling the vertices of a graph with short labels, coupled with a {\\em decoder} algorithm allowing one to compute $F(u,v)$ of any two vertices $u$ and $v$ directly from their labels. As applications for labeling schemes concern mainly large and dynamically changing networks, it is of interest to study {\\em distributed dynamic} labeling schemes. This paper investigates labelin...

  6. HyFIS: adaptive neuro-fuzzy inference systems and their application to nonlinear dynamical systems.

    Science.gov (United States)

    Kim, J; Kasabov, N

    1999-11-01

    This paper proposes an adaptive neuro-fuzzy system, HyFIS (Hybrid neural Fuzzy Inference System), for building and optimising fuzzy models. The proposed model introduces the learning power of neural networks to fuzzy logic systems and provides linguistic meaning to the connectionist architectures. Heuristic fuzzy logic rules and input-output fuzzy membership functions can be optimally tuned from training examples by a hybrid learning scheme comprised of two phases: rule generation phase from data; and rule tuning phase using error backpropagation learning scheme for a neural fuzzy system. To illustrate the performance and applicability of the proposed neuro-fuzzy hybrid model, extensive simulation studies of nonlinear complex dynamic systems are carried out. The proposed method can be applied to an on-line incremental adaptive learning for the prediction and control of nonlinear dynamical systems. Two benchmark case studies are used to demonstrate that the proposed HyFIS system is a superior neuro-fuzzy modelling technique.

  7. Inference and Assumption in Historical Seismology

    Science.gov (United States)

    Musson, R. M. W.

    The principal aim in studies of historical earthquakes is usually to be able to derive parameters for past earthquakes from macroseismic or other data and thus extend back in time parametric earthquake catalogues, often with improved seismic hazard studies as the ultimate goal. In cases of relatively recent historical earthquakes, for example, those of the 18th and 19th centuries, it is often the case that there is such an abundance of available macroseismic data that estimating earthquake parameters is relatively straightforward. For earlier historical periods, especially medieval and earlier, and also for areas where settlement or documentation are sparse, the situation is much harder. The seismologist often finds that he has only a few data points (or even one) for an earthquake that nevertheless appears to be regionally significant.In such cases, it is natural that the investigator will attempt to make the most of the available data, expanding it by making working assumptions, and from these deriving conclusions by inference (i.e. the process of proceeding logically from some premise). This can be seen in a number of existing studies; in some cases extremely slight data are so magnified by the use of inference that one must regard the results as tentative in the extreme. Two main types of inference can be distinguished. The first type is inference from documentation. This is where assumptions are made such as: the absence of a report of the earthquake from this monastic chronicle indicates that at this locality the earthquake was not felt. The second type is inference from seismicity. Here one deals with arguments such as all recent earthquakes felt at town X are events occurring in seismic zone Y, therefore this ancient earthquake which is only reported at town X probably also occurred in this zone.

  8. Optimizing Decision Tree Attack on CAS Scheme

    Directory of Open Access Journals (Sweden)

    PERKOVIC, T.

    2016-05-01

    Full Text Available In this paper we show a successful side-channel timing attack on a well-known high-complexity cognitive authentication (CAS scheme. We exploit the weakness of CAS scheme that comes from the asymmetry of the virtual interface and graphical layout which results in nonuniform human behavior during the login procedure, leading to detectable variations in user's response times. We optimized a well-known probabilistic decision tree attack on CAS scheme by introducing this timing information into the attack. We show that the developed classifier could be used to significantly reduce the number of login sessions required to break the CAS scheme.

  9. Key Predistribution Schemes for Distributed Sensor Networks

    CERN Document Server

    Bose, Mausumi; Mukerjee, Rahul

    2011-01-01

    Key predistribution schemes for distributed sensor networks have received significant attention in the recent literature. In this paper we propose a new construction method for these schemes based on combinations of duals of standard block designs. Our method is a broad spectrum one which works for any intersection threshold. By varying the initial designs, we can generate various schemes and this makes the method quite flexible. We also obtain explicit algebraic expressions for the metrics for local connectivity and resiliency. These schemes are quite efficient with regard to connectivity and resiliency and at the same time they allow a straightforward shared-key discovery.

  10. A new access scheme in OFDMA systems

    Institute of Scientific and Technical Information of China (English)

    GU Xue-lin; YAN Wei; TIAN Hui; ZHANG Ping

    2006-01-01

    This article presents a dynamic random access scheme for orthogonal frequency division multiple access (OFDMA) systems. The key features of the proposed scheme are:it is a combination of both the distributed and the centralized schemes, it can accommodate several delay sensitivity classes,and it can adjust the number of random access channels in a media access control (MAC) frame and the access probability according to the outcome of Mobile Terminals access attempts in previous MAC frames. For floating populated packet-based networks, the proposed scheme possibly leads to high average user satisfaction.

  11. Efficient Certificateless Signcryption Scheme from Weil Pairing

    Directory of Open Access Journals (Sweden)

    Gang Yu

    2011-08-01

    Full Text Available Certificateless signcryption has both the advantage of certificateless public key cryptography, which overcome the escrow problem inherited from identity based cryptography without the use of certificates as in traditional public key cryptography, and signcryption which can fulfill both the functions of signature and encryption in a logical signal step. In this paper, we explicit the security model for certificateless signcryption and propose an efficient certificateless signcryption scheme from Weil pairings. The new scheme not only can be proved to be secure in our model but also can simultaneously provide public verifiability and forward security. Furthermore, compared with existing schemes, the new scheme is more efficient.

  12. Hybrid scheme for Brownian semistationary processes

    DEFF Research Database (Denmark)

    Bennedsen, Mikkel; Lunde, Asger; Pakkanen, Mikko S.

    the asymptotics of the mean square error of the hybrid scheme and we observe that the scheme leads to a substantial improvement of accuracy compared to the ordinary forward Riemann-sum scheme, while having the same computational complexity. We exemplify the use of the hybrid scheme by two numerical experiments......, where we examine the finite-sample properties of an estimator of the roughness parameter of a Brownian semistationary process and study Monte Carlo option pricing in the rough Bergomi model of Bayer et al. (2015), respectively....

  13. An Improved Proxy Multi-Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    GU Li-ze; ZHANG Sheng; YANG Yi-xian

    2005-01-01

    Based on the Kim-like's proxy multi-signature scheme[1],an improved proxy multi-signature scheme is proposed.The new scheme overcomes the two problems in the Kim-like's proxy multi-signature scheme:(1)Security issue(every original signer can forge a valid proxy multi-signature for any message);(2)Efficiency issue(both the size of the proxy multi-signature and the efficiency of signature checking are dependent on the number of the original signers).

  14. Sampling scheme optimization from hyperspectral data

    NARCIS (Netherlands)

    Debba, P.

    2006-01-01

    This thesis presents statistical sampling scheme optimization for geo-environ-menta] purposes on the basis of hyperspectral data. It integrates derived products of the hyperspectral remote sensing data into individual sampling schemes. Five different issues are being dealt with.First, the optimized

  15. Anonymous Credential Schemes with Encrypted Attributes

    NARCIS (Netherlands)

    Guajardo Merchan, J.; Mennink, B.; Schoenmakers, B.

    2011-01-01

    In anonymous credential schemes, users obtain credentials on certain attributes from an issuer, and later show these credentials to a relying party anonymously and without fully disclosing the attributes. In this paper, we introduce the notion of (anonymous) credential schemes with encrypted attribu

  16. Mixed ultrasoft/norm-conserved pseudopotential scheme

    DEFF Research Database (Denmark)

    Stokbro, Kurt

    1996-01-01

    A variant of the Vanderbilt ultrasoft pseudopotential scheme, where the norm conservation is released for only one or a few angular channels, is presented. Within this scheme some difficulties of the truly ultrasoft pseudopotentials are overcome without sacrificing the pseudopotential softness. (...

  17. Finite volume renormalization scheme for fermionic operators

    Energy Technology Data Exchange (ETDEWEB)

    Monahan, Christopher; Orginos, Kostas [JLAB

    2013-11-01

    We propose a new finite volume renormalization scheme. Our scheme is based on the Gradient Flow applied to both fermion and gauge fields and, much like the Schr\\"odinger functional method, allows for a nonperturbative determination of the scale dependence of operators using a step-scaling approach. We give some preliminary results for the pseudo-scalar density in the quenched approximation.

  18. Nonstandard finite difference schemes for differential equations

    Directory of Open Access Journals (Sweden)

    Mohammad Mehdizadeh Khalsaraei

    2014-12-01

    Full Text Available In this paper, the reorganization of the denominator of the discrete derivative and nonlocal approximation of nonlinear terms are used in the design of nonstandard finite difference schemes (NSFDs. Numerical examples confirming then efficiency of schemes, for some differential equations are provided. In order to illustrate the accuracy of the new NSFDs, the numerical results are compared with standard methods.

  19. Consolidation of the health insurance scheme

    CERN Multimedia

    Association du personnel

    2009-01-01

    In the last issue of Echo, we highlighted CERN’s obligation to guarantee a social security scheme for all employees, pensioners and their families. In that issue we talked about the first component: pensions. This time we shall discuss the other component: the CERN Health Insurance Scheme (CHIS).

  20. Unconditionnally stable scheme for Riccati equation

    CERN Document Server

    Dubois, François; 10.1051/proc:2000003

    2011-01-01

    We present a numerical scheme for the resolution of matrix Riccati equation used in control problems. The scheme is unconditionnally stable and the solution is definite positive at each time step of the resolution. We prove the convergence in the scalar case and present several numerical experiments for classical test cases.

  1. Modified Mean-Pyramid Coding Scheme

    Science.gov (United States)

    Cheung, Kar-Ming; Romer, Richard

    1996-01-01

    Modified mean-pyramid coding scheme requires transmission of slightly fewer data. Data-expansion factor reduced from 1/3 to 1/12. Schemes for progressive transmission of image data transmitted in sequence of frames in such way coarse version of image reconstructed after receipt of first frame and increasingly refined version of image reconstructed after receipt of each subsequent frame.

  2. A New Public-Key Encryption Scheme

    Institute of Scientific and Technical Information of China (English)

    Hai-Bo Tian; Xi Sun; Yu-Min Wang

    2007-01-01

    This paper proposes a new public-key encryption scheme which removes one element from the public-key tuple of the original Cramer-Shoup scheme.As a result, a ciphertext is not a quadruple but a triple at the cost of a strong assumption,the third version of knowledge of exponent assumption (KEA3).Under assumptions of KEA3, a decision Diffie-Hellman (DDH) and a variant of target collision resistance (TCRv), the new scheme is proved secure against indistinguishable adaptive chosen ciphertext attack (IND-CCA2).This scheme is as efficient as Damgard ElGamal (DEG) scheme when it makes use of a well-known algorithm for product of exponentiations.The DEG scheme is recently proved IND-CCA1 secure by Bellare and Palacio in ASIACRYPT 2004 under another strong assumption.In addition to our IND-CCA2 secured scheme, we also believe that the security proof procedure itself provides a well insight for ElGamal-based encryption schemes which are secure in real world.

  3. Novel Link Adaptation Schemes for OFDM System

    Institute of Scientific and Technical Information of China (English)

    LEI Ming; CAI Peng; XU Yue-shan; ZHANG Ping

    2003-01-01

    Orthogonal Frequency Division Multiplexing (OFDM) is the most promising technique supporting the high data rate transmission. The combination of the link adaptation and OFDM can further increase the spectral efficiency. In this paper, we put forward two link adaptation schemes for OFDM system which have the advantages of both flexibility and practicability. Both of the two novel link adaptation schemes are based on the iterative mechanism to allocate the bit and power to subcarriers according to their channel gains and noisy levels which are assumed to be already known at the transmitter. The candidate modulation modes are determined freely before the link adaptation schemes are performed. The distinction between the two novel link adaptation schemes is that in the novel scheme A, the modulation mode is upgraded to the neighboring higher-order mode, while in the novel scheme B the modulation is upgraded to the genuine optimal mode. Therefore, the novel scheme A has the advantage of lower complexity and the novel scheme B has the advantage of higher spectral efficiency.

  4. Sampling scheme optimization from hyperspectral data

    NARCIS (Netherlands)

    Debba, P.

    2006-01-01

    This thesis presents statistical sampling scheme optimization for geo-environ-menta] purposes on the basis of hyperspectral data. It integrates derived products of the hyperspectral remote sensing data into individual sampling schemes. Five different issues are being dealt with.First, the optimized

  5. Modified Mean-Pyramid Coding Scheme

    Science.gov (United States)

    Cheung, Kar-Ming; Romer, Richard

    1996-01-01

    Modified mean-pyramid coding scheme requires transmission of slightly fewer data. Data-expansion factor reduced from 1/3 to 1/12. Schemes for progressive transmission of image data transmitted in sequence of frames in such way coarse version of image reconstructed after receipt of first frame and increasingly refined version of image reconstructed after receipt of each subsequent frame.

  6. Phase calibration scheme for a ``T'' array

    Science.gov (United States)

    Ramesh, R.; Subramanian, K. R.; Sastry, Ch. V.

    1999-10-01

    A calibration scheme based on closure and redundancy techniques is described for correcting the phase errors in the complex visibilities observed with a T-shaped radio interferometer array. Practical details of the scheme are illustrated with reference to the Gauribidanur radioheliograph (GRH).

  7. Privacy Preserving Mapping Schemes Supporting Comparison

    NARCIS (Netherlands)

    Tang, Qiang

    2010-01-01

    To cater to the privacy requirements in cloud computing, we introduce a new primitive, namely Privacy Preserving Mapping (PPM) schemes supporting comparison. An PPM scheme enables a user to map data items into images in such a way that, with a set of images, any entity can determine the $<, =, >$ re

  8. Privacy Preserving Mapping Schemes Supporting Comparison

    NARCIS (Netherlands)

    Tang, Qiang

    2010-01-01

    To cater to the privacy requirements in cloud computing, we introduce a new primitive, namely Privacy Preserving Mapping (PPM) schemes supporting comparison. An PPM scheme enables a user to map data items into images in such a way that, with a set of images, any entity can determine the <, =, >

  9. Modeling Students' Mathematics Using Steffe's Fraction Schemes

    Science.gov (United States)

    Norton, Anderson H.; McCloskey, Andrea V.

    2008-01-01

    Each year, more teachers learn about the successful intervention program known as Math Recovery (USMRC 2008; Wright 2003). The program uses Steffe's whole-number schemes to model, understand, and support children's development of whole-number reasoning. Readers are probably less familiar with Steffe's fraction schemes, which have proven similarly…

  10. A Secure Threshold Group Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    Wang Xiaoming; Fu Fangwei

    2003-01-01

    The threshold group signature is an important kind of signature. So far, many threshold group signature schemes have been proposed, but most of them suffer from conspiracy attack and are insecure. In this paper, a secure threshold group signature scheme is proposed. It can not only satisfy the properties of the threshold group signature, but also withstand the conspiracy attack

  11. Cornering (3+1) sterile neutrino schemes

    CERN Document Server

    Maltoni, M; Valle, José W F

    2001-01-01

    Using the most recent atmospheric neutrino data, as well as short-baseline and tritium $\\beta$-decay data we show that (3+1) sterile neutrino schemes are severely disfavored, in contrast to the theoretically favored (2+2) schemes.

  12. An Adaptive Handover Prediction Scheme for Seamless Mobility Based Wireless Networks

    Directory of Open Access Journals (Sweden)

    Ali Safa Sadiq

    2014-01-01

    Full Text Available We propose an adaptive handover prediction (AHP scheme for seamless mobility based wireless networks. That is, the AHP scheme incorporates fuzzy logic with AP prediction process in order to lend cognitive capability to handover decision making. Selection metrics, including received signal strength, mobile node relative direction towards the access points in the vicinity, and access point load, are collected and considered inputs of the fuzzy decision making system in order to select the best preferable AP around WLANs. The obtained handover decision which is based on the calculated quality cost using fuzzy inference system is also based on adaptable coefficients instead of fixed coefficients. In other words, the mean and the standard deviation of the normalized network prediction metrics of fuzzy inference system, which are collected from available WLANs are obtained adaptively. Accordingly, they are applied as statistical information to adjust or adapt the coefficients of membership functions. In addition, we propose an adjustable weight vector concept for input metrics in order to cope with the continuous, unpredictable variation in their membership degrees. Furthermore, handover decisions are performed in each MN independently after knowing RSS, direction toward APs, and AP load. Finally, performance evaluation of the proposed scheme shows its superiority compared with representatives of the prediction approaches.

  13. Autonomous droop scheme with reduced generation cost

    DEFF Research Database (Denmark)

    Nutkani, Inam Ullah; Loh, Poh Chiang; Blaabjerg, Frede

    2013-01-01

    Droop scheme has been widely applied to the control of Distributed Generators (DGs) in microgrids for proportional power sharing based on their ratings. For standalone microgrid, where centralized management system is not viable, the proportional power sharing based droop might not suit well since...... DGs are usually of different types unlike synchronous generators. This paper presents an autonomous droop scheme that takes into consideration the operating cost, efficiency and emission penalty of each DG since all these factors directly or indirectly contributes to the Total Generation Cost (TGC......) of the overall microgrid. Comparing it with the traditional scheme, the proposed scheme has retained its simplicity, which certainly is a feature preferred by the industry. The overall performance of the proposed scheme has been verified through simulation and experiment....

  14. Two level scheme solvers for nuclear spectroscopy

    Science.gov (United States)

    Jansson, Kaj; DiJulio, Douglas; Cederkäll, Joakim

    2011-10-01

    A program for building level schemes from γ-spectroscopy coincidence data has been developed. The scheme builder was equipped with two different algorithms: a statistical one based on the Metropolis method and a more logical one, called REMP (REcurse, Merge and Permute), developed from scratch. These two methods are compared both on ideal cases and on experimental γ-ray data sets. The REMP algorithm is based on coincidences and transition energies. Using correct and complete coincidence data, it has solved approximately half a million schemes without failures. Also, for incomplete data and data with minor errors, the algorithm produces consistent sub-schemes when it is not possible to obtain a complete scheme from the provided data.

  15. An Efficient Forward Secure Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    YU Jia; KONG Fan-yu; LI Da-xing

    2006-01-01

    A new efficient forward secure signature scheme based on bilinear pairings is presented in this paper.Each complexity of key generation, key update, signing and verifying algorithms in this scheme is O(1) in terms of the total number of time periods T. Because a new structure in node secret key storage and a unique strategy in key update are employed, the signing and verifying costs don't grow when T increases. At the same time, the key generation and key update algorithms are efficiently constructed thanks to using the pre-order traversal technique of binary trees. Compared with other schemes based on bilinear pairings, the signature size in this scheme is very short, which doesn't change with T increasing. The scheme is forward secure in random oracle model assuming CDH problem is hard.

  16. A DRM Scheme Using File Physical Information

    Directory of Open Access Journals (Sweden)

    Cheng Qu

    2015-05-01

    Full Text Available A digital file has both the content and physical information, however the latter was not fully made use of in previous digital rights management (DRM systems. This paper introduces the idea of making use of file physical information to improve the system security and provides a scheme based on this idea to resist the replay attack in DRM systems. In our scheme, compared to commonly used schemes, we remove the dependency on continuous online connection from the client-side to the server-side or the usage of tamper-proof hardware, such as Trusted Platform Module (TPM. The scheme is appropriate for offline digital content usage. Primary experiments demonstrate that our scheme is secure enough to be put into practice use.

  17. Comparison among sea surface roughness schemes

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Based on the measurements from the US National Data Buoy Center 3-m discus buoy site No.44004 (38.5°N, 70.47°W) from January 1 to March 31 of 2003, with the COARE algorithm (Version 3.0), the results from four parameterization schemes developed recently for sea surface aerodynamic roughness length were compared with each other. Calculations of frictional speed u*, drag coefficient Cd and wind stress τ indicate that the calculated frictional velocities from the four schemes (8.50%-16.20%, the normalized standard error estimate, or NSEE), the computed drag coefficients and wind stress (respectively 15.08%-28.67% and 17.26%-50.59% NSEE) are reasonable. Schemes YT96 and GW03 are consistent. The O02 scheme gives overestimated values for u* and Cd. Schemes TY01 and GW03 display discontinuous characteristics in handling young wave data.

  18. A novel key management scheme using biometrics

    Science.gov (United States)

    Sui, Yan; Yang, Kai; Du, Yingzi; Orr, Scott; Zou, Xukai

    2010-04-01

    Key management is one of the most important issues in cryptographic systems. Several important challenges in such a context are represented by secure and efficient key generation, key distribution, as well as key revocation. Addressing such challenges requires a comprehensive solution which is robust, secure and efficient. Compared to traditional key management schemes, key management using biometrics requires the presence of the user, which can reduce fraud and protect the key better. In this paper, we propose a novel key management scheme using iris based biometrics. Our newly proposed scheme outperforms traditional key management schemes as well as some existing key-binding biometric schemes in terms of security, diversity and/or efficiency.

  19. Terrorism Event Classification Using Fuzzy Inference Systems

    CERN Document Server

    Inyaem, Uraiwan; Meesad, Phayung; Tran, Dat

    2010-01-01

    Terrorism has led to many problems in Thai societies, not only property damage but also civilian casualties. Predicting terrorism activities in advance can help prepare and manage risk from sabotage by these activities. This paper proposes a framework focusing on event classification in terrorism domain using fuzzy inference systems (FISs). Each FIS is a decision-making model combining fuzzy logic and approximate reasoning. It is generated in five main parts: the input interface, the fuzzification interface, knowledge base unit, decision making unit and output defuzzification interface. Adaptive neuro-fuzzy inference system (ANFIS) is a FIS model adapted by combining the fuzzy logic and neural network. The ANFIS utilizes automatic identification of fuzzy logic rules and adjustment of membership function (MF). Moreover, neural network can directly learn from data set to construct fuzzy logic rules and MF implemented in various applications. FIS settings are evaluated based on two comparisons. The first evaluat...

  20. Inferring Planetary Obliquity Using Rotational & Orbital Photometry

    CERN Document Server

    Schwartz, Joel C; Haggard, Hal M; Pallé, Eric; Cowan, Nicolas B

    2015-01-01

    The obliquity of a terrestrial planet is an important clue about its formation and critical to its climate. Previous studies using simulated photometry of Earth show that continuous observations over most of a planet's orbit can be inverted to infer obliquity. We extend this approach to single-epoch observations for planets with arbitrary albedo maps. For diffuse reflection, the flux seen by a distant observer is the product of the planet's albedo map, the host star's illumination, and the observer's visibility of different planet regions. It is useful to treat the product of illumination and visibility as the kernel of a convolution; this kernel is unimodal and symmetric. For planets with unknown obliquity, the kernel is not known a priori, but could be inferred by fitting a rotational light curve. We analyze this kernel under different viewing geometries, finding it well described by its longitudinal width and latitudinal position. We use Monte Carlo simulation to estimate uncertainties on these kernel char...

  1. Human collective intelligence as distributed Bayesian inference

    CERN Document Server

    Krafft, Peter M; Pan, Wei; Della Penna, Nicolás; Altshuler, Yaniv; Shmueli, Erez; Tenenbaum, Joshua B; Pentland, Alex

    2016-01-01

    Collective intelligence is believed to underly the remarkable success of human society. The formation of accurate shared beliefs is one of the key components of human collective intelligence. How are accurate shared beliefs formed in groups of fallible individuals? Answering this question requires a multiscale analysis. We must understand both the individual decision mechanisms people use, and the properties and dynamics of those mechanisms in the aggregate. As of yet, mathematical tools for such an approach have been lacking. To address this gap, we introduce a new analytical framework: We propose that groups arrive at accurate shared beliefs via distributed Bayesian inference. Distributed inference occurs through information processing at the individual level, and yields rational belief formation at the group level. We instantiate this framework in a new model of human social decision-making, which we validate using a dataset we collected of over 50,000 users of an online social trading platform where inves...

  2. Bayesianism and inference to the best explanation

    Directory of Open Access Journals (Sweden)

    Valeriano IRANZO

    2008-01-01

    Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.

  3. Inference on power law spatial trends

    CERN Document Server

    Robinson, Peter M

    2012-01-01

    Power law or generalized polynomial regressions with unknown real-valued exponents and coefficients, and weakly dependent errors, are considered for observations over time, space or space--time. Consistency and asymptotic normality of nonlinear least-squares estimates of the parameters are established. The joint limit distribution is singular, but can be used as a basis for inference on either exponents or coefficients. We discuss issues of implementation, efficiency, potential for improved estimation and possibilities of extension to more general or alternative trending models to allow for irregularly spaced data or heteroscedastic errors; though it focusses on a particular model to fix ideas, the paper can be viewed as offering machinery useful in developing inference for a variety of models in which power law trends are a component. Indeed, the paper also makes a contribution that is potentially relevant to many other statistical models: Our problem is one of many in which consistency of a vector of parame...

  4. The NIFTY way of Bayesian signal inference

    Energy Technology Data Exchange (ETDEWEB)

    Selig, Marco, E-mail: mselig@mpa-Garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany, and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D{sup 3}PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.

  5. Inferences on Children’s Reading Groups

    Directory of Open Access Journals (Sweden)

    Javier González García

    2009-05-01

    Full Text Available This article focuses on the non-literal information of a text, which can be inferred from key elements or clues offered by the text itself. This kind of text is called implicit text or inference, due to the thinking process that it stimulates. The explicit resources that lead to information retrieval are related to others of implicit information, which have increased their relevance. In this study, during two courses, how two teachers interpret three stories and how they establish a debate dividing the class into three student groups, was analyzed. The sample was formed by two classes of two urban public schools of Burgos capital (Spain, and two of public schools of Tampico (Mexico. This allowed us to observe an increasing percentage value of the group focused in text comprehension, and a lesser percentage of the group perceiving comprehension as a secondary objective.

  6. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results......To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  7. Introductory statistical inference with the likelihood function

    CERN Document Server

    Rohde, Charles A

    2014-01-01

    This textbook covers the fundamentals of statistical inference and statistical theory including Bayesian and frequentist approaches and methodology possible without excessive emphasis on the underlying mathematics. This book is about some of the basic principles of statistics that are necessary to understand and evaluate methods for analyzing complex data sets. The likelihood function is used for pure likelihood inference throughout the book. There is also coverage of severity and finite population sampling. The material was developed from an introductory statistical theory course taught by the author at the Johns Hopkins University’s Department of Biostatistics. Students and instructors in public health programs will benefit from the likelihood modeling approach that is used throughout the text. This will also appeal to epidemiologists and psychometricians.  After a brief introduction, there are chapters on estimation, hypothesis testing, and maximum likelihood modeling. The book concludes with secti...

  8. Variational Bayesian Inference of Line Spectra

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri

    2016-01-01

    In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid; and the coeffici......In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid......; and the coefficients are governed by a Bernoulli-Gaussian prior model turning model order selection into binary sequence detection. Unlike earlier works which retain only point estimates of the frequencies, we undertake a more complete Bayesian treatment by estimating the posterior probability density functions (pdfs...

  9. Improved testing inference in mixed linear models

    CERN Document Server

    Melo, Tatiane F N; Cribari-Neto, Francisco; 10.1016/j.csda.2008.12.007

    2011-01-01

    Mixed linear models are commonly used in repeated measures studies. They account for the dependence amongst observations obtained from the same experimental unit. Oftentimes, the number of observations is small, and it is thus important to use inference strategies that incorporate small sample corrections. In this paper, we develop modified versions of the likelihood ratio test for fixed effects inference in mixed linear models. In particular, we derive a Bartlett correction to such a test and also to a test obtained from a modified profile likelihood function. Our results generalize those in Zucker et al. (Journal of the Royal Statistical Society B, 2000, 62, 827-838) by allowing the parameter of interest to be vector-valued. Additionally, our Bartlett corrections allow for random effects nonlinear covariance matrix structure. We report numerical evidence which shows that the proposed tests display superior finite sample behavior relative to the standard likelihood ratio test. An application is also presente...

  10. Towards Stratification Learning through Homology Inference

    CERN Document Server

    Bendich, Paul; Wang, Bei

    2010-01-01

    A topological approach to stratification learning is developed for point cloud data drawn from a stratified space. Given such data, our objective is to infer which points belong to the same strata. First we define a multi-scale notion of a stratified space, giving a stratification for each radius level. We then use methods derived from kernel and cokernel persistent homology to cluster the data points into different strata, and we prove a result which guarantees the correctness of our clustering, given certain topological conditions; some geometric intuition for these topological conditions is also provided. Our correctness result is then given a probabilistic flavor: we give bounds on the minimum number of sample points required to infer, with probability, which points belong to the same strata. Finally, we give an explicit algorithm for the clustering, prove its correctness, and apply it to some simulated data.

  11. Bayesian inference of structural brain networks.

    Science.gov (United States)

    Hinne, Max; Heskes, Tom; Beckmann, Christian F; van Gerven, Marcel A J

    2013-02-01

    Structural brain networks are used to model white-matter connectivity between spatially segregated brain regions. The presence, location and orientation of these white matter tracts can be derived using diffusion-weighted magnetic resonance imaging in combination with probabilistic tractography. Unfortunately, as of yet, none of the existing approaches provide an undisputed way of inferring brain networks from the streamline distributions which tractography produces. State-of-the-art methods rely on an arbitrary threshold or, alternatively, yield weighted results that are difficult to interpret. In this paper, we provide a generative model that explicitly describes how structural brain networks lead to observed streamline distributions. This allows us to draw principled conclusions about brain networks, which we validate using simultaneously acquired resting-state functional MRI data. Inference may be further informed by means of a prior which combines connectivity estimates from multiple subjects. Based on this prior, we obtain networks that significantly improve on the conventional approach.

  12. Statistical Methods in Phylogenetic and Evolutionary Inferences

    Directory of Open Access Journals (Sweden)

    Luigi Bertolotti

    2013-05-01

    Full Text Available Molecular instruments are the most accurate methods in organisms’identification and characterization. Biologists are often involved in studies where the main goal is to identify relationships among individuals. In this framework, it is very important to know and apply the most robust approaches to infer correctly these relationships, allowing the right conclusions about phylogeny. In this review, we will introduce the reader to the most used statistical methods in phylogenetic analyses, the Maximum Likelihood and the Bayesian approaches, considering for simplicity only analyses regardingDNA sequences. Several studieswill be showed as examples in order to demonstrate how the correct phylogenetic inference can lead the scientists to highlight very peculiar features in pathogens biology and evolution.

  13. Inferring network topology via the propagation process

    CERN Document Server

    Zeng, An

    2013-01-01

    Inferring the network topology from the dynamics is a fundamental problem with wide applications in geology, biology and even counter-terrorism. Based on the propagation process, we present a simple method to uncover the network topology. The numerical simulation on artificial networks shows that our method enjoys a high accuracy in inferring the network topology. We find the infection rate in the propagation process significantly influences the accuracy, and each network is corresponding to an optimal infection rate. Moreover, the method generally works better in large networks. These finding are confirmed in both real social and nonsocial networks. Finally, the method is extended to directed networks and a similarity measure specific for directed networks is designed.

  14. Inference for ordered parameters in multinomial distributions

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    This paper discusses inference for ordered parameters of multinomial distributions. We first show that the asymptotic distributions of their maximum likelihood estimators (MLEs) are not always normal and the bootstrap distribution estimators of the MLEs can be inconsistent. Then a class of weighted sum estimators (WSEs) of the ordered parameters is proposed. Properties of the WSEs are studied, including their asymptotic normality. Based on those results, large sample inferences for smooth functions of the ordered parameters can be made. Especially, the confidence intervals of the maximum cell probabilities are constructed. Simulation results indicate that this interval estimation performs much better than the bootstrap approaches in the literature. Finally, the above results for ordered parameters of multinomial distributions are extended to more general distribution models.

  15. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  16. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.

  17. Towards an improved land surface scheme for prairie landscapes

    Science.gov (United States)

    Mekonnen, M. A.; Wheater, H. S.; Ireson, A. M.; Spence, C.; Davison, B.; Pietroniro, A.

    2014-04-01

    The prairie region of Canada and the United States is characterized by millions of small depressions of glacial origin called prairie potholes. The transfer of surface runoff in this landscape is mainly through a “fill and spill” mechanism among neighboring potholes. While non-contributing areas, that is small internally drained basins, are common on this landscape, during wet periods these areas can become hydrologically connected to larger regional drainage systems. Accurate prediction of prairie surface runoff generation and streamflow thus requires realistic representation of the dynamic threshold-mediated nature of these contributing areas. This paper presents a new prairie surface runoff generation algorithm for land surface schemes and large scale hydrological models that conceptualizes a hydrologic unit as a combination of variable and interacting storage elements. The proposed surface runoff generation algorithm uses a probability density function to represent the spatial variation of pothole storages and assumes a unique relationship between storage and the fractional contributing area for runoff (and hence amount of direct runoff generated) within a grid cell. In this paper the parameters that define this relationship are obtained by calibration against streamflow. The model was compared to an existing hydrology-land surface scheme (HLSS) applied to a typical Canadian prairie catchment, the Assiniboine River. The existing configuration is based on the Canadian Land Surface Scheme (CLASS) and WATROF (a physically-based overland and interflow scheme). The new configuration consists of CLASS coupled with the new PDMROF model. Results showed that the proposed surface runoff generation algorithm performed better at simulating streamflow, and appears to capture the dynamic nature of contributing areas in an effective and parsimonious manner. A pilot evaluation based on 1 m LiDAR data from a small (10 km2) experimental area suggests that the shape of the

  18. Data analysis recipes: Probability calculus for inference

    OpenAIRE

    Hogg, David W.

    2012-01-01

    In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods,...

  19. Data analysis recipes: Probability calculus for inference

    CERN Document Server

    Hogg, David W

    2012-01-01

    In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods, posterior probabilities, and posterior predictions are all discussed.

  20. Analysis of KATRIN data using Bayesian inference

    DEFF Research Database (Denmark)

    Riis, Anna Sejersen; Hannestad, Steen; Weinheimer, Christian

    2011-01-01

    The KATRIN (KArlsruhe TRItium Neutrino) experiment will be analyzing the tritium beta-spectrum to determine the mass of the neutrino with a sensitivity of 0.2 eV (90% C.L.). This approach to a measurement of the absolute value of the neutrino mass relies only on the principle of energy conservati...... the KATRIN chi squared function in the COSMOMC package - an MCMC code using Bayesian parameter inference - solved the task at hand very nicely....

  1. Inferring Trust Based on Similarity with TILLIT

    Science.gov (United States)

    Tavakolifard, Mozhgan; Herrmann, Peter; Knapskog, Svein J.

    A network of people having established trust relations and a model for propagation of related trust scores are fundamental building blocks in many of today’s most successful e-commerce and recommendation systems. However, the web of trust is often too sparse to predict trust values between non-familiar people with high accuracy. Trust inferences are transitive associations among users in the context of an underlying social network and may provide additional information to alleviate the consequences of the sparsity and possible cold-start problems. Such approaches are helpful, provided that a complete trust path exists between the two users. An alternative approach to the problem is advocated in this paper. Based on collaborative filtering one can exploit the like-mindedness resp. similarity of individuals to infer trust to yet unknown parties which increases the trust relations in the web. For instance, if one knows that with respect to a specific property, two parties are trusted alike by a large number of different trusters, one can assume that they are similar. Thus, if one has a certain degree of trust to the one party, one can safely assume a very similar trustworthiness of the other one. In an attempt to provide high quality recommendations and proper initial trust values even when no complete trust propagation path or user profile exists, we propose TILLIT — a model based on combination of trust inferences and user similarity. The similarity is derived from the structure of the trust graph and users’ trust behavior as opposed to other collaborative-filtering based approaches which use ratings of items or user’s profile. We describe an algorithm realizing the approach based on a combination of trust inferences and user similarity, and validate the algorithm using a real large-scale data-set.

  2. Towards a Faster Randomized Parcellation Based Inference

    OpenAIRE

    Hoyos-Idrobo, Andrés; Varoquaux, Gaël; Thirion, Bertrand

    2017-01-01

    International audience; In neuroimaging, multi-subject statistical analysis is an essential step, as it makes it possible to draw conclusions for the population under study. However, the lack of power in neuroimaging studies combined with the lack of stability and sensitivity of voxel-based methods may lead to non-reproducible results. A method designed to tackle this problem is Randomized Parcellation-Based Inference (RPBI), which has shown good empirical performance. Nevertheless, the use o...

  3. Thermodynamics of statistical inference by cells.

    Science.gov (United States)

    Lang, Alex H; Fisher, Charles K; Mora, Thierry; Mehta, Pankaj

    2014-10-03

    The deep connection between thermodynamics, computation, and information is now well established both theoretically and experimentally. Here, we extend these ideas to show that thermodynamics also places fundamental constraints on statistical estimation and learning. To do so, we investigate the constraints placed by (nonequilibrium) thermodynamics on the ability of biochemical signaling networks to estimate the concentration of an external signal. We show that accuracy is limited by energy consumption, suggesting that there are fundamental thermodynamic constraints on statistical inference.

  4. Unified Theory of Inference for Text Understanding

    Science.gov (United States)

    1986-11-25

    reataurant script is recognized, script application would lead to inferences such as identifying the waiter as ’ ’the waiter who is employed by the...relations between the objects. Objects have names as a convenience for the system modeler, but the names are not used for purposes other than...intent is that we can consider talking to be a frame with a talker slot which must be filled by a person. This is just a convenient notation; the

  5. Inferring sparse networks for noisy transient processes

    Science.gov (United States)

    Tran, Hoang M.; Bukkapatnam, Satish T. S.

    2016-02-01

    Inferring causal structures of real world complex networks from measured time series signals remains an open issue. The current approaches are inadequate to discern between direct versus indirect influences (i.e., the presence or absence of a directed arc connecting two nodes) in the presence of noise, sparse interactions, as well as nonlinear and transient dynamics of real world processes. We report a sparse regression (referred to as the -min) approach with theoretical bounds on the constraints on the allowable perturbation to recover the network structure that guarantees sparsity and robustness to noise. We also introduce averaging and perturbation procedures to further enhance prediction scores (i.e., reduce inference errors), and the numerical stability of -min approach. Extensive investigations have been conducted with multiple benchmark simulated genetic regulatory network and Michaelis-Menten dynamics, as well as real world data sets from DREAM5 challenge. These investigations suggest that our approach can significantly improve, oftentimes by 5 orders of magnitude over the methods reported previously for inferring the structure of dynamic networks, such as Bayesian network, network deconvolution, silencing and modular response analysis methods based on optimizing for sparsity, transients, noise and high dimensionality issues.

  6. Intuitive Mechanics: Inferences of Vertical Projectile Motion

    Directory of Open Access Journals (Sweden)

    Milana Damjenić

    2016-07-01

    Full Text Available Our intuitive knowledge of physics mechanics, i.e. knowledge defined through personal experience about velocity, acceleration, motion causes, etc., is often wrong. This research examined whether similar misconceptions occur systematically in the case of vertical projectiles launched upwards. The first experiment examined inferences of velocity and acceleration of the ball moving vertically upwards, while the second experiment examined whether the mass of the thrown ball and force of the throw have an impact on the inference. The results showed that more than three quarters of the participants wrongly assumed that maximum velocity and peak acceleration did not occur at the initial launch of the projectile. There was no effect of object mass or effect of the force of the throw on the inference relating to the velocity and acceleration of the ball. The results exceed the explanatory reach of the impetus theory, most commonly used to explain the naive understanding of the mechanics of object motion. This research supports that the actions on objects approach and the property transmission heuristics may more aptly explain the dissidence between perceived and actual implications in projectile motion.

  7. Combinatorics of distance-based tree inference.

    Science.gov (United States)

    Pardi, Fabio; Gascuel, Olivier

    2012-10-01

    Several popular methods for phylogenetic inference (or hierarchical clustering) are based on a matrix of pairwise distances between taxa (or any kind of objects): The objective is to construct a tree with branch lengths so that the distances between the leaves in that tree are as close as possible to the input distances. If we hold the structure (topology) of the tree fixed, in some relevant cases (e.g., ordinary least squares) the optimal values for the branch lengths can be expressed using simple combinatorial formulae. Here we define a general form for these formulae and show that they all have two desirable properties: First, the common tree reconstruction approaches (least squares, minimum evolution), when used in combination with these formulae, are guaranteed to infer the correct tree when given enough data (consistency); second, the branch lengths of all the simple (nearest neighbor interchange) rearrangements of a tree can be calculated, optimally, in quadratic time in the size of the tree, thus allowing the efficient application of hill climbing heuristics. The study presented here is a continuation of that by Mihaescu and Pachter on branch length estimation [Mihaescu R, Pachter L (2008) Proc Natl Acad Sci USA 105:13206-13211]. The focus here is on the inference of the tree itself and on providing a basis for novel algorithms to reconstruct trees from distances.

  8. Inference of magnetic fields in inhomogeneous prominences

    Science.gov (United States)

    Milić, I.; Faurobert, M.; Atanacković, O.

    2017-01-01

    Context. Most of the quantitative information about the magnetic field vector in solar prominences comes from the analysis of the Hanle effect acting on lines formed by scattering. As these lines can be of non-negligible optical thickness, it is of interest to study the line formation process further. Aims: We investigate the multidimensional effects on the interpretation of spectropolarimetric observations, particularly on the inference of the magnetic field vector. We do this by analyzing the differences between multidimensional models, which involve fully self-consistent radiative transfer computations in the presence of spatial inhomogeneities and velocity fields, and those which rely on simple one-dimensional geometry. Methods: We study the formation of a prototype line in ad hoc inhomogeneous, isothermal 2D prominence models. We solve the NLTE polarized line formation problem in the presence of a large-scale oriented magnetic field. The resulting polarized line profiles are then interpreted (i.e. inverted) assuming a simple 1D slab model. Results: We find that differences between input and the inferred magnetic field vector are non-negligible. Namely, we almost universally find that the inferred field is weaker and more horizontal than the input field. Conclusions: Spatial inhomogeneities and radiative transfer have a strong effect on scattering line polarization in the optically thick lines. In real-life situations, ignoring these effects could lead to a serious misinterpretation of spectropolarimetric observations of chromospheric objects such as prominences.

  9. Inferring Pedigree Graphs from Genetic Distances

    Science.gov (United States)

    Tamura, Takeyuki; Ito, Hiro

    In this paper, we study a problem of inferring blood relationships which satisfy a given matrix of genetic distances between all pairs of n nodes. Blood relationships are represented by our proposed graph class, which is called a pedigree graph. A pedigree graph is a directed acyclic graph in which the maximum indegree is at most two. We show that the number of pedigree graphs which satisfy the condition of given genetic distances may be exponential, but they can be represented by one directed acyclic graph with n nodes. Moreover, an O(n3) time algorithm which solves the problem is also given. Although phylogenetic trees and phylogenetic networks are similar data structures to pedigree graphs, it seems that inferring methods for phylogenetic trees and networks cannot be applied to infer pedigree graphs since nodes of phylogenetic trees and networks represent species whereas nodes of pedigree graphs represent individuals. We also show an O(n2) time algorithm which detects a contradiction between a given pedigreee graph and distance matrix of genetic distances.

  10. Bootstrapping phylogenies inferred from rearrangement data

    Directory of Open Access Journals (Sweden)

    Lin Yu

    2012-08-01

    Full Text Available Abstract Background Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. Results We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Conclusions Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its

  11. Mediational Inferences in the Process of Counselor Judgment.

    Science.gov (United States)

    Haase, Richard F.; And Others

    1983-01-01

    Replicates research on the process of moving from observations to clinical judgments. Counselors (N=20) made status inferences, attributional inferences, and diagnostic classification of clients based on case folders. Results suggest the clinical judgment process was stagewise mediated, and attributional inferences had little direct impact on…

  12. Type Inference for Session Types in the Pi-Calculus

    DEFF Research Database (Denmark)

    Graversen, Eva Fajstrup; Harbo, Jacob Buchreitz; Huttel, Hans

    2014-01-01

    In this paper we present a direct algorithm for session type inference for the π-calculus. Type inference for session types has previously been achieved by either imposing limitations and restriction on the π-calculus, or by reducing the type inference problem to that for linear types. Our approach...

  13. Type Inference for Session Types in the Pi-Calculus

    DEFF Research Database (Denmark)

    Huttel, Hans; Graversen, Eva Fajstrup; Wahl, Sebastian

    2014-01-01

    In this paper we present a direct algorithm for session type inference for the π-calculus. Type inference for session types has previously been achieved by either imposing limitations and restriction on the π-calculus, or by reducing the type inference problem to that for linear types. Our approa...

  14. Classical and Bayesian aspects of robust unit root inference

    NARCIS (Netherlands)

    H. Hoek (Henk); H.K. van Dijk (Herman)

    1995-01-01

    textabstractThis paper has two themes. First, we classify some effects which outliers in the data have on unit root inference. We show that, both in a classical and a Bayesian framework, the presence of additive outliers moves ‘standard’ inference towards stationarity. Second, we base inference on a

  15. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  16. Reasoning about Informal Statistical Inference: One Statistician's View

    Science.gov (United States)

    Rossman, Allan J.

    2008-01-01

    This paper identifies key concepts and issues associated with the reasoning of informal statistical inference. I focus on key ideas of inference that I think all students should learn, including at secondary level as well as tertiary. I argue that a fundamental component of inference is to go beyond the data at hand, and I propose that statistical…

  17. Secure Electronic Cash Scheme with Anonymity Revocation

    Directory of Open Access Journals (Sweden)

    Baoyuan Kang

    2016-01-01

    Full Text Available In a popular electronic cash scheme, there are three participants: the bank, the customer, and the merchant. First, a customer opens an account in a bank. Then, he withdraws an e-cash from his account and pays it to a merchant. After checking the electronic cash’s validity, the merchant accepts it and deposits it to the bank. There are a number of requirements for an electronic cash scheme, such as, anonymity, unforgeability, unreusability, divisibility, transferability, and portability. Anonymity property of electronic cash schemes can ensure the privacy of payers. However, this anonymity property is easily abused by criminals. In 2011, Chen et al. proposed a novel electronic cash system with trustee-based anonymity revocation from pairing. On demand, the trustee can disclose the identity for e-cash. But, in this paper we point out that Chen et al.’s scheme is subjected to some drawbacks. To contribute secure electronic cash schemes, we propose a new offline electronic cash scheme with anonymity revocation. We also provide the formally security proofs of the unlinkability and unforgeability. Furthermore, the proposed scheme ensures the property of avoiding merchant frauds.

  18. Ponzi scheme diffusion in complex networks

    Science.gov (United States)

    Zhu, Anding; Fu, Peihua; Zhang, Qinghe; Chen, Zhenyue

    2017-08-01

    Ponzi schemes taking the form of Internet-based financial schemes have been negatively affecting China's economy for the last two years. Because there is currently a lack of modeling research on Ponzi scheme diffusion within social networks yet, we develop a potential-investor-divestor (PID) model to investigate the diffusion dynamics of Ponzi scheme in both homogeneous and inhomogeneous networks. Our simulation study of artificial and real Facebook social networks shows that the structure of investor networks does indeed affect the characteristics of dynamics. Both the average degree of distribution and the power-law degree of distribution will reduce the spreading critical threshold and will speed up the rate of diffusion. A high speed of diffusion is the key to alleviating the interest burden and improving the financial outcomes for the Ponzi scheme operator. The zero-crossing point of fund flux function we introduce proves to be a feasible index for reflecting the fast-worsening situation of fiscal instability and predicting the forthcoming collapse. The faster the scheme diffuses, the higher a peak it will reach and the sooner it will collapse. We should keep a vigilant eye on the harm of Ponzi scheme diffusion through modern social networks.

  19. Is there a hierarchy of social inferences? The likelihood and speed of inferring intentionality, mind, and personality.

    Science.gov (United States)

    Malle, Bertram F; Holbrook, Jess

    2012-04-01

    People interpret behavior by making inferences about agents' intentionality, mind, and personality. Past research studied such inferences 1 at a time; in real life, people make these inferences simultaneously. The present studies therefore examined whether 4 major inferences (intentionality, desire, belief, and personality), elicited simultaneously in response to an observed behavior, might be ordered in a hierarchy of likelihood and speed. To achieve generalizability, the studies included a wide range of stimulus behaviors, presented them verbally and as dynamic videos, and assessed inferences both in a retrieval paradigm (measuring the likelihood and speed of accessing inferences immediately after they were made) and in an online processing paradigm (measuring the speed of forming inferences during behavior observation). Five studies provide evidence for a hierarchy of social inferences-from intentionality and desire to belief to personality-that is stable across verbal and visual presentations and that parallels the order found in developmental and primate research.

  20. A Spatial Domain Quantum Watermarking Scheme

    Science.gov (United States)

    Wei, Zhan-Hong; Chen, Xiu-Bo; Xu, Shu-Jiang; Niu, Xin-Xin; Yang, Yi-Xian

    2016-07-01

    This paper presents a spatial domain quantum watermarking scheme. For a quantum watermarking scheme, a feasible quantum circuit is a key to achieve it. This paper gives a feasible quantum circuit for the presented scheme. In order to give the quantum circuit, a new quantum multi-control rotation gate, which can be achieved with quantum basic gates, is designed. With this quantum circuit, our scheme can arbitrarily control the embedding position of watermark images on carrier images with the aid of auxiliary qubits. Besides reversely acting the given quantum circuit, the paper gives another watermark extracting algorithm based on quantum measurements. Moreover, this paper also gives a new quantum image scrambling method and its quantum circuit. Differ from other quantum watermarking schemes, all given quantum circuits can be implemented with basic quantum gates. Moreover, the scheme is a spatial domain watermarking scheme, and is not based on any transform algorithm on quantum images. Meanwhile, it can make sure the watermark be secure even though the watermark has been found. With the given quantum circuit, this paper implements simulation experiments for the presented scheme. The experimental result shows that the scheme does well in the visual quality and the embedding capacity. Supported by the National Natural Science Foundation of China under Grant Nos. 61272514, 61170272, 61373131, 61121061, 61411146001, the program for New Century Excellent Talents under Grant No. NCET-13-0681, the National Development Foundation for Cryptological Research (Grant No. MMJJ201401012) and the Fok Ying Tung Education Foundation under Grant No. 131067, and the Shandong Provincial Natural Science Foundation of China under Grant No. ZR2013FM025

  1. Exploiting Same Scale Similarity in Fisher's Scheme

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yao

    2001-01-01

    The method proposed by Y. Fisher is the most popular fractal image coding scheme. In his scheme, domain blocks are constrained to be twice as large as range blocks in order to ensure the convergence of the iterative decoding stage. However,this constraint has limited the fractal encoder to exploit the self-similarity of the original image. In order to overcome the shortcoming, a novel scheme using same-sized range and domain blocks is proposed in the paper. Experimental results show the improvements in compression performance.

  2. Mission Mangalam scheme: Ex ploring the opportunities.

    Directory of Open Access Journals (Sweden)

    Dr. Pallavi A. Upadhyay*,

    2015-01-01

    Full Text Available Background Mission Mangalam has been launched by the Gujarat Government in 2010.It is an integrated poverty alleviation approach and an initiative to empower women. Mission Mangalam is helping women to earn their livelihood and to become independent. These Sakhimandals are linked to banks to fulfill the requirement of fund. Sakhimandals get financial assistance from banks. Some of the core benefits of the scheme can be linked with the health sector as well; Objectives (1 To review the scheme of Mission Mangalam (2 To explore the possibility of health linkage with the scheme (3 To study the perception of beneficiaries and their socio- demographic profile; Methodology: A cross sectional study. Samplesize-152 women members of Sakhimandals in Saraspur ward. Health of all the members of Mandals of Saraspur was checked by the Ahmedabad Municipal Corporation. Women of Sakhimandals were interviewed to understand their perception about the scheme as well as about any other health benefit they have experienced for themselves or their family members. 3 Additionally, Community based survey of 50 BPL families was carried out to assess the proportion of families covered under the scheme of Mission Mangalam. Results: Mean age of these women (n=152 was 31.81 years with SD=6.74. Education of maximum number 71(47% of women was up to secondary. Mean income was 5460 Rs/month with SD=1840 .Mean of number of family members is 5.4. 109 (72% women are residing in chali area. 98(64% women were told about this scheme by social worker, others were told about the scheme by her friend or UCD official. Paired t test was carried out to find increase in Hb levels of the beneficiary women. It was found to be significant (p=0.007, t=15.64. Age of women is associated with habit to save money. (p=0.003. There are only 22 (44% out of 50 families visited, who have at least one member enrolled under the scheme. More stringent efforts for universal coverage have to be made by

  3. A SUBDIVISION SCHEME FOR VOLUMETRIC MODELS

    Institute of Scientific and Technical Information of China (English)

    GhulamMustafa; LiuXuefeng

    2005-01-01

    In this paper, a subdivision scheme which generalizes a surface scheme in previous papers to volume meshes is designed. The scheme exhibits significant control over shrink-age/size of volumetric models. It also has the ability to conveniently incorporate boundaries and creases into a smooth limit shape of models. The method presented here is much simpler and easier as compared to MacCracken and Joy's. This method makes no restrictions on the local topology of meshes. Particularly, it can be applied without any change to meshes of nonmanifold topology.

  4. Consistency of non-minimal renormalisation schemes

    CERN Document Server

    Jack, I

    2016-01-01

    Non-minimal renormalisation schemes such as the momentum subtraction scheme (MOM) have frequently been used for physical computations. The consistency of such a scheme relies on the existence of a coupling redefinition linking it to MSbar. We discuss the implementation of this procedure in detail for a general theory and show how to construct the relevant redefinition up to three-loop order, for the case of a general theory of fermions and scalars in four dimensions and a general scalar theory in six dimensions.

  5. Optimal Sales Schemes for Network Goods

    DEFF Research Database (Denmark)

    Parakhonyak, Alexei; Vikander, Nick

    This paper examines the optimal sequencing of sales in the presence of network externalities. A firm sells a good to a group of consumers whose payoff from buying is increasing in total quantity sold. The firm selects the order to serve consumers so as to maximize expected sales. It can serve all...... consumers simultaneously, serve them all sequentially, or employ any intermediate scheme. We show that the optimal sales scheme is purely sequential, where each consumer observes all previous sales before choosing whether to buy himself. A sequential scheme maximizes the amount of information available...

  6. Chaotic cryptographic scheme and its randomness evaluation

    Science.gov (United States)

    Stoyanov, B. P.

    2012-10-01

    We propose a new cryptographic scheme based on the Lorenz chaos attractor and 32 bit bent Boolean function. We evaluated the keystream generated by the scheme with batteries of the NIST statistical tests. We also applied a number of statistical analysis techniques, such as calculating histograms, correlations between two adjacent pixels, information entropy, and differential resistance, all refer to images encrypted by the proposed system. The results of the analysis show that the new cryptographic scheme ensures a secure way for sending digital data with potential applications in real-time image encryption.

  7. EFFICIENT IMAGE TRANSMISSION SCHEME IN FMT SYSTEM

    Institute of Scientific and Technical Information of China (English)

    Qi Zhongrui; Gao Zhenming

    2005-01-01

    An efficient image transmission scheme is proposed based on byte partition and adaptive sub-channel distribution technique in Filtered MultiTone (FMT) system over frequency selective slow fading channel. According to the simulation results and analysis of a typical image with matlab, improvement in Peak Signal to Noise Ratio (PSNR) of the received image and low complexity for equalization is demonstrated remarkably. Comparing with no adaptive and no actual channel equalization scheme, the proposed scheme saves over 6 dB when PSNR=40 dB.

  8. Improved Quantum Signature Scheme with Weak Arbitrator

    Science.gov (United States)

    Su, Qi; Li, Wen-Min

    2013-09-01

    In this paper, we find a man-in-the-middle attack on the quantum signature scheme with a weak arbitrator (Luo et al., Int. J. Theor. Phys., 51:2135, 2012). In that scheme, the authors proposed a quantum signature based on quantum one way function which contains both verifying the signer phase and verifying the signed message phase. However, after our analysis we will show that Eve can adopt different strategies in respective phases to forge the signature without being detected. Then we present an improved scheme to increase the security.

  9. Practical Coding Schemes for Cognitive Overlay Radios

    CERN Document Server

    Kurniawan, Ernest; Rini, Stefano

    2012-01-01

    We develop practical coding schemes for the cognitive overlay radios as modeled by the cognitive interference channel, a variation of the classical two user interference channel where one of the transmitters has knowledge of both messages. Inspired by information theoretical results, we develop a coding strategy for each of the three parameter regimes where capacity is known. A key feature of the capacity achieving schemes in these regimes is the joint decoding of both users' codewords, which we accomplish by performing a posteriori probability calculation over a combined trellis. The schemes are shown to perform close to the capacity limit with low error rate.

  10. Cognitive radio networks dynamic resource allocation schemes

    CERN Document Server

    Wang, Shaowei

    2014-01-01

    This SpringerBrief presents a survey of dynamic resource allocation schemes in Cognitive Radio (CR) Systems, focusing on the spectral-efficiency and energy-efficiency in wireless networks. It also introduces a variety of dynamic resource allocation schemes for CR networks and provides a concise introduction of the landscape of CR technology. The author covers in detail the dynamic resource allocation problem for the motivations and challenges in CR systems. The Spectral- and Energy-Efficient resource allocation schemes are comprehensively investigated, including new insights into the trade-off

  11. Finite-volume scheme for anisotropic diffusion

    Energy Technology Data Exchange (ETDEWEB)

    Es, Bram van, E-mail: bramiozo@gmail.com [Centrum Wiskunde & Informatica, P.O. Box 94079, 1090GB Amsterdam (Netherlands); FOM Institute DIFFER, Dutch Institute for Fundamental Energy Research, The Netherlands" 1 (Netherlands); Koren, Barry [Eindhoven University of Technology (Netherlands); Blank, Hugo J. de [FOM Institute DIFFER, Dutch Institute for Fundamental Energy Research, The Netherlands" 1 (Netherlands)

    2016-02-01

    In this paper, we apply a special finite-volume scheme, limited to smooth temperature distributions and Cartesian grids, to test the importance of connectivity of the finite volumes. The area of application is nuclear fusion plasma with field line aligned temperature gradients and extreme anisotropy. We apply the scheme to the anisotropic heat-conduction equation, and compare its results with those of existing finite-volume schemes for anisotropic diffusion. Also, we introduce a general model adaptation of the steady diffusion equation for extremely anisotropic diffusion problems with closed field lines.

  12. An Optimal Labeling Scheme for Ancestry Queries

    OpenAIRE

    2009-01-01

    An ancestry labeling scheme assigns labels (bit strings) to the nodes of rooted trees such that ancestry queries between any two nodes in a tree can be answered merely by looking at their corresponding labels. The quality of an ancestry labeling scheme is measured by its label size, that is the maximal number of bits in a label of a tree node. In addition to its theoretical appeal, the design of efficient ancestry labeling schemes is motivated by applications in web search engines. For this p...

  13. A sequential Monte Carlo framework for haplotype inference in CNV/SNP genotype data.

    Science.gov (United States)

    Iliadis, Alexandros; Anastassiou, Dimitris; Wang, Xiaodong

    2014-01-01

    Copy number variations (CNVs) are abundant in the human genome. They have been associated with complex traits in genome-wide association studies (GWAS) and expected to continue playing an important role in identifying the etiology of disease phenotypes. As a result of current high throughput whole-genome single-nucleotide polymorphism (SNP) arrays, we currently have datasets that simultaneously have integer copy numbers in CNV regions as well as SNP genotypes. At the same time, haplotypes that have been shown to offer advantages over genotypes in identifying disease traits even though available for SNP genotypes are largely not available for CNV/SNP data due to insufficient computational tools. We introduce a new framework for inferring haplotypes in CNV/SNP data using a sequential Monte Carlo sampling scheme 'Tree-Based Deterministic Sampling CNV' (TDSCNV). We compare our method with polyHap(v2.0), the only currently available software able to perform inference in CNV/SNP genotypes, on datasets of varying number of markers. We have found that both algorithms show similar accuracy but TDSCNV is an order of magnitude faster while scaling linearly with the number of markers and number of individuals and thus could be the method of choice for haplotype inference in such datasets. Our method is implemented in the TDSCNV package which is available for download at http://www.ee.columbia.edu/~anastas/tdscnv.

  14. Assessing colour-dependent occupation statistics inferred from galaxy group catalogues

    Science.gov (United States)

    Campbell, Duncan; van den Bosch, Frank C.; Hearin, Andrew; Padmanabhan, Nikhil; Berlind, Andreas; Mo, H. J.; Tinker, Jeremy; Yang, Xiaohu

    2015-09-01

    We investigate the ability of current implementations of galaxy group finders to recover colour-dependent halo occupation statistics. To test the fidelity of group catalogue inferred statistics, we run three different group finders used in the literature over a mock that includes galaxy colours in a realistic manner. Overall, the resulting mock group catalogues are remarkably similar, and most colour-dependent statistics are recovered with reasonable accuracy. However, it is also clear that certain systematic errors arise as a consequence of correlated errors in group membership determination, central/satellite designation, and halo mass assignment. We introduce a new statistic, the halo transition probability (HTP), which captures the combined impact of all these errors. As a rule of thumb, errors tend to equalize the properties of distinct galaxy populations (i.e. red versus blue galaxies or centrals versus satellites), and to result in inferred occupation statistics that are more accurate for red galaxies than for blue galaxies. A statistic that is particularly poorly recovered from the group catalogues is the red fraction of central galaxies as a function of halo mass. Group finders do a good job in recovering galactic conformity, but also have a tendency to introduce weak conformity when none is present. We conclude that proper inference of colour-dependent statistics from group catalogues is best achieved using forward modelling (i.e. running group finders over mock data) or by implementing a correction scheme based on the HTP, as long as the latter is not too strongly model dependent.

  15. Cosmological parameters, shear maps and power spectra from CFHTLenS using Bayesian hierarchical inference

    CERN Document Server

    Alsing, Justin; Jaffe, Andrew H

    2016-01-01

    We apply two Bayesian hierarchical inference schemes to infer shear power spectra, shear maps and cosmological parameters from the CFHTLenS weak lensing survey - the first application of this method to data. In the first approach, we sample the joint posterior distribution of the shear maps and power spectra by Gibbs sampling, with minimal model assumptions. In the second approach, we sample the joint posterior of the shear maps and cosmological parameters, providing a new, accurate and principled approach to cosmological parameter inference from cosmic shear data. As a first demonstration on data we perform a 2-bin tomographic analysis to constrain cosmological parameters and investigate the possibility of photometric redshift bias in the CFHTLenS data. Under the baseline $\\Lambda$CDM model we constrain $S_8 = \\sigma_8(\\Omega_\\mathrm{m}/0.3)^{0.5} = 0.67 ^{\\scriptscriptstyle+ 0.03 }_{\\scriptscriptstyle- 0.03 }$ $(68\\%)$, consistent with previous CFHTLenS analysis but in tension with Planck. Adding neutrino m...

  16. Inference of gene regulatory networks from genetic perturbations with linear regression model.

    Directory of Open Access Journals (Sweden)

    Zijian Dong

    Full Text Available It is an effective strategy to use both genetic perturbation data and gene expression data to infer regulatory networks that aims to improve the detection accuracy of the regulatory relationships among genes. Based on both types of data, the genetic regulatory networks can be accurately modeled by Structural Equation Modeling (SEM. In this paper, a linear regression (LR model is formulated based on the SEM, and a novel iterative scheme using Bayesian inference is proposed to estimate the parameters of the LR model (LRBI. Comparative evaluations of LRBI with other two algorithms, the Adaptive Lasso (AL-Based and the Sparsity-aware Maximum Likelihood (SML, are also presented. Simulations show that LRBI has significantly better performance than AL-Based, and overperforms SML in terms of power of detection. Applying the LRBI algorithm to experimental data, we inferred the interactions in a network of 35 yeast genes. An open-source program of the LRBI algorithm is freely available upon request.

  17. Inferring Acceptance and Rejection in Dialogue by Default Rules of Inference

    CERN Document Server

    Walker, M A

    1996-01-01

    This paper discusses the processes by which conversants in a dialogue can infer whether their assertions and proposals have been accepted or rejected by their conversational partners. It expands on previous work by showing that logical consistency is a necessary indicator of acceptance, but that it is not sufficient, and that logical inconsistency is sufficient as an indicator of rejection, but it is not necessary. I show how conversants can use information structure and prosody as well as logical reasoning in distinguishing between acceptances and logically consistent rejections, and relate this work to previous work on implicature and default reasoning by introducing three new classes of rejection: {\\sc implicature rejections}, {\\sc epistemic rejections} and {\\sc deliberation rejections}. I show how these rejections are inferred as a result of default inferences, which, by other analyses, would have been blocked by the context. In order to account for these facts, I propose a model of the common ground that...

  18. DEVELOPMENT AND APPLICATIONS OF WENO SCHEMES IN CONTINUUM PHYSICS

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper briefly presents the general ideas of high order accurate weighted essentially non-oscillatory (WENO) schemes, and describes the similarities and differences of the two classes of WENO schemes: finite volume schemes and finite difference schemes. We also briefly mention a recent development of WENO schemes,namely an adaptive approach within the finite difference framework using smooth time dependent curvilinear coordinates.``

  19. SUPPLEMENT AND IMPROVEMENT OF HOLLY-PREISSMANN SCHEME

    Institute of Scientific and Technical Information of China (English)

    XIE Zuo-tao; ZHANG Xiao-feng; TAN Guang-ming

    2004-01-01

    Using the undetermined coefficient method, Holly-Preissmann scheme is improved effectively. The scheme with the minus velocity is added, and a new conservative scheme is also presented on the basis of original scheme. The simulations of the new scheme accord with the exact result, which enhances its applicability in the engineering.

  20. A two-stage scheme for multi-view human pose estimation

    Science.gov (United States)

    Yan, Junchi; Sun, Bing; Liu, Yuncai

    2010-08-01

    We present a two-stage scheme integrating voxel reconstruction and human motion tacking. By combining voxel reconstruction with human motion tracking interactively, our method can work in a cluttered background where perfect foreground silhouettes are hardly available. For each frame, a silhouette-based 3D volume reconstruction method and hierarchical tracking algorithm are applied in two stages. In the first stage, coarse reconstruction and tracking results are obtained, and then the refinement for reconstruction is applied in the second stage. The experimental results demonstrate our approach is promising. Although our method focuses on the problem of human body voxel reconstruction and motion tracking in this paper, our scheme can be used to reconstruct voxel data and infer the pose of many specified rigid and articulated objects.