WorldWideScience

Sample records for nonparametric belief propagation

  1. Indoor Positioning Using Nonparametric Belief Propagation Based on Spanning Trees

    Directory of Open Access Journals (Sweden)

    Savic Vladimir

    2010-01-01

    Full Text Available Nonparametric belief propagation (NBP is one of the best-known methods for cooperative localization in sensor networks. It is capable of providing information about location estimation with appropriate uncertainty and to accommodate non-Gaussian distance measurement errors. However, the accuracy of NBP is questionable in loopy networks. Therefore, in this paper, we propose a novel approach, NBP based on spanning trees (NBP-ST created by breadth first search (BFS method. In addition, we propose a reliable indoor model based on obtained measurements in our lab. According to our simulation results, NBP-ST performs better than NBP in terms of accuracy and communication cost in the networks with high connectivity (i.e., highly loopy networks. Furthermore, the computational and communication costs are nearly constant with respect to the transmission radius. However, the drawbacks of proposed method are a little bit higher computational cost and poor performance in low-connected networks.

  2. Counting Belief Propagation

    CERN Document Server

    Kersting, Kristian; Natarajan, Sriraam

    2012-01-01

    A major benefit of graphical models is that most knowledge is captured in the model structure. Many models, however, produce inference problems with a lot of symmetries not reflected in the graphical structure and hence not exploitable by efficient inference techniques such as belief propagation (BP). In this paper, we present a new and simple BP algorithm, called counting BP, that exploits such additional symmetries. Starting from a given factor graph, counting BP first constructs a compressed factor graph of clusternodes and clusterfactors, corresponding to sets of nodes and factors that are indistinguishable given the evidence. Then it runs a modified BP algorithm on the compressed graph that is equivalent to running BP on the original factor graph. Our experiments show that counting BP is applicable to a variety of important AI tasks such as (dynamic) relational models and boolean model counting, and that significant efficiency gains are obtainable, often by orders of magnitude.

  3. Negative Tree Reweighted Belief Propagation

    CERN Document Server

    Liu, Qiang

    2012-01-01

    We introduce a new class of lower bounds on the log partition function of a Markov random field which makes use of a reversed Jensen's inequality. In particular, our method approximates the intractable distribution using a linear combination of spanning trees with negative weights. This technique is a lower-bound counterpart to the tree-reweighted belief propagation algorithm, which uses a convex combination of spanning trees with positive weights to provide corresponding upper bounds. We develop algorithms to optimize and tighten the lower bounds over the non-convex set of valid parameter values. Our algorithm generalizes mean field approaches (including naive and structured mean field approximations), which it includes as a limiting case.

  4. Improved Generalized Belief Propagation for Vision Processing

    Directory of Open Access Journals (Sweden)

    S. Y. Chen

    2011-01-01

    Full Text Available Generalized belief propagation (GBP is a region-based belief propagation algorithm which can get good convergence in Markov random fields. However, the computation time is too heavy to use in practical engineering applications. This paper proposes a method to accelerate the efficiency of GBP. A caching technique and chessboard passing strategy are used to speed up algorithm. Then, the direction set method which is used to reduce the complexity of computing clique messages from quadric to cubic. With such a strategy the processing speed can be greatly increased. Besides, it is the first attempt to apply GBP for solving the stereomatching problem. Experiments show that the proposed algorithm can speed up by 15+ times for typical stereo matching problem and infer a more plausible result.

  5. Belief Propagation Algorithm for Portfolio Optimization Problems.

    Science.gov (United States)

    Shinzato, Takashi; Yasuda, Muneki

    2015-01-01

    The typical behavior of optimal solutions to portfolio optimization problems with absolute deviation and expected shortfall models using replica analysis was pioneeringly estimated by S. Ciliberti et al. [Eur. Phys. B. 57, 175 (2007)]; however, they have not yet developed an approximate derivation method for finding the optimal portfolio with respect to a given return set. In this study, an approximation algorithm based on belief propagation for the portfolio optimization problem is presented using the Bethe free energy formalism, and the consistency of the numerical experimental results of the proposed algorithm with those of replica analysis is confirmed. Furthermore, the conjecture of H. Konno and H. Yamazaki, that the optimal solutions with the absolute deviation model and with the mean-variance model have the same typical behavior, is verified using replica analysis and the belief propagation algorithm.

  6. Belief Propagation Algorithm for Portfolio Optimization Problems.

    Directory of Open Access Journals (Sweden)

    Takashi Shinzato

    Full Text Available The typical behavior of optimal solutions to portfolio optimization problems with absolute deviation and expected shortfall models using replica analysis was pioneeringly estimated by S. Ciliberti et al. [Eur. Phys. B. 57, 175 (2007]; however, they have not yet developed an approximate derivation method for finding the optimal portfolio with respect to a given return set. In this study, an approximation algorithm based on belief propagation for the portfolio optimization problem is presented using the Bethe free energy formalism, and the consistency of the numerical experimental results of the proposed algorithm with those of replica analysis is confirmed. Furthermore, the conjecture of H. Konno and H. Yamazaki, that the optimal solutions with the absolute deviation model and with the mean-variance model have the same typical behavior, is verified using replica analysis and the belief propagation algorithm.

  7. Belief Propagation Methods for Intercell Interference Coordination

    CERN Document Server

    Rangan, Sundeep

    2010-01-01

    We consider a broad class of interference coordination and resource allocation problems for wireless links where the goal is to maximize the sum of functions of individual link rates. Such problems arise in the context of, for example, fractional frequency reuse (FFR) for macro-cellular networks and dynamic interference management in femtocells. The resulting optimization problems are typically hard to solve optimally even using centralized algorithms but are an essential computational step in implementing rate-fair and queue stabilizing scheduling policies in wireless networks. We consider a belief propagation framework to solve such problems approximately. In particular, we construct approximations to the belief propagation iterations to obtain computationally simple and distributed algorithms with low communication overhead. Notably, our methods are very general and apply to, for example, the optimization of transmit powers, transmit beamforming vectors, and sub-band allocation to maximize the above object...

  8. DNA motif elucidation using belief propagation.

    Science.gov (United States)

    Wong, Ka-Chun; Chan, Tak-Ming; Peng, Chengbin; Li, Yue; Zhang, Zhaolei

    2013-09-01

    Protein-binding microarray (PBM) is a high-throughout platform that can measure the DNA-binding preference of a protein in a comprehensive and unbiased manner. A typical PBM experiment can measure binding signal intensities of a protein to all the possible DNA k-mers (k=8∼10); such comprehensive binding affinity data usually need to be reduced and represented as motif models before they can be further analyzed and applied. Since proteins can often bind to DNA in multiple modes, one of the major challenges is to decompose the comprehensive affinity data into multimodal motif representations. Here, we describe a new algorithm that uses Hidden Markov Models (HMMs) and can derive precise and multimodal motifs using belief propagations. We describe an HMM-based approach using belief propagations (kmerHMM), which accepts and preprocesses PBM probe raw data into median-binding intensities of individual k-mers. The k-mers are ranked and aligned for training an HMM as the underlying motif representation. Multiple motifs are then extracted from the HMM using belief propagations. Comparisons of kmerHMM with other leading methods on several data sets demonstrated its effectiveness and uniqueness. Especially, it achieved the best performance on more than half of the data sets. In addition, the multiple binding modes derived by kmerHMM are biologically meaningful and will be useful in interpreting other genome-wide data such as those generated from ChIP-seq. The executables and source codes are available at the authors' websites: e.g. http://www.cs.toronto.edu/∼wkc/kmerHMM.

  9. DNA motif elucidation using belief propagation

    KAUST Repository

    Wong, Ka-Chun

    2013-06-29

    Protein-binding microarray (PBM) is a high-throughout platform that can measure the DNA-binding preference of a protein in a comprehensive and unbiased manner. A typical PBM experiment can measure binding signal intensities of a protein to all the possible DNA k-mers (k = 8 ?10); such comprehensive binding affinity data usually need to be reduced and represented as motif models before they can be further analyzed and applied. Since proteins can often bind to DNA in multiple modes, one of the major challenges is to decompose the comprehensive affinity data into multimodal motif representations. Here, we describe a new algorithm that uses Hidden Markov Models (HMMs) and can derive precise and multimodal motifs using belief propagations. We describe an HMM-based approach using belief propagations (kmerHMM), which accepts and preprocesses PBM probe raw data into median-binding intensities of individual k-mers. The k-mers are ranked and aligned for training an HMM as the underlying motif representation. Multiple motifs are then extracted from the HMM using belief propagations. Comparisons of kmerHMM with other leading methods on several data sets demonstrated its effectiveness and uniqueness. Especially, it achieved the best performance on more than half of the data sets. In addition, the multiple binding modes derived by kmerHMM are biologically meaningful and will be useful in interpreting other genome-wide data such as those generated from ChIP-seq. The executables and source codes are available at the authors\\' websites: e.g. http://www.cs.toronto.edu/?wkc/kmerHMM. 2013 The Author(s).

  10. Gaussian Belief Propagation Based Multiuser Detection

    CERN Document Server

    Bickson, Danny; Shental, Ori; Siegel, Paul H; Wolf, Jack K

    2008-01-01

    In this work, we present a novel construction for solving the linear multiuser detection problem using the Gaussian Belief Propagation algorithm. Our algorithm yields an efficient, iterative and distributed implementation of the MMSE detector. We compare our algorithm's performance to a recent result and show an improved memory consumption, reduced computation steps and a reduction in the number of sent messages. We prove that recent work by Montanari et al. is an instance of our general algorithm, providing new convergence results for both algorithms.

  11. Relaxed Half-Stochastic Belief Propagation

    CERN Document Server

    Leduc-Primeau, François; Mannor, Shie; Gross, Warren J

    2012-01-01

    Low-density parity-check codes are attractive for high throughput applications because of their low decoding complexity per bit, but also because all the codeword bits can be decoded in parallel. However, achieving this in a circuit implementation is complicated by the number of wires required to exchange messages between processing nodes. Decoding algorithms that exchange binary messages are interesting for fully-parallel implementations because they can reduce the number and the length of the wires, and increase logic density. This paper introduces the Relaxed Half-Stochastic (RHS) decoding algorithm, a binary message belief propagation (BP) algorithm that achieves a coding gain comparable to the best known BP algorithms that use real-valued messages. We derive the RHS algorithm by starting from the well-known Sum-Product algorithm, and then derive a low-complexity version suitable for circuit implementation. We present extensive simulation results on two standardized codes having different rates and constr...

  12. Fixing convergence of Gaussian belief propagation

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jason K [Los Alamos National Laboratory; Bickson, Danny [IBM RESEARCH LAB; Dolev, Danny [HEBREW UNIV

    2009-01-01

    Gaussian belief propagation (GaBP) is an iterative message-passing algorithm for inference in Gaussian graphical models. It is known that when GaBP converges it converges to the correct MAP estimate of the Gaussian random vector and simple sufficient conditions for its convergence have been established. In this paper we develop a double-loop algorithm for forcing convergence of GaBP. Our method computes the correct MAP estimate even in cases where standard GaBP would not have converged. We further extend this construction to compute least-squares solutions of over-constrained linear systems. We believe that our construction has numerous applications, since the GaBP algorithm is linked to solution of linear systems of equations, which is a fundamental problem in computer science and engineering. As a case study, we discuss the linear detection problem. We show that using our new construction, we are able to force convergence of Montanari's linear detection algorithm, in cases where it would originally fail. As a consequence, we are able to increase significantly the number of users that can transmit concurrently.

  13. Polynomial Linear Programming with Gaussian Belief Propagation

    CERN Document Server

    Bickson, Danny; Shental, Ori; Dolev, Danny

    2008-01-01

    Interior-point methods are state-of-the-art algorithms for solving linear programming (LP) problems with polynomial complexity. Specifically, the Karmarkar algorithm typically solves LP problems in time O(n^{3.5}), where $n$ is the number of unknown variables. Karmarkar's celebrated algorithm is known to be an instance of the log-barrier method using the Newton iteration. The main computational overhead of this method is in inverting the Hessian matrix of the Newton iteration. In this contribution, we propose the application of the Gaussian belief propagation (GaBP) algorithm as part of an efficient and distributed LP solver that exploits the sparse and symmetric structure of the Hessian matrix and avoids the need for direct matrix inversion. This approach shifts the computation from realm of linear algebra to that of probabilistic inference on graphical models, thus applying GaBP as an efficient inference engine. Our construction is general and can be used for any interior-point algorithm which uses the Newt...

  14. Sythesis of MCMC and Belief Propagation

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Sungsoo [Korea Advanced Institute of Science and Technology, Daejeon (South Korea); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shin, Jinwoo [Korea Advanced Institute of Science and Technology, Daejeon (South Korea)

    2016-05-27

    Markov Chain Monte Carlo (MCMC) and Belief Propagation (BP) are the most popular algorithms for computational inference in Graphical Models (GM). In principle, MCMC is an exact probabilistic method which, however, often suffers from exponentially slow mixing. In contrast, BP is a deterministic method, which is typically fast, empirically very successful, however in general lacking control of accuracy over loopy graphs. In this paper, we introduce MCMC algorithms correcting the approximation error of BP, i.e., we provide a way to compensate for BP errors via a consecutive BP-aware MCMC. Our framework is based on the Loop Calculus (LC) approach which allows to express the BP error as a sum of weighted generalized loops. Although the full series is computationally intractable, it is known that a truncated series, summing up all 2-regular loops, is computable in polynomial-time for planar pair-wise binary GMs and it also provides a highly accurate approximation empirically. Motivated by this, we first propose a polynomial-time approximation MCMC scheme for the truncated series of general (non-planar) pair-wise binary models. Our main idea here is to use the Worm algorithm, known to provide fast mixing in other (related) problems, and then design an appropriate rejection scheme to sample 2-regular loops. Furthermore, we also design an efficient rejection-free MCMC scheme for approximating the full series. The main novelty underlying our design is in utilizing the concept of cycle basis, which provides an efficient decomposition of the generalized loops. In essence, the proposed MCMC schemes run on transformed GM built upon the non-trivial BP solution, and our experiments show that this synthesis of BP and MCMC outperforms both direct MCMC and bare BP schemes.

  15. Merging Belief Propagation and the Mean Field Approximation

    DEFF Research Database (Denmark)

    Riegler, Erwin; Kirkelund, Gunvor Elisabeth; Manchón, Carles Navarro

    2010-01-01

    We present a joint message passing approach that combines belief propagation and the mean field approximation. Our analysis is based on the region-based free energy approximation method proposed by Yedidia et al., which allows to use the same objective function (Kullback-Leibler divergence...

  16. Efficient Decoding of Turbo Codes with Nonbinary Belief Propagation

    Directory of Open Access Journals (Sweden)

    Thierry Lestable

    2008-05-01

    Full Text Available This paper presents a new approach to decode turbo codes using a nonbinary belief propagation decoder. The proposed approach can be decomposed into two main steps. First, a nonbinary Tanner graph representation of the turbo code is derived by clustering the binary parity-check matrix of the turbo code. Then, a group belief propagation decoder runs several iterations on the obtained nonbinary Tanner graph. We show in particular that it is necessary to add a preprocessing step on the parity-check matrix of the turbo code in order to ensure good topological properties of the Tanner graph and then good iterative decoding performance. Finally, by capitalizing on the diversity which comes from the existence of distinct efficient preprocessings, we propose a new decoding strategy, called decoder diversity, that intends to take benefits from the diversity through collaborative decoding schemes.

  17. Bayesian Hypothesis Test for Sparse Support Recovery using Belief Propagation

    CERN Document Server

    Kang, Jaewook; Kim, Kiseon

    2012-01-01

    In this paper, we introduce a new support recovery algorithm from noisy measurements called Bayesian hypothesis test via belief propagation (BHT-BP). BHT-BP focuses on sparse support recovery rather than sparse signal estimation. The key idea behind BHT-BP is to detect the support set of a sparse vector using hypothesis test where the posterior densities used in the test are obtained by aid of belief propagation (BP). Since BP provides precise posterior information using the noise statistic, BHT-BP can recover the support with robustness against the measurement noise. In addition, BHT-BP has low computational cost compared to the other algorithms by the use of BP. We show the support recovery performance of BHT-BP on the parameters (N; M; K; SNR) and compare the performance of BHT-BP to OMP and Lasso via numerical results.

  18. Spatially Coupled Ensembles Universally Achieve Capacity under Belief Propagation

    CERN Document Server

    Kudekar, Shrinivas; Urbanke, Ruediger

    2012-01-01

    We investigate spatially coupled code ensembles. For transmission over the binary erasure channel, it was recently shown that spatial coupling increases the belief propagation threshold of the ensemble to essentially the maximum a-priori threshold of the underlying component ensemble. This explains why convolutional LDPC ensembles, originally introduced by Felstrom and Zigangirov, perform so well over this channel. We show that the equivalent result holds true for transmission over general binary-input memoryless output-symmetric channels. More precisely, given a desired error probability and a gap to capacity, we can construct a spatially coupled ensemble which fulfills these constraints universally on this class of channels under belief propagation decoding. In fact, most codes in that ensemble have that property. The quantifier universal refers to the single ensemble/code which is good for all channels but we assume that the channel is known at the receiver. The key technical result is a proof that under b...

  19. Structured sublinear compressive sensing via dense belief propagation

    CERN Document Server

    Dai, Wei; Pham, Hoa Vin

    2011-01-01

    Compressive sensing (CS) is a sampling technique designed for reducing the complexity of sparse data acquisition. One of the major obstacles for practical deployment of CS techniques is the signal reconstruction time and the high storage cost of random sensing matrices. We propose a new structured compressive sensing scheme, based on codes of graphs, that allows for a joint design of structured sensing matrices and logarithmic-complexity reconstruction algorithms. The compressive sensing matrices can be shown to offer asymptotically optimal performance when used in combination with Orthogonal Matching Pursuit (OMP) methods. For more elaborate greedy reconstruction schemes, we propose a new family of dense list decoding belief propagation algorithms, as well as reinforced- and multiple-basis belief propagation algorithms. Our simulation results indicate that reinforced BP CS schemes offer very good complexity-performance tradeoffs for very sparse signal vectors.

  20. Sequential stratified sampling belief propagation for multiple targets tracking

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Rather than the difficulties of highly non-linear and non-Gaussian observation process and the state distribution in single target tracking, the presence of a large, varying number of targets and their interactions place more challenge on visual tracking. To overcome these difficulties, we formulate multiple targets tracking problem in a dynamic Markov network which consists of three coupled Markov random fields that model the following: a field for joint state of multi-target, one binary process for existence of individual target, and another binary process for occlusion of dual adjacent targets. By introducing two robust functions, we eliminate the two binary processes, and then apply a novel version of belief propagation called sequential stratified sampling belief propagation algorithm to obtain the maximum a posteriori (MAP) estimation in the dynamic Markov network. By using stratified sampler, we incorporate bottom-up information provided by a learned detector (e.g. SVM classifier) and belief information for the messages updating. Other low-level visual cues (e.g. color and shape) can be easily incorporated in our multi-target tracking model to obtain better tracking results. Experimental results suggest that our method is comparable to the state-of-the-art multiple targets tracking methods in several test cases.

  1. Orbit-product representation and correction of Gaussian belief propagation

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jason K [Los Alamos National Laboratory; Chertkov, Michael [Los Alamos National Laboratory; Chernyak, Vladimir [WAYNE STATE UNIV

    2009-01-01

    We present a new interpretation of Gaussian belief propagation (GaBP) based on the 'zeta function' representation of the determinant as a product over orbits of a graph. We show that GaBP captures back-tracking orbits of the graph and consider how to correct this estimate by accounting for non-backtracking orbits. We show that the product over non-backtracking orbits may be interpreted as the determinant of the non-backtracking adjacency matrix of the graph with edge weights based on the solution of GaBP. An efficient method is proposed to compute a truncated correction factor including all non-backtracking orbits up to a specified length.

  2. Belief propagation decoding of quantum channels by passing quantum messages

    Science.gov (United States)

    Renes, Joseph M.

    2017-07-01

    The belief propagation (BP) algorithm is a powerful tool in a wide range of disciplines from statistical physics to machine learning to computational biology, and is ubiquitous in decoding classical error-correcting codes. The algorithm works by passing messages between nodes of the factor graph associated with the code and enables efficient decoding of the channel, in some cases even up to the Shannon capacity. Here we construct the first BP algorithm which passes quantum messages on the factor graph and is capable of decoding the classical-quantum channel with pure state outputs. This gives explicit decoding circuits whose number of gates is quadratic in the code length. We also show that this decoder can be modified to work with polar codes for the pure state channel and as part of a decoder for transmitting quantum information over the amplitude damping channel. These represent the first explicit capacity-achieving decoders for non-Pauli channels.

  3. Markov entropy decomposition: a variational dual for quantum belief propagation

    CERN Document Server

    Poulin, David

    2010-01-01

    We present a lower bound for the free energy of a quantum many-body system at finite temperature. This lower bound is expressed as a convex optimization problem with linear constraints, and is derived using strong subadditivity of von Neumann entropy and a relaxation of the consistency condition of local density operators. The dual to this minimization problem leads to a set of quantum belief propagation equations, thus providing a firm theoretical foundation to that approach. The minimization problem is numerically tractable, and we find good agreement with quantum Monte Carlo for the spin-half Heisenberg anti-ferromagnet in two dimensions. This lower bound complements other variational upper bounds. We discuss applications to Hamiltonian complexity theory and give a generalization of the structure theorem of Hayden, Jozsa, Petz and Winter to trees in an appendix.

  4. Multilevel Decoders Surpassing Belief Propagation on the Binary Symmetric Channel

    CERN Document Server

    Planjery, Shiva Kumar; Chilappagari, Shashi Kiran; Vasić, Bane

    2010-01-01

    In this paper, we propose a new class of quantized message-passing decoders for LDPC codes over the BSC. The messages take values (or levels) from a finite set. The update rules do not mimic belief propagation but instead are derived using the knowledge of trapping sets. We show that the update rules can be derived to correct certain error patterns that are uncorrectable by algorithms such as BP and min-sum. In some cases even with a small message set, these decoders can guarantee correction of a higher number of errors than BP and min-sum. We provide particularly good 3-bit decoders for 3-left-regular LDPC codes. They significantly outperform the BP and min-sum decoders, but more importantly, they achieve this at only a fraction of the complexity of the BP and min-sum decoders.

  5. Identifying optimal targets of network attack by belief propagation

    CERN Document Server

    Mugisha, Salomon

    2016-01-01

    For a network formed by nodes and undirected links between pairs of nodes, the network optimal attack problem aims at deleting a minimum number of target nodes to break the network down into many small components. This problem is intrinsically related to the feedback vertex set problem that was successfully tackled by spin glass theory and an associated belief propagation-guided decimation (BPD) algorithm [H.-J. Zhou, Eur.~Phys.~J.~B 86 (2013), 455]. In the present work we apply a slightly adjusted version of the BPD algorithm to the network optimal attack problem, and demonstrate that it has much better performance than a recently proposed Collective Information algorithm [F. Morone and H. A. Makse, Nature 524 (2015), 63--68] for different types of random networks and real-world network instances. The BPD-guided attack scheme often induces an abrupt collapse of the whole network, which may make it very difficult to defend.

  6. TMBP: A Topic Modeling Toolbox Using Belief Propagation

    CERN Document Server

    Zeng, Jia

    2012-01-01

    Latent Dirichlet allocation (LDA) is an important class of hierarchical Bayesian models for probabilistic topic modeling, which attracts worldwide interests and touches on many important applications in text mining, computer vision and computational biology. This paper introduces a topic modeling toolbox (TMBP) based on the belief propagation (BP) algorithms. This toolbox is implemented by MEX C++/MATLAB platform for either Windows or Linux. The current version includes various learning algorithms for latent Dirichlet allocation (LDA), author-topic models (ATM), relational topic models (RTM), and labeled LDA (LaLDA). This toolbox is an ongoing project and more and more BP-based learning algorithms for various LDA-based topic models will be added in the near future. Interested readers may also extend this toolbox for solving more complicated topic modeling problems. The source code is freely available under the GNU General Public Licence, Version 1.0 at http://code.google.com/p/tmbp-topicmodel-beliefpropagatio...

  7. Belief Propagation based MIMO Detection Operating on Quantized Channel Output

    CERN Document Server

    Mezghani, Amine

    2010-01-01

    In multiple-antenna communications, as bandwidth and modulation order increase, system components must work with demanding tolerances. In particular, high resolution and high sampling rate analog-to-digital converters (ADCs) are often prohibitively challenging to design. Therefore ADCs for such applications should be low-resolution. This paper provides new insights into the problem of optimal signal detection based on quantized received signals for multiple-input multiple-output (MIMO) channels. It capitalizes on previous works which extensively analyzed the unquantized linear vector channel using graphical inference methods. In particular, a "loopy" belief propagation-like (BP) MIMO detection algorithm, operating on quantized data with low complexity, is proposed. In addition, we study the impact of finite receiver resolution in fading channels in the large-system limit by means of a state evolution analysis of the BP algorithm, which refers to the limit where the number of transmit and receive antennas go t...

  8. Metric Ranking of Invariant Networks with Belief Propagation

    Energy Technology Data Exchange (ETDEWEB)

    Tao, Changxia [Xi' an Jiaotong University, China; Ge, Yong [University of North Carolina, Charlotte; Song, Qinbao [Xi' an Jiaotong University, China; Ge, Yuan [Anhui Polytechnic University, China; Omitaomu, Olufemi A [ORNL

    2014-01-01

    The management of large-scale distributed information systems relies on the effective use and modeling of monitoring data collected at various points in the distributed information systems. A promising approach is to discover invariant relationships among the monitoring data and generate invariant networks, where a node is a monitoring data source (metric) and a link indicates an invariant relationship between two monitoring data. Such an invariant network representation can help system experts to localize and diagnose the system faults by examining those broken invariant relationships and their related metrics, because system faults usually propagate among the monitoring data and eventually lead to some broken invariant relationships. However, at one time, there are usually a lot of broken links (invariant relationships) within an invariant network. Without proper guidance, it is difficult for system experts to manually inspect this large number of broken links. Thus, a critical challenge is how to effectively and efficiently rank metrics (nodes) of invariant networks according to the anomaly levels of metrics. The ranked list of metrics will provide system experts with useful guidance for them to localize and diagnose the system faults. To this end, we propose to model the nodes and the broken links as a Markov Random Field (MRF), and develop an iteration algorithm to infer the anomaly of each node based on belief propagation (BP). Finally, we validate the proposed algorithm on both realworld and synthetic data sets to illustrate its effectiveness.

  9. Identifying optimal targets of network attack by belief propagation

    Science.gov (United States)

    Mugisha, Salomon; Zhou, Hai-Jun

    2016-07-01

    For a network formed by nodes and undirected links between pairs of nodes, the network optimal attack problem aims at deleting a minimum number of target nodes to break the network down into many small components. This problem is intrinsically related to the feedback vertex set problem that was successfully tackled by spin-glass theory and an associated belief propagation-guided decimation (BPD) algorithm [Zhou, Eur. Phys. J. B 86, 455 (2013), 10.1140/epjb/e2013-40690-1]. In the present work we apply the BPD algorithm (which has approximately linear time complexity) to the network optimal attack problem and demonstrate that it has much better performance than a recently proposed collective information algorithm [Morone and Makse, Nature 524, 65 (2015), 10.1038/nature14604] for different types of random networks and real-world network instances. The BPD-guided attack scheme often induces an abrupt collapse of the whole network, which may make it very difficult to defend.

  10. Belief Propagation and Bethe approximation for Traffic Prediction

    CERN Document Server

    Furtlehner, C; Lasgouttes, J M; Fortelle, Arnaud De La; Furtlehner, Cyril; Lasgouttes, Jean-Marc

    2007-01-01

    We define and study an inference algorithm based on "belief propagation" (BP) and the Bethe approximation. The idea is to encode into a graph an a priori information composed of correlations or marginal probabilities of variables, and to use a message passing procedure to estimate the actual state from some extra real-time information. This method is originally designed for traffic prediction and is particularly suitable in settings where the only information available is floating car data. We propose a discretized traffic description, based on the Ising model of statistical physics, in order to both reconstruct and predict the traffic in real time. General properties of BP are addressed in this context. In particular, a detailed study of stability is proposed with respect to the a priori data and the graph topology. The behavior of the algorithm is illustrated by numerical studies on a simple traffic toy model. How this approach can be generalized to encode superposition of many traffic patterns is discussed...

  11. Vehicle detection from very-high-resolution (VHR) aerial imagery using attribute belief propagation (ABP)

    Science.gov (United States)

    Wang, Yanli; Li, Ying; Zhang, Li; Huang, Yuchun

    2016-10-01

    With the popularity of very-high-resolution (VHR) aerial imagery, the shape, color, and context attribute of vehicles are better characterized. Due to the various road surroundings and imaging conditions, vehicle attributes could be adversely affected so that vehicle is mistakenly detected or missed. This paper is motivated to robustly extract the rich attribute feature for detecting the vehicles of VHR imagery under different scenarios. Based on the hierarchical component tree of vehicle context, attribute belief propagation (ABP) is proposed to detect salient vehicles from the statistical perspective. With the Max-tree data structure, the multi-level component tree around the road network is efficiently created. The spatial relationship between vehicle and its belonging context is established with the belief definition of vehicle attribute. To effectively correct single-level belief error, the inter-level belief linkages enforce consistency of belief assignment between corresponding components at different levels. ABP starts from an initial set of vehicle belief calculated by vehicle attribute, and then iterates through each component by applying inter-level belief passing until convergence. The optimal value of vehicle belief of each component is obtained via minimizing its belief function iteratively. The proposed algorithm is tested on a diverse set of VHR imagery acquired in the city and inter-city areas of the West and South China. Experimental results show that the proposed algorithm can detect vehicle efficiently and suppress the erroneous effectively. The proposed ABP framework is promising to robustly classify the vehicles from VHR Aerial imagery.

  12. Relative-Residual-Based Dynamic Schedule for Belief Propagation Decoding of LDPC Codes

    Institute of Scientific and Technical Information of China (English)

    Huang Jie; Zhang Lijun

    2011-01-01

    Two Relative-Residual-based Dynamic Schedules (RRDS) for Belief Propagation (BP) decoding of Low-Density Parity-Check (LDPC) codes are proposed,in which the Variable code-RRDS (VN-RRDS) is a greediness-reduced version of the Check code-RRDS (CN-RRDS).The RRDS only processes the variable (or check) node,which has the maximum relative residual among all the variable (or check) nodes in each decoding iteration,thus keeping less greediness and decreased complexity in comparison with the edge-based Variable-to-Check Residual Belief Propagation (VC-RBP) algorithm.Moreover,VN-RRDS propagates first the message which has the largest residual based on all check equations.For different types of LDPC codes,simulation results show that the convergence rate of RRDS is higher than that of VC-RBP while keeping very low computational complexity.Furthermore,VN-RRDS achieves faster convergence as well as better performance than CN-RRDS.

  13. Linear CDMA Detection Algorithm Based on Statistical Neurodynamics and Belief Propagation and the Stability Conditions

    Science.gov (United States)

    Shojiguchi, A.; Tanaka, T.; Okada, M.

    Recently a modified algorithm of code-division multiple-access (CDMA) parallel interference canceler (PIC) has been proposed by Tanaka based on statistical neurodynamics. In this paper we apply the modified algorithm to the linear PIC (LPIC) and investigate its stability. We show that the stable (unstable) fixed points of the modified algorithm correspond to the stable (unstable) replica symmetry solutions with the Gaussian prior. We also show the modified algorithm is a special case of Kabashima's belief-propagation algorithm with Gaussian prior.

  14. Graph Zeta Function in the Bethe Free Energy and Loopy Belief Propagation

    CERN Document Server

    Watanabe, Yusuke

    2010-01-01

    We propose a new approach to the analysis of Loopy Belief Propagation (LBP) by establishing a formula that connects the Hessian of the Bethe free energy with the edge zeta function. The formula has a number of theoretical implications on LBP. It is applied to give a sufficient condition that the Hessian of the Bethe free energy is positive definite, which shows non-convexity for graphs with multiple cycles. The formula clarifies the relation between the local stability of a fixed point of LBP and local minima of the Bethe free energy. We also propose a new approach to the uniqueness of LBP fixed point, and show various conditions of uniqueness.

  15. Belief-Propagation-Approximated Decoding of Low-Density Parity-Check Codes

    Institute of Scientific and Technical Information of China (English)

    SONG Hui-shi; ZHANG Ping

    2004-01-01

    In this paper, we propose a new reduced-complexity decoding algorithm of Low-Density Parity-Check (LDPC) codes, called Belief-Propagation-Approximated (BPA) algorithm, which utilizes the idea of normalization and translates approximately the intricate nonlinear operation in the check nodes of the original BP algorithm to only one operation of looking up the table. The normalization factors can be obtained by simulation, or theoretically. Simulation results demonstrate that BPA algorithm exhibits fairly satisfactory bit error performance on the Additive White Gaussian Noise (AWGN) channel.

  16. Generalized Belief Propagation for the Noiseless Capacity and Information Rates of Run-Length Limited Constraints

    CERN Document Server

    Sabato, Giovanni

    2011-01-01

    The performance of the generalized belief propagation algorithm for computing the noiseless capacity and mutual information rates of finite-size two-dimensional and three-dimensional run-length limited constraints is investigated. For each constraint, a method is proposed to choose the basic regions and to construct the region graph. Simulation results for the capacity of different constraints as a function of the size of the channel and mutual information rates of different constraints as a function of signal-to-noise ratio are reported. Convergence to the Shannon capacity is also discussed.

  17. Merging Belief Propagation and the Mean Field Approximation: A Free Energy Approach

    DEFF Research Database (Denmark)

    Riegler, Erwin; Kirkelund, Gunvor Elisabeth; Manchón, Carles Navarro

    2013-01-01

    We present a joint message passing approach that combines belief propagation and the mean field approximation. Our analysis is based on the region-based free energy approximation method proposed by Yedidia et al. We show that the message passing fixed-point equations obtained with this combination...... correspond to stationary points of a constrained region-based free energy approximation. Moreover, we present a convergent implementation of these message passing fixed-point equations provided that the underlying factor graph fulfills certain technical conditions. In addition, we show how to include hard...

  18. Combination of Annealing Particle Filter and Belief Propagation for 3D Upper Body Tracking

    Directory of Open Access Journals (Sweden)

    Ilaria Renna

    2012-01-01

    Full Text Available 3D upper body pose estimation is a topic greatly studied by the computer vision society because it is useful in a great number of applications, mainly for human robots interactions including communications with companion robots. However there is a challenging problem: the complexity of classical algorithms that increases exponentially with the dimension of the vectors’ state becomes too difficult to handle. To tackle this problem, we propose a new approach that combines several annealing particle filters defined independently for each limb and belief propagation method to add geometrical constraints between individual filters. Experimental results on a real human gestures sequence will show that this combined approach leads to reliable results.

  19. A Gaussian Belief Propagation Solver for Large Scale Support Vector Machines

    CERN Document Server

    Bickson, Danny; Dolev, Danny

    2008-01-01

    Support vector machines (SVMs) are an extremely successful type of classification and regression algorithms. Building an SVM entails solving a constrained convex quadratic programming problem, which is quadratic in the number of training samples. We introduce an efficient parallel implementation of an support vector regression solver, based on the Gaussian Belief Propagation algorithm (GaBP). In this paper, we demonstrate that methods from the complex system domain could be utilized for performing efficient distributed computation. We compare the proposed algorithm to previously proposed distributed and single-node SVM solvers. Our comparison shows that the proposed algorithm is just as accurate as these solvers, while being significantly faster, especially for large datasets. We demonstrate scalability of the proposed algorithm to up to 1,024 computing nodes and hundreds of thousands of data points using an IBM Blue Gene supercomputer. As far as we know, our work is the largest parallel implementation of bel...

  20. Belief Propagation for Min-cost Network Flow: Convergence and Correctness

    CERN Document Server

    Gamarnik, David; Wei, Yehua

    2010-01-01

    Message passing type algorithms such as the so-called Belief Propagation algorithm have recently gained a lot of attention in the statistics, signal processing and machine learning communities as attractive algorithms for solving a variety of optimization and inference problems. As a decentralized, easy to implement and empirically successful algorithm, BP deserves attention from the theoretical standpoint, and here not much is known at the present stage. In order to fill this gap we consider the performance of the BP algorithm in the context of the capacitated minimum-cost network flow problem - the classical problem in the operations research field. We prove that BP converges to the optimal solution in the pseudo-polynomial time, provided that the optimal solution of the underlying problem is unique and the problem input is integral. Moreover, we present a simple modification of the BP algorithm which gives a fully polynomial-time randomized approximation scheme (FPRAS) for the same problem, which no longer...

  1. Low Complexity Approach for High Throughput Belief-Propagation based Decoding of LDPC Codes

    Directory of Open Access Journals (Sweden)

    BOT, A.

    2013-11-01

    Full Text Available The paper proposes a low complexity belief propagation (BP based decoding algorithm for LDPC codes. In spite of the iterative nature of the decoding process, the proposed algorithm provides both reduced complexity and increased BER performances as compared with the classic min-sum (MS algorithm, generally used for hardware implementations. Linear approximations of check-nodes update function are used in order to reduce the complexity of the BP algorithm. Considering this decoding approach, an FPGA based hardware architecture is proposed for implementing the decoding algorithm, aiming to increase the decoder throughput. FPGA technology was chosen for the LDPC decoder implementation, due to its parallel computation and reconfiguration capabilities. The obtained results show improvements regarding decoding throughput and BER performances compared with state-of-the-art approaches.

  2. The Directed Dominating Set Problem: Generalized Leaf Removal and Belief Propagation

    CERN Document Server

    Habibulla, Yusupjan; Zhou, Hai-Jun

    2015-01-01

    A minimum dominating set for a digraph (directed graph) is a smallest set of vertices such that each vertex either belongs to this set or has at least one parent vertex in this set. We solve this hard combinatorial optimization problem approximately by a local algorithm of generalized leaf removal and by a message-passing algorithm of belief propagation. These algorithms can construct near-optimal dominating sets or even exact minimum dominating sets for random digraphs and also for real-world digraph instances. We further develop a core percolation theory and a replica-symmetric spin glass theory for this problem. Our algorithmic and theoretical results may facilitate applications of dominating sets to various network problems involving directed interactions.

  3. Design of Belief Propagation Based on FPGA for the Multistereo CAFADIS Camera

    Directory of Open Access Journals (Sweden)

    José Manuel Rodríguez-Ramos

    2010-10-01

    Full Text Available In this paper we describe a fast, specialized hardware implementation of the belief propagation algorithm for the CAFADIS camera, a new plenoptic sensor patented by the University of La Laguna. This camera captures the lightfield of the scene and can be used to find out at which depth each pixel is in focus. The algorithm has been designed for FPGA devices using VHDL. We propose a parallel and pipeline architecture to implement the algorithm without external memory. Although the BRAM resources of the device increase considerably, we can maintain real-time restrictions by using extremely high-performance signal processing capability through parallelism and by accessing several memories simultaneously. The quantifying results with 16 bit precision have shown that performances are really close to the original Matlab programmed algorithm.

  4. CURRENT STATUS OF NONPARAMETRIC STATISTICS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-02-01

    Full Text Available Nonparametric statistics is one of the five points of growth of applied mathematical statistics. Despite the large number of publications on specific issues of nonparametric statistics, the internal structure of this research direction has remained undeveloped. The purpose of this article is to consider its division into regions based on the existing practice of scientific activity determination of nonparametric statistics and classify investigations on nonparametric statistical methods. Nonparametric statistics allows to make statistical inference, in particular, to estimate the characteristics of the distribution and testing statistical hypotheses without, as a rule, weakly proven assumptions about the distribution function of samples included in a particular parametric family. For example, the widespread belief that the statistical data are often have the normal distribution. Meanwhile, analysis of results of observations, in particular, measurement errors, always leads to the same conclusion - in most cases the actual distribution significantly different from normal. Uncritical use of the hypothesis of normality often leads to significant errors, in areas such as rejection of outlying observation results (emissions, the statistical quality control, and in other cases. Therefore, it is advisable to use nonparametric methods, in which the distribution functions of the results of observations are imposed only weak requirements. It is usually assumed only their continuity. On the basis of generalization of numerous studies it can be stated that to date, using nonparametric methods can solve almost the same number of tasks that previously used parametric methods. Certain statements in the literature are incorrect that nonparametric methods have less power, or require larger sample sizes than parametric methods. Note that in the nonparametric statistics, as in mathematical statistics in general, there remain a number of unresolved problems

  5. A Line-Based Adaptive-Weight Matching Algorithm Using Loopy Belief Propagation

    Directory of Open Access Journals (Sweden)

    Hui Li

    2015-01-01

    Full Text Available In traditional adaptive-weight stereo matching, the rectangular shaped support region requires excess memory consumption and time. We propose a novel line-based stereo matching algorithm for obtaining a more accurate disparity map with low computation complexity. This algorithm can be divided into two steps: disparity map initialization and disparity map refinement. In the initialization step, a new adaptive-weight model based on the linear support region is put forward for cost aggregation. In this model, the neural network is used to evaluate the spatial proximity, and the mean-shift segmentation method is used to improve the accuracy of color similarity; the Birchfield pixel dissimilarity function and the census transform are adopted to establish the dissimilarity measurement function. Then the initial disparity map is obtained by loopy belief propagation. In the refinement step, the disparity map is optimized by iterative left-right consistency checking method and segmentation voting method. The parameter values involved in this algorithm are determined with many simulation experiments to further improve the matching effect. Simulation results indicate that this new matching method performs well on standard stereo benchmarks and running time of our algorithm is remarkably lower than that of algorithm with rectangle-shaped support region.

  6. A Generalization Belief Propagation Decoding Algorithm for Polar Codes Based on Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Yingxian Zhang

    2014-01-01

    Full Text Available We propose a generalization belief propagation (BP decoding algorithm based on particle swarm optimization (PSO to improve the performance of the polar codes. Through the analysis of the existing BP decoding algorithm, we first introduce a probability modifying factor to each node of the BP decoder, so as to enhance the error correcting capacity of the decoding. Then, we generalize the BP decoding algorithm based on these modifying factors and drive the probability update equations for the proposed decoding. Based on the new probability update equations, we show the intrinsic relationship of the existing decoding algorithms. Finally, in order to achieve the best performance, we formulate an optimization problem to find the optimal probability modifying factors for the proposed decoding algorithm. Furthermore, a method based on the modified PSO algorithm is also introduced to solve that optimization problem. Numerical results show that the proposed generalization BP decoding algorithm achieves better performance than that of the existing BP decoding, which suggests the effectiveness of the proposed decoding algorithm.

  7. The role of idiotypic interactions in the adaptive immune system: a belief-propagation approach

    Science.gov (United States)

    Bartolucci, Silvia; Mozeika, Alexander; Annibale, Alessia

    2016-08-01

    In this work we use belief-propagation techniques to study the equilibrium behaviour of a minimal model for the immune system comprising interacting T and B clones. We investigate the effect of the so-called idiotypic interactions among complementary B clones on the system’s activation. Our results show that B-B interactions increase the system’s resilience to noise, making clonal activation more stable, while increasing the cross-talk between different clones. We derive analytically the noise level at which a B clone gets activated, in the absence of cross-talk, and find that this increases with the strength of idiotypic interactions and with the number of T cells sending signals to the B clones. We also derive, analytically and numerically, via population dynamics, the critical line where clonal cross-talk arises. Our approach allows us to derive the B clone size distribution, which can be experimentally measured and gives important information about the adaptive immune system response to antigens and vaccination.

  8. Automatic segmentation of kidneys from non-contrast CT images using efficient belief propagation

    Science.gov (United States)

    Liu, Jianfei; Linguraru, Marius George; Wang, Shijun; Summers, Ronald M.

    2013-03-01

    CT colonography (CTC) can increase the chance of detecting high-risk lesions not only within the colon but anywhere in the abdomen with a low cost. Extracolonic findings such as calculi and masses are frequently found in the kidneys on CTC. Accurate kidney segmentation is an important step to detect extracolonic findings in the kidneys. However, noncontrast CTC images make the task of kidney segmentation substantially challenging because the intensity values of kidney parenchyma are similar to those of adjacent structures. In this paper, we present a fully automatic kidney segmentation algorithm to support extracolonic diagnosis from CTC data. It is built upon three major contributions: 1) localize kidney search regions by exploiting the segmented liver and spleen as well as body symmetry; 2) construct a probabilistic shape prior handling the issue of kidney touching other organs; 3) employ efficient belief propagation on the shape prior to extract the kidneys. We evaluated the accuracy of our algorithm on five non-contrast CTC datasets with manual kidney segmentation as the ground-truth. The Dice volume overlaps were 88%/89%, the root-mean-squared errors were 3.4 mm/2.8 mm, and the average surface distances were 2.1 mm/1.9 mm for the left/right kidney respectively. We also validated the robustness on 27 additional CTC cases, and 23 datasets were successfully segmented. In four problematic cases, the segmentation of the left kidney failed due to problems with the spleen segmentation. The results demonstrated that the proposed algorithm could automatically and accurately segment kidneys from CTC images, given the prior correct segmentation of the liver and spleen.

  9. Iterative Receiver Design for ISI Channels Using Combined Belief- and Expectation-Propagation

    DEFF Research Database (Denmark)

    Sun, Peng; Zhang, Chuanzong; Wang, Zhongyong

    2015-01-01

    the underlying probabilistic model. We also present a simple but efficient method to cope with the "negative variance" problem of expectation propagation. Simulation results show that the proposed algorithm outperforms, in terms of bit-error-rate and convergence rate, a LMMSE turbo-equalizer based on Gaussian...

  10. VARIABLE NON-UNIFORM QUANTIZED BELIEF PROPAGATION ALGORITHM FOR LDPC DECODING

    Institute of Scientific and Technical Information of China (English)

    Liu Binbin; Bai Dong; Mei Shunliang

    2008-01-01

    Non-uniform quantization for messages in Low-Density Parity-Check (LDPC) decoding can reduce implementation complexity and mitigate performance loss. But the distribution of messages varies in the iterative decoding. This letter proposes a variable non-uniform quantized Belief Propaga- tion (BP) algorithm. The BP decoding is analyzed by density evolution with Gaussian approximation. Since the probability density of messages can be well approximated by Gaussian distribution, by the unbiased estimation of variance, the distribution of messages can be tracked during the iteration. Thus the non-uniform quantization scheme can be optimized to minimize the distortion. Simulation results show that the variable non-uniform quantization scheme can achieve better error rate performance and faster decoding convergence than the conventional non-uniform quantization and uniform quantization schemes.

  11. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  12. Quantal Response: Nonparametric Modeling

    Science.gov (United States)

    2017-01-01

    spline N−spline Fig. 3 Logistic regression 7 Approved for public release; distribution is unlimited. 5. Nonparametric QR Models Nonparametric linear ...stimulus and probability of response. The Generalized Linear Model approach does not make use of the limit distribution but allows arbitrary functional...7. Conclusions and Recommendations 18 8. References 19 Appendix A. The Linear Model 21 Appendix B. The Generalized Linear Model 33 Appendix C. B

  13. Nonparametric statistical methods

    CERN Document Server

    Hollander, Myles; Chicken, Eric

    2013-01-01

    Praise for the Second Edition"This book should be an essential part of the personal library of every practicing statistician."-Technometrics  Thoroughly revised and updated, the new edition of Nonparametric Statistical Methods includes additional modern topics and procedures, more practical data sets, and new problems from real-life situations. The book continues to emphasize the importance of nonparametric methods as a significant branch of modern statistics and equips readers with the conceptual and technical skills necessary to select and apply the appropriate procedures for any given sit

  14. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  15. A belief-propagation-based decoding method for two-dimensional barcodes with monochrome auxiliary lines robust against non-uniform geometric distortion

    Science.gov (United States)

    Kamizuru, Kohei; Nakamura, Kazuya; Kawasaki, Hiroshi; Ono, Satoshi

    2017-03-01

    Two-dimensional (2D) codes are widely used for various fields such as production, logistics, and marketing thanks to their larger capacity than one-dimensional barcodes. However, they are subject to distortion when printed on non-rigid materials, such as papers and clothes. Although general 2D code decoders correct uniform distortion such as perspective distortion, it is difficult to correct non-uniform and irregular distortion of the 2D code itself. This paper proposes a decoding method for the 2D code, which models monochrome auxiliary line recognition as Markov random field, and solves it using belief propagation.

  16. Nonparametric Predictive Regression

    OpenAIRE

    Ioannis Kasparis; Elena Andreou; Phillips, Peter C.B.

    2012-01-01

    A unifying framework for inference is developed in predictive regressions where the predictor has unknown integration properties and may be stationary or nonstationary. Two easily implemented nonparametric F-tests are proposed. The test statistics are related to those of Kasparis and Phillips (2012) and are obtained by kernel regression. The limit distribution of these predictive tests holds for a wide range of predictors including stationary as well as non-stationary fractional and near unit...

  17. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  18. Top-down feedback in an HMAX-like cortical model of object perception based on hierarchical Bayesian networks and belief propagation.

    Directory of Open Access Journals (Sweden)

    Salvador Dura-Bernal

    Full Text Available Hierarchical generative models, such as Bayesian networks, and belief propagation have been shown to provide a theoretical framework that can account for perceptual processes, including feedforward recognition and feedback modulation. The framework explains both psychophysical and physiological experimental data and maps well onto the hierarchical distributed cortical anatomy. However, the complexity required to model cortical processes makes inference, even using approximate methods, very computationally expensive. Thus, existing object perception models based on this approach are typically limited to tree-structured networks with no loops, use small toy examples or fail to account for certain perceptual aspects such as invariance to transformations or feedback reconstruction. In this study we develop a Bayesian network with an architecture similar to that of HMAX, a biologically-inspired hierarchical model of object recognition, and use loopy belief propagation to approximate the model operations (selectivity and invariance. Crucially, the resulting Bayesian network extends the functionality of HMAX by including top-down recursive feedback. Thus, the proposed model not only achieves successful feedforward recognition invariant to noise, occlusions, and changes in position and size, but is also able to reproduce modulatory effects such as illusory contour completion and attention. Our novel and rigorous methodology covers key aspects such as learning using a layerwise greedy algorithm, combining feedback information from multiple parents and reducing the number of operations required. Overall, this work extends an established model of object recognition to include high-level feedback modulation, based on state-of-the-art probabilistic approaches. The methodology employed, consistent with evidence from the visual cortex, can be potentially generalized to build models of hierarchical perceptual organization that include top-down and bottom

  19. Nonparametric tests for censored data

    CERN Document Server

    Bagdonavicus, Vilijandas; Nikulin, Mikhail

    2013-01-01

    This book concerns testing hypotheses in non-parametric models. Generalizations of many non-parametric tests to the case of censored and truncated data are considered. Most of the test results are proved and real applications are illustrated using examples. Theories and exercises are provided. The incorrect use of many tests applying most statistical software is highlighted and discussed.

  20. Nonparametric statistical methods using R

    CERN Document Server

    Kloke, John

    2014-01-01

    A Practical Guide to Implementing Nonparametric and Rank-Based ProceduresNonparametric Statistical Methods Using R covers traditional nonparametric methods and rank-based analyses, including estimation and inference for models ranging from simple location models to general linear and nonlinear models for uncorrelated and correlated responses. The authors emphasize applications and statistical computation. They illustrate the methods with many real and simulated data examples using R, including the packages Rfit and npsm.The book first gives an overview of the R language and basic statistical c

  1. Belief Consensus Algorithms for Distributed Target Tracking in Wireless Sensor Networks

    CERN Document Server

    Savic, Vladimir; Zazo, Santiago

    2012-01-01

    In distributed target tracking in wireless sensor networks (WSN), agreement on the target state is usually achieved by the construction and maintenance of a communication path. Such an approach lack robustness to failures, and is not applicable to asynchronous networks. Recently, methods have been proposed that can solve these problems using consensus algorithms. However, these methods suffer from at least one of the following problems: i) they do not use fastest consensus methods, and ii) they cannot handle all parametric and nonparametric likelihood functions. In this paper, we propose a general framework for target tracking using distributed particle filtering (DPF) based on three asynchronous belief consensus (BC) algorithms: standard belief consensus (SBC), broadcast gossip (BG), and belief propagation (BP). Since DPF can be also solved (without consensus) by exchanging the observed data, we determine under which conditions BC-based methods are preferred. Finally, we perform extensive simulations to anal...

  2. Semi- and Nonparametric ARCH Processes

    Directory of Open Access Journals (Sweden)

    Oliver B. Linton

    2011-01-01

    Full Text Available ARCH/GARCH modelling has been successfully applied in empirical finance for many years. This paper surveys the semiparametric and nonparametric methods in univariate and multivariate ARCH/GARCH models. First, we introduce some specific semiparametric models and investigate the semiparametric and nonparametrics estimation techniques applied to: the error density, the functional form of the volatility function, the relationship between mean and variance, long memory processes, locally stationary processes, continuous time processes and multivariate models. The second part of the paper is about the general properties of such processes, including stationary conditions, ergodic conditions and mixing conditions. The last part is on the estimation methods in ARCH/GARCH processes.

  3. Nonparametric estimation of ultrasound pulses

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Leeman, Sidney

    1994-01-01

    An algorithm for nonparametric estimation of 1D ultrasound pulses in echo sequences from human tissues is derived. The technique is a variation of the homomorphic filtering technique using the real cepstrum, and the underlying basis of the method is explained. The algorithm exploits a priori...

  4. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  5. Non-Parametric Inference in Astrophysics

    CERN Document Server

    Wasserman, L H; Nichol, R C; Genovese, C; Jang, W; Connolly, A J; Moore, A W; Schneider, J; Wasserman, Larry; Miller, Christopher J.; Nichol, Robert C.; Genovese, Chris; Jang, Woncheol; Connolly, Andrew J.; Moore, Andrew W.; Schneider, Jeff; group, the PICA

    2001-01-01

    We discuss non-parametric density estimation and regression for astrophysics problems. In particular, we show how to compute non-parametric confidence intervals for the location and size of peaks of a function. We illustrate these ideas with recent data on the Cosmic Microwave Background. We also briefly discuss non-parametric Bayesian inference.

  6. Probabilistic Volcanic Multi-Hazard Assessment at Somma-Vesuvius (Italy): coupling Bayesian Belief Networks with a physical model for lahar propagation

    Science.gov (United States)

    Tierz, Pablo; Woodhouse, Mark; Phillips, Jeremy; Sandri, Laura; Selva, Jacopo; Marzocchi, Warner; Odbert, Henry

    2017-04-01

    Volcanoes are extremely complex physico-chemical systems where magma formed at depth breaks into the planet's surface resulting in major hazards from local to global scales. Volcano physics are dominated by non-linearities, and complicated spatio-temporal interrelationships which make volcanic hazards stochastic (i.e. not deterministic) by nature. In this context, probabilistic assessments are required to quantify the large uncertainties related to volcanic hazards. Moreover, volcanoes are typically multi-hazard environments where different hazardous processes can occur whether simultaneously or in succession. In particular, explosive volcanoes are able to accumulate, through tephra fallout and Pyroclastic Density Currents (PDCs), large amounts of pyroclastic material into the drainage basins surrounding the volcano. This addition of fresh particulate material alters the local/regional hydrogeological equilibrium and increases the frequency and magnitude of sediment-rich aqueous flows, commonly known as lahars. The initiation and volume of rain-triggered lahars may depend on: rainfall intensity and duration; antecedent rainfall; terrain slope; thickness, permeability and hydraulic diffusivity of the tephra deposit; etc. Quantifying these complex interrelationships (and their uncertainties), in a tractable manner, requires a structured but flexible probabilistic approach. A Bayesian Belief Network (BBN) is a directed acyclic graph that allows the representation of the joint probability distribution for a set of uncertain variables in a compact and efficient way, by exploiting unconditional and conditional independences between these variables. Once constructed and parametrized, the BBN uses Bayesian inference to perform causal (e.g. forecast) and/or evidential reasoning (e.g. explanation) about query variables, given some evidence. In this work, we illustrate how BBNs can be used to model the influence of several variables on the generation of rain-triggered lahars

  7. Nonparametric Inference for Periodic Sequences

    KAUST Repository

    Sun, Ying

    2012-02-01

    This article proposes a nonparametric method for estimating the period and values of a periodic sequence when the data are evenly spaced in time. The period is estimated by a "leave-out-one-cycle" version of cross-validation (CV) and complements the periodogram, a widely used tool for period estimation. The CV method is computationally simple and implicitly penalizes multiples of the smallest period, leading to a "virtually" consistent estimator of integer periods. This estimator is investigated both theoretically and by simulation.We also propose a nonparametric test of the null hypothesis that the data have constantmean against the alternative that the sequence of means is periodic. Finally, our methodology is demonstrated on three well-known time series: the sunspots and lynx trapping data, and the El Niño series of sea surface temperatures. © 2012 American Statistical Association and the American Society for Quality.

  8. Nonparametric Econometrics: The np Package

    Directory of Open Access Journals (Sweden)

    Tristen Hayfield

    2008-07-01

    Full Text Available We describe the R np package via a series of applications that may be of interest to applied econometricians. The np package implements a variety of nonparametric and semiparametric kernel-based estimators that are popular among econometricians. There are also procedures for nonparametric tests of significance and consistent model specification tests for parametric mean regression models and parametric quantile regression models, among others. The np package focuses on kernel methods appropriate for the mix of continuous, discrete, and categorical data often found in applied settings. Data-driven methods of bandwidth selection are emphasized throughout, though we caution the user that data-driven bandwidth selection methods can be computationally demanding.

  9. Astronomical Methods for Nonparametric Regression

    Science.gov (United States)

    Steinhardt, Charles L.; Jermyn, Adam

    2017-01-01

    I will discuss commonly used techniques for nonparametric regression in astronomy. We find that several of them, particularly running averages and running medians, are generically biased, asymmetric between dependent and independent variables, and perform poorly in recovering the underlying function, even when errors are present only in one variable. We then examine less-commonly used techniques such as Multivariate Adaptive Regressive Splines and Boosted Trees and find them superior in bias, asymmetry, and variance both theoretically and in practice under a wide range of numerical benchmarks. In this context the chief advantage of the common techniques is runtime, which even for large datasets is now measured in microseconds compared with milliseconds for the more statistically robust techniques. This points to a tradeoff between bias, variance, and computational resources which in recent years has shifted heavily in favor of the more advanced methods, primarily driven by Moore's Law. Along these lines, we also propose a new algorithm which has better overall statistical properties than all techniques examined thus far, at the cost of significantly worse runtime, in addition to providing guidance on choosing the nonparametric regression technique most suitable to any specific problem. We then examine the more general problem of errors in both variables and provide a new algorithm which performs well in most cases and lacks the clear asymmetry of existing non-parametric methods, which fail to account for errors in both variables.

  10. Stereo matching using belief propagation and local edge construction-based cost aggregation%置信度传播和区域边缘构建的立体匹配算法

    Institute of Scientific and Technical Information of China (English)

    何袱; 达飞鹏

    2011-01-01

    讨论了区域匹配代价和全局置信度传播算法之间的相互作用,提出一种基于全局置信度传播和区域边缘构建的立体匹配算法.首先,在每个像素的固定邻域内利用二阶微分边缘算子搜索并构建一条虚拟的封闭边缘,形成相应的区域自适应窗口;然后使用自适应窗口内的支持像素计算中心像素之间的匹配代价;最后利用具备加速消息更新机制的置信度传播算法获取视差.实验结果表明,基于区域边缘构建的匹配代价可以较好地适用于全局置信度传播优化算法,提出的立体匹配算法可以在Middlebury标准下获得良好的匹配结果.%In this paper,the importance of cost aggregation,also called similarity measure,for belief propagation and the interaction of them are discussed. A global stereo matching algorithm is proposed by combining the belief propagation and local edge construction-based cost aggregation. First, a virtual closed edge is formed surrounding each pixel via second derivative operator in order to construct an adaptive window for the centered pixel. Then, the local cost aggregation is calculated on support pixels in an adaptive window. Finally, accelerated belief propagation optimization algorithm is used to obtain the disparity. The experiments based on the Middlebury benchmark indicate that the local edge construction-based cost aggregation can do well with belief propagation optimization and show encouraging results of the proposed stereo matching algorithm.

  11. Nonparametric regression with filtered data

    CERN Document Server

    Linton, Oliver; Nielsen, Jens Perch; Van Keilegom, Ingrid; 10.3150/10-BEJ260

    2011-01-01

    We present a general principle for estimating a regression function nonparametrically, allowing for a wide variety of data filtering, for example, repeated left truncation and right censoring. Both the mean and the median regression cases are considered. The method works by first estimating the conditional hazard function or conditional survivor function and then integrating. We also investigate improved methods that take account of model structure such as independent errors and show that such methods can improve performance when the model structure is true. We establish the pointwise asymptotic normality of our estimators.

  12. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo

    2013-06-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric and based on the asymptotic distribution of the empirical copula process.We perform simulation experiments to evaluate our test and conclude that our method is reliable and powerful for assessing common assumptions on the structure of copulas, particularly when the sample size is moderately large. We illustrate our testing approach on two datasets. © 2013 American Statistical Association.

  13. Multiatlas segmentation as nonparametric regression.

    Science.gov (United States)

    Awate, Suyash P; Whitaker, Ross T

    2014-09-01

    This paper proposes a novel theoretical framework to model and analyze the statistical characteristics of a wide range of segmentation methods that incorporate a database of label maps or atlases; such methods are termed as label fusion or multiatlas segmentation. We model these multiatlas segmentation problems as nonparametric regression problems in the high-dimensional space of image patches. We analyze the nonparametric estimator's convergence behavior that characterizes expected segmentation error as a function of the size of the multiatlas database. We show that this error has an analytic form involving several parameters that are fundamental to the specific segmentation problem (determined by the chosen anatomical structure, imaging modality, registration algorithm, and label-fusion algorithm). We describe how to estimate these parameters and show that several human anatomical structures exhibit the trends modeled analytically. We use these parameter estimates to optimize the regression estimator. We show that the expected error for large database sizes is well predicted by models learned on small databases. Thus, a few expert segmentations can help predict the database sizes required to keep the expected error below a specified tolerance level. Such cost-benefit analysis is crucial for deploying clinical multiatlas segmentation systems.

  14. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  15. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  16. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  17. Nonparametric Regression with Common Shocks

    Directory of Open Access Journals (Sweden)

    Eduardo A. Souza-Rodrigues

    2016-09-01

    Full Text Available This paper considers a nonparametric regression model for cross-sectional data in the presence of common shocks. Common shocks are allowed to be very general in nature; they do not need to be finite dimensional with a known (small number of factors. I investigate the properties of the Nadaraya-Watson kernel estimator and determine how general the common shocks can be while still obtaining meaningful kernel estimates. Restrictions on the common shocks are necessary because kernel estimators typically manipulate conditional densities, and conditional densities do not necessarily exist in the present case. By appealing to disintegration theory, I provide sufficient conditions for the existence of such conditional densities and show that the estimator converges in probability to the Kolmogorov conditional expectation given the sigma-field generated by the common shocks. I also establish the rate of convergence and the asymptotic distribution of the kernel estimator.

  18. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  19. An asymptotically optimal nonparametric adaptive controller

    Institute of Scientific and Technical Information of China (English)

    郭雷; 谢亮亮

    2000-01-01

    For discrete-time nonlinear stochastic systems with unknown nonparametric structure, a kernel estimation-based nonparametric adaptive controller is constructed based on truncated certainty equivalence principle. Global stability and asymptotic optimality of the closed-loop systems are established without resorting to any external excitations.

  20. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how....... For this purpose non-parametric methods together with additive models are suggested. Also, a new approach specifically designed to detect non-linearities is introduced. Confidence intervals are constructed by use of bootstrapping. As a link between non-parametric and parametric methods a paper dealing with neural...... the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...

  1. Bayesian nonparametric duration model with censorship

    Directory of Open Access Journals (Sweden)

    Joseph Hakizamungu

    2007-10-01

    Full Text Available This paper is concerned with nonparametric i.i.d. durations models censored observations and we establish by a simple and unified approach the general structure of a bayesian nonparametric estimator for a survival function S. For Dirichlet prior distributions, we describe completely the structure of the posterior distribution of the survival function. These results are essentially supported by prior and posterior independence properties.

  2. Bootstrap Estimation for Nonparametric Efficiency Estimates

    OpenAIRE

    1995-01-01

    This paper develops a consistent bootstrap estimation procedure to obtain confidence intervals for nonparametric measures of productive efficiency. Although the methodology is illustrated in terms of technical efficiency measured by output distance functions, the technique can be easily extended to other consistent nonparametric frontier models. Variation in estimated efficiency scores is assumed to result from variation in empirical approximations to the true boundary of the production set. ...

  3. Suppressed Belief

    Directory of Open Access Journals (Sweden)

    Komarine Romdenh-Romluc

    2009-12-01

    Full Text Available Moran’s revised conception of conscious belief requires us to reconceptualise suppressed belief. The work of Merleau-Ponty offers a way to do this. His account of motor-skills allows us to understand suppressed beliefs as pre-reflective ways of dealing with the world.

  4. 利用 TMF 和置信传播的无监督 SAR 图像分割%Unsupervised SAR image segmentation using TMF and belief propagation

    Institute of Scientific and Technical Information of China (English)

    甘露; 吴艳; 王凡

    2015-01-01

    针对三重马尔可夫场模型传统统计推断方法无法兼顾分割精度和计算效率的问题,提出了一种高效的利用置信传播的三重马尔可夫场模型统计推断方法,并应用于无监督合成孔径雷达图像分割。该算法结合三重马尔可夫场模型和合成孔径雷达图像统计特性,将图像分割问题转化为三重马尔可夫场的最大后验边缘估计问题。针对三重马尔可夫场中的两个标记场,将置信传播算法推广到二元情形,通过消息传递的方式估计双标记场的联合后验边缘概率,并依据最大后验边缘准则同时实现两个标记场的估计。模拟图像和实测合成孔径雷达图像的仿真实验结果表明,该算法能有效抑制相干斑的影响,能以合理的计算代价获得精确的分割结果。%To solve the problem that the traditional statistical inference approach for the triplet Markov fields ( TMF ) model cannot balance segmentation accuracy and computational efficiency , an efficient statistical inference approach for the TMF model using belief propagation is proposed , and then applied to unsupervised synthetic aperture radar ( SAR) image segmentation . The algorithm combines the TMF model and the statistical property of the SAR image , and translates the segmentation problem into maximization of the posterior marginal ( MPM ) estimation . For the two label fields in TMF , the belief propagation algorithm is generalized to the bivariate case to estimate the joint posterior marginal probability of the two label fields through message passing . The two label fields can be simultaneously estimated according to the MPM criterion . Experiments on both simulated and real SAR images demonstrate that the proposed algorithm can efficiently suppress the influence of the speckle , and obtain accurate segmentation results with a reasonable computational cost .

  5. Why preferring parametric forecasting to nonparametric methods?

    Science.gov (United States)

    Jabot, Franck

    2015-05-07

    A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    breaks in correlations. Only when correlations are constant does the parametric DCC model deliver the best outcome. The methodologies are illustrated by evaluating two interesting portfolios. The first portfolio consists of the equity sector SPDRs and the S&P 500, while the second one contains major......This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural...... currencies. Results show the nonparametric model generally dominates the others when evaluating in-sample. However, the semiparametric model is best for out-of-sample analysis....

  7. Recent Advances and Trends in Nonparametric Statistics

    CERN Document Server

    Akritas, MG

    2003-01-01

    The advent of high-speed, affordable computers in the last two decades has given a new boost to the nonparametric way of thinking. Classical nonparametric procedures, such as function smoothing, suddenly lost their abstract flavour as they became practically implementable. In addition, many previously unthinkable possibilities became mainstream; prime examples include the bootstrap and resampling methods, wavelets and nonlinear smoothers, graphical methods, data mining, bioinformatics, as well as the more recent algorithmic approaches such as bagging and boosting. This volume is a collection o

  8. Correlated Non-Parametric Latent Feature Models

    CERN Document Server

    Doshi-Velez, Finale

    2012-01-01

    We are often interested in explaining data through a set of hidden factors or features. When the number of hidden features is unknown, the Indian Buffet Process (IBP) is a nonparametric latent feature model that does not bound the number of active features in dataset. However, the IBP assumes that all latent features are uncorrelated, making it inadequate for many realworld problems. We introduce a framework for correlated nonparametric feature models, generalising the IBP. We use this framework to generate several specific models and demonstrate applications on realworld datasets.

  9. A Censored Nonparametric Software Reliability Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper analyses the effct of censoring on the estimation of failure rate, and presents a framework of a censored nonparametric software reliability model. The model is based on nonparametric testing of failure rate monotonically decreasing and weighted kernel failure rate estimation under the constraint of failure rate monotonically decreasing. Not only does the model have the advantages of little assumptions and weak constraints, but also the residual defects number of the software system can be estimated. The numerical experiment and real data analysis show that the model performs well with censored data.

  10. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural...... breaks in correlations. Only when correlations are constant does the parametric DCC model deliver the best outcome. The methodologies are illustrated by evaluating two interesting portfolios. The first portfolio consists of the equity sector SPDRs and the S&P 500, while the second one contains major...

  11. Thirty years of nonparametric item response theory

    NARCIS (Netherlands)

    Molenaar, W.

    2001-01-01

    Relationships between a mathematical measurement model and its real-world applications are discussed. A distinction is made between large data matrices commonly found in educational measurement and smaller matrices found in attitude and personality measurement. Nonparametric methods are evaluated fo

  12. A Bayesian Nonparametric Approach to Test Equating

    Science.gov (United States)

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  13. How Are Teachers Teaching? A Nonparametric Approach

    Science.gov (United States)

    De Witte, Kristof; Van Klaveren, Chris

    2014-01-01

    This paper examines which configuration of teaching activities maximizes student performance. For this purpose a nonparametric efficiency model is formulated that accounts for (1) self-selection of students and teachers in better schools and (2) complementary teaching activities. The analysis distinguishes both individual teaching (i.e., a…

  14. Nonparametric confidence intervals for monotone functions

    NARCIS (Netherlands)

    Groeneboom, P.; Jongbloed, G.

    2015-01-01

    We study nonparametric isotonic confidence intervals for monotone functions. In [Ann. Statist. 29 (2001) 1699–1731], pointwise confidence intervals, based on likelihood ratio tests using the restricted and unrestricted MLE in the current status model, are introduced. We extend the method to the trea

  15. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  16. Nonparametric confidence intervals for monotone functions

    NARCIS (Netherlands)

    Groeneboom, P.; Jongbloed, G.

    2015-01-01

    We study nonparametric isotonic confidence intervals for monotone functions. In [Ann. Statist. 29 (2001) 1699–1731], pointwise confidence intervals, based on likelihood ratio tests using the restricted and unrestricted MLE in the current status model, are introduced. We extend the method to the

  17. A Nonparametric Analogy of Analysis of Covariance

    Science.gov (United States)

    Burnett, Thomas D.; Barr, Donald R.

    1977-01-01

    A nonparametric test of the hypothesis of no treatment effect is suggested for a situation where measures of the severity of the condition treated can be obtained and ranked both pre- and post-treatment. The test allows the pre-treatment rank to be used as a concomitant variable. (Author/JKS)

  18. Panel data specifications in nonparametric kernel regression

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...

  19. How Are Teachers Teaching? A Nonparametric Approach

    Science.gov (United States)

    De Witte, Kristof; Van Klaveren, Chris

    2014-01-01

    This paper examines which configuration of teaching activities maximizes student performance. For this purpose a nonparametric efficiency model is formulated that accounts for (1) self-selection of students and teachers in better schools and (2) complementary teaching activities. The analysis distinguishes both individual teaching (i.e., a…

  20. Conscious Belief

    Directory of Open Access Journals (Sweden)

    David Pitt

    2016-04-01

    Full Text Available Tim Crane maintains that beliefs cannot be conscious because they persist in the absence of consciousness. Conscious judgments can share their contents with beliefs, and their occurrence can be evidence for what one believes; but they cannot be beliefs, because they don’t persist. I challenge Crane’s premise that belief attributions to the temporarily unconscious are literally true. To say of an unconscious agent that she believes that p is like saying that she sings well. To say she sings well is to say that when she sings, her singing is good. To say that she believes that p is (roughly to say that when she consciously considers the content that p she consciously affirms (believes it. I also argue that the phenomenal view of intentional content Crane appears to endorse prima facie commits him to the view, at least controversial, perhaps incoherent, that there is unconscious phenomenology (the intentional contents of unconscious beliefs.

  1. Nonparametric tests for pathwise properties of semimartingales

    CERN Document Server

    Cont, Rama; 10.3150/10-BEJ293

    2011-01-01

    We propose two nonparametric tests for investigating the pathwise properties of a signal modeled as the sum of a L\\'{e}vy process and a Brownian semimartingale. Using a nonparametric threshold estimator for the continuous component of the quadratic variation, we design a test for the presence of a continuous martingale component in the process and a test for establishing whether the jumps have finite or infinite variation, based on observations on a discrete-time grid. We evaluate the performance of our tests using simulations of various stochastic models and use the tests to investigate the fine structure of the DM/USD exchange rate fluctuations and SPX futures prices. In both cases, our tests reveal the presence of a non-zero Brownian component and a finite variation jump component.

  2. Nonparametric Transient Classification using Adaptive Wavelets

    CERN Document Server

    Varughese, Melvin M; Stephanou, Michael; Bassett, Bruce A

    2015-01-01

    Classifying transients based on multi band light curves is a challenging but crucial problem in the era of GAIA and LSST since the sheer volume of transients will make spectroscopic classification unfeasible. Here we present a nonparametric classifier that uses the transient's light curve measurements to predict its class given training data. It implements two novel components: the first is the use of the BAGIDIS wavelet methodology - a characterization of functional data using hierarchical wavelet coefficients. The second novelty is the introduction of a ranked probability classifier on the wavelet coefficients that handles both the heteroscedasticity of the data in addition to the potential non-representativity of the training set. The ranked classifier is simple and quick to implement while a major advantage of the BAGIDIS wavelets is that they are translation invariant, hence they do not need the light curves to be aligned to extract features. Further, BAGIDIS is nonparametric so it can be used for blind ...

  3. A Bayesian nonparametric meta-analysis model.

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G

    2015-03-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.

  4. Nonparametric Bayes analysis of social science data

    Science.gov (United States)

    Kunihama, Tsuyoshi

    Social science data often contain complex characteristics that standard statistical methods fail to capture. Social surveys assign many questions to respondents, which often consist of mixed-scale variables. Each of the variables can follow a complex distribution outside parametric families and associations among variables may have more complicated structures than standard linear dependence. Therefore, it is not straightforward to develop a statistical model which can approximate structures well in the social science data. In addition, many social surveys have collected data over time and therefore we need to incorporate dynamic dependence into the models. Also, it is standard to observe massive number of missing values in the social science data. To address these challenging problems, this thesis develops flexible nonparametric Bayesian methods for the analysis of social science data. Chapter 1 briefly explains backgrounds and motivations of the projects in the following chapters. Chapter 2 develops a nonparametric Bayesian modeling of temporal dependence in large sparse contingency tables, relying on a probabilistic factorization of the joint pmf. Chapter 3 proposes nonparametric Bayes inference on conditional independence with conditional mutual information used as a measure of the strength of conditional dependence. Chapter 4 proposes a novel Bayesian density estimation method in social surveys with complex designs where there is a gap between sample and population. We correct for the bias by adjusting mixture weights in Bayesian mixture models. Chapter 5 develops a nonparametric model for mixed-scale longitudinal surveys, in which various types of variables can be induced through latent continuous variables and dynamic latent factors lead to flexibly time-varying associations among variables.

  5. Bayesian nonparametric estimation for Quantum Homodyne Tomography

    OpenAIRE

    Naulet, Zacharie; Barat, Eric

    2016-01-01

    We estimate the quantum state of a light beam from results of quantum homodyne tomography noisy measurements performed on identically prepared quantum systems. We propose two Bayesian nonparametric approaches. The first approach is based on mixture models and is illustrated through simulation examples. The second approach is based on random basis expansions. We study the theoretical performance of the second approach by quantifying the rate of contraction of the posterior distribution around ...

  6. NONPARAMETRIC ESTIMATION OF CHARACTERISTICS OF PROBABILITY DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-10-01

    Full Text Available The article is devoted to the nonparametric point and interval estimation of the characteristics of the probabilistic distribution (the expectation, median, variance, standard deviation, variation coefficient of the sample results. Sample values are regarded as the implementation of independent and identically distributed random variables with an arbitrary distribution function having the desired number of moments. Nonparametric analysis procedures are compared with the parametric procedures, based on the assumption that the sample values have a normal distribution. Point estimators are constructed in the obvious way - using sample analogs of the theoretical characteristics. Interval estimators are based on asymptotic normality of sample moments and functions from them. Nonparametric asymptotic confidence intervals are obtained through the use of special output technology of the asymptotic relations of Applied Statistics. In the first step this technology uses the multidimensional central limit theorem, applied to the sums of vectors whose coordinates are the degrees of initial random variables. The second step is the conversion limit multivariate normal vector to obtain the interest of researcher vector. At the same considerations we have used linearization and discarded infinitesimal quantities. The third step - a rigorous justification of the results on the asymptotic standard for mathematical and statistical reasoning level. It is usually necessary to use the necessary and sufficient conditions for the inheritance of convergence. This article contains 10 numerical examples. Initial data - information about an operating time of 50 cutting tools to the limit state. Using the methods developed on the assumption of normal distribution, it can lead to noticeably distorted conclusions in a situation where the normality hypothesis failed. Practical recommendations are: for the analysis of real data we should use nonparametric confidence limits

  7. portfolio optimization based on nonparametric estimation methods

    Directory of Open Access Journals (Sweden)

    mahsa ghandehari

    2017-03-01

    Full Text Available One of the major issues investors are facing with in capital markets is decision making about select an appropriate stock exchange for investing and selecting an optimal portfolio. This process is done through the risk and expected return assessment. On the other hand in portfolio selection problem if the assets expected returns are normally distributed, variance and standard deviation are used as a risk measure. But, the expected returns on assets are not necessarily normal and sometimes have dramatic differences from normal distribution. This paper with the introduction of conditional value at risk ( CVaR, as a measure of risk in a nonparametric framework, for a given expected return, offers the optimal portfolio and this method is compared with the linear programming method. The data used in this study consists of monthly returns of 15 companies selected from the top 50 companies in Tehran Stock Exchange during the winter of 1392 which is considered from April of 1388 to June of 1393. The results of this study show the superiority of nonparametric method over the linear programming method and the nonparametric method is much faster than the linear programming method.

  8. Introduction to nonparametric statistics for the biological sciences using R

    CERN Document Server

    MacFarland, Thomas W

    2016-01-01

    This book contains a rich set of tools for nonparametric analyses, and the purpose of this supplemental text is to provide guidance to students and professional researchers on how R is used for nonparametric data analysis in the biological sciences: To introduce when nonparametric approaches to data analysis are appropriate To introduce the leading nonparametric tests commonly used in biostatistics and how R is used to generate appropriate statistics for each test To introduce common figures typically associated with nonparametric data analysis and how R is used to generate appropriate figures in support of each data set The book focuses on how R is used to distinguish between data that could be classified as nonparametric as opposed to data that could be classified as parametric, with both approaches to data classification covered extensively. Following an introductory lesson on nonparametric statistics for the biological sciences, the book is organized into eight self-contained lessons on various analyses a...

  9. DNA motif elucidation using belief propagation

    OpenAIRE

    Wong, Ka-Chun; Chan, Tak-Ming; Peng, Chengbin; Li, Yue; Zhang, Zhaolei

    2013-01-01

    Protein-binding microarray (PBM) is a high-throughout platform that can measure the DNA-binding preference of a protein in a comprehensive and unbiased manner. A typical PBM experiment can measure binding signal intensities of a protein to all the possible DNA k-mers (k = 8 ∼10); such comprehensive binding affinity data usually need to be reduced and represented as motif models before they can be further analyzed and applied. Since proteins can often bind to DNA in multiple modes, one of the ...

  10. Distributed Sensor Selection via Gaussian Belief Propagation

    CERN Document Server

    Bickson, Danny

    2009-01-01

    The sensor selection problem is a boolean convex optimization problem; given $m$ sensor measurements we aim at finding $k

  11. Distributed Kalman Filter via Gaussian Belief Propagation

    CERN Document Server

    Bickson, Danny; Dolev, Danny

    2008-01-01

    Recent result shows how to compute distributively and efficiently the linear MMSE for the multiuser detection problem, using the Gaussian BP algorithm. In the current work, we extend this construction, and show that operating this algorithm twice on the matching inputs, has several interesting interpretations. First, we show equivalence to computing one iteration of the Kalman filter. Second, we show that the Kalman filter is a special case of the Gaussian information bottleneck algorithm, when the weight parameter $\\beta = 1$. Third, we discuss the relation to the Affine-scaling interior-point method and show it is a special case of Kalman filter. Besides of the theoretical interest of this linking estimation, compression/clustering and optimization, we allow a single distributed implementation of those algorithms, which is a highly practical and important task in sensor and mobile ad-hoc networks. Application to numerous problem domains includes collaborative signal processing and distributed allocation of ...

  12. A nonparametric and diversified portfolio model

    Science.gov (United States)

    Shirazi, Yasaman Izadparast; Sabiruzzaman, Md.; Hamzah, Nor Aishah

    2014-07-01

    Traditional portfolio models, like mean-variance (MV) suffer from estimation error and lack of diversity. Alternatives, like mean-entropy (ME) or mean-variance-entropy (MVE) portfolio models focus independently on the issue of either a proper risk measure or the diversity. In this paper, we propose an asset allocation model that compromise between risk of historical data and future uncertainty. In the new model, entropy is presented as a nonparametric risk measure as well as an index of diversity. Our empirical evaluation with a variety of performance measures shows that this model has better out-of-sample performances and lower portfolio turnover than its competitors.

  13. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are pointed...... out, and methods to prevent bias are presented. The techniques are evaluated by comparing their speed and accuracy on the simple case of estimating auto-correlation functions for the response of a single degree-of-freedom system loaded with white noise....

  14. Lottery spending: a non-parametric analysis.

    Science.gov (United States)

    Garibaldi, Skip; Frisoli, Kayla; Ke, Li; Lim, Melody

    2015-01-01

    We analyze the spending of individuals in the United States on lottery tickets in an average month, as reported in surveys. We view these surveys as sampling from an unknown distribution, and we use non-parametric methods to compare properties of this distribution for various demographic groups, as well as claims that some properties of this distribution are constant across surveys. We find that the observed higher spending by Hispanic lottery players can be attributed to differences in education levels, and we dispute previous claims that the top 10% of lottery players consistently account for 50% of lottery sales.

  15. Lottery spending: a non-parametric analysis.

    Directory of Open Access Journals (Sweden)

    Skip Garibaldi

    Full Text Available We analyze the spending of individuals in the United States on lottery tickets in an average month, as reported in surveys. We view these surveys as sampling from an unknown distribution, and we use non-parametric methods to compare properties of this distribution for various demographic groups, as well as claims that some properties of this distribution are constant across surveys. We find that the observed higher spending by Hispanic lottery players can be attributed to differences in education levels, and we dispute previous claims that the top 10% of lottery players consistently account for 50% of lottery sales.

  16. Nonparametric inferences for kurtosis and conditional kurtosis

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-heng; HE You-hua

    2009-01-01

    Under the assumption of strictly stationary process, this paper proposes a nonparametric model to test the kurtosis and conditional kurtosis for risk time series. We apply this method to the daily returns of S&P500 index and the Shanghai Composite Index, and simulate GARCH data for verifying the efficiency of the presented model. Our results indicate that the risk series distribution is heavily tailed, but the historical information can make its future distribution light-tailed. However the far future distribution's tails are little affected by the historical data.

  17. Parametric versus non-parametric simulation

    OpenAIRE

    Dupeux, Bérénice; Buysse, Jeroen

    2014-01-01

    Most of ex-ante impact assessment policy models have been based on a parametric approach. We develop a novel non-parametric approach, called Inverse DEA. We use non parametric efficiency analysis for determining the farm’s technology and behaviour. Then, we compare the parametric approach and the Inverse DEA models to a known data generating process. We use a bio-economic model as a data generating process reflecting a real world situation where often non-linear relationships exist. Results s...

  18. Preliminary results on nonparametric facial occlusion detection

    Directory of Open Access Journals (Sweden)

    Daniel LÓPEZ SÁNCHEZ

    2016-10-01

    Full Text Available The problem of face recognition has been extensively studied in the available literature, however, some aspects of this field require further research. The design and implementation of face recognition systems that can efficiently handle unconstrained conditions (e.g. pose variations, illumination, partial occlusion... is still an area under active research. This work focuses on the design of a new nonparametric occlusion detection technique. In addition, we present some preliminary results that indicate that the proposed technique might be useful to face recognition systems, allowing them to dynamically discard occluded face parts.

  19. Bayesian Nonparametric Clustering for Positive Definite Matrices.

    Science.gov (United States)

    Cherian, Anoop; Morellas, Vassilios; Papanikolopoulos, Nikolaos

    2016-05-01

    Symmetric Positive Definite (SPD) matrices emerge as data descriptors in several applications of computer vision such as object tracking, texture recognition, and diffusion tensor imaging. Clustering these data matrices forms an integral part of these applications, for which soft-clustering algorithms (K-Means, expectation maximization, etc.) are generally used. As is well-known, these algorithms need the number of clusters to be specified, which is difficult when the dataset scales. To address this issue, we resort to the classical nonparametric Bayesian framework by modeling the data as a mixture model using the Dirichlet process (DP) prior. Since these matrices do not conform to the Euclidean geometry, rather belongs to a curved Riemannian manifold,existing DP models cannot be directly applied. Thus, in this paper, we propose a novel DP mixture model framework for SPD matrices. Using the log-determinant divergence as the underlying dissimilarity measure to compare these matrices, and further using the connection between this measure and the Wishart distribution, we derive a novel DPM model based on the Wishart-Inverse-Wishart conjugate pair. We apply this model to several applications in computer vision. Our experiments demonstrate that our model is scalable to the dataset size and at the same time achieves superior accuracy compared to several state-of-the-art parametric and nonparametric clustering algorithms.

  20. Belief change

    CSIR Research Space (South Africa)

    Booth, R

    2012-07-01

    Full Text Available In this paper the authors present a brief overview of belief change, a research area concerned with the question of how a rational agent ought to change its mind in the face of new, possibly conflicting, information. The authors limit themselves...

  1. Nonparametric dark energy reconstruction from supernova data.

    Science.gov (United States)

    Holsclaw, Tracy; Alam, Ujjaini; Sansó, Bruno; Lee, Herbert; Heitmann, Katrin; Habib, Salman; Higdon, David

    2010-12-10

    Understanding the origin of the accelerated expansion of the Universe poses one of the greatest challenges in physics today. Lacking a compelling fundamental theory to test, observational efforts are targeted at a better characterization of the underlying cause. If a new form of mass-energy, dark energy, is driving the acceleration, the redshift evolution of the equation of state parameter w(z) will hold essential clues as to its origin. To best exploit data from observations it is necessary to develop a robust and accurate reconstruction approach, with controlled errors, for w(z). We introduce a new, nonparametric method for solving the associated statistical inverse problem based on Gaussian process modeling and Markov chain Monte Carlo sampling. Applying this method to recent supernova measurements, we reconstruct the continuous history of w out to redshift z=1.5.

  2. Local Component Analysis for Nonparametric Bayes Classifier

    CERN Document Server

    Khademi, Mahmoud; safayani, Meharn

    2010-01-01

    The decision boundaries of Bayes classifier are optimal because they lead to maximum probability of correct decision. It means if we knew the prior probabilities and the class-conditional densities, we could design a classifier which gives the lowest probability of error. However, in classification based on nonparametric density estimation methods such as Parzen windows, the decision regions depend on the choice of parameters such as window width. Moreover, these methods suffer from curse of dimensionality of the feature space and small sample size problem which severely restricts their practical applications. In this paper, we address these problems by introducing a novel dimension reduction and classification method based on local component analysis. In this method, by adopting an iterative cross-validation algorithm, we simultaneously estimate the optimal transformation matrices (for dimension reduction) and classifier parameters based on local information. The proposed method can classify the data with co...

  3. Nonparametric k-nearest-neighbor entropy estimator.

    Science.gov (United States)

    Lombardi, Damiano; Pant, Sanjay

    2016-01-01

    A nonparametric k-nearest-neighbor-based entropy estimator is proposed. It improves on the classical Kozachenko-Leonenko estimator by considering nonuniform probability densities in the region of k-nearest neighbors around each sample point. It aims to improve the classical estimators in three situations: first, when the dimensionality of the random variable is large; second, when near-functional relationships leading to high correlation between components of the random variable are present; and third, when the marginal variances of random variable components vary significantly with respect to each other. Heuristics on the error of the proposed and classical estimators are presented. Finally, the proposed estimator is tested for a variety of distributions in successively increasing dimensions and in the presence of a near-functional relationship. Its performance is compared with a classical estimator, and a significant improvement is demonstrated.

  4. Nonparametric estimation of location and scale parameters

    KAUST Repository

    Potgieter, C.J.

    2012-12-01

    Two random variables X and Y belong to the same location-scale family if there are constants μ and σ such that Y and μ+σX have the same distribution. In this paper we consider non-parametric estimation of the parameters μ and σ under minimal assumptions regarding the form of the distribution functions of X and Y. We discuss an approach to the estimation problem that is based on asymptotic likelihood considerations. Our results enable us to provide a methodology that can be implemented easily and which yields estimators that are often near optimal when compared to fully parametric methods. We evaluate the performance of the estimators in a series of Monte Carlo simulations. © 2012 Elsevier B.V. All rights reserved.

  5. Nonparametric Maximum Entropy Estimation on Information Diagrams

    CERN Document Server

    Martin, Elliot A; Meinke, Alexander; Děchtěrenko, Filip; Davidsen, Jörn

    2016-01-01

    Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies. Specifically, we show how to apply the concept to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish a number of significant advantages of our approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases. In addition, we propose a nonparametric formulation of connected informations and give an illustrative example showing how this agrees with the existing parametric formulation in cases of interest. We furthe...

  6. Nonparametric estimation of employee stock options

    Institute of Scientific and Technical Information of China (English)

    FU Qiang; LIU Li-an; LIU Qian

    2006-01-01

    We proposed a new model to price employee stock options (ESOs). The model is based on nonparametric statistical methods with market data. It incorporates the kernel estimator and employs a three-step method to modify BlackScholes formula. The model overcomes the limits of Black-Scholes formula in handling option prices with varied volatility. It disposes the effects of ESOs self-characteristics such as non-tradability, the longer term for expiration, the early exercise feature, the restriction on shorting selling and the employee's risk aversion on risk neutral pricing condition, and can be applied to ESOs valuation with the explanatory variable in no matter the certainty case or random case.

  7. On Parametric (and Non-Parametric Variation

    Directory of Open Access Journals (Sweden)

    Neil Smith

    2009-11-01

    Full Text Available This article raises the issue of the correct characterization of ‘Parametric Variation’ in syntax and phonology. After specifying their theoretical commitments, the authors outline the relevant parts of the Principles–and–Parameters framework, and draw a three-way distinction among Universal Principles, Parameters, and Accidents. The core of the contribution then consists of an attempt to provide identity criteria for parametric, as opposed to non-parametric, variation. Parametric choices must be antecedently known, and it is suggested that they must also satisfy seven individually necessary and jointly sufficient criteria. These are that they be cognitively represented, systematic, dependent on the input, deterministic, discrete, mutually exclusive, and irreversible.

  8. Nonparametric inference of network structure and dynamics

    Science.gov (United States)

    Peixoto, Tiago P.

    The network structure of complex systems determine their function and serve as evidence for the evolutionary mechanisms that lie behind them. Despite considerable effort in recent years, it remains an open challenge to formulate general descriptions of the large-scale structure of network systems, and how to reliably extract such information from data. Although many approaches have been proposed, few methods attempt to gauge the statistical significance of the uncovered structures, and hence the majority cannot reliably separate actual structure from stochastic fluctuations. Due to the sheer size and high-dimensionality of many networks, this represents a major limitation that prevents meaningful interpretations of the results obtained with such nonstatistical methods. In this talk, I will show how these issues can be tackled in a principled and efficient fashion by formulating appropriate generative models of network structure that can have their parameters inferred from data. By employing a Bayesian description of such models, the inference can be performed in a nonparametric fashion, that does not require any a priori knowledge or ad hoc assumptions about the data. I will show how this approach can be used to perform model comparison, and how hierarchical models yield the most appropriate trade-off between model complexity and quality of fit based on the statistical evidence present in the data. I will also show how this general approach can be elegantly extended to networks with edge attributes, that are embedded in latent spaces, and that change in time. The latter is obtained via a fully dynamic generative network model, based on arbitrary-order Markov chains, that can also be inferred in a nonparametric fashion. Throughout the talk I will illustrate the application of the methods with many empirical networks such as the internet at the autonomous systems level, the global airport network, the network of actors and films, social networks, citations among

  9. A nonparametric dynamic additive regression model for longitudinal data

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas H.

    2000-01-01

    dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models......dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models...

  10. Nonparametric Bayesian inference for multidimensional compound Poisson processes

    NARCIS (Netherlands)

    S. Gugushvili; F. van der Meulen; P. Spreij

    2015-01-01

    Given a sample from a discretely observed multidimensional compound Poisson process, we study the problem of nonparametric estimation of its jump size density r0 and intensity λ0. We take a nonparametric Bayesian approach to the problem and determine posterior contraction rates in this context, whic

  11. Asymptotic theory of nonparametric regression estimates with censored data

    Institute of Scientific and Technical Information of China (English)

    施沛德; 王海燕; 张利华

    2000-01-01

    For regression analysis, some useful Information may have been lost when the responses are right censored. To estimate nonparametric functions, several estimates based on censored data have been proposed and their consistency and convergence rates have been studied in literat黵e, but the optimal rates of global convergence have not been obtained yet. Because of the possible Information loss, one may think that it is impossible for an estimate based on censored data to achieve the optimal rates of global convergence for nonparametric regression, which were established by Stone based on complete data. This paper constructs a regression spline estimate of a general nonparametric regression f unction based on right-censored response data, and proves, under some regularity condi-tions, that this estimate achieves the optimal rates of global convergence for nonparametric regression. Since the parameters for the nonparametric regression estimate have to be chosen based on a data driven criterion, we also obtai

  12. 2nd Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Manteiga, Wenceslao; Romo, Juan

    2016-01-01

    This volume collects selected, peer-reviewed contributions from the 2nd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Cádiz (Spain) between June 11–16 2014, and sponsored by the American Statistical Association, the Institute of Mathematical Statistics, the Bernoulli Society for Mathematical Statistics and Probability, the Journal of Nonparametric Statistics and Universidad Carlos III de Madrid. The 15 articles are a representative sample of the 336 contributed papers presented at the conference. They cover topics such as high-dimensional data modelling, inference for stochastic processes and for dependent data, nonparametric and goodness-of-fit testing, nonparametric curve estimation, object-oriented data analysis, and semiparametric inference. The aim of the ISNPS 2014 conference was to bring together recent advances and trends in several areas of nonparametric statistics in order to facilitate the exchange of research ideas, promote collaboration among researchers...

  13. Death beliefs, superstitious beliefs and health anxiety.

    Science.gov (United States)

    James, Abigail; Wells, Adrian

    2002-03-01

    The present study explored the association between beliefs about death, superstitious beliefs, and health anxiety. It was hypothesized that negative beliefs about death and superstitious beliefs would be positively correlated with health anxiety. Conversely, positive beliefs about death were hypothesized to be negatively correlated with health anxiety. A cross-sectional, correlational and multiple regression design was adopted. A sample of 106 Roman Catholics and 197 Atheists completed a questionnaire measuring aspects of health anxiety, spiritual beliefs, and control variables consisting of demographics. Negative beliefs about death and superstitious beliefs were related to health anxiety within both the Roman Catholic and Atheist samples. The expected negative relationship between positive beliefs about death and health anxiety was not supported in either sample. Multiple regression analyses indicated that death beliefs and superstitious beliefs, in combination with background variables, significantly predicted health anxiety in the Roman Catholic sample. For Atheists, although death and superstitious beliefs were identified as significant predictors, when considered with other variables, the additional variance accounted for was not significant. Negative beliefs about death and superstitious beliefs appear to be positively associated with health anxiety. These types of beliefs may have the potential to offer a useful addition to cognitive-behavioural models of health anxiety.

  14. LinguisticBelief: a java application for linguistic evaluation using belief, fuzzy sets, and approximate reasoning.

    Energy Technology Data Exchange (ETDEWEB)

    Darby, John L.

    2007-03-01

    LinguisticBelief is a Java computer code that evaluates combinations of linguistic variables using an approximate reasoning rule base. Each variable is comprised of fuzzy sets, and a rule base describes the reasoning on combinations of variables fuzzy sets. Uncertainty is considered and propagated through the rule base using the belief/plausibility measure. The mathematics of fuzzy sets, approximate reasoning, and belief/ plausibility are complex. Without an automated tool, this complexity precludes their application to all but the simplest of problems. LinguisticBelief automates the use of these techniques, allowing complex problems to be evaluated easily. LinguisticBelief can be used free of charge on any Windows XP machine. This report documents the use and structure of the LinguisticBelief code, and the deployment package for installation client machines.

  15. Nonparametric methods in actigraphy: An update

    Directory of Open Access Journals (Sweden)

    Bruno S.B. Gonçalves

    2014-09-01

    Full Text Available Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm results for each time interval. Simulated data showed that (1 synchronization analysis depends on sample size, and (2 fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization.

  16. Bayesian nonparametric adaptive control using Gaussian processes.

    Science.gov (United States)

    Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A

    2015-03-01

    Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.

  17. Nonparametric methods in actigraphy: An update

    Science.gov (United States)

    Gonçalves, Bruno S.B.; Cavalcanti, Paula R.A.; Tavares, Gracilene R.; Campos, Tania F.; Araujo, John F.

    2014-01-01

    Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm) results for each time interval. Simulated data showed that (1) synchronization analysis depends on sample size, and (2) fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization. PMID:26483921

  18. Nonparametric Detection of Geometric Structures Over Networks

    Science.gov (United States)

    Zou, Shaofeng; Liang, Yingbin; Poor, H. Vincent

    2017-10-01

    Nonparametric detection of existence of an anomalous structure over a network is investigated. Nodes corresponding to the anomalous structure (if one exists) receive samples generated by a distribution q, which is different from a distribution p generating samples for other nodes. If an anomalous structure does not exist, all nodes receive samples generated by p. It is assumed that the distributions p and q are arbitrary and unknown. The goal is to design statistically consistent tests with probability of errors converging to zero as the network size becomes asymptotically large. Kernel-based tests are proposed based on maximum mean discrepancy that measures the distance between mean embeddings of distributions into a reproducing kernel Hilbert space. Detection of an anomalous interval over a line network is first studied. Sufficient conditions on minimum and maximum sizes of candidate anomalous intervals are characterized in order to guarantee the proposed test to be consistent. It is also shown that certain necessary conditions must hold to guarantee any test to be universally consistent. Comparison of sufficient and necessary conditions yields that the proposed test is order-level optimal and nearly optimal respectively in terms of minimum and maximum sizes of candidate anomalous intervals. Generalization of the results to other networks is further developed. Numerical results are provided to demonstrate the performance of the proposed tests.

  19. Nonparametric Bayesian drift estimation for multidimensional stochastic differential equations

    NARCIS (Netherlands)

    Gugushvili, S.; Spreij, P.

    2014-01-01

    We consider nonparametric Bayesian estimation of the drift coefficient of a multidimensional stochastic differential equation from discrete-time observations on the solution of this equation. Under suitable regularity conditions, we establish posterior consistency in this context.

  20. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of

  1. A non-parametric approach to investigating fish population dynamics

    National Research Council Canada - National Science Library

    Cook, R.M; Fryer, R.J

    2001-01-01

    .... Using a non-parametric model for the stock-recruitment relationship it is possible to avoid defining specific functions relating recruitment to stock size while also providing a natural framework to model process error...

  2. Non-parametric approach to the study of phenotypic stability.

    Science.gov (United States)

    Ferreira, D F; Fernandes, S B; Bruzi, A T; Ramalho, M A P

    2016-02-19

    The aim of this study was to undertake the theoretical derivations of non-parametric methods, which use linear regressions based on rank order, for stability analyses. These methods were extension different parametric methods used for stability analyses and the result was compared with a standard non-parametric method. Intensive computational methods (e.g., bootstrap and permutation) were applied, and data from the plant-breeding program of the Biology Department of UFLA (Minas Gerais, Brazil) were used to illustrate and compare the tests. The non-parametric stability methods were effective for the evaluation of phenotypic stability. In the presence of variance heterogeneity, the non-parametric methods exhibited greater power of discrimination when determining the phenotypic stability of genotypes.

  3. Nonparametric Bayesian Modeling for Automated Database Schema Matching

    Energy Technology Data Exchange (ETDEWEB)

    Ferragut, Erik M [ORNL; Laska, Jason A [ORNL

    2015-01-01

    The problem of merging databases arises in many government and commercial applications. Schema matching, a common first step, identifies equivalent fields between databases. We introduce a schema matching framework that builds nonparametric Bayesian models for each field and compares them by computing the probability that a single model could have generated both fields. Our experiments show that our method is more accurate and faster than the existing instance-based matching algorithms in part because of the use of nonparametric Bayesian models.

  4. PV power forecast using a nonparametric PV model

    OpenAIRE

    Almeida, Marcelo Pinho; Perpiñan Lamigueiro, Oscar; Narvarte Fernández, Luis

    2015-01-01

    Forecasting the AC power output of a PV plant accurately is important both for plant owners and electric system operators. Two main categories of PV modeling are available: the parametric and the nonparametric. In this paper, a methodology using a nonparametric PV model is proposed, using as inputs several forecasts of meteorological variables from a Numerical Weather Forecast model, and actual AC power measurements of PV plants. The methodology was built upon the R environment and uses Quant...

  5. Belief Elicitation in Experiments

    DEFF Research Database (Denmark)

    Blanco, Mariana; Engelmann, Dirk; Koch, Alexander

    Belief elicitation in economics experiments usually relies on paying subjects according to the accuracy of stated beliefs in addition to payments for other decisions. Such incentives, however, allow risk-averse subjects to hedge with their stated beliefs against adverse outcomes of other decisions......-belief elicitation treatment using a financial investment frame, where hedging arguably would be most natural....

  6. RELIGION AND FOLK BELIEFS

    OpenAIRE

    Mustafa SEVER

    2011-01-01

    Folk beliefs, transmitted from generation to generation since the transfer from pre-writing and not included in any divine religion, but his side continued vitality of beliefs. Folk beliefs that have traditional as well as the local folk beliefs, include beliefs and practices, and personal features, divide two main types as a magic and experiential people’s beliefs. In this study we’ll focus on the qualities, differences and functions in everyday life of divine religion and folk beliefs and f...

  7. Evolution of Religious Beliefs

    CERN Document Server

    CERN. Geneva

    2009-01-01

    Humans may be distinguished from all other animals in having beliefs about the causal interaction of physical objects. Causal beliefs are a developmental primitive in human children; animals, by contrast, have very few causal beliefs. The origin of human causal beliefs comes from the evolutionary advantage it gave in relation to complex tool making and use. Causal beliefs gave rise religion and mystical thinking as our ancestors wanted to know the causes of events that affected their lives.

  8. Asymptotic theory of nonparametric regression estimates with censored data

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    For regression analysis, some useful information may have been lost when the responses are right censored. To estimate nonparametric functions, several estimates based on censored data have been proposed and their consistency and convergence rates have been studied in literature, but the optimal rates of global convergence have not been obtained yet. Because of the possible information loss, one may think that it is impossible for an estimate based on censored data to achieve the optimal rates of global convergence for nonparametric regression, which were established by Stone based on complete data. This paper constructs a regression spline estimate of a general nonparametric regression function based on right_censored response data, and proves, under some regularity conditions, that this estimate achieves the optimal rates of global convergence for nonparametric regression. Since the parameters for the nonparametric regression estimate have to be chosen based on a data driven criterion, we also obtain the asymptotic optimality of AIC, AICC, GCV, Cp and FPE criteria in the process of selecting the parameters.

  9. Rediscovery of Good-Turing estimators via Bayesian nonparametrics.

    Science.gov (United States)

    Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye

    2016-03-01

    The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library.

  10. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb-Douglas and......We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs...... rejects both the Cobb-Douglas and the Translog functional form, while a recently developed nonparametric kernel regression method with a fully nonparametric panel data specification delivers plausible results. On average, the nonparametric regression results are similar to results that are obtained from...

  11. Adaptive Particle Filter for Nonparametric Estimation with Measurement Uncertainty in Wireless Sensor Networks.

    Science.gov (United States)

    Li, Xiaofan; Zhao, Yubin; Zhang, Sha; Fan, Xiaopeng

    2016-05-30

    Particle filters (PFs) are widely used for nonlinear signal processing in wireless sensor networks (WSNs). However, the measurement uncertainty makes the WSN observations unreliable to the actual case and also degrades the estimation accuracy of the PFs. In addition to the algorithm design, few works focus on improving the likelihood calculation method, since it can be pre-assumed by a given distribution model. In this paper, we propose a novel PF method, which is based on a new likelihood fusion method for WSNs and can further improve the estimation performance. We firstly use a dynamic Gaussian model to describe the nonparametric features of the measurement uncertainty. Then, we propose a likelihood adaptation method that employs the prior information and a belief factor to reduce the measurement noise. The optimal belief factor is attained by deriving the minimum Kullback-Leibler divergence. The likelihood adaptation method can be integrated into any PFs, and we use our method to develop three versions of adaptive PFs for a target tracking system using wireless sensor network. The simulation and experimental results demonstrate that our likelihood adaptation method has greatly improved the estimation performance of PFs in a high noise environment. In addition, the adaptive PFs are highly adaptable to the environment without imposing computational complexity.

  12. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Directory of Open Access Journals (Sweden)

    Saerom Park

    Full Text Available Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  13. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Science.gov (United States)

    Park, Saerom; Lee, Jaewook; Son, Youngdoo

    2016-01-01

    Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  14. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...

  15. Nonparametric estimation of a convex bathtub-shaped hazard function.

    Science.gov (United States)

    Jankowski, Hanna K; Wellner, Jon A

    2009-11-01

    In this paper, we study the nonparametric maximum likelihood estimator (MLE) of a convex hazard function. We show that the MLE is consistent and converges at a local rate of n(2/5) at points x(0) where the true hazard function is positive and strictly convex. Moreover, we establish the pointwise asymptotic distribution theory of our estimator under these same assumptions. One notable feature of the nonparametric MLE studied here is that no arbitrary choice of tuning parameter (or complicated data-adaptive selection of the tuning parameter) is required.

  16. Determining the Mass of Kepler-78b with Nonparametric Gaussian Process Estimation

    Science.gov (United States)

    Grunblatt, Samuel Kai; Howard, Andrew; Haywood, Raphaëlle

    2016-01-01

    Kepler-78b is a transiting planet that is 1.2 times the radius of Earth and orbits a young, active K dwarf every 8 hr. The mass of Kepler-78b has been independently reported by two teams based on radial velocity (RV) measurements using the HIRES and HARPS-N spectrographs. Due to the active nature of the host star, a stellar activity model is required to distinguish and isolate the planetary signal in RV data. Whereas previous studies tested parametric stellar activity models, we modeled this system using nonparametric Gaussian process (GP) regression. We produced a GP regression of relevant Kepler photometry. We then use the posterior parameter distribution for our photometric fit as a prior for our simultaneous GP + Keplerian orbit models of the RV data sets. We tested three simple kernel functions for our GP regressions. Based on a Bayesian likelihood analysis, we selected a quasi-periodic kernel model with GP hyperparameters coupled between the two RV data sets, giving a Doppler amplitude of 1.86 ± 0.25 m s-1 and supporting our belief that the correlated noise we are modeling is astrophysical. The corresponding mass of 1.87-0.26+0.27 ME is consistent with that measured in previous studies, and more robust due to our nonparametric signal estimation. Based on our mass and the radius measurement from transit photometry, Kepler-78b has a bulk density of 6.0-1.4+1.9 g cm-3. We estimate that Kepler-78b is 32% ± 26% iron using a two-component rock-iron model. This is consistent with an Earth-like composition, with uncertainty spanning Moon-like to Mercury-like compositions.

  17. Peirce on Educational Beliefs

    Science.gov (United States)

    Strand, Torill

    2005-01-01

    This article contends that Charles Sanders Peirce (1839-1914) may enhance our understanding of educational beliefs and that Peirce's logic may be a tool to distinguish between a dogmatic and a pragmatic justification of such beliefs. The first part of the article elaborates on Peirce's comprehension of beliefs as mediated, socially situated and…

  18. Nonparametric Cointegration Analysis of Fractional Systems With Unknown Integration Orders

    DEFF Research Database (Denmark)

    Nielsen, Morten Ørregaard

    2009-01-01

    In this paper a nonparametric variance ratio testing approach is proposed for determining the number of cointegrating relations in fractionally integrated systems. The test statistic is easily calculated without prior knowledge of the integration order of the data, the strength of the cointegrating...

  19. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...... but that this dependence vanishes after 2-3 years....

  20. A non-parametric model for the cosmic velocity field

    NARCIS (Netherlands)

    Branchini, E; Teodoro, L; Frenk, CS; Schmoldt, [No Value; Efstathiou, G; White, SDM; Saunders, W; Sutherland, W; Rowan-Robinson, M; Keeble, O; Tadros, H; Maddox, S; Oliver, S

    1999-01-01

    We present a self-consistent non-parametric model of the local cosmic velocity field derived from the distribution of IRAS galaxies in the PSCz redshift survey. The survey has been analysed using two independent methods, both based on the assumptions of gravitational instability and linear biasing.

  1. Influence of test and person characteristics on nonparametric appropriateness measurement

    NARCIS (Netherlands)

    Meijer, Rob R.; Molenaar, Ivo W.; Sijtsma, Klaas

    1994-01-01

    Appropriateness measurement in nonparametric item response theory modeling is affected by the reliability of the items, the test length, the type of aberrant response behavior, and the percentage of aberrant persons in the group. The percentage of simulees defined a priori as aberrant responders tha

  2. Influence of Test and Person Characteristics on Nonparametric Appropriateness Measurement

    NARCIS (Netherlands)

    Meijer, Rob R; Molenaar, Ivo W; Sijtsma, Klaas

    1994-01-01

    Appropriateness measurement in nonparametric item response theory modeling is affected by the reliability of the items, the test length, the type of aberrant response behavior, and the percentage of aberrant persons in the group. The percentage of simulees defined a priori as aberrant responders tha

  3. Estimation of Spatial Dynamic Nonparametric Durbin Models with Fixed Effects

    Science.gov (United States)

    Qian, Minghui; Hu, Ridong; Chen, Jianwei

    2016-01-01

    Spatial panel data models have been widely studied and applied in both scientific and social science disciplines, especially in the analysis of spatial influence. In this paper, we consider the spatial dynamic nonparametric Durbin model (SDNDM) with fixed effects, which takes the nonlinear factors into account base on the spatial dynamic panel…

  4. Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series

    DEFF Research Database (Denmark)

    Gao, Jiti; Kanaya, Shin; Li, Degui

    2015-01-01

    This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our...... results can be viewed as a nonstationary extension of some well-known uniform consistency results for stationary time series....

  5. Non-parametric Bayesian inference for inhomogeneous Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    With reference to a specific data set, we consider how to perform a flexible non-parametric Bayesian analysis of an inhomogeneous point pattern modelled by a Markov point process, with a location dependent first order term and pairwise interaction only. A priori we assume that the first order term...

  6. Investigating the cultural patterns of corruption: A nonparametric analysis

    OpenAIRE

    Halkos, George; Tzeremes, Nickolaos

    2011-01-01

    By using a sample of 77 countries our analysis applies several nonparametric techniques in order to reveal the link between national culture and corruption. Based on Hofstede’s cultural dimensions and the corruption perception index, the results reveal that countries with higher levels of corruption tend to have higher power distance and collectivism values in their society.

  7. Coverage Accuracy of Confidence Intervals in Nonparametric Regression

    Institute of Scientific and Technical Information of China (English)

    Song-xi Chen; Yong-song Qin

    2003-01-01

    Point-wise confidence intervals for a nonparametric regression function with random design points are considered. The confidence intervals are those based on the traditional normal approximation and the empirical likelihood. Their coverage accuracy is assessed by developing the Edgeworth expansions for the coverage probabilities. It is shown that the empirical likelihood confidence intervals are Bartlett correctable.

  8. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of consump

  9. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move...

  10. Effect on Prediction when Modeling Covariates in Bayesian Nonparametric Models.

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Rosner, Gary L; Müller, Peter; Stewart, Clinton F

    2013-04-01

    In biomedical research, it is often of interest to characterize biologic processes giving rise to observations and to make predictions of future observations. Bayesian nonparametric methods provide a means for carrying out Bayesian inference making as few assumptions about restrictive parametric models as possible. There are several proposals in the literature for extending Bayesian nonparametric models to include dependence on covariates. Limited attention, however, has been directed to the following two aspects. In this article, we examine the effect on fitting and predictive performance of incorporating covariates in a class of Bayesian nonparametric models by one of two primary ways: either in the weights or in the locations of a discrete random probability measure. We show that different strategies for incorporating continuous covariates in Bayesian nonparametric models can result in big differences when used for prediction, even though they lead to otherwise similar posterior inferences. When one needs the predictive density, as in optimal design, and this density is a mixture, it is better to make the weights depend on the covariates. We demonstrate these points via a simulated data example and in an application in which one wants to determine the optimal dose of an anticancer drug used in pediatric oncology.

  11. Wave Propagation

    CERN Document Server

    Ferrarese, Giorgio

    2011-01-01

    Lectures: A. Jeffrey: Lectures on nonlinear wave propagation.- Y. Choquet-Bruhat: Ondes asymptotiques.- G. Boillat: Urti.- Seminars: D. Graffi: Sulla teoria dell'ottica non-lineare.- G. Grioli: Sulla propagazione del calore nei mezzi continui.- T. Manacorda: Onde nei solidi con vincoli interni.- T. Ruggeri: "Entropy principle" and main field for a non linear covariant system.- B. Straughan: Singular surfaces in dipolar materials and possible consequences for continuum mechanics

  12. Strategic Belief Management

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul

    While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs are almost entirely neglected, save for some game theory treatments. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects....... The capability to manage beliefs will increasingly be a strategic one, a key source of wealth creation, and a key research area for strategic organization scholars.......While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs are almost entirely neglected, save for some game theory treatments. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects...

  13. Nonparametric Estimation of Mean and Variance and Pricing of Securities Nonparametric Estimation of Mean and Variance and Pricing of Sec

    Directory of Open Access Journals (Sweden)

    Akhtar R. Siddique

    2000-03-01

    Full Text Available This paper develops a filtering-based framework of non-parametric estimation of parameters of a diffusion process from the conditional moments of discrete observations of the process. This method is implemented for interest rate data in the Eurodollar and long term bond markets. The resulting estimates are then used to form non-parametric univariate and bivariate interest rate models and compute prices for the short term Eurodollar interest rate futures options and long term discount bonds. The bivariate model produces prices substantially closer to the market prices. This paper develops a filtering-based framework of non-parametric estimation of parameters of a diffusion process from the conditional moments of discrete observations of the process. This method is implemented for interest rate data in the Eurodollar and long term bond markets. The resulting estimates are then used to form non-parametric univariate and bivariate interest rate models and compute prices for the short term Eurodollar interest rate futures options and long term discount bonds. The bivariate model produces prices substantially closer to the market prices.

  14. Comparison of Rank Analysis of Covariance and Nonparametric Randomized Blocks Analysis.

    Science.gov (United States)

    Porter, Andrew C.; McSweeney, Maryellen

    The relative power of three possible experimental designs under the condition that data is to be analyzed by nonparametric techniques; the comparison of the power of each nonparametric technique to its parametric analogue; and the comparison of relative powers using nonparametric and parametric techniques are discussed. The three nonparametric…

  15. Nonparametric inference procedures for multistate life table analysis.

    Science.gov (United States)

    Dow, M M

    1985-01-01

    Recent generalizations of the classical single state life table procedures to the multistate case provide the means to analyze simultaneously the mobility and mortality experience of 1 or more cohorts. This paper examines fairly general nonparametric combinatorial matrix procedures, known as quadratic assignment, as an analysis technic of various transitional patterns commonly generated by cohorts over the life cycle course. To some degree, the output from a multistate life table analysis suggests inference procedures. In his discussion of multstate life table construction features, the author focuses on the matrix formulation of the problem. He then presents several examples of the proposed nonparametric procedures. Data for the mobility and life expectancies at birth matrices come from the 458 member Cayo Santiago rhesus monkey colony. The author's matrix combinatorial approach to hypotheses testing may prove to be a useful inferential strategy in several multidimensional demographic areas.

  16. Non-parametric estimation of Fisher information from real data

    CERN Document Server

    Shemesh, Omri Har; Miñano, Borja; Hoekstra, Alfons G; Sloot, Peter M A

    2015-01-01

    The Fisher Information matrix is a widely used measure for applications ranging from statistical inference, information geometry, experiment design, to the study of criticality in biological systems. Yet there is no commonly accepted non-parametric algorithm to estimate it from real data. In this rapid communication we show how to accurately estimate the Fisher information in a nonparametric way. We also develop a numerical procedure to minimize the errors by choosing the interval of the finite difference scheme necessary to compute the derivatives in the definition of the Fisher information. Our method uses the recently published "Density Estimation using Field Theory" algorithm to compute the probability density functions for continuous densities. We use the Fisher information of the normal distribution to validate our method and as an example we compute the temperature component of the Fisher Information Matrix in the two dimensional Ising model and show that it obeys the expected relation to the heat capa...

  17. International Conference on Robust Rank-Based and Nonparametric Methods

    CERN Document Server

    McKean, Joseph

    2016-01-01

    The contributors to this volume include many of the distinguished researchers in this area. Many of these scholars have collaborated with Joseph McKean to develop underlying theory for these methods, obtain small sample corrections, and develop efficient algorithms for their computation. The papers cover the scope of the area, including robust nonparametric rank-based procedures through Bayesian and big data rank-based analyses. Areas of application include biostatistics and spatial areas. Over the last 30 years, robust rank-based and nonparametric methods have developed considerably. These procedures generalize traditional Wilcoxon-type methods for one- and two-sample location problems. Research into these procedures has culminated in complete analyses for many of the models used in practice including linear, generalized linear, mixed, and nonlinear models. Settings are both multivariate and univariate. With the development of R packages in these areas, computation of these procedures is easily shared with r...

  18. Nonparametric instrumental regression with non-convex constraints

    Science.gov (United States)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  19. Combined parametric-nonparametric identification of block-oriented systems

    CERN Document Server

    Mzyk, Grzegorz

    2014-01-01

    This book considers a problem of block-oriented nonlinear dynamic system identification in the presence of random disturbances. This class of systems includes various interconnections of linear dynamic blocks and static nonlinear elements, e.g., Hammerstein system, Wiener system, Wiener-Hammerstein ("sandwich") system and additive NARMAX systems with feedback. Interconnecting signals are not accessible for measurement. The combined parametric-nonparametric algorithms, proposed in the book, can be selected dependently on the prior knowledge of the system and signals. Most of them are based on the decomposition of the complex system identification task into simpler local sub-problems by using non-parametric (kernel or orthogonal) regression estimation. In the parametric stage, the generalized least squares or the instrumental variables technique is commonly applied to cope with correlated excitations. Limit properties of the algorithms have been shown analytically and illustrated in simple experiments.

  20. Estimation of Stochastic Volatility Models by Nonparametric Filtering

    DEFF Research Database (Denmark)

    Kanaya, Shin; Kristensen, Dennis

    2016-01-01

    /estimated volatility process replacing the latent process. Our estimation strategy is applicable to both parametric and nonparametric stochastic volatility models, and can handle both jumps and market microstructure noise. The resulting estimators of the stochastic volatility model will carry additional biases......A two-step estimation method of stochastic volatility models is proposed: In the first step, we nonparametrically estimate the (unobserved) instantaneous volatility process. In the second step, standard estimation methods for fully observed diffusion processes are employed, but with the filtered...... and variances due to the first-step estimation, but under regularity conditions we show that these vanish asymptotically and our estimators inherit the asymptotic properties of the infeasible estimators based on observations of the volatility process. A simulation study examines the finite-sample properties...

  1. Nonparametric Regression Estimation for Multivariate Null Recurrent Processes

    Directory of Open Access Journals (Sweden)

    Biqing Cai

    2015-04-01

    Full Text Available This paper discusses nonparametric kernel regression with the regressor being a \\(d\\-dimensional \\(\\beta\\-null recurrent process in presence of conditional heteroscedasticity. We show that the mean function estimator is consistent with convergence rate \\(\\sqrt{n(Th^{d}}\\, where \\(n(T\\ is the number of regenerations for a \\(\\beta\\-null recurrent process and the limiting distribution (with proper normalization is normal. Furthermore, we show that the two-step estimator for the volatility function is consistent. The finite sample performance of the estimate is quite reasonable when the leave-one-out cross validation method is used for bandwidth selection. We apply the proposed method to study the relationship of Federal funds rate with 3-month and 5-year T-bill rates and discover the existence of nonlinearity of the relationship. Furthermore, the in-sample and out-of-sample performance of the nonparametric model is far better than the linear model.

  2. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    -Douglas function nor the Translog function are consistent with the “true” relationship between the inputs and the output in our data set. We solve this problem by using non-parametric regression. This approach delivers reasonable results, which are on average not too different from the results of the parametric......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  3. Right-Censored Nonparametric Regression: A Comparative Simulation Study

    Directory of Open Access Journals (Sweden)

    Dursun Aydın

    2016-11-01

    Full Text Available This paper introduces the operating of the selection criteria for right-censored nonparametric regression using smoothing spline. In order to transform the response variable into a variable that contains the right-censorship, we used the KaplanMeier weights proposed by [1], and [2]. The major problem in smoothing spline method is to determine a smoothing parameter to obtain nonparametric estimates of the regression function. In this study, the mentioned parameter is chosen based on censored data by means of the criteria such as improved Akaike information criterion (AICc, Bayesian (or Schwarz information criterion (BIC and generalized crossvalidation (GCV. For this purpose, a Monte-Carlo simulation study is carried out to illustrate which selection criterion gives the best estimation for censored data.

  4. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  5. Poverty and life cycle effects: A nonparametric analysis for Germany

    OpenAIRE

    Stich, Andreas

    1996-01-01

    Most empirical studies on poverty consider the extent of poverty either for the entire society or for separate groups like elderly people.However, these papers do not show what the situation looks like for persons of a certain age. In this paper poverty measures depending on age are derived using the joint density of income and age. The density is nonparametrically estimated by weighted Gaussian kernel density estimation. Applying the conditional density of income to several poverty measures ...

  6. Nonparametric estimation of Fisher information from real data

    Science.gov (United States)

    Har-Shemesh, Omri; Quax, Rick; Miñano, Borja; Hoekstra, Alfons G.; Sloot, Peter M. A.

    2016-02-01

    The Fisher information matrix (FIM) is a widely used measure for applications including statistical inference, information geometry, experiment design, and the study of criticality in biological systems. The FIM is defined for a parametric family of probability distributions and its estimation from data follows one of two paths: either the distribution is assumed to be known and the parameters are estimated from the data or the parameters are known and the distribution is estimated from the data. We consider the latter case which is applicable, for example, to experiments where the parameters are controlled by the experimenter and a complicated relation exists between the input parameters and the resulting distribution of the data. Since we assume that the distribution is unknown, we use a nonparametric density estimation on the data and then compute the FIM directly from that estimate using a finite-difference approximation to estimate the derivatives in its definition. The accuracy of the estimate depends on both the method of nonparametric estimation and the difference Δ θ between the densities used in the finite-difference formula. We develop an approach for choosing the optimal parameter difference Δ θ based on large deviations theory and compare two nonparametric density estimation methods, the Gaussian kernel density estimator and a novel density estimation using field theory method. We also compare these two methods to a recently published approach that circumvents the need for density estimation by estimating a nonparametric f divergence and using it to approximate the FIM. We use the Fisher information of the normal distribution to validate our method and as a more involved example we compute the temperature component of the FIM in the two-dimensional Ising model and show that it obeys the expected relation to the heat capacity and therefore peaks at the phase transition at the correct critical temperature.

  7. ANALYSIS OF TIED DATA: AN ALTERNATIVE NON-PARAMETRIC APPROACH

    Directory of Open Access Journals (Sweden)

    I. C. A. OYEKA

    2012-02-01

    Full Text Available This paper presents a non-parametric statistical method of analyzing two-sample data that makes provision for the possibility of ties in the data. A test statistic is developed and shown to be free of the effect of any possible ties in the data. An illustrative example is provided and the method is shown to compare favourably with its competitor; the Mann-Whitney test and is more powerful than the latter when there are ties.

  8. Nonparametric test for detecting change in distribution with panel data

    CERN Document Server

    Pommeret, Denys; Ghattas, Badih

    2011-01-01

    This paper considers the problem of comparing two processes with panel data. A nonparametric test is proposed for detecting a monotone change in the link between the two process distributions. The test statistic is of CUSUM type, based on the empirical distribution functions. The asymptotic distribution of the proposed statistic is derived and its finite sample property is examined by bootstrap procedures through Monte Carlo simulations.

  9. A Bayesian nonparametric method for prediction in EST analysis

    Directory of Open Access Journals (Sweden)

    Prünster Igor

    2007-09-01

    Full Text Available Abstract Background Expressed sequence tags (ESTs analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b the number of new unique genes to be observed in a future sample; c the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample.

  10. Fusion of Hard and Soft Information in Nonparametric Density Estimation

    Science.gov (United States)

    2015-06-10

    estimation exploiting, in concert, hard and soft information. Although our development, theoretical and numerical, makes no distinction based on sample...Fusion of Hard and Soft Information in Nonparametric Density Estimation∗ Johannes O. Royset Roger J-B Wets Department of Operations Research...univariate density estimation in situations when the sample ( hard information) is supplemented by “soft” information about the random phenomenon. These

  11. Nonparametric estimation for hazard rate monotonously decreasing system

    Institute of Scientific and Technical Information of China (English)

    Han Fengyan; Li Weisong

    2005-01-01

    Estimation of density and hazard rate is very important to the reliability analysis of a system. In order to estimate the density and hazard rate of a hazard rate monotonously decreasing system, a new nonparametric estimator is put forward. The estimator is based on the kernel function method and optimum algorithm. Numerical experiment shows that the method is accurate enough and can be used in many cases.

  12. Non-parametric versus parametric methods in environmental sciences

    Directory of Open Access Journals (Sweden)

    Muhammad Riaz

    2016-01-01

    Full Text Available This current report intends to highlight the importance of considering background assumptions required for the analysis of real datasets in different disciplines. We will provide comparative discussion of parametric methods (that depends on distributional assumptions (like normality relative to non-parametric methods (that are free from many distributional assumptions. We have chosen a real dataset from environmental sciences (one of the application areas. The findings may be extended to the other disciplines following the same spirit.

  13. Optimal Belief Approximation

    CERN Document Server

    Leike, Reimar H

    2016-01-01

    In Bayesian statistics probability distributions express beliefs. However, for many problems the beliefs cannot be computed analytically and approximations of beliefs are needed. We seek a ranking function that quantifies how "embarrassing" it is to communicate a given approximation. We show that there is only one ranking under the requirements that (1) the best ranked approximation is the non-approximated belief and (2) that the ranking judges approximations only by their predictions for actual outcomes. We find that this ranking is equivalent to the Kullback-Leibler divergence that is frequently used in the literature. However, there seems to be confusion about the correct order in which its functional arguments, the approximated and non-approximated beliefs, should be used. We hope that our elementary derivation settles the apparent confusion. We show for example that when approximating beliefs with Gaussian distributions the optimal approximation is given by moment matching. This is in contrast to many su...

  14. Belief Elicitation in Experiments

    DEFF Research Database (Denmark)

    Blanco, Mariana; Engelmann, Dirk; Koch, Alexander

    Belief elicitation in economics experiments usually relies on paying subjects according to the accuracy of stated beliefs in addition to payments for other decisions. Such incentives, however, allow risk-averse subjects to hedge with their stated beliefs against adverse outcomes of other decisions...... in the experiment. This raises two questions: (i) can we trust the existing belief elicitation results, (ii) can we avoid potential hedging confounds? Our results instill confidence regarding both issues. We propose an experimental design that eliminates hedging opportunities, and use this to test for the empirical...

  15. Preschool Teachers' Attitudes and Beliefs Toward Science

    Science.gov (United States)

    Lloyd, Sharon Henry

    In the United States, a current initiative, Advancing Active STEM Education for Our Youngest Learners, aims to advance science, technology, engineering, and math (STEM) education in early childhood. The purpose of this study was to understand preschool teachers' proficiency with science and address the problem of whether or not science learning opportunities are provided to young children based on teachers' attitudes and beliefs. A theoretical framework for establishing teachers' attitudes toward science developed by van Aalderen-Smeets, van der Molen, and Asma, along with Bandura's theory of self-efficacy were the foundations for this research. Research questions explored preschool teachers' attitudes and beliefs toward science in general and how they differed based on education level and years of preschool teaching experience. Descriptive comparative data were collected from 48 preschool teacher participants using an online format with a self-reported measure and were analyzed using nonparametric tests to describe differences between groups based on identified factors of teacher comfort, child benefit, and challenges. Results indicated that the participants believed that early childhood science is developmentally appropriate and that young children benefit from science instruction through improved school-readiness skills. Preschool teachers with a state credential or an associate's degree and more teaching experience had more teacher comfort toward science based on attitudes and beliefs surveyed. The data indicated participating preschool teachers experienced few challenges in teaching science. The study may support positive social change through increased awareness of strengths and weaknesses of preschool teachers for the development of effective science professional development. Science is a crucial component of school-readiness skills, laying a foundation for success in later grades.

  16. a Multivariate Downscaling Model for Nonparametric Simulation of Daily Flows

    Science.gov (United States)

    Molina, J. M.; Ramirez, J. A.; Raff, D. A.

    2011-12-01

    A multivariate, stochastic nonparametric framework for stepwise disaggregation of seasonal runoff volumes to daily streamflow is presented. The downscaling process is conditional on volumes of spring runoff and large-scale ocean-atmosphere teleconnections and includes a two-level cascade scheme: seasonal-to-monthly disaggregation first followed by monthly-to-daily disaggregation. The non-parametric and assumption-free character of the framework allows consideration of the random nature and nonlinearities of daily flows, which parametric models are unable to account for adequately. This paper examines statistical links between decadal/interannual climatic variations in the Pacific Ocean and hydrologic variability in US northwest region, and includes a periodicity analysis of climate patterns to detect coherences of their cyclic behavior in the frequency domain. We explore the use of such relationships and selected signals (e.g., north Pacific gyre oscillation, southern oscillation, and Pacific decadal oscillation indices, NPGO, SOI and PDO, respectively) in the proposed data-driven framework by means of a combinatorial approach with the aim of simulating improved streamflow sequences when compared with disaggregated series generated from flows alone. A nearest neighbor time series bootstrapping approach is integrated with principal component analysis to resample from the empirical multivariate distribution. A volume-dependent scaling transformation is implemented to guarantee the summability condition. In addition, we present a new and simple algorithm, based on nonparametric resampling, that overcomes the common limitation of lack of preservation of historical correlation between daily flows across months. The downscaling framework presented here is parsimonious in parameters and model assumptions, does not generate negative values, and produces synthetic series that are statistically indistinguishable from the observations. We present evidence showing that both

  17. Panel data nonparametric estimation of production risk and risk preferences

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We apply nonparametric panel data kernel regression to investigate production risk, out-put price uncertainty, and risk attitudes of Polish dairy farms based on a firm-level unbalanced panel data set that covers the period 2004–2010. We compare different model specifications and different...... approaches for obtaining firm-specific measures of risk attitudes. We found that Polish dairy farmers are risk averse regarding production risk and price uncertainty. According to our results, Polish dairy farmers perceive the production risk as being more significant than the risk related to output price...

  18. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  19. Nonparametric statistics a step-by-step approach

    CERN Document Server

    Corder, Gregory W

    2014-01-01

    "…a very useful resource for courses in nonparametric statistics in which the emphasis is on applications rather than on theory.  It also deserves a place in libraries of all institutions where introductory statistics courses are taught."" -CHOICE This Second Edition presents a practical and understandable approach that enhances and expands the statistical toolset for readers. This book includes: New coverage of the sign test and the Kolmogorov-Smirnov two-sample test in an effort to offer a logical and natural progression to statistical powerSPSS® (Version 21) software and updated screen ca

  20. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  1. Nonparametric statistical structuring of knowledge systems using binary feature matches

    DEFF Research Database (Denmark)

    Mørup, Morten; Glückstad, Fumiko Kano; Herlau, Tue

    2014-01-01

    statistical support and how this approach generalizes to the structuring and alignment of knowledge systems. We propose a non-parametric Bayesian generative model for structuring binary feature data that does not depend on a specific choice of similarity measure. We jointly model all combinations of binary......Structuring knowledge systems with binary features is often based on imposing a similarity measure and clustering objects according to this similarity. Unfortunately, such analyses can be heavily influenced by the choice of similarity measure. Furthermore, it is unclear at which level clusters have...

  2. Testing for a constant coefficient of variation in nonparametric regression

    OpenAIRE

    Dette, Holger; Marchlewski, Mareen; Wagener, Jens

    2010-01-01

    In the common nonparametric regression model Y_i=m(X_i)+sigma(X_i)epsilon_i we consider the problem of testing the hypothesis that the coefficient of the scale and location function is constant. The test is based on a comparison of the observations Y_i=\\hat{sigma}(X_i) with their mean by a smoothed empirical process, where \\hat{sigma} denotes the local linear estimate of the scale function. We show weak convergence of a centered version of this process to a Gaussian process under the null ...

  3. Generative Temporal Modelling of Neuroimaging - Decomposition and Nonparametric Testing

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff

    The goal of this thesis is to explore two improvements for functional magnetic resonance imaging (fMRI) analysis; namely our proposed decomposition method and an extension to the non-parametric testing framework. Analysis of fMRI allows researchers to investigate the functional processes...... of the brain, and provides insight into neuronal coupling during mental processes or tasks. The decomposition method is a Gaussian process-based independent components analysis (GPICA), which incorporates a temporal dependency in the sources. A hierarchical model specification is used, featuring both...

  4. Using Mathematica to build Non-parametric Statistical Tables

    Directory of Open Access Journals (Sweden)

    Gloria Perez Sainz de Rozas

    2003-01-01

    Full Text Available In this paper, I present computational procedures to obtian statistical tables. The tables of the asymptotic distribution and the exact distribution of Kolmogorov-Smirnov statistic Dn for one population, the table of the distribution of the runs R, the table of the distribution of Wilcoxon signed-rank statistic W+ and the table of the distribution of Mann-Whitney statistic Ux using Mathematica, Version 3.9 under Window98. I think that it is an interesting cuestion because many statistical packages give the asymptotic significance level in the statistical tests and with these porcedures one can easily calculate the exact significance levels and the left-tail and right-tail probabilities with non-parametric distributions. I have used mathematica to make these calculations because one can use symbolic language to solve recursion relations. It's very easy to generate the format of the tables, and it's possible to obtain any table of the mentioned non-parametric distributions with any precision, not only with the standard parameters more used in Statistics, and without transcription mistakes. Furthermore, using similar procedures, we can generate tables for the following distribution functions: Binomial, Poisson, Hypergeometric, Normal, x2 Chi-Square, T-Student, F-Snedecor, Geometric, Gamma and Beta.

  5. 1st Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Lahiri, S; Politis, Dimitris

    2014-01-01

    This volume is composed of peer-reviewed papers that have developed from the First Conference of the International Society for NonParametric Statistics (ISNPS). This inaugural conference took place in Chalkidiki, Greece, June 15-19, 2012. It was organized with the co-sponsorship of the IMS, the ISI, and other organizations. M.G. Akritas, S.N. Lahiri, and D.N. Politis are the first executive committee members of ISNPS, and the editors of this volume. ISNPS has a distinguished Advisory Committee that includes Professors R.Beran, P.Bickel, R. Carroll, D. Cook, P. Hall, R. Johnson, B. Lindsay, E. Parzen, P. Robinson, M. Rosenblatt, G. Roussas, T. SubbaRao, and G. Wahba. The Charting Committee of ISNPS consists of more than 50 prominent researchers from all over the world.   The chapters in this volume bring forth recent advances and trends in several areas of nonparametric statistics. In this way, the volume facilitates the exchange of research ideas, promotes collaboration among researchers from all over the wo...

  6. Non-parametric Morphologies of Mergers in the Illustris Simulation

    CERN Document Server

    Bignone, Lucas A; Sillero, Emanuel; Pedrosa, Susana E; Pellizza, Leonardo J; Lambas, Diego G

    2016-01-01

    We study non-parametric morphologies of mergers events in a cosmological context, using the Illustris project. We produce mock g-band images comparable to observational surveys from the publicly available Illustris simulation idealized mock images at $z=0$. We then measure non parametric indicators: asymmetry, Gini, $M_{20}$, clumpiness and concentration for a set of galaxies with $M_* >10^{10}$ M$_\\odot$. We correlate these automatic statistics with the recent merger history of galaxies and with the presence of close companions. Our main contribution is to assess in a cosmological framework, the empirically derived non-parametric demarcation line and average time-scales used to determine the merger rate observationally. We found that 98 per cent of galaxies above the demarcation line have a close companion or have experienced a recent merger event. On average, merger signatures obtained from the $G-M_{20}$ criteria anticorrelate clearly with the elapsing time to the last merger event. We also find that the a...

  7. Genomic breeding value estimation using nonparametric additive regression models

    Directory of Open Access Journals (Sweden)

    Solberg Trygve

    2009-01-01

    Full Text Available Abstract Genomic selection refers to the use of genomewide dense markers for breeding value estimation and subsequently for selection. The main challenge of genomic breeding value estimation is the estimation of many effects from a limited number of observations. Bayesian methods have been proposed to successfully cope with these challenges. As an alternative class of models, non- and semiparametric models were recently introduced. The present study investigated the ability of nonparametric additive regression models to predict genomic breeding values. The genotypes were modelled for each marker or pair of flanking markers (i.e. the predictors separately. The nonparametric functions for the predictors were estimated simultaneously using additive model theory, applying a binomial kernel. The optimal degree of smoothing was determined by bootstrapping. A mutation-drift-balance simulation was carried out. The breeding values of the last generation (genotyped was predicted using data from the next last generation (genotyped and phenotyped. The results show moderate to high accuracies of the predicted breeding values. A determination of predictor specific degree of smoothing increased the accuracy.

  8. Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes

    Science.gov (United States)

    Zhou, Wei-Xing; Sornette, Didier

    We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.

  9. Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization

    Science.gov (United States)

    Lee, Kyungbook; Song, Seok Goo

    2016-10-01

    Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events (M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.

  10. A non-parametric framework for estimating threshold limit values

    Directory of Open Access Journals (Sweden)

    Ulm Kurt

    2005-11-01

    Full Text Available Abstract Background To estimate a threshold limit value for a compound known to have harmful health effects, an 'elbow' threshold model is usually applied. We are interested on non-parametric flexible alternatives. Methods We describe how a step function model fitted by isotonic regression can be used to estimate threshold limit values. This method returns a set of candidate locations, and we discuss two algorithms to select the threshold among them: the reduced isotonic regression and an algorithm considering the closed family of hypotheses. We assess the performance of these two alternative approaches under different scenarios in a simulation study. We illustrate the framework by analysing the data from a study conducted by the German Research Foundation aiming to set a threshold limit value in the exposure to total dust at workplace, as a causal agent for developing chronic bronchitis. Results In the paper we demonstrate the use and the properties of the proposed methodology along with the results from an application. The method appears to detect the threshold with satisfactory success. However, its performance can be compromised by the low power to reject the constant risk assumption when the true dose-response relationship is weak. Conclusion The estimation of thresholds based on isotonic framework is conceptually simple and sufficiently powerful. Given that in threshold value estimation context there is not a gold standard method, the proposed model provides a useful non-parametric alternative to the standard approaches and can corroborate or challenge their findings.

  11. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb-Douglas a......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...... to estimate production functions without the specification of a functional form. Therefore, they avoid possible misspecification errors due to the use of an unsuitable functional form. In this paper, we use parametric and non-parametric methods to identify the optimal size of Polish crop farms...

  12. Bayesian nonparametric centered random effects models with variable selection.

    Science.gov (United States)

    Yang, Mingan

    2013-03-01

    In a linear mixed effects model, it is common practice to assume that the random effects follow a parametric distribution such as a normal distribution with mean zero. However, in the case of variable selection, substantial violation of the normality assumption can potentially impact the subset selection and result in poor interpretation and even incorrect results. In nonparametric random effects models, the random effects generally have a nonzero mean, which causes an identifiability problem for the fixed effects that are paired with the random effects. In this article, we focus on a Bayesian method for variable selection. We characterize the subject-specific random effects nonparametrically with a Dirichlet process and resolve the bias simultaneously. In particular, we propose flexible modeling of the conditional distribution of the random effects with changes across the predictor space. The approach is implemented using a stochastic search Gibbs sampler to identify subsets of fixed effects and random effects to be included in the model. Simulations are provided to evaluate and compare the performance of our approach to the existing ones. We then apply the new approach to a real data example, cross-country and interlaboratory rodent uterotrophic bioassay.

  13. Wavelet Estimators in Nonparametric Regression: A Comparative Simulation Study

    Directory of Open Access Journals (Sweden)

    Anestis Antoniadis

    2001-06-01

    Full Text Available Wavelet analysis has been found to be a powerful tool for the nonparametric estimation of spatially-variable objects. We discuss in detail wavelet methods in nonparametric regression, where the data are modelled as observations of a signal contaminated with additive Gaussian noise, and provide an extensive review of the vast literature of wavelet shrinkage and wavelet thresholding estimators developed to denoise such data. These estimators arise from a wide range of classical and empirical Bayes methods treating either individual or blocks of wavelet coefficients. We compare various estimators in an extensive simulation study on a variety of sample sizes, test functions, signal-to-noise ratios and wavelet filters. Because there is no single criterion that can adequately summarise the behaviour of an estimator, we use various criteria to measure performance in finite sample situations. Insight into the performance of these estimators is obtained from graphical outputs and numerical tables. In order to provide some hints of how these estimators should be used to analyse real data sets, a detailed practical step-by-step illustration of a wavelet denoising analysis on electrical consumption is provided. Matlab codes are provided so that all figures and tables in this paper can be reproduced.

  14. Computing Economies of Scope Using Robust Partial Frontier Nonparametric Methods

    Directory of Open Access Journals (Sweden)

    Pedro Carvalho

    2016-03-01

    Full Text Available This paper proposes a methodology to examine economies of scope using the recent order-α nonparametric method. It allows us to investigate economies of scope by comparing the efficient order-α frontiers of firms that produce two or more goods with the efficient order-α frontiers of firms that produce only one good. To accomplish this, and because the order-α frontiers are irregular, we suggest to linearize them by the DEA estimator. The proposed methodology uses partial frontier nonparametric methods that are more robust than the traditional full frontier methods. By using a sample of 67 Portuguese water utilities for the period 2002–2008 and, also, a simulated sample, we prove the usefulness of the approach adopted and show that if only the full frontier methods were used, they would lead to different results. We found evidence of economies of scope in the provision of water supply and wastewater services simultaneously by water utilities in Portugal.

  15. Bayesian nonparametric dictionary learning for compressed sensing MRI.

    Science.gov (United States)

    Huang, Yue; Paisley, John; Lin, Qin; Ding, Xinghao; Fu, Xueyang; Zhang, Xiao-Ping

    2014-12-01

    We develop a Bayesian nonparametric model for reconstructing magnetic resonance images (MRIs) from highly undersampled k -space data. We perform dictionary learning as part of the image reconstruction process. To this end, we use the beta process as a nonparametric dictionary learning prior for representing an image patch as a sparse combination of dictionary elements. The size of the dictionary and patch-specific sparsity pattern are inferred from the data, in addition to other dictionary learning variables. Dictionary learning is performed directly on the compressed image, and so is tailored to the MRI being considered. In addition, we investigate a total variation penalty term in combination with the dictionary learning model, and show how the denoising property of dictionary learning removes dependence on regularization parameters in the noisy setting. We derive a stochastic optimization algorithm based on Markov chain Monte Carlo for the Bayesian model, and use the alternating direction method of multipliers for efficiently performing total variation minimization. We present empirical results on several MRI, which show that the proposed regularization framework can improve reconstruction accuracy over other methods.

  16. On belief propagation guided decimation for random k-SAT

    CERN Document Server

    Coja-Oghlan, Amin

    2010-01-01

    Let F be a uniformly distributed random k-SAT formula with n variables and m clauses. Non-constructive arguments show that F is satisfiable for clause/variable ratios m/nc.r(k)/k, for a constant c>0 (independent of k).

  17. Nonparametric Estimation of Cumulative Incidence Functions for Competing Risks Data with Missing Cause of Failure

    DEFF Research Database (Denmark)

    Effraimidis, Georgios; Dahl, Christian Møller

    In this paper, we develop a fully nonparametric approach for the estimation of the cumulative incidence function with Missing At Random right-censored competing risks data. We obtain results on the pointwise asymptotic normality as well as the uniform convergence rate of the proposed nonparametric...... estimator. A simulation study that serves two purposes is provided. First, it illustrates in details how to implement our proposed nonparametric estimator. Secondly, it facilitates a comparison of the nonparametric estimator to a parametric counterpart based on the estimator of Lu and Liang (2008...

  18. Reducing conspiracy theory beliefs

    Directory of Open Access Journals (Sweden)

    Stojanov Ana

    2015-01-01

    Full Text Available This study aimed to look at possible ways to reduce beliefs in conspiracy theories and increase the intention to have a fictitious child vaccinated. One hundred and sixty participants answered an online questionnaire. Three groups were used. The control group did not read any text prior to answering whereas the two experimental groups read either only debunking information or information about the motives of the conspiracists and the fallacy in their reasoning in addition to the debunking paragraph. The second experimental manipulation was effective in reducing medical conspiracy theories beliefs, but not belief in conspiracy theories in general. Neither intervention was effective in increasing the likelihood to have a fictitious child vaccinated. Those not intending to vaccinate a fictitious child endorsed conspiracy theories to a greater degree. A positive correlation between beliefs in conspiracy theories and the experiential/intuitive information processing system was found.

  19. Teaching Beliefs and Evaluation

    Science.gov (United States)

    Asch, Rosalie L.

    1976-01-01

    Attempts to bring into clearer perspective how art teaching beliefs relate to attitudes and methodology concerning evaluation. Also shows how some common evaluation pitfalls can be avoided without compromising art learning and quality. (Author/RK)

  20. Singular Value Decomposition, Hessian Errors, and Linear Algebra of Non-parametric Extraction of Partons from DIS

    CERN Document Server

    Goshtasbpour, Mehrdad

    2014-01-01

    By singular value decomposition (SVD) of a numerically singular Hessian matrix and a numerically singular system of linear equations for the experimental data (accumulated in the respective ${\\chi ^2}$ function) and constraints, least square solutions and their propagated errors for the non-parametric extraction of Partons from $F_2$ are obtained. SVD and its physical application is phenomenologically described in the two cases. Among the subjects covered are: identification and properties of the boundary between the two subsets of ordered eigenvalues corresponding to range and null space, and the eigenvalue structure of the null space of the singular matrix, including a second boundary separating the smallest eigenvalues of essentially no information, in a particular case. The eigenvector-eigenvalue structure of "redundancy and smallness" of the errors of two pdf sets, in our simplified Hessian model, is described by a secondary manifestation of deeper null space, in the context of SVD.

  1. Mental health, belief deficit compensation, and paranormal beliefs.

    Science.gov (United States)

    Schumaker, J F

    1987-09-01

    The present study examined the relationship between religious and nonreligious paranormal beliefs and mental health, as well as the possibility that nonreligious subjects compensate for a lack of identification with traditional religion by increased nonreligious paranormal beliefs. Subjects were 80 undergraduates categorized as religious or nonreligious on the basis of scores on the Traditional Religion subscale of the Paranormal Belief Scale. Religious subjects had significantly higher total paranormal belief scores than nonreligious subjects. Those adopting religious paranormal beliefs were actually somewhat more likely to adopt other nonreligious paranormal beliefs. The failure of nonreligious subjects to compensate fully for this traditional religious belief deficit was reflected in their mental health ratings on the Langer's Mental Health Scale (Langer, 1962). Paranormal beliefs were found to be negatively correlated with reported symptoms of psychopathology, supporting the formulation that paranormal beliefs may serve to ensure psychic integrity by acting as "self-serving cognitive biases."

  2. Robust Depth-Weighted Wavelet for Nonparametric Regression Models

    Institute of Scientific and Technical Information of China (English)

    Lu LIN

    2005-01-01

    In the nonpaxametric regression models, the original regression estimators including kernel estimator, Fourier series estimator and wavelet estimator are always constructed by the weighted sum of data, and the weights depend only on the distance between the design points and estimation points. As a result these estimators are not robust to the perturbations in data. In order to avoid this problem, a new nonparametric regression model, called the depth-weighted regression model, is introduced and then the depth-weighted wavelet estimation is defined. The new estimation is robust to the perturbations in data, which attains very high breakdown value close to 1/2. On the other hand, some asymptotic behaviours such as asymptotic normality are obtained. Some simulations illustrate that the proposed wavelet estimator is more robust than the original wavelet estimator and, as a price to pay for the robustness, the new method is slightly less efficient than the original method.

  3. Nonparametric Bayesian inference of the microcanonical stochastic block model

    CERN Document Server

    Peixoto, Tiago P

    2016-01-01

    A principled approach to characterize the hidden modular structure of networks is to formulate generative models, and then infer their parameters from data. When the desired structure is composed of modules or "communities", a suitable choice for this task is the stochastic block model (SBM), where nodes are divided into groups, and the placement of edges is conditioned on the group memberships. Here, we present a nonparametric Bayesian method to infer the modular structure of empirical networks, including the number of modules and their hierarchical organization. We focus on a microcanonical variant of the SBM, where the structure is imposed via hard constraints. We show how this simple model variation allows simultaneously for two important improvements over more traditional inference approaches: 1. Deeper Bayesian hierarchies, with noninformative priors replaced by sequences of priors and hyperpriors, that not only remove limitations that seriously degrade the inference on large networks, but also reveal s...

  4. A Non-Parametric Spatial Independence Test Using Symbolic Entropy

    Directory of Open Access Journals (Sweden)

    López Hernández, Fernando

    2008-01-01

    Full Text Available In the present paper, we construct a new, simple, consistent and powerful test forspatial independence, called the SG test, by using symbolic dynamics and symbolic entropyas a measure of spatial dependence. We also give a standard asymptotic distribution of anaffine transformation of the symbolic entropy under the null hypothesis of independencein the spatial process. The test statistic and its standard limit distribution, with theproposed symbolization, are invariant to any monotonuous transformation of the data.The test applies to discrete or continuous distributions. Given that the test is based onentropy measures, it avoids smoothed nonparametric estimation. We include a MonteCarlo study of our test, together with the well-known Moran’s I, the SBDS (de Graaffet al, 2001 and (Brett and Pinkse, 1997 non parametric test, in order to illustrate ourapproach.

  5. Analyzing single-molecule time series via nonparametric Bayesian inference.

    Science.gov (United States)

    Hines, Keegan E; Bankston, John R; Aldrich, Richard W

    2015-02-03

    The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  6. Analyzing multiple spike trains with nonparametric Granger causality.

    Science.gov (United States)

    Nedungadi, Aatira G; Rangarajan, Govindan; Jain, Neeraj; Ding, Mingzhou

    2009-08-01

    Simultaneous recordings of spike trains from multiple single neurons are becoming commonplace. Understanding the interaction patterns among these spike trains remains a key research area. A question of interest is the evaluation of information flow between neurons through the analysis of whether one spike train exerts causal influence on another. For continuous-valued time series data, Granger causality has proven an effective method for this purpose. However, the basis for Granger causality estimation is autoregressive data modeling, which is not directly applicable to spike trains. Various filtering options distort the properties of spike trains as point processes. Here we propose a new nonparametric approach to estimate Granger causality directly from the Fourier transforms of spike train data. We validate the method on synthetic spike trains generated by model networks of neurons with known connectivity patterns and then apply it to neurons simultaneously recorded from the thalamus and the primary somatosensory cortex of a squirrel monkey undergoing tactile stimulation.

  7. Prior processes and their applications nonparametric Bayesian estimation

    CERN Document Server

    Phadia, Eswar G

    2016-01-01

    This book presents a systematic and comprehensive treatment of various prior processes that have been developed over the past four decades for dealing with Bayesian approach to solving selected nonparametric inference problems. This revised edition has been substantially expanded to reflect the current interest in this area. After an overview of different prior processes, it examines the now pre-eminent Dirichlet process and its variants including hierarchical processes, then addresses new processes such as dependent Dirichlet, local Dirichlet, time-varying and spatial processes, all of which exploit the countable mixture representation of the Dirichlet process. It subsequently discusses various neutral to right type processes, including gamma and extended gamma, beta and beta-Stacy processes, and then describes the Chinese Restaurant, Indian Buffet and infinite gamma-Poisson processes, which prove to be very useful in areas such as machine learning, information retrieval and featural modeling. Tailfree and P...

  8. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb......-Douglas or the Translog production function is used. However, the specification of a functional form for the production function involves the risk of specifying a functional form that is not similar to the “true” relationship between the inputs and the output. This misspecification might result in biased estimation...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  9. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  10. Curve registration by nonparametric goodness-of-fit testing

    CERN Document Server

    Dalalyan, Arnak

    2011-01-01

    The problem of curve registration appears in many different areas of applications ranging from neuroscience to road traffic modeling. In the present work, we propose a nonparametric testing framework in which we develop a generalized likelihood ratio test to perform curve registration. We first prove that, under the null hypothesis, the resulting test statistic is asymptotically distributed as a chi-squared random variable. This result, often referred to as Wilks' phenomenon, provides a natural threshold for the test of a prescribed asymptotic significance level and a natural measure of lack-of-fit in terms of the p-value of the chi squared test. We also prove that the proposed test is consistent, i.e., its power is asymptotically equal to 1. Some numerical experiments on synthetic datasets are reported as well.

  11. Nonparametric forecasting of low-dimensional dynamical systems.

    Science.gov (United States)

    Berry, Tyrus; Giannakis, Dimitrios; Harlim, John

    2015-03-01

    This paper presents a nonparametric modeling approach for forecasting stochastic dynamical systems on low-dimensional manifolds. The key idea is to represent the discrete shift maps on a smooth basis which can be obtained by the diffusion maps algorithm. In the limit of large data, this approach converges to a Galerkin projection of the semigroup solution to the underlying dynamics on a basis adapted to the invariant measure. This approach allows one to quantify uncertainties (in fact, evolve the probability distribution) for nontrivial dynamical systems with equation-free modeling. We verify our approach on various examples, ranging from an inhomogeneous anisotropic stochastic differential equation on a torus, the chaotic Lorenz three-dimensional model, and the Niño-3.4 data set which is used as a proxy of the El Niño Southern Oscillation.

  12. Nonparametric Model of Smooth Muscle Force Production During Electrical Stimulation.

    Science.gov (United States)

    Cole, Marc; Eikenberry, Steffen; Kato, Takahide; Sandler, Roman A; Yamashiro, Stanley M; Marmarelis, Vasilis Z

    2017-03-01

    A nonparametric model of smooth muscle tension response to electrical stimulation was estimated using the Laguerre expansion technique of nonlinear system kernel estimation. The experimental data consisted of force responses of smooth muscle to energy-matched alternating single pulse and burst current stimuli. The burst stimuli led to at least a 10-fold increase in peak force in smooth muscle from Mytilus edulis, despite the constant energy constraint. A linear model did not fit the data. However, a second-order model fit the data accurately, so the higher-order models were not required to fit the data. Results showed that smooth muscle force response is not linearly related to the stimulation power.

  13. Nonparametric estimation of stochastic differential equations with sparse Gaussian processes

    Science.gov (United States)

    García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.

    2017-08-01

    The application of stochastic differential equations (SDEs) to the analysis of temporal data has attracted increasing attention, due to their ability to describe complex dynamics with physically interpretable equations. In this paper, we introduce a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series. The use of Gaussian processes as priors permits working directly in a function-space view and thus the inference takes place directly in this space. To cope with the computational complexity that requires the use of Gaussian processes, a sparse Gaussian process approximation is provided. This approximation permits the efficient computation of predictions for the drift and diffusion terms by using a distribution over a small subset of pseudosamples. The proposed method has been validated using both simulated data and real data from economy and paleoclimatology. The application of the method to real data demonstrates its ability to capture the behavior of complex systems.

  14. Revealing components of the galaxy population through nonparametric techniques

    CERN Document Server

    Bamford, Steven P; Nichol, Robert C; Miller, Christopher J; Wasserman, Larry; Genovese, Christopher R; Freeman, Peter E

    2008-01-01

    The distributions of galaxy properties vary with environment, and are often multimodal, suggesting that the galaxy population may be a combination of multiple components. The behaviour of these components versus environment holds details about the processes of galaxy development. To release this information we apply a novel, nonparametric statistical technique, identifying four components present in the distribution of galaxy H$\\alpha$ emission-line equivalent-widths. We interpret these components as passive, star-forming, and two varieties of active galactic nuclei. Independent of this interpretation, the properties of each component are remarkably constant as a function of environment. Only their relative proportions display substantial variation. The galaxy population thus appears to comprise distinct components which are individually independent of environment, with galaxies rapidly transitioning between components as they move into denser environments.

  15. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...... of stochasticity associated with Lithuanian family farm performance. The former technique showed that the farms differed in terms of the mean values and variance of the efficiency scores over time with some clear patterns prevailing throughout the whole research period. The fuzzy Free Disposal Hull showed...

  16. Binary Classifier Calibration Using a Bayesian Non-Parametric Approach.

    Science.gov (United States)

    Naeini, Mahdi Pakdaman; Cooper, Gregory F; Hauskrecht, Milos

    Learning probabilistic predictive models that are well calibrated is critical for many prediction and decision-making tasks in Data mining. This paper presents two new non-parametric methods for calibrating outputs of binary classification models: a method based on the Bayes optimal selection and a method based on the Bayesian model averaging. The advantage of these methods is that they are independent of the algorithm used to learn a predictive model, and they can be applied in a post-processing step, after the model is learned. This makes them applicable to a wide variety of machine learning models and methods. These calibration methods, as well as other methods, are tested on a variety of datasets in terms of both discrimination and calibration performance. The results show the methods either outperform or are comparable in performance to the state-of-the-art calibration methods.

  17. Parametric or nonparametric? A parametricness index for model selection

    CERN Document Server

    Liu, Wei; 10.1214/11-AOS899

    2012-01-01

    In model selection literature, two classes of criteria perform well asymptotically in different situations: Bayesian information criterion (BIC) (as a representative) is consistent in selection when the true model is finite dimensional (parametric scenario); Akaike's information criterion (AIC) performs well in an asymptotic efficiency when the true model is infinite dimensional (nonparametric scenario). But there is little work that addresses if it is possible and how to detect the situation that a specific model selection problem is in. In this work, we differentiate the two scenarios theoretically under some conditions. We develop a measure, parametricness index (PI), to assess whether a model selected by a potentially consistent procedure can be practically treated as the true model, which also hints on AIC or BIC is better suited for the data for the goal of estimating the regression function. A consequence is that by switching between AIC and BIC based on the PI, the resulting regression estimator is si...

  18. Nonparametric reconstruction of the Om diagnostic to test LCDM

    CERN Document Server

    Escamilla-Rivera, Celia

    2015-01-01

    Cosmic acceleration is usually related with the unknown dark energy, which equation of state, w(z), is constrained and numerically confronted with independent astrophysical data. In order to make a diagnostic of w(z), the introduction of a null test of dark energy can be done using a diagnostic function of redshift, Om. In this work we present a nonparametric reconstruction of this diagnostic using the so-called Loess-Simex factory to test the concordance model with the advantage that this approach offers an alternative way to relax the use of priors and find a possible 'w' that reliably describe the data with no previous knowledge of a cosmological model. Our results demonstrate that the method applied to the dynamical Om diagnostic finds a preference for a dark energy model with equation of state w =-2/3, which correspond to a static domain wall network.

  19. Evaluation of Nonparametric Probabilistic Forecasts of Wind Power

    DEFF Research Database (Denmark)

    Pinson, Pierre; Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg, orlov 31.07.2008;

    likely outcome for each look-ahead time, but also with uncertainty estimates given by probabilistic forecasts. In order to avoid assumptions on the shape of predictive distributions, these probabilistic predictions are produced from nonparametric methods, and then take the form of a single or a set...... of quantile forecasts. The required and desirable properties of such probabilistic forecasts are defined and a framework for their evaluation is proposed. This framework is applied for evaluating the quality of two statistical methods producing full predictive distributions from point predictions of wind......Predictions of wind power production for horizons up to 48-72 hour ahead comprise a highly valuable input to the methods for the daily management or trading of wind generation. Today, users of wind power predictions are not only provided with point predictions, which are estimates of the most...

  20. Equity and efficiency in private and public education: a nonparametric comparison

    NARCIS (Netherlands)

    L. Cherchye; K. de Witte; E. Ooghe; I. Nicaise

    2007-01-01

    We present a nonparametric approach for the equity and efficiency evaluation of (private and public) primary schools in Flanders. First, we use a nonparametric (Data Envelopment Analysis) model that is specially tailored to assess educational efficiency at the pupil level. The model accounts for the

  1. Parametric and non-parametric modeling of short-term synaptic plasticity. Part II: Experimental study.

    Science.gov (United States)

    Song, Dong; Wang, Zhuo; Marmarelis, Vasilis Z; Berger, Theodore W

    2009-02-01

    This paper presents a synergistic parametric and non-parametric modeling study of short-term plasticity (STP) in the Schaffer collateral to hippocampal CA1 pyramidal neuron (SC) synapse. Parametric models in the form of sets of differential and algebraic equations have been proposed on the basis of the current understanding of biological mechanisms active within the system. Non-parametric Poisson-Volterra models are obtained herein from broadband experimental input-output data. The non-parametric model is shown to provide better prediction of the experimental output than a parametric model with a single set of facilitation/depression (FD) process. The parametric model is then validated in terms of its input-output transformational properties using the non-parametric model since the latter constitutes a canonical and more complete representation of the synaptic nonlinear dynamics. Furthermore, discrepancies between the experimentally-derived non-parametric model and the equivalent non-parametric model of the parametric model suggest the presence of multiple FD processes in the SC synapses. Inclusion of an additional set of FD process in the parametric model makes it replicate better the characteristics of the experimentally-derived non-parametric model. This improved parametric model in turn provides the requisite biological interpretability that the non-parametric model lacks.

  2. Out-of-Sample Extensions for Non-Parametric Kernel Methods.

    Science.gov (United States)

    Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang

    2017-02-01

    Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.

  3. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  4. Equity and efficiency in private and public education: a nonparametric comparison

    NARCIS (Netherlands)

    Cherchye, L.; de Witte, K.; Ooghe, E.; Nicaise, I.

    2007-01-01

    We present a nonparametric approach for the equity and efficiency evaluation of (private and public) primary schools in Flanders. First, we use a nonparametric (Data Envelopment Analysis) model that is specially tailored to assess educational efficiency at the pupil level. The model accounts for the

  5. Semi-parametric regression: Efficiency gains from modeling the nonparametric part

    CERN Document Server

    Yu, Kyusang; Park, Byeong U; 10.3150/10-BEJ296

    2011-01-01

    It is widely admitted that structured nonparametric modeling that circumvents the curse of dimensionality is important in nonparametric estimation. In this paper we show that the same holds for semi-parametric estimation. We argue that estimation of the parametric component of a semi-parametric model can be improved essentially when more structure is put into the nonparametric part of the model. We illustrate this for the partially linear model, and investigate efficiency gains when the nonparametric part of the model has an additive structure. We present the semi-parametric Fisher information bound for estimating the parametric part of the partially linear additive model and provide semi-parametric efficient estimators for which we use a smooth backfitting technique to deal with the additive nonparametric part. We also present the finite sample performances of the proposed estimators and analyze Boston housing data as an illustration.

  6. On the Importance of Elimination Heuristics in Lazy Propagation

    DEFF Research Database (Denmark)

    Madsen, Anders Læsø; Butz, Cory J.

    2012-01-01

    Belief update in a Bayesian network using Lazy Propagation (LP) proceeds by message passing over a junction tree (JT). In the process of computing a message, a set of variables is eliminated. As the JT provides only a partial order on the elimination of variables, it is necessary to identify elim...

  7. Glaucoma Monitoring in a Clinical Setting Glaucoma Progression Analysis vs Nonparametric Progression Analysis in the Groningen Longitudinal Glaucoma Study

    NARCIS (Netherlands)

    Wesselink, Christiaan; Heeg, Govert P.; Jansonius, Nomdo M.

    Objective: To compare prospectively 2 perimetric progression detection algorithms for glaucoma, the Early Manifest Glaucoma Trial algorithm (glaucoma progression analysis [GPA]) and a nonparametric algorithm applied to the mean deviation (MD) (nonparametric progression analysis [NPA]). Methods:

  8. Belief functions on lattices

    CERN Document Server

    Grabisch, Michel

    2008-01-01

    We extend the notion of belief function to the case where the underlying structure is no more the Boolean lattice of subsets of some universal set, but any lattice, which we will endow with a minimal set of properties according to our needs. We show that all classical constructions and definitions (e.g., mass allocation, commonality function, plausibility functions, necessity measures with nested focal elements, possibility distributions, Dempster rule of combination, decomposition w.r.t. simple support functions, etc.) remain valid in this general setting. Moreover, our proof of decomposition of belief functions into simple support functions is much simpler and general than the original one by Shafer.

  9. Viscothermal wave propagation

    NARCIS (Netherlands)

    Nijhof, Marten Jozef Johannes

    2010-01-01

    In this work, the accuracy, efficiency and range of applicability of various (approximate) models for viscothermal wave propagation are investigated. Models for viscothermal wave propagation describe thewave behavior of fluids including viscous and thermal effects. Cases where viscothermal effects a

  10. Probabilistic dynamic belief revision

    NARCIS (Netherlands)

    Baltag, A.; Smets, S.

    2008-01-01

    We investigate the discrete (finite) case of the Popper-Renyi theory of conditional probability, introducing discrete conditional probabilistic models for knowledge and conditional belief, and comparing them with the more standard plausibility models. We also consider a related notion, that of safe

  11. Information and Heterogeneous Beliefs

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Qin, Zhenjiang

    2014-01-01

    In an incomplete market with heterogeneous prior beliefs, we show public information can have a substantial impact on the ex ante cost of capital, trading volume, and investor welfare. The Pareto effcient public information system is the system enjoying the maximum ex ante cost of capital and the...

  12. Varieties of Belief and Probability

    NARCIS (Netherlands)

    D.J.N. van Eijck (Jan); S. Ghosh; J. Szymanik

    2015-01-01

    htmlabstractFor reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for

  13. Nonparametric predictive inference for combining diagnostic tests with parametric copula

    Science.gov (United States)

    Muhammad, Noryanti; Coolen, F. P. A.; Coolen-Maturi, T.

    2017-09-01

    Measuring the accuracy of diagnostic tests is crucial in many application areas including medicine and health care. The Receiver Operating Characteristic (ROC) curve is a popular statistical tool for describing the performance of diagnostic tests. The area under the ROC curve (AUC) is often used as a measure of the overall performance of the diagnostic test. In this paper, we interest in developing strategies for combining test results in order to increase the diagnostic accuracy. We introduce nonparametric predictive inference (NPI) for combining two diagnostic test results with considering dependence structure using parametric copula. NPI is a frequentist statistical framework for inference on a future observation based on past data observations. NPI uses lower and upper probabilities to quantify uncertainty and is based on only a few modelling assumptions. While copula is a well-known statistical concept for modelling dependence of random variables. A copula is a joint distribution function whose marginals are all uniformly distributed and it can be used to model the dependence separately from the marginal distributions. In this research, we estimate the copula density using a parametric method which is maximum likelihood estimator (MLE). We investigate the performance of this proposed method via data sets from the literature and discuss results to show how our method performs for different family of copulas. Finally, we briefly outline related challenges and opportunities for future research.

  14. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  15. Bayesian nonparametric clustering in phylogenetics: modeling antigenic evolution in influenza.

    Science.gov (United States)

    Cybis, Gabriela B; Sinsheimer, Janet S; Bedford, Trevor; Rambaut, Andrew; Lemey, Philippe; Suchard, Marc A

    2017-01-18

    Influenza is responsible for up to 500,000 deaths every year, and antigenic variability represents much of its epidemiological burden. To visualize antigenic differences across many viral strains, antigenic cartography methods use multidimensional scaling on binding assay data to map influenza antigenicity onto a low-dimensional space. Analysis of such assay data ideally leads to natural clustering of influenza strains of similar antigenicity that correlate with sequence evolution. To understand the dynamics of these antigenic groups, we present a framework that jointly models genetic and antigenic evolution by combining multidimensional scaling of binding assay data, Bayesian phylogenetic machinery and nonparametric clustering methods. We propose a phylogenetic Chinese restaurant process that extends the current process to incorporate the phylogenetic dependency structure between strains in the modeling of antigenic clusters. With this method, we are able to use the genetic information to better understand the evolution of antigenicity throughout epidemics, as shown in applications of this model to H1N1 influenza. Copyright © 2017 John Wiley & Sons, Ltd.

  16. The Utility of Nonparametric Transformations for Imputation of Survey Data

    Directory of Open Access Journals (Sweden)

    Robbins Michael W.

    2014-12-01

    Full Text Available Missing values present a prevalent problem in the analysis of establishment survey data. Multivariate imputation algorithms (which are used to fill in missing observations tend to have the common limitation that imputations for continuous variables are sampled from Gaussian distributions. This limitation is addressed here through the use of robust marginal transformations. Specifically, kernel-density and empirical distribution-type transformations are discussed and are shown to have favorable properties when used for imputation of complex survey data. Although such techniques have wide applicability (i.e., they may be easily applied in conjunction with a wide array of imputation techniques, the proposed methodology is applied here with an algorithm for imputation in the USDA’s Agricultural Resource Management Survey. Data analysis and simulation results are used to illustrate the specific advantages of the robust methods when compared to the fully parametric techniques and to other relevant techniques such as predictive mean matching. To summarize, transformations based upon parametric densities are shown to distort several data characteristics in circumstances where the parametric model is ill fit; however, no circumstances are found in which the transformations based upon parametric models outperform the nonparametric transformations. As a result, the transformation based upon the empirical distribution (which is the most computationally efficient is recommended over the other transformation procedures in practice.

  17. Nonparametric identification of structural modifications in Laplace domain

    Science.gov (United States)

    Suwała, G.; Jankowski, Ł.

    2017-02-01

    This paper proposes and experimentally verifies a Laplace-domain method for identification of structural modifications, which (1) unlike time-domain formulations, allows the identification to be focused on these parts of the frequency spectrum that have a high signal-to-noise ratio, and (2) unlike frequency-domain formulations, decreases the influence of numerical artifacts related to the particular choice of the FFT exponential window decay. In comparison to the time-domain approach proposed earlier, advantages of the proposed method are smaller computational cost and higher accuracy, which leads to reliable performance in more difficult identification cases. Analytical formulas for the first- and second-order sensitivity analysis are derived. The approach is based on a reduced nonparametric model, which has the form of a set of selected structural impulse responses. Such a model can be collected purely experimentally, which obviates the need for design and laborious updating of a parametric model, such as a finite element model. The approach is verified experimentally using a 26-node lab 3D truss structure and 30 identification cases of a single mass modification or two concurrent mass modifications.

  18. A New Non-Parametric Approach to Galaxy Morphological Classification

    CERN Document Server

    Lotz, J M; Madau, P; Lotz, Jennifer M.; Primack, Joel; Madau, Piero

    2003-01-01

    We present two new non-parametric methods for quantifying galaxy morphology: the relative distribution of the galaxy pixel flux values (the Gini coefficient or G) and the second-order moment of the brightest 20% of the galaxy's flux (M20). We test the robustness of G and M20 to decreasing signal-to-noise and spatial resolution, and find that both measures are reliable to within 10% at average signal-to-noise per pixel greater than 3 and resolutions better than 1000 pc and 500 pc, respectively. We have measured G and M20, as well as concentration (C), asymmetry (A), and clumpiness (S) in the rest-frame near-ultraviolet/optical wavelengths for 150 bright local "normal" Hubble type galaxies (E-Sd) galaxies and 104 0.05 < z < 0.25 ultra-luminous infrared galaxies (ULIRGs).We find that most local galaxies follow a tight sequence in G-M20-C, where early-types have high G and C and low M20 and late-type spirals have lower G and C and higher M20. The majority of ULIRGs lie above the normal galaxy G-M20 sequence...

  19. Nonparametric Bayes modeling for case control studies with many predictors.

    Science.gov (United States)

    Zhou, Jing; Herring, Amy H; Bhattacharya, Anirban; Olshan, Andrew F; Dunson, David B

    2016-03-01

    It is common in biomedical research to run case-control studies involving high-dimensional predictors, with the main goal being detection of the sparse subset of predictors having a significant association with disease. Usual analyses rely on independent screening, considering each predictor one at a time, or in some cases on logistic regression assuming no interactions. We propose a fundamentally different approach based on a nonparametric Bayesian low rank tensor factorization model for the retrospective likelihood. Our model allows a very flexible structure in characterizing the distribution of multivariate variables as unknown and without any linear assumptions as in logistic regression. Predictors are excluded only if they have no impact on disease risk, either directly or through interactions with other predictors. Hence, we obtain an omnibus approach for screening for important predictors. Computation relies on an efficient Gibbs sampler. The methods are shown to have high power and low false discovery rates in simulation studies, and we consider an application to an epidemiology study of birth defects.

  20. Biological parametric mapping with robust and non-parametric statistics.

    Science.gov (United States)

    Yang, Xue; Beason-Held, Lori; Resnick, Susan M; Landman, Bennett A

    2011-07-15

    Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, regions of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrices. Recently, biological parametric mapping has extended the widely popular statistical parametric mapping approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and non-parametric regression in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provide a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Adaptive Neural Network Nonparametric Identifier With Normalized Learning Laws.

    Science.gov (United States)

    Chairez, Isaac

    2016-04-05

    This paper addresses the design of a normalized convergent learning law for neural networks (NNs) with continuous dynamics. The NN is used here to obtain a nonparametric model for uncertain systems described by a set of ordinary differential equations. The source of uncertainties is the presence of some external perturbations and poor knowledge of the nonlinear function describing the system dynamics. A new adaptive algorithm based on normalized algorithms was used to adjust the weights of the NN. The adaptive algorithm was derived by means of a nonstandard logarithmic Lyapunov function (LLF). Two identifiers were designed using two variations of LLFs leading to a normalized learning law for the first identifier and a variable gain normalized learning law. In the case of the second identifier, the inclusion of normalized learning laws yields to reduce the size of the convergence region obtained as solution of the practical stability analysis. On the other hand, the velocity of convergence for the learning laws depends on the norm of errors in inverse form. This fact avoids the peaking transient behavior in the time evolution of weights that accelerates the convergence of identification error. A numerical example demonstrates the improvements achieved by the algorithm introduced in this paper compared with classical schemes with no-normalized continuous learning methods. A comparison of the identification performance achieved by the no-normalized identifier and the ones developed in this paper shows the benefits of the learning law proposed in this paper.

  2. Nonparametric estimation of quantum states, processes and measurements

    Science.gov (United States)

    Lougovski, Pavel; Bennink, Ryan

    Quantum state, process, and measurement estimation methods traditionally use parametric models, in which the number and role of relevant parameters is assumed to be known. When such an assumption cannot be justified, a common approach in many disciplines is to fit the experimental data to multiple models with different sets of parameters and utilize an information criterion to select the best fitting model. However, it is not always possible to assume a model with a finite (countable) number of parameters. This typically happens when there are unobserved variables that stem from hidden correlations that can only be unveiled after collecting experimental data. How does one perform quantum characterization in this situation? We present a novel nonparametric method of experimental quantum system characterization based on the Dirichlet Process (DP) that addresses this problem. Using DP as a prior in conjunction with Bayesian estimation methods allows us to increase model complexity (number of parameters) adaptively as the number of experimental observations grows. We illustrate our approach for the one-qubit case and show how a probability density function for an unknown quantum process can be estimated.

  3. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  4. Non-parametric and least squares Langley plot methods

    Directory of Open Access Journals (Sweden)

    P. W. Kiedron

    2015-04-01

    Full Text Available Langley plots are used to calibrate sun radiometers primarily for the measurement of the aerosol component of the atmosphere that attenuates (scatters and absorbs incoming direct solar radiation. In principle, the calibration of a sun radiometer is a straightforward application of the Bouguer–Lambert–Beer law V=V>/i>0e−τ ·m, where a plot of ln (V voltage vs. m air mass yields a straight line with intercept ln (V0. This ln (V0 subsequently can be used to solve for τ for any measurement of V and calculation of m. This calibration works well on some high mountain sites, but the application of the Langley plot calibration technique is more complicated at other, more interesting, locales. This paper is concerned with ferreting out calibrations at difficult sites and examining and comparing a number of conventional and non-conventional methods for obtaining successful Langley plots. The eleven techniques discussed indicate that both least squares and various non-parametric techniques produce satisfactory calibrations with no significant differences among them when the time series of ln (V0's are smoothed and interpolated with median and mean moving window filters.

  5. Pivotal Estimation of Nonparametric Functions via Square-root Lasso

    CERN Document Server

    Belloni, Alexandre; Wang, Lie

    2011-01-01

    In a nonparametric linear regression model we study a variant of LASSO, called square-root LASSO, which does not require the knowledge of the scaling parameter $\\sigma$ of the noise or bounds for it. This work derives new finite sample upper bounds for prediction norm rate of convergence, $\\ell_1$-rate of converge, $\\ell_\\infty$-rate of convergence, and sparsity of the square-root LASSO estimator. A lower bound for the prediction norm rate of convergence is also established. In many non-Gaussian noise cases, we rely on moderate deviation theory for self-normalized sums and on new data-dependent empirical process inequalities to achieve Gaussian-like results provided log p = o(n^{1/3}) improving upon results derived in the parametric case that required log p = O(log n). In addition, we derive finite sample bounds on the performance of ordinary least square (OLS) applied tom the model selected by square-root LASSO accounting for possible misspecification of the selected model. In particular, we provide mild con...

  6. Perfect Derived Propagators

    CERN Document Server

    Schulte, Christian

    2008-01-01

    When implementing a propagator for a constraint, one must decide about variants: When implementing min, should one also implement max? Should one implement linear equations both with and without coefficients? Constraint variants are ubiquitous: implementing them requires considerable (if not prohibitive) effort and decreases maintainability, but will deliver better performance. This paper shows how to use variable views, previously introduced for an implementation architecture, to derive perfect propagator variants. A model for views and derived propagators is introduced. Derived propagators are proved to be indeed perfect in that they inherit essential properties such as correctness and domain and bounds consistency. Techniques for systematically deriving propagators such as transformation, generalization, specialization, and channeling are developed for several variable domains. We evaluate the massive impact of derived propagators. Without derived propagators, Gecode would require 140000 rather than 40000 ...

  7. Cognitive neuropsychiatry and delusional belief.

    Science.gov (United States)

    Coltheart, Max

    2007-08-01

    Cognitive neuropsychiatry is a new field of cognitive psychology which seeks to learn more about the normal operation of high-level aspects of cognition such as belief formation, reasoning, decision making, theory of mind, and pragmatics by studying people in whom such processes are abnormal. So far, the high-level cognitive process most widely studied in cognitive neuropsychiatry has been belief formation, investigated by examining people with delusional beliefs. This paper describes some of the forms of delusional belief that have been examined from this perspective and offers a general two-deficit cognitive-neuropsychiatric account of delusional belief.

  8. Information and Heterogeneous Beliefs

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Qin, Zhenjiang

    2014-01-01

    In an incomplete market with heterogeneous prior beliefs, we show public information can have a substantial impact on the ex ante cost of capital, trading volume, and investor welfare. The Pareto effcient public information system is the system enjoying the maximum ex ante cost of capital...... ante risk premium is unaffected by the informativeness of the public information system. Similar results are obtained in a production economy, but the impact on the ex ante cost of capital is dampened compared to the exchange economy due to welfare improving reductions in real investments to smooth...... and the maximum expected abnormal trading volume. Imperfect public information increases the gains-to-trade based on heterogeneously updated posterior beliefs. In an exchange economy, this leads to higher growth in the investors' certainty equivalents and, thus, a higher equilibrium interest rate, whereas the ex...

  9. Patient Belief in Miracles

    DEFF Research Database (Denmark)

    Hvidt, Niels Christian

    2011-01-01

    Faith and hope in divine healing figure in most religious traditions. This chapter looks at faith in healing miracles and explores how following that faith may involve both risks and advantages. On the one hand, it may imply a risk by camouflaging a deferring attitude as when patients decline...... medical treatment on the basis of their belief in Divine intervention. On the other hand, faith in miracles forms an important part of a well-integrated religiosity by inspiring hope and so helping patients to find meaning and initiative in situations in which they might otherwise be tempted to give up....... Against the backdrop of such considerations, the chapter provides recommendations for health professionals and relatives as how to handle patient belief in miracles in practice....

  10. Assessing superstitious belief.

    Science.gov (United States)

    Dagnall, Neil; Parker, Andrew; Munley, Gary

    2009-04-01

    The current study evaluated the psychometric properties of Wiseman and Watt's (2004) negative and positive superstitious belief items. The original items were compared with a modified, reworded set of items which emphasized each item's relation with either good or bad luck, and standard psychometric analyses were done. Modifying the items did not improve their psychometric properties; there was a negligible effect on Cronbach alpha, and Positive Item 3 continued to perform poorly. Confirmatory factor analysis, using the maximum likelihood method, suggested that a two-factor solution was preferable to a one-factor solution for both the original and modified items and that the problematic item should be discounted. It was concluded that the items require development and refinement before firm conclusions can be made about the factorial structure of superstitious belief. These results should also be tested further using Rasch methods.

  11. Patient Belief in Miracles

    DEFF Research Database (Denmark)

    Hvidt, Niels Christian

    2011-01-01

    Faith and hope in divine healing figure in most religious traditions. This chapter looks at faith in healing miracles and explores how following that faith may involve both risks and advantages. On the one hand, it may imply a risk by camouflaging a deferring attitude as when patients decline...... medical treatment on the basis of their belief in Divine intervention. On the other hand, faith in miracles forms an important part of a well-integrated religiosity by inspiring hope and so helping patients to find meaning and initiative in situations in which they might otherwise be tempted to give up....... Against the backdrop of such considerations, the chapter provides recommendations for health professionals and relatives as how to handle patient belief in miracles in practice....

  12. Strong Convergence of Partitioning Estimation for Nonparametric Regression Function under Dependence Samples

    Institute of Scientific and Technical Information of China (English)

    LINGNeng-xiang; DUXue-qiao

    2005-01-01

    In this paper, we study the strong consistency for partitioning estimation of regression function under samples that axe φ-mixing sequences with identically distribution.Key words: nonparametric regression function; partitioning estimation; strong convergence;φ-mixing sequences.

  13. Kernel bandwidth estimation for non-parametric density estimation: a comparative study

    CSIR Research Space (South Africa)

    Van der Walt, CM

    2013-12-01

    Full Text Available We investigate the performance of conventional bandwidth estimators for non-parametric kernel density estimation on a number of representative pattern-recognition tasks, to gain a better understanding of the behaviour of these estimators in high...

  14. Bayesian nonparametric estimation and consistency of mixed multinomial logit choice models

    CERN Document Server

    De Blasi, Pierpaolo; Lau, John W; 10.3150/09-BEJ233

    2011-01-01

    This paper develops nonparametric estimation for discrete choice models based on the mixed multinomial logit (MMNL) model. It has been shown that MMNL models encompass all discrete choice models derived under the assumption of random utility maximization, subject to the identification of an unknown distribution $G$. Noting the mixture model description of the MMNL, we employ a Bayesian nonparametric approach, using nonparametric priors on the unknown mixing distribution $G$, to estimate choice probabilities. We provide an important theoretical support for the use of the proposed methodology by investigating consistency of the posterior distribution for a general nonparametric prior on the mixing distribution. Consistency is defined according to an $L_1$-type distance on the space of choice probabilities and is achieved by extending to a regression model framework a recent approach to strong consistency based on the summability of square roots of prior probabilities. Moving to estimation, slightly different te...

  15. Nonparametric Monitoring for Geotechnical Structures Subject to Long-Term Environmental Change

    Directory of Open Access Journals (Sweden)

    Hae-Bum Yun

    2011-01-01

    Full Text Available A nonparametric, data-driven methodology of monitoring for geotechnical structures subject to long-term environmental change is discussed. Avoiding physical assumptions or excessive simplification of the monitored structures, the nonparametric monitoring methodology presented in this paper provides reliable performance-related information particularly when the collection of sensor data is limited. For the validation of the nonparametric methodology, a field case study was performed using a full-scale retaining wall, which had been monitored for three years using three tilt gauges. Using the very limited sensor data, it is demonstrated that important performance-related information, such as drainage performance and sensor damage, could be disentangled from significant daily, seasonal and multiyear environmental variations. Extensive literature review on recent developments of parametric and nonparametric data processing techniques for geotechnical applications is also presented.

  16. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models

    CERN Document Server

    Fan, Jianqing; Song, Rui

    2011-01-01

    A variable screening procedure via correlation learning was proposed Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under the nonparametric additive models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, an iterative nonparametric independence screening (INIS) is also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data a...

  17. Nonparametric TOA estimators for low-resolution IR-UWB digital receiver

    Institute of Scientific and Technical Information of China (English)

    Yanlong Zhang; Weidong Chen

    2015-01-01

    Nonparametric time-of-arrival (TOA) estimators for im-pulse radio ultra-wideband (IR-UWB) signals are proposed. Non-parametric detection is obviously useful in situations where de-tailed information about the statistics of the noise is unavailable or not accurate. Such TOA estimators are obtained based on condi-tional statistical tests with only a symmetry distribution assumption on the noise probability density function. The nonparametric es-timators are attractive choices for low-resolution IR-UWB digital receivers which can be implemented by fast comparators or high sampling rate low resolution analog-to-digital converters (ADCs), in place of high sampling rate high resolution ADCs which may not be available in practice. Simulation results demonstrate that nonparametric TOA estimators provide more effective and robust performance than typical energy detection (ED) based estimators.

  18. Nonparametric statistical tests for the continuous data: the basic concept and the practical use.

    Science.gov (United States)

    Nahm, Francis Sahngun

    2016-02-01

    Conventional statistical tests are usually called parametric tests. Parametric tests are used more frequently than nonparametric tests in many medical articles, because most of the medical researchers are familiar with and the statistical software packages strongly support parametric tests. Parametric tests require important assumption; assumption of normality which means that distribution of sample means is normally distributed. However, parametric test can be misleading when this assumption is not satisfied. In this circumstance, nonparametric tests are the alternative methods available, because they do not required the normality assumption. Nonparametric tests are the statistical methods based on signs and ranks. In this article, we will discuss about the basic concepts and practical use of nonparametric tests for the guide to the proper use.

  19. Examples of the Application of Nonparametric Information Geometry to Statistical Physics

    Directory of Open Access Journals (Sweden)

    Giovanni Pistone

    2013-09-01

    Full Text Available We review a nonparametric version of Amari’s information geometry in which the set of positive probability densities on a given sample space is endowed with an atlas of charts to form a differentiable manifold modeled on Orlicz Banach spaces. This nonparametric setting is used to discuss the setting of typical problems in machine learning and statistical physics, such as black-box optimization, Kullback-Leibler divergence, Boltzmann-Gibbs entropy and the Boltzmann equation.

  20. Economic decision making and the application of nonparametric prediction models

    Science.gov (United States)

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2008-01-01

    Sustained increases in energy prices have focused attention on gas resources in low-permeability shale or in coals that were previously considered economically marginal. Daily well deliverability is often relatively small, although the estimates of the total volumes of recoverable resources in these settings are often large. Planning and development decisions for extraction of such resources must be areawide because profitable extraction requires optimization of scale economies to minimize costs and reduce risk. For an individual firm, the decision to enter such plays depends on reconnaissance-level estimates of regional recoverable resources and on cost estimates to develop untested areas. This paper shows how simple nonparametric local regression models, used to predict technically recoverable resources at untested sites, can be combined with economic models to compute regional-scale cost functions. The context of the worked example is the Devonian Antrim-shale gas play in the Michigan basin. One finding relates to selection of the resource prediction model to be used with economic models. Models chosen because they can best predict aggregate volume over larger areas (many hundreds of sites) smooth out granularity in the distribution of predicted volumes at individual sites. This loss of detail affects the representation of economic cost functions and may affect economic decisions. Second, because some analysts consider unconventional resources to be ubiquitous, the selection and order of specific drilling sites may, in practice, be determined arbitrarily by extraneous factors. The analysis shows a 15-20% gain in gas volume when these simple models are applied to order drilling prospects strategically rather than to choose drilling locations randomly. Copyright ?? 2008 Society of Petroleum Engineers.

  1. A robust nonparametric method for quantifying undetected extinctions.

    Science.gov (United States)

    Chisholm, Ryan A; Giam, Xingli; Sadanandan, Keren R; Fung, Tak; Rheindt, Frank E

    2016-06-01

    How many species have gone extinct in modern times before being described by science? To answer this question, and thereby get a full assessment of humanity's impact on biodiversity, statistical methods that quantify undetected extinctions are required. Such methods have been developed recently, but they are limited by their reliance on parametric assumptions; specifically, they assume the pools of extant and undetected species decay exponentially, whereas real detection rates vary temporally with survey effort and real extinction rates vary with the waxing and waning of threatening processes. We devised a new, nonparametric method for estimating undetected extinctions. As inputs, the method requires only the first and last date at which each species in an ensemble was recorded. As outputs, the method provides estimates of the proportion of species that have gone extinct, detected, or undetected and, in the special case where the number of undetected extant species in the present day is assumed close to zero, of the absolute number of undetected extinct species. The main assumption of the method is that the per-species extinction rate is independent of whether a species has been detected or not. We applied the method to the resident native bird fauna of Singapore. Of 195 recorded species, 58 (29.7%) have gone extinct in the last 200 years. Our method projected that an additional 9.6 species (95% CI 3.4, 19.8) have gone extinct without first being recorded, implying a true extinction rate of 33.0% (95% CI 31.0%, 36.2%). We provide R code for implementing our method. Because our method does not depend on strong assumptions, we expect it to be broadly useful for quantifying undetected extinctions. © 2016 Society for Conservation Biology.

  2. Nonparametric Bayesian inference of the microcanonical stochastic block model

    Science.gov (United States)

    Peixoto, Tiago P.

    2017-01-01

    A principled approach to characterize the hidden modular structure of networks is to formulate generative models and then infer their parameters from data. When the desired structure is composed of modules or "communities," a suitable choice for this task is the stochastic block model (SBM), where nodes are divided into groups, and the placement of edges is conditioned on the group memberships. Here, we present a nonparametric Bayesian method to infer the modular structure of empirical networks, including the number of modules and their hierarchical organization. We focus on a microcanonical variant of the SBM, where the structure is imposed via hard constraints, i.e., the generated networks are not allowed to violate the patterns imposed by the model. We show how this simple model variation allows simultaneously for two important improvements over more traditional inference approaches: (1) deeper Bayesian hierarchies, with noninformative priors replaced by sequences of priors and hyperpriors, which not only remove limitations that seriously degrade the inference on large networks but also reveal structures at multiple scales; (2) a very efficient inference algorithm that scales well not only for networks with a large number of nodes and edges but also with an unlimited number of modules. We show also how this approach can be used to sample modular hierarchies from the posterior distribution, as well as to perform model selection. We discuss and analyze the differences between sampling from the posterior and simply finding the single parameter estimate that maximizes it. Furthermore, we expose a direct equivalence between our microcanonical approach and alternative derivations based on the canonical SBM.

  3. Nonparametric, Coupled ,Bayesian ,Dictionary ,and Classifier Learning for Hyperspectral Classification.

    Science.gov (United States)

    Akhtar, Naveed; Mian, Ajmal

    2017-10-03

    We present a principled approach to learn a discriminative dictionary along a linear classifier for hyperspectral classification. Our approach places Gaussian Process priors over the dictionary to account for the relative smoothness of the natural spectra, whereas the classifier parameters are sampled from multivariate Gaussians. We employ two Beta-Bernoulli processes to jointly infer the dictionary and the classifier. These processes are coupled under the same sets of Bernoulli distributions. In our approach, these distributions signify the frequency of the dictionary atom usage in representing class-specific training spectra, which also makes the dictionary discriminative. Due to the coupling between the dictionary and the classifier, the popularity of the atoms for representing different classes gets encoded into the classifier. This helps in predicting the class labels of test spectra that are first represented over the dictionary by solving a simultaneous sparse optimization problem. The labels of the spectra are predicted by feeding the resulting representations to the classifier. Our approach exploits the nonparametric Bayesian framework to automatically infer the dictionary size--the key parameter in discriminative dictionary learning. Moreover, it also has the desirable property of adaptively learning the association between the dictionary atoms and the class labels by itself. We use Gibbs sampling to infer the posterior probability distributions over the dictionary and the classifier under the proposed model, for which, we derive analytical expressions. To establish the effectiveness of our approach, we test it on benchmark hyperspectral images. The classification performance is compared with the state-of-the-art dictionary learning-based classification methods.

  4. Non-parametric combination and related permutation tests for neuroimaging.

    Science.gov (United States)

    Winkler, Anderson M; Webster, Matthew A; Brooks, Jonathan C; Tracey, Irene; Smith, Stephen M; Nichols, Thomas E

    2016-04-01

    In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well-known definition of union-intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non-parametric combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one-way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction.

  5. Ontological confusions but not mentalizing abilities predict religious belief, paranormal belief, and belief in supernatural purpose.

    Science.gov (United States)

    Lindeman, Marjaana; Svedholm-Häkkinen, Annika M; Lipsanen, Jari

    2015-01-01

    The current research tested the hypothesis that the abilities for understanding other people's minds give rise to the cognitive biases that underlie supernatural beliefs. We used structural equation modeling (N=2789) to determine the roles of various mentalizing tendencies, namely self-reported affective and cognitive empathy (i.e., mind reading), actual cognitive and affective empathic abilities, hyper-empathizing, and two cognitive biases (core ontological confusions and promiscuous teleology) in giving rise to supernatural beliefs. Support for a path from mentalizing abilities through cognitive biases to supernatural beliefs was weak. The relationships of mentalizing abilities with supernatural beliefs were also weak, and these relationships were not substantially mediated by cognitive biases. Core ontological confusions emerged as the best predictor, while promiscuous teleology predicted only a small proportion of variance. The results were similar for religious beliefs, paranormal beliefs, and for belief in supernatural purpose.

  6. Bisimulation and expressivity for conditional belief, degrees of belief, and safe belief

    DEFF Research Database (Denmark)

    Andersen, Mikkel Birkegaard; Bolander, Thomas; van Ditmarsch, Hans;

    2016-01-01

    Plausibility models are Kripke models that agents use to reason about knowledge and belief, both of themselves and of each other. Such models are used to interpret the notions of conditional belief, degrees of belief, and safe belief. The logic of conditional belief contains that modality and also...... the knowledge modality, and similarly for the logic of degrees of belief and the logic of safe belief. With respect to these logics, plausibility models may contain too much information. A proper notion of bisimulation is required that characterises them. We define that notion of bisimulation and prove...... the required characterisations: on the class of image-finite and preimage-finite models (with respect to the plausibility relation), two pointed Kripke models are modally equivalent in either of the three logics, if and only if they are bisimilar. As a result, the information content of such a model can...

  7. Yield Stability of Maize Hybrids Evaluated in Maize Regional Trials in Southwestern China Using Nonparametric Methods

    Institute of Scientific and Technical Information of China (English)

    LIU Yong-jian; DUAN Chuan; TIAN Meng-liang; HU Er-liang; HUANG Yu-bi

    2010-01-01

    Analysis of multi-environment trials (METs) of crops for the evaluation and recommendation of varieties is an important issue in plant breeding research. Evaluating on the both stability of performance and high yield is essential in MET analyses. The objective of the present investigation was to compare 11 nonparametric stability statistics and apply nonparametric tests for genotype-by-environment interaction (GEI) to 14 maize (Zea mays L.) genotypes grown at 25 locations in southwestern China during 2005. Results of nonparametric tests of GEI and a combined ANOVA across locations showed that both crossover and noncrossover GEI, and genotypes varied highly significantly for yield. The results of principal component analysis, correlation analysis of nonparametric statistics, and yield indicated the nonparametric statistics grouped as four distinct classes that corresponded to different agronomic and biological concepts of stability.Furthermore, high values of TOP and low values of rank-sum were associated with high mean yield, but the other nonparametric statistics were not positively correlated with mean yield. Therefore, only rank-sum and TOP methods would be useful for simultaneously selection for high yield and stability. These two statistics recommended JY686 and HX 168 as desirable and ND 108, CM 12, CN36, and NK6661 as undesirable genotypes.

  8. A novel nonparametric confidence interval for differences of proportions for correlated binary data.

    Science.gov (United States)

    Duan, Chongyang; Cao, Yingshu; Zhou, Lizhi; Tan, Ming T; Chen, Pingyan

    2016-11-16

    Various confidence interval estimators have been developed for differences in proportions resulted from correlated binary data. However, the width of the mostly recommended Tango's score confidence interval tends to be wide, and the computing burden of exact methods recommended for small-sample data is intensive. The recently proposed rank-based nonparametric method by treating proportion as special areas under receiver operating characteristic provided a new way to construct the confidence interval for proportion difference on paired data, while the complex computation limits its application in practice. In this article, we develop a new nonparametric method utilizing the U-statistics approach for comparing two or more correlated areas under receiver operating characteristics. The new confidence interval has a simple analytic form with a new estimate of the degrees of freedom of n - 1. It demonstrates good coverage properties and has shorter confidence interval widths than that of Tango. This new confidence interval with the new estimate of degrees of freedom also leads to coverage probabilities that are an improvement on the rank-based nonparametric confidence interval. Comparing with the approximate exact unconditional method, the nonparametric confidence interval demonstrates good coverage properties even in small samples, and yet they are very easy to implement computationally. This nonparametric procedure is evaluated using simulation studies and illustrated with three real examples. The simplified nonparametric confidence interval is an appealing choice in practice for its ease of use and good performance. © The Author(s) 2016.

  9. The Wheeler Propagator

    OpenAIRE

    Bollini, C. G.; Rocca, M. C.

    1998-01-01

    We study the half advanced and half retarded Wheeler Green function and its relation to Feynman propagators. First for massless equation. Then, for Klein-Gordon equations with arbitrary mass parameters; real, imaginary or complex. In all cases the Wheeler propagator lacks an on-shell free propagation. The Wheeler function has support inside the light-cone (whatever the mass). The associated vacuum is symmetric with respect to annihilation and creation operators. We show with some examples tha...

  10. Slav beliefs on changelings

    Directory of Open Access Journals (Sweden)

    Radenković Ljubinko R.

    2002-01-01

    Full Text Available Beliefs and legends that certain mythological creatures - fairies, witches, the devil, (vile, veštice, đavo, boginka, mamuna, baenik, domovoj, leshi etc. can take away the child from the mother and exchange it for its own in the image of the abducted child, are widespread with the West and East Slavs, while with the South Slavs they are found only in the northern parts, in Pannonia. Such demonic child is most often called: podmeče (with the Serbs, podvršće (with the Croats, podmenek (with the Slovenians, odmienjec (with the Poles, odminok (with the Ukrainians, obmen (with the Russians, etc. According to the folk beliefs, a changeling differs from the other children by its sluggish growth, voraciousness, and persistent desire to harm or spite other members of the household. Slav legends mention the ways of stealing the human and planting the demonic child (a, recognizing the demonic child (b, and disposing of it and restoring the rightful child (c. In order to prevent the demon from exchanging her child, the mother must observe certain rules of conduct during pregnancy and in the 40 days following the childbirth. Certain measures of magical protection are also undertaken, as: placing sharp iron objects near the nursing woman, then brooms, leaving the candle to burn all night, burning frankincense in her presence, sprinkling her with holy water, etc. The legends on changelings were most probably adopted by the Slavs from the neighboring western peoples (Germans, and included in the already present beliefs that the birth of a child is a gift from the other world, and that the mother must take great care of the gift and be grateful for it. Otherwise, the one bestowing the gift may take it away as well.

  11. Belief in a Just World

    Directory of Open Access Journals (Sweden)

    Sibel Kilinc

    2011-01-01

    Full Text Available Belief in a just world hypothesis is defined as the belief that the world is a just place where people generally get what they deserve. It states that individuals have a need to believe that they live in a just world; they believe in a world where people get what they deserve and where people deserve what they get. Individuals believe that who work hard or who perform good acts obtain rewards for their actions, while the sinners and the laggards receive punishments instead. Similarly, individuals want to believe that positive outcomes, whether money, success, or happiness, are obtained only by good people and, conversely, that negative outcomes only happen to bad persons. Justice beliefs have been hypothesized as adaptive for dealing with day-to-day stres. Just world beliefs protect individuals from the daily negative psychological consequences of living in what is realistically an unjust world. In addition, just world beliefs are thought to enhance feelings of security to the extent that if the individual satisfies the conditions for being "good," he or she is protected from injustice. The belief in a just world, like other positive illusions, should contribute to the maintenance of one's mental health. Belief in a just world's is discussed in two ways: personal belief in a just world's answers the question “how much justly is the world to me?”, whereas the belief in a just world's in general answers the question “how much justly is the world?”

  12. Homo economicus belief inhibits trust.

    Directory of Open Access Journals (Sweden)

    Ziqiang Xin

    Full Text Available As a foundational concept in economics, the homo economicus assumption regards humans as rational and self-interested actors. In contrast, trust requires individuals to believe partners' benevolence and unselfishness. Thus, the homo economicus belief may inhibit trust. The present three experiments demonstrated that the direct exposure to homo economicus belief can weaken trust. And economic situations like profit calculation can also activate individuals' homo economicus belief and inhibit their trust. It seems that people's increasing homo economicus belief may serve as one cause of the worldwide decline of trust.

  13. Homo economicus belief inhibits trust.

    Science.gov (United States)

    Xin, Ziqiang; Liu, Guofang

    2013-01-01

    As a foundational concept in economics, the homo economicus assumption regards humans as rational and self-interested actors. In contrast, trust requires individuals to believe partners' benevolence and unselfishness. Thus, the homo economicus belief may inhibit trust. The present three experiments demonstrated that the direct exposure to homo economicus belief can weaken trust. And economic situations like profit calculation can also activate individuals' homo economicus belief and inhibit their trust. It seems that people's increasing homo economicus belief may serve as one cause of the worldwide decline of trust.

  14. Shallow-Water Propagation

    Science.gov (United States)

    2016-06-07

    Shallow- Water Propagation William L. Siegmann Rensselaer Polytechnic Institute 110 Eighth Street Troy, New York 12180-3590 phone: (518) 276...ocean_acoustics LONG-TERM GOALS Develop methods for propagation and coherence calculations in complex shallow- water environments, determine...intensity and coherence. APPROACH (A) Develop high accuracy PE techniques for applications to shallow- water sediments, accounting for

  15. A formal approach to aggregated belief formation

    NARCIS (Netherlands)

    Heuvelink, A.; Klein, M.C.A.; Treur, J.

    2008-01-01

    This paper introduces a formal method to aggregate over basic beliefs, in order to deduce aggregated or complex beliefs as often used in applications. Complex beliefs can represent several things, such as a belief about a period in which other beliefs held or the minimal or maximal certainty with wh

  16. Recursive belief manipulation and second-order false-beliefs

    DEFF Research Database (Denmark)

    Braüner, Torben; Blackburn, Patrick Rowan; Polyanskaya, Irina

    2016-01-01

    The literature on first-order false-belief is extensive, but less is known about the second-order case. The ability to handle second-order false-beliefs correctly seems to mark a cognitively significant step, but what is its status? Is it an example of *complexity only* development, or does it in...

  17. Non-Parametric Bayesian State Space Estimator for Negative Information

    Directory of Open Access Journals (Sweden)

    Guillaume de Chambrier

    2017-09-01

    Full Text Available Simultaneous Localization and Mapping (SLAM is concerned with the development of filters to accurately and efficiently infer the state parameters (position, orientation, etc. of an agent and aspects of its environment, commonly referred to as the map. A mapping system is necessary for the agent to achieve situatedness, which is a precondition for planning and reasoning. In this work, we consider an agent who is given the task of finding a set of objects. The agent has limited perception and can only sense the presence of objects if a direct contact is made, as a result most of the sensing is negative information. In the absence of recurrent sightings or direct measurements of objects, there are no correlations from the measurement errors that can be exploited. This renders SLAM estimators, for which this fact is their backbone such as EKF-SLAM, ineffective. In addition for our setting, no assumptions are taken with respect to the marginals (beliefs of both the agent and objects (map. From the loose assumptions we stipulate regarding the marginals and measurements, we adopt a histogram parametrization. We introduce a Bayesian State Space Estimator (BSSE, which we name Measurement Likelihood Memory Filter (MLMF, in which the values of the joint distribution are not parametrized but instead we directly apply changes from the measurement integration step to the marginals. This is achieved by keeping track of the history of likelihood functions’ parameters. We demonstrate that the MLMF gives the same filtered marginals as a histogram filter and show two implementations: MLMF and scalable-MLMF that both have a linear space complexity. The original MLMF retains an exponential time complexity (although an order of magnitude smaller than the histogram filter while the scalable-MLMF introduced independence assumption such to have a linear time complexity. We further quantitatively demonstrate the scalability of our algorithm with 25 beliefs having up to

  18. A Hybrid Index for Characterizing Drought Based on a Nonparametric Kernel Estimator

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Shengzhi; Huang, Qiang; Leng, Guoyong; Chang, Jianxia

    2016-06-01

    This study develops a nonparametric multivariate drought index, namely, the Nonparametric Multivariate Standardized Drought Index (NMSDI), by considering the variations of both precipitation and streamflow. Building upon previous efforts in constructing Nonparametric Multivariate Drought Index, we use the nonparametric kernel estimator to derive the joint distribution of precipitation and streamflow, thus providing additional insights in drought index development. The proposed NMSDI are applied in the Wei River Basin (WRB), based on which the drought evolution characteristics are investigated. Results indicate: (1) generally, NMSDI captures the drought onset similar to Standardized Precipitation Index (SPI) and drought termination and persistence similar to Standardized Streamflow Index (SSFI). The drought events identified by NMSDI match well with historical drought records in the WRB. The performances are also consistent with that by an existing Multivariate Standardized Drought Index (MSDI) at various timescales, confirming the validity of the newly constructed NMSDI in drought detections (2) An increasing risk of drought has been detected for the past decades, and will be persistent to a certain extent in future in most areas of the WRB; (3) the identified change points of annual NMSDI are mainly concentrated in the early 1970s and middle 1990s, coincident with extensive water use and soil reservation practices. This study highlights the nonparametric multivariable drought index, which can be used for drought detections and predictions efficiently and comprehensively.

  19. Free will and paranormal beliefs

    Science.gov (United States)

    Mogi, Ken

    2014-01-01

    Free will is one of the fundamental aspects of human cognition. In the context of cognitive neuroscience, various experiments on time perception, sensorimotor coordination, and agency suggest the possibility that it is a robust illusion (a feeling independent of actual causal relationship with actions) constructed by neural mechanisms. Humans are known to suffer from various cognitive biases and failures, and the sense of free will might be one of them. Here I report a positive correlation between the belief in free will and paranormal beliefs (UFO, reincarnation, astrology, and psi). Web questionnaires involving 2076 subjects (978 males, 1087 females, and 11 other genders) were conducted, which revealed significant positive correlations between belief in free will (theory and practice) and paranormal beliefs. There was no significant correlation between belief in free will and knowledge in paranormal phenomena. Paranormal belief scores for females were significantly higher than those for males, with corresponding significant (albeit weaker) difference in belief in free will. These results are consistent with the view that free will is an illusion which shares common cognitive elements with paranormal beliefs. PMID:24765084

  20. Politics of climate change belief

    Science.gov (United States)

    2017-01-01

    Donald Trump's actions during the election and his first weeks as US president-elect send a strong message about his belief in climate change, or lack thereof. However, these actions may reflect polarization of climate change beliefs, not climate mitigation behaviour.

  1. Free will and paranormal beliefs

    Directory of Open Access Journals (Sweden)

    Ken eMogi

    2014-04-01

    Full Text Available Free will is one of the fundamental aspects of human cognition. In the context of cognitive neuroscience, various experiments on time perception, sensorimotor coordination, and agency suggest the possibility that it is a robust illusion (a feeling independent of actual causal relationship with actions constructed by neural mechanisms. Humans are known to suffer from various cognitive biases and failures, and the sense of free will might be one of them. Here I report a positive correlation between the belief in free will and paranormal beliefs (UFO, reincarnation, astrology, and psi. Web questionnaires involving 2076 subjects (978 males, 1087 females, and 11 other genders were conducted, which revealed significant positive correlations between belief in free will (theory and practice and paranormal beliefs. There was no significant correlation between belief in free will and knowledge in paranormal phenomena. Paranormal belief scores for females were significantly higher than those for males, with corresponding significant (albeit weaker difference in belief in free will. These results are consistent with the view that free will is an illusion which shares common cognitive elements with paranormal beliefs.

  2. Epistemological Beliefs and Academic Achievement

    Science.gov (United States)

    Arslantas, Halis Adnan

    2016-01-01

    This study aimed to identify the relationship between teacher candidates' epistemological beliefs and academic achievement. The participants of the study were 353 teacher candidates studying their fourth year at the Education Faculty. The Epistemological Belief Scale was used which adapted to Turkish through reliability and validity work by…

  3. Bulimic Beliefs: Food for Thought.

    Science.gov (United States)

    Bauer, Barbara G.; Anderson, Wayne P.

    1989-01-01

    Contends that individuals suffering from bulimia nervosa share characteristic pattern of thinking which must be understood if effective treatment is to take place. Presents these beliefs, gathered by clinical experience and literature review, in format describing each belief, discussing common causes for its development, and suggesting therapeutic…

  4. Free will and paranormal beliefs.

    Science.gov (United States)

    Mogi, Ken

    2014-01-01

    Free will is one of the fundamental aspects of human cognition. In the context of cognitive neuroscience, various experiments on time perception, sensorimotor coordination, and agency suggest the possibility that it is a robust illusion (a feeling independent of actual causal relationship with actions) constructed by neural mechanisms. Humans are known to suffer from various cognitive biases and failures, and the sense of free will might be one of them. Here I report a positive correlation between the belief in free will and paranormal beliefs (UFO, reincarnation, astrology, and psi). Web questionnaires involving 2076 subjects (978 males, 1087 females, and 11 other genders) were conducted, which revealed significant positive correlations between belief in free will (theory and practice) and paranormal beliefs. There was no significant correlation between belief in free will and knowledge in paranormal phenomena. Paranormal belief scores for females were significantly higher than those for males, with corresponding significant (albeit weaker) difference in belief in free will. These results are consistent with the view that free will is an illusion which shares common cognitive elements with paranormal beliefs.

  5. Belief update as social choice

    NARCIS (Netherlands)

    van Benthem, J.; Girard, P.; Roy, O.; Marion, M.

    2011-01-01

    Dynamic epistemic-doxastic logics describe the new knowledge or new beliefs indexBelief of agents after some informational event has happened. Technically, this requires an update rule that turns a doxastic-epistemic modelM(recording the current information state of the agents) and a dynamic ‘event

  6. Religious beliefs and entrepreneurship among Dutch protestants

    NARCIS (Netherlands)

    C.A. Rietveld (Niels); E. van Burg (Elco)

    2013-01-01

    textabstractReligious beliefs affect the economic behavior of individuals. The aim of this study is to investigate the relation between religious beliefs and entrepreneurship. Empirical evidence that entrepreneurship rates differ among religions suggests that religious beliefs influence the pursuit

  7. Religious beliefs and entrepreneurship among Dutch protestants

    NARCIS (Netherlands)

    C.A. Rietveld (Niels); E. van Burg (Elco)

    2013-01-01

    textabstractReligious beliefs affect the economic behavior of individuals. The aim of this study is to investigate the relation between religious beliefs and entrepreneurship. Empirical evidence that entrepreneurship rates differ among religions suggests that religious beliefs influence the pursuit

  8. Gear Crack Propagation Investigation

    Science.gov (United States)

    1995-01-01

    Reduced weight is a major design goal in aircraft power transmissions. Some gear designs incorporate thin rims to help meet this goal. Thin rims, however, may lead to bending fatigue cracks. These cracks may propagate through a gear tooth or into the gear rim. A crack that propagates through a tooth would probably not be catastrophic, and ample warning of a failure could be possible. On the other hand, a crack that propagates through the rim would be catastrophic. Such cracks could lead to disengagement of a rotor or propeller from an engine, loss of an aircraft, and fatalities. To help create and validate tools for the gear designer, the NASA Lewis Research Center performed in-house analytical and experimental studies to investigate the effect of rim thickness on gear-tooth crack propagation. Our goal was to determine whether cracks grew through gear teeth (benign failure mode) or through gear rims (catastrophic failure mode) for various rim thicknesses. In addition, we investigated the effect of rim thickness on crack propagation life. A finite-element-based computer program simulated gear-tooth crack propagation. The analysis used principles of linear elastic fracture mechanics, and quarter-point, triangular elements were used at the crack tip to represent the stress singularity. The program had an automated crack propagation option in which cracks were grown numerically via an automated remeshing scheme. Crack-tip stress-intensity factors were estimated to determine crack-propagation direction. Also, various fatigue crack growth models were used to estimate crack-propagation life. Experiments were performed in Lewis' Spur Gear Fatigue Rig to validate predicted crack propagation results. Gears with various backup ratios were tested to validate crack-path predictions. Also, test gears were installed with special crack-propagation gages in the tooth fillet region to measure bending-fatigue crack growth. From both predictions and tests, gears with backup ratios

  9. The Wheeler Propagator

    CERN Document Server

    Bollini, C G

    1998-01-01

    We study the half advanced and half retarded Wheeler Green function and its relation to Feynman propagators. First for massless equation. Then, for Klein-Gordon equations with arbitrary mass parameters; real, imaginary or complex. In all cases the Wheeler propagator lacks an on-shell free propagation. The Wheeler function has support inside the light-cone (whatever the mass). The associated vacuum is symmetric with respect to annihilation and creation operators. We show with some examples that perturbative unitarity holds, whatever the mass (real or complex). Some possible applications are discussed.

  10. Multibiometrics Belief Fusion

    CERN Document Server

    Kisku, Dakshina Ranjan; Gupta, Phalguni

    2010-01-01

    This paper proposes a multimodal biometric system through Gaussian Mixture Model (GMM) for face and ear biometrics with belief fusion of the estimated scores characterized by Gabor responses and the proposed fusion is accomplished by Dempster-Shafer (DS) decision theory. Face and ear images are convolved with Gabor wavelet filters to extracts spatially enhanced Gabor facial features and Gabor ear features. Further, GMM is applied to the high-dimensional Gabor face and Gabor ear responses separately for quantitive measurements. Expectation Maximization (EM) algorithm is used to estimate density parameters in GMM. This produces two sets of feature vectors which are then fused using Dempster-Shafer theory. Experiments are conducted on multimodal database containing face and ear images of 400 individuals. It is found that use of Gabor wavelet filters along with GMM and DS theory can provide robust and efficient multimodal fusion strategy.

  11. LIGO: The strong belief

    CERN Multimedia

    Antonella Del Rosso

    2016-01-01

    Twenty years of designing, building and testing a number of innovative technologies, with the strong belief that the endeavour would lead to a historic breakthrough. The Bulletin publishes an abstract of the Courier’s interview with Barry Barish, one of the founding fathers of LIGO.   The plots show the signals of gravitational waves detected by the twin LIGO observatories at Livingston, Louisiana, and Hanford, Washington. (Image: Caltech/MIT/LIGO Lab) On 11 February, the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Virgo collaborations published a historic paper in which they showed a gravitational signal emitted by the merger of two black holes. These results come after 20 years of hard work by a large collaboration of scientists operating the two LIGO observatories in the US. Barry Barish, Linde Professor of Physics, Emeritus at the California Institute of Technology and former Director of the Global Design Effort for the Internat...

  12. Non-Parametric Bayesian Updating within the Assessment of Reliability for Offshore Wind Turbine Support Structures

    DEFF Research Database (Denmark)

    Ramirez, José Rangel; Sørensen, John Dalsgaard

    2011-01-01

    This work illustrates the updating and incorporation of information in the assessment of fatigue reliability for offshore wind turbine. The new information, coming from external and condition monitoring can be used to direct updating of the stochastic variables through a non-parametric Bayesian...... updating approach and be integrated in the reliability analysis by a third-order polynomial chaos expansion approximation. Although Classical Bayesian updating approaches are often used because of its parametric formulation, non-parametric approaches are better alternatives for multi-parametric updating...... with a non-conjugating formulation. The results in this paper show the influence on the time dependent updated reliability when non-parametric and classical Bayesian approaches are used. Further, the influence on the reliability of the number of updated parameters is illustrated....

  13. Local kernel nonparametric discriminant analysis for adaptive extraction of complex structures

    Science.gov (United States)

    Li, Quanbao; Wei, Fajie; Zhou, Shenghan

    2017-05-01

    The linear discriminant analysis (LDA) is one of popular means for linear feature extraction. It usually performs well when the global data structure is consistent with the local data structure. Other frequently-used approaches of feature extraction usually require linear, independence, or large sample condition. However, in real world applications, these assumptions are not always satisfied or cannot be tested. In this paper, we introduce an adaptive method, local kernel nonparametric discriminant analysis (LKNDA), which integrates conventional discriminant analysis with nonparametric statistics. LKNDA is adept in identifying both complex nonlinear structures and the ad hoc rule. Six simulation cases demonstrate that LKNDA have both parametric and nonparametric algorithm advantages and higher classification accuracy. Quartic unilateral kernel function may provide better robustness of prediction than other functions. LKNDA gives an alternative solution for discriminant cases of complex nonlinear feature extraction or unknown feature extraction. At last, the application of LKNDA in the complex feature extraction of financial market activities is proposed.

  14. Non-parametric seismic hazard analysis in the presence of incomplete data

    Science.gov (United States)

    Yazdani, Azad; Mirzaei, Sajjad; Dadkhah, Koroush

    2017-01-01

    The distribution of earthquake magnitudes plays a crucial role in the estimation of seismic hazard parameters. Due to the complexity of earthquake magnitude distribution, non-parametric approaches are recommended over classical parametric methods. The main deficiency of the non-parametric approach is the lack of complete magnitude data in almost all cases. This study aims to introduce an imputation procedure for completing earthquake catalog data that will allow the catalog to be used for non-parametric density estimation. Using a Monte Carlo simulation, the efficiency of introduced approach is investigated. This study indicates that when a magnitude catalog is incomplete, the imputation procedure can provide an appropriate tool for seismic hazard assessment. As an illustration, the imputation procedure was applied to estimate earthquake magnitude distribution in Tehran, the capital city of Iran.

  15. Cognitive biases explain religious belief, paranormal belief, and belief in life's purpose.

    Science.gov (United States)

    Willard, Aiyana K; Norenzayan, Ara

    2013-11-01

    Cognitive theories of religion have postulated several cognitive biases that predispose human minds towards religious belief. However, to date, these hypotheses have not been tested simultaneously and in relation to each other, using an individual difference approach. We used a path model to assess the extent to which several interacting cognitive tendencies, namely mentalizing, mind body dualism, teleological thinking, and anthropomorphism, as well as cultural exposure to religion, predict belief in God, paranormal beliefs and belief in life's purpose. Our model, based on two independent samples (N=492 and N=920) found that the previously known relationship between mentalizing and belief is mediated by individual differences in dualism, and to a lesser extent by teleological thinking. Anthropomorphism was unrelated to religious belief, but was related to paranormal belief. Cultural exposure to religion (mostly Christianity) was negatively related to anthropomorphism, and was unrelated to any of the other cognitive tendencies. These patterns were robust for both men and women, and across at least two ethnic identifications. The data were most consistent with a path model suggesting that mentalizing comes first, which leads to dualism and teleology, which in turn lead to religious, paranormal, and life's-purpose beliefs. Alternative theoretical models were tested but did not find empirical support.

  16. Modern nonparametric, robust and multivariate methods festschrift in honour of Hannu Oja

    CERN Document Server

    Taskinen, Sara

    2015-01-01

    Written by leading experts in the field, this edited volume brings together the latest findings in the area of nonparametric, robust and multivariate statistical methods. The individual contributions cover a wide variety of topics ranging from univariate nonparametric methods to robust methods for complex data structures. Some examples from statistical signal processing are also given. The volume is dedicated to Hannu Oja on the occasion of his 65th birthday and is intended for researchers as well as PhD students with a good knowledge of statistics.

  17. Multivariate nonparametric regression and visualization with R and applications to finance

    CERN Document Server

    Klemelä, Jussi

    2014-01-01

    A modern approach to statistical learning and its applications through visualization methods With a unique and innovative presentation, Multivariate Nonparametric Regression and Visualization provides readers with the core statistical concepts to obtain complete and accurate predictions when given a set of data. Focusing on nonparametric methods to adapt to the multiple types of data generatingmechanisms, the book begins with an overview of classification and regression. The book then introduces and examines various tested and proven visualization techniques for learning samples and functio

  18. NONPARAMETRIC FIXED EFFECT PANEL DATA MODELS: RELATIONSHIP BETWEEN AIR POLLUTION AND INCOME FOR TURKEY

    Directory of Open Access Journals (Sweden)

    Rabia Ece OMAY

    2013-06-01

    Full Text Available In this study, relationship between gross domestic product (GDP per capita and sulfur dioxide (SO2 and particulate matter (PM10 per capita is modeled for Turkey. Nonparametric fixed effect panel data analysis is used for the modeling. The panel data covers 12 territories, in first level of Nomenclature of Territorial Units for Statistics (NUTS, for period of 1990-2001. Modeling of the relationship between GDP and SO2 and PM10 for Turkey, the non-parametric models have given good results.

  19. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  20. Parametrically guided estimation in nonparametric varying coefficient models with quasi-likelihood.

    Science.gov (United States)

    Davenport, Clemontina A; Maity, Arnab; Wu, Yichao

    2015-04-01

    Varying coefficient models allow us to generalize standard linear regression models to incorporate complex covariate effects by modeling the regression coefficients as functions of another covariate. For nonparametric varying coefficients, we can borrow the idea of parametrically guided estimation to improve asymptotic bias. In this paper, we develop a guided estimation procedure for the nonparametric varying coefficient models. Asymptotic properties are established for the guided estimators and a method of bandwidth selection via bias-variance tradeoff is proposed. We compare the performance of the guided estimator with that of the unguided estimator via both simulation and real data examples.

  1. Teachers’ Beliefs and Their Belief Change in an Intercultural Context

    DEFF Research Database (Denmark)

    Wang, Li

    culture in which the teacher has been educated affects how they interpret students, teaching and their teacher role. The study of CFL teachers teaching in intercultural contexts makes a connection between context, cultural issues, individual experience, teacher belief and teaching concerns, and it helps...... of teaching in a new context and in their early years of the teaching careers of CFL teachers in the Danish context. It has been shown that the multifaceted beliefs that CFL teachers hold are based on their personal experience, shaped by context, and mediated by their classroom practices. The educational...... in an intercultural context in relation to the factors shaping teacher belief and how belief is changed to better suit the context. It also gives inspiration for the design of effective teacher preparation, development and training programs for CFL teachers in global contexts by addressing their prior experience...

  2. Beliefs and conceptions. Complementary perspectives

    Directory of Open Access Journals (Sweden)

    Fuensanta Hernández Pina

    2011-04-01

    Full Text Available The beliefs and conceptions of teaching and learning held by teachers is a research topic which could mean a step forward in our understanding of important factors for improving the quality of Education. For over two decades, a number of researchers have achieved results through studies which offer a corpus of solid knowledge about beliefs and conceptions of teaching and learning which has resulted in establishing new andinteresting interpretations of that relationship. In this paper, we present the ideas about beliefs and conceptions held by a group of researches about teaching and learning.

  3. Estimation of Esfarayen Farmers Risk Aversion Coefficient and Its Influencing Factors (Nonparametric Approach

    Directory of Open Access Journals (Sweden)

    Z. Nematollahi

    2016-03-01

    absolute risk aversion of 0.00003, which is lower than for the subsample existing of farmers in the 'non-wealthy' group. This assumption that the absolute risk aversion is a decreasing function of wealth is in accordance with Arrow (1970 expectation. The method used was to calculate the proportional risk premium (PRP representing the proportion of the expected payoff of a risky prospect that the farmers would be willing to pay to trade away all the risk for a certain thing, proposed by Hardaker (2000. Our finding showed that the higher risk averse the farmer was, the higher will the PRP would be. Farmers risk premium was 303113 IRR. It should be mentioned that the 'non-wealthy' group had a larger PRP than the 'wealthy' group. Following Freund (1956, if the net revenue for each activity is normally distributed and assuming a negative exponential utility function, we can utilize the absolute risk aversion coefficient to obtain relative risk aversion coefficient (Rr. Based on this study, Rr vary from 0.31 to 8.49 and the relative coefficient of risk aversion in our sample was 4.79. Our results showed that the majority of farmers in the study area are highly risk averse (Anderson and Dillon, 1992. The relationships between the relative risk aversion coefficients of farmers and their socio-economic characteristics were also evaluated in this study. Results showed that the age had a positive impact, level of wealth and diversity had negative impacts on farmers' risk aversion coefficient. Conclusion: Due to existence of the risk and uncertainty in agriculture, the present study was designed to determine the risk aversion coefficient for Esfarayen farmers. A new non-parametric method and the QP method were used to calculate the coefficient of risk aversion. The model used in this analysis found the optimal farm plan given a planning horizon of 1 year. Thus, the historical mean GM vector and variance-covariance matrix were assumed to represent farmers beliefs. Our results showed

  4. A nonparametric approach to the estimation of diffusion processes, with an application to a short-term interest rate model

    NARCIS (Netherlands)

    Jiang, GJ; Knight, JL

    1997-01-01

    In this paper, we propose a nonparametric identification and estimation procedure for an Ito diffusion process based on discrete sampling observations. The nonparametric kernel estimator for the diffusion function developed in this paper deals with general Ito diffusion processes and avoids any

  5. A nonparametric approach to the estimation of diffusion processes, with an application to a short-term interest rate model

    NARCIS (Netherlands)

    Jiang, GJ; Knight, JL

    1997-01-01

    In this paper, we propose a nonparametric identification and estimation procedure for an Ito diffusion process based on discrete sampling observations. The nonparametric kernel estimator for the diffusion function developed in this paper deals with general Ito diffusion processes and avoids any func

  6. Compression Stress Effect on Dislocations Movement and Crack propagation in Cubic Crystal

    OpenAIRE

    Suprijadi; Ely, Aprilia; Meiqorry, Yusfi

    2011-01-01

    Fracture material is seriously problem in daily life, and it has connection with mechanical properties itself. The mechanical properties is belief depend on dislocation movement and crack propagation in the crystal. Information about this is very important to characterize the material. In FCC crystal structure the competition between crack propagation and dislocation wake is very interesting, in a ductile material like copper (Cu) dislocation can be seen in room temperature, but in a brittle ...

  7. Components of the Belief Gap

    Directory of Open Access Journals (Sweden)

    Cecilie Gaziano

    2014-01-01

    Full Text Available Knowledge gap research focuses on education as an indicator of socioeconomic status (SES. Belief gap research centers on ideology as potentially more powerful than education in comparing sociopolitical groups with scientifically established knowledge and groups with opposing beliefs accepted on faith. This study examined the relationship between education and ideology to understand belief gaps better. The study used 2008 American National Election Study (ANES data to compare conservatives, moderates, and liberals by education on religiosity, child rearing values, opinionation, need for cognition, orientation toward politics, and mass media access and use. Although liberals tended to be more educated than conservatives overall, better-educated conservatives had the highest household incomes and were a much larger group. No known knowledge gap studies have reported results on one group characterized by high education and an opposing group distinguished by a different indicator of SES. Reformulations of the belief gap hypothesis are offered.

  8. Traditional beliefs and electromagnetic fields

    Directory of Open Access Journals (Sweden)

    Colin A. Ross

    2011-09-01

    Full Text Available The author proposes that a wide range of traditional beliefs and practices may provide clues to real electromagnetic field interactions in the biosphere. For instance, evil eye beliefs may be a cultural elaboration of the sense of being stared at, which in turn may have a basis in real electromagnetic emissions through the eye. Data to support this hypothesis are presented. Other traditional beliefs such as remote sensing of game and the importance of connection to the Earth Mother may also contain a kernel of truth. A series of testable scientific hypotheses concerning traditional beliefs and electromagnetic fields is presented. At this stage, the theory does not have sufficient evidence to be accepted as proven; its purpose is to stimulate thought and research

  9. Paranormal belief and attributional style.

    Science.gov (United States)

    Dudley, R T; Whisnand, E A

    2000-06-01

    52 college students completed Tobacyk's 1988 Revised Paranormal Belief Scale and Peterson, Semmel, von Baeyer, Abramson, Metalsky, and Seligman's 1982 Attributional Style Questionnaire. Analysis showed significantly higher depressive attributional styles among high scorers on paranormal phenomena than low scorers.

  10. Nonparametric estimation of population density for line transect sampling using FOURIER series

    Science.gov (United States)

    Crain, B.R.; Burnham, K.P.; Anderson, D.R.; Lake, J.L.

    1979-01-01

    A nonparametric, robust density estimation method is explored for the analysis of right-angle distances from a transect line to the objects sighted. The method is based on the FOURIER series expansion of a probability density function over an interval. With only mild assumptions, a general population density estimator of wide applicability is obtained.

  11. A non-parametric peak finder algorithm and its application in searches for new physics

    CERN Document Server

    Chekanov, S

    2011-01-01

    We have developed an algorithm for non-parametric fitting and extraction of statistically significant peaks in the presence of statistical and systematic uncertainties. Applications of this algorithm for analysis of high-energy collision data are discussed. In particular, we illustrate how to use this algorithm in general searches for new physics in invariant-mass spectra using pp Monte Carlo simulations.

  12. Nonparametric estimation of the stationary M/G/1 workload distribution function

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted

    2005-01-01

    In this paper it is demonstrated how a nonparametric estimator of the stationary workload distribution function of the M/G/1-queue can be obtained by systematic sampling the workload process. Weak convergence results and bootstrap methods for empirical distribution functions for stationary associ...

  13. Testing a parametric function against a nonparametric alternative in IV and GMM settings

    DEFF Research Database (Denmark)

    Gørgens, Tue; Wurtz, Allan

    This paper develops a specification test for functional form for models identified by moment restrictions, including IV and GMM settings. The general framework is one where the moment restrictions are specified as functions of data, a finite-dimensional parameter vector, and a nonparametric real...

  14. Non-parametric Bayesian graph models reveal community structure in resting state fMRI

    DEFF Research Database (Denmark)

    Andersen, Kasper Winther; Madsen, Kristoffer H.; Siebner, Hartwig Roman

    2014-01-01

    Modeling of resting state functional magnetic resonance imaging (rs-fMRI) data using network models is of increasing interest. It is often desirable to group nodes into clusters to interpret the communication patterns between nodes. In this study we consider three different nonparametric Bayesian...

  15. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  16. Non-parametric system identification from non-linear stochastic response

    DEFF Research Database (Denmark)

    Rüdinger, Finn; Krenk, Steen

    2001-01-01

    An estimation method is proposed for identification of non-linear stiffness and damping of single-degree-of-freedom systems under stationary white noise excitation. Non-parametric estimates of the stiffness and damping along with an estimate of the white noise intensity are obtained by suitable p...

  17. The Probability of Exceedance as a Nonparametric Person-Fit Statistic for Tests of Moderate Length

    NARCIS (Netherlands)

    Tendeiro, Jorge N.; Meijer, Rob R.

    2013-01-01

    To classify an item score pattern as not fitting a nonparametric item response theory (NIRT) model, the probability of exceedance (PE) of an observed response vector x can be determined as the sum of the probabilities of all response vectors that are, at most, as likely as x, conditional on the test

  18. Non-Parametric Bayesian Updating within the Assessment of Reliability for Offshore Wind Turbine Support Structures

    DEFF Research Database (Denmark)

    Ramirez, José Rangel; Sørensen, John Dalsgaard

    2011-01-01

    This work illustrates the updating and incorporation of information in the assessment of fatigue reliability for offshore wind turbine. The new information, coming from external and condition monitoring can be used to direct updating of the stochastic variables through a non-parametric Bayesian u...

  19. An Investigation into the Dimensionality of TOEFL Using Conditional Covariance-Based Nonparametric Approach

    Science.gov (United States)

    Jang, Eunice Eunhee; Roussos, Louis

    2007-01-01

    This article reports two studies to illustrate methodologies for conducting a conditional covariance-based nonparametric dimensionality assessment using data from two forms of the Test of English as a Foreign Language (TOEFL). Study 1 illustrates how to assess overall dimensionality of the TOEFL including all three subtests. Study 2 is aimed at…

  20. Testing for constant nonparametric effects in general semiparametric regression models with interactions

    KAUST Repository

    Wei, Jiawei

    2011-07-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work was originally motivated by a unique testing problem in genetic epidemiology (Chatterjee, et al., 2006) that involved a typical generalized linear model but with an additional term reminiscent of the Tukey one-degree-of-freedom formulation, and their interest was in testing for main effects of the genetic variables, while gaining statistical power by allowing for a possible interaction between genes and the environment. Later work (Maity, et al., 2009) involved the possibility of modeling the environmental variable nonparametrically, but they focused on whether there was a parametric main effect for the genetic variables. In this paper, we consider the complementary problem, where the interest is in testing for the main effect of the nonparametrically modeled environmental variable. We derive a generalized likelihood ratio test for this hypothesis, show how to implement it, and provide evidence that our method can improve statistical power when compared to standard partially linear models with main effects only. We use the method for the primary purpose of analyzing data from a case-control study of colorectal adenoma.

  1. Measuring the Influence of Networks on Transaction Costs Using a Nonparametric Regression Technique

    DEFF Research Database (Denmark)

    Henningsen, Geraldine; Henningsen, Arne; Henning, Christian H.C.A.

    . We empirically analyse the effect of networks on productivity using a cross-validated local linear non-parametric regression technique and a data set of 384 farms in Poland. Our empirical study generally supports our hypothesis that networks affect productivity. Large and dense trading networks...

  2. Comparison of reliability techniques of parametric and non-parametric method

    Directory of Open Access Journals (Sweden)

    C. Kalaiselvan

    2016-06-01

    Full Text Available Reliability of a product or system is the probability that the product performs adequately its intended function for the stated period of time under stated operating conditions. It is function of time. The most widely used nano ceramic capacitor C0G and X7R is used in this reliability study to generate the Time-to failure (TTF data. The time to failure data are identified by Accelerated Life Test (ALT and Highly Accelerated Life Testing (HALT. The test is conducted at high stress level to generate more failure rate within the short interval of time. The reliability method used to convert accelerated to actual condition is Parametric method and Non-Parametric method. In this paper, comparative study has been done for Parametric and Non-Parametric methods to identify the failure data. The Weibull distribution is identified for parametric method; Kaplan–Meier and Simple Actuarial Method are identified for non-parametric method. The time taken to identify the mean time to failure (MTTF in accelerating condition is the same for parametric and non-parametric method with relative deviation.

  3. Non-parametric Tuning of PID Controllers A Modified Relay-Feedback-Test Approach

    CERN Document Server

    Boiko, Igor

    2013-01-01

    The relay feedback test (RFT) has become a popular and efficient  tool used in process identification and automatic controller tuning. Non-parametric Tuning of PID Controllers couples new modifications of classical RFT with application-specific optimal tuning rules to form a non-parametric method of test-and-tuning. Test and tuning are coordinated through a set of common parameters so that a PID controller can obtain the desired gain or phase margins in a system exactly, even with unknown process dynamics. The concept of process-specific optimal tuning rules in the nonparametric setup, with corresponding tuning rules for flow, level pressure, and temperature control loops is presented in the text.   Common problems of tuning accuracy based on parametric and non-parametric approaches are addressed. In addition, the text treats the parametric approach to tuning based on the modified RFT approach and the exact model of oscillations in the system under test using the locus of a perturbedrelay system (LPRS) meth...

  4. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  5. A non-parametric method for correction of global radiation observations

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Perers, Bengt;

    2013-01-01

    in the observations are corrected. These are errors such as: tilt in the leveling of the sensor, shadowing from surrounding objects, clipping and saturation in the signal processing, and errors from dirt and wear. The method is based on a statistical non-parametric clear-sky model which is applied to both...

  6. Nonparametric estimation in an "illness-death" model when all transition times are interval censored

    DEFF Research Database (Denmark)

    Frydman, Halina; Gerds, Thomas; Grøn, Randi

    2013-01-01

    We develop nonparametric maximum likelihood estimation for the parameters of an irreversible Markov chain on states {0,1,2} from the observations with interval censored times of 0 → 1, 0 → 2 and 1 → 2 transitions. The distinguishing aspect of the data is that, in addition to all transition times ...

  7. A Comparison of Shewhart Control Charts based on Normality, Nonparametrics, and Extreme-Value Theory

    NARCIS (Netherlands)

    Ion, R.A.; Does, R.J.M.M.; Klaassen, C.A.J.

    2000-01-01

    Several control charts for individual observations are compared. The traditional ones are the well-known Shewhart control charts with estimators for the spread based on the sample standard deviation and the average of the moving ranges. The alternatives are nonparametric control charts, based on emp

  8. Non-parametric production analysis of pesticides use in the Netherlands

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Silva, E.

    2004-01-01

    Many previous empirical studies on the productivity of pesticides suggest that pesticides are under-utilized in agriculture despite the general held believe that these inputs are substantially over-utilized. This paper uses data envelopment analysis (DEA) to calculate non-parametric measures of the

  9. An Assessment of the Nonparametric Approach for Evaluating the Fit of Item Response Models

    Science.gov (United States)

    Liang, Tie; Wells, Craig S.; Hambleton, Ronald K.

    2014-01-01

    As item response theory has been more widely applied, investigating the fit of a parametric model becomes an important part of the measurement process. There is a lack of promising solutions to the detection of model misfit in IRT. Douglas and Cohen introduced a general nonparametric approach, RISE (Root Integrated Squared Error), for detecting…

  10. Performances and Spending Efficiency in Higher Education: A European Comparison through Non-Parametric Approaches

    Science.gov (United States)

    Agasisti, Tommaso

    2011-01-01

    The objective of this paper is an efficiency analysis concerning higher education systems in European countries. Data have been extracted from OECD data-sets (Education at a Glance, several years), using a non-parametric technique--data envelopment analysis--to calculate efficiency scores. This paper represents the first attempt to conduct such an…

  11. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models.

    Science.gov (United States)

    Fan, Jianqing; Feng, Yang; Song, Rui

    2011-06-01

    A variable screening procedure via correlation learning was proposed in Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under general nonparametric models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, a data-driven thresholding and an iterative nonparametric independence screening (INIS) are also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data analysis demonstrate that the proposed procedure works well with moderate sample size and large dimension and performs better than competing methods.

  12. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Varying Coefficient Models.

    Science.gov (United States)

    Fan, Jianqing; Ma, Yunbei; Dai, Wei

    2014-01-01

    The varying-coefficient model is an important class of nonparametric statistical model that allows us to examine how the effects of covariates vary with exposure variables. When the number of covariates is large, the issue of variable selection arises. In this paper, we propose and investigate marginal nonparametric screening methods to screen variables in sparse ultra-high dimensional varying-coefficient models. The proposed nonparametric independence screening (NIS) selects variables by ranking a measure of the nonparametric marginal contributions of each covariate given the exposure variable. The sure independent screening property is established under some mild technical conditions when the dimensionality is of nonpolynomial order, and the dimensionality reduction of NIS is quantified. To enhance the practical utility and finite sample performance, two data-driven iterative NIS methods are proposed for selecting thresholding parameters and variables: conditional permutation and greedy methods, resulting in Conditional-INIS and Greedy-INIS. The effectiveness and flexibility of the proposed methods are further illustrated by simulation studies and real data applications.

  13. Low default credit scoring using two-class non-parametric kernel density estimation

    CSIR Research Space (South Africa)

    Rademeyer, E

    2016-12-01

    Full Text Available This paper investigates the performance of two-class classification credit scoring data sets with low default ratios. The standard two-class parametric Gaussian and non-parametric Parzen classifiers are extended, using Bayes’ rule, to include either...

  14. Measuring the influence of networks on transaction costs using a non-parametric regression technique

    DEFF Research Database (Denmark)

    Henningsen, Géraldine; Henningsen, Arne; Henning, Christian H.C.A.

    . We empirically analyse the effect of networks on productivity using a cross-validated local linear non-parametric regression technique and a data set of 384 farms in Poland. Our empirical study generally supports our hypothesis that networks affect productivity. Large and dense trading networks...

  15. Do Former College Athletes Earn More at Work? A Nonparametric Assessment

    Science.gov (United States)

    Henderson, Daniel J.; Olbrecht, Alexandre; Polachek, Solomon W.

    2006-01-01

    This paper investigates how students' collegiate athletic participation affects their subsequent labor market success. By using newly developed techniques in nonparametric regression, it shows that on average former college athletes earn a wage premium. However, the premium is not uniform, but skewed so that more than half the athletes actually…

  16. Nonparametric Tests of Collectively Rational Consumption Behavior : An Integer Programming Procedure

    NARCIS (Netherlands)

    Cherchye, L.J.H.; de Rock, B.; Sabbe, J.; Vermeulen, F.M.P.

    2008-01-01

    We present an IP-based nonparametric (revealed preference) testing proce- dure for rational consumption behavior in terms of general collective models, which include consumption externalities and public consumption. An empiri- cal application to data drawn from the Russia Longitudinal Monitoring

  17. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  18. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  19. Testing for Constant Nonparametric Effects in General Semiparametric Regression Models with Interactions.

    Science.gov (United States)

    Wei, Jiawei; Carroll, Raymond J; Maity, Arnab

    2011-07-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work was originally motivated by a unique testing problem in genetic epidemiology (Chatterjee, et al., 2006) that involved a typical generalized linear model but with an additional term reminiscent of the Tukey one-degree-of-freedom formulation, and their interest was in testing for main effects of the genetic variables, while gaining statistical power by allowing for a possible interaction between genes and the environment. Later work (Maity, et al., 2009) involved the possibility of modeling the environmental variable nonparametrically, but they focused on whether there was a parametric main effect for the genetic variables. In this paper, we consider the complementary problem, where the interest is in testing for the main effect of the nonparametrically modeled environmental variable. We derive a generalized likelihood ratio test for this hypothesis, show how to implement it, and provide evidence that our method can improve statistical power when compared to standard partially linear models with main effects only. We use the method for the primary purpose of analyzing data from a case-control study of colorectal adenoma.

  20. Irrational beliefs and marital conflict.

    Science.gov (United States)

    Möller, A T; de Beer, Z C

    1998-02-01

    To test the hypothesis that the major irrational evaluative beliefs postulated by Rational Emotive Behavior Therapy are related to marital conflict, 15 married couples participated in a thought-listing procedure. During this procedure, three idiosyncratic scenes portraying marital conflict and three control scenes free of conflict were identified for and presented to each member of the dyad. Analysis indicated that the conflict-portraying scenes were associated with significantly more irrational evaluative beliefs and significantly fewer rational cognitions than the control scenes.

  1. Social vigilantism: measuring individual differences in belief superiority and resistance to persuasion.

    Science.gov (United States)

    Saucier, Donald A; Webster, Russell J

    2010-01-01

    Social vigilantism (SV) is an enduring individual difference that assesses the tendency of individuals to impress and propagate their "superior" beliefs onto others to correct others' more "ignorant" opinions. After establishing a reliable measure of SV, three studies showed that SV was associated with greater expressions of belief superiority (whether reacting to others holding dissimilar or similar beliefs) and greater resistance to persuasion (via increased rates of counterarguing and greater attitude stability after persuasion appeals) even after controlling for relevant individual differences (narcissism, dogmatism, psychological reactance, and need for cognition), as well as attitude importance and extremity. Thus, SV predicts expressions of belief superiority and resistance to persuasion above and beyond characteristics of the attitude and individual difference variables previously studied in the attitude literature. SV is a meaningful construct in increasing the understanding of persuasion, attitude resistance, and attitude dissemination that can be applied in a variety of psychological domains.

  2. Hierarchical Affinity Propagation

    CERN Document Server

    Givoni, Inmar; Frey, Brendan J

    2012-01-01

    Affinity propagation is an exemplar-based clustering algorithm that finds a set of data-points that best exemplify the data, and associates each datapoint with one exemplar. We extend affinity propagation in a principled way to solve the hierarchical clustering problem, which arises in a variety of domains including biology, sensor networks and decision making in operational research. We derive an inference algorithm that operates by propagating information up and down the hierarchy, and is efficient despite the high-order potentials required for the graphical model formulation. We demonstrate that our method outperforms greedy techniques that cluster one layer at a time. We show that on an artificial dataset designed to mimic the HIV-strain mutation dynamics, our method outperforms related methods. For real HIV sequences, where the ground truth is not available, we show our method achieves better results, in terms of the underlying objective function, and show the results correspond meaningfully to geographi...

  3. Propagation of waves

    CERN Document Server

    David, P

    2013-01-01

    Propagation of Waves focuses on the wave propagation around the earth, which is influenced by its curvature, surface irregularities, and by passage through atmospheric layers that may be refracting, absorbing, or ionized. This book begins by outlining the behavior of waves in the various media and at their interfaces, which simplifies the basic phenomena, such as absorption, refraction, reflection, and interference. Applications to the case of the terrestrial sphere are also discussed as a natural generalization. Following the deliberation on the diffraction of the "ground? wave around the ear

  4. The Propagation and Evolution of Design Constraints

    DEFF Research Database (Denmark)

    Nair, V. V.; Howard, Thomas J.; Culley, S. J.

    2011-01-01

    The research in this paper reports a case study of a design project which had two scopes for a variant and an adaptive design type. This is quite common practice where designers commonly propose short and long term solutions to design problems. In this particular case, preference was not voiced...... for an adaptive solution until the 3rd stage-gate of the project. The researcher acted as project manager of the design project using a participatory action research methodology. During the design work the researcher recorded and classified the various design constraints introduced and propagated throughout...... the design project.Where it was hypothesised that the variant design type will contain all constraints of the adaptive design with additional ones, the study should this not to be the case. The study also revealed that contrary to popular belief, changes at higher systems levels may exhibit less creativity...

  5. Religious beliefs and entrepreneurship among Dutch protestants

    OpenAIRE

    Rietveld, Niels; van Burg, Elco

    2013-01-01

    textabstractReligious beliefs affect the economic behavior of individuals. The aim of this study is to investigate the relation between religious beliefs and entrepreneurship. Empirical evidence that entrepreneurship rates differ among religions suggests that religious beliefs influence the pursuit of entrepreneurship. However, how and which specific religious beliefs play a role in this relationship remains unknown. Therefore, we study the relation between two key religious beliefs and entre...

  6. Estimation of the limit of detection with a bootstrap-derived standard error by a partly non-parametric approach. Application to HPLC drug assays

    DEFF Research Database (Denmark)

    Linnet, Kristian

    2005-01-01

    Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...

  7. Variable sigma Gaussian processes: An expectation propagation perspective

    CERN Document Server

    Yuan,; Abdel-Gawad, Ahmed H; Minka, Thomas P

    2009-01-01

    Gaussian processes (GPs) provide a probabilistic nonparametric representation of functions in regression, classification, and other problems. Unfortunately, exact learning with GPs is intractable for large datasets. A variety of approximate GP methods have been proposed that essentially map the large dataset into a small set of basis points. The most advanced of these, the variable-sigma GP (VSGP) (Walder et al., 2008), allows each basis point to have its own length scale. However, VSGP was only derived for regression. We describe how VSGP can be applied to classification and other problems, by deriving it as an expectation propagation algorithm. In this view, sparse GP approximations correspond to a KL-projection of the true posterior onto a compact exponential family of GPs. VSGP constitutes one such family, and we show how to enlarge this family to get additional accuracy. In particular, we show that endowing each basis point with its own full covariance matrix provides a significant increase in approximat...

  8. DROMO propagator revisited

    Science.gov (United States)

    Urrutxua, Hodei; Sanjurjo-Rivo, Manuel; Peláez, Jesús

    2016-01-01

    In the year 2000 an in-house orbital propagator called DROMO (Peláez et al. in Celest Mech Dyn Astron 97:131-150, 2007. doi: 10.1007/s10569-006-9056-3) was developed by the Space Dynamics Group of the Technical University of Madrid, based in a set of redundant variables including Euler-Rodrigues parameters. An original deduction of the DROMO propagator is carried out, underlining its close relation with the ideal frame concept introduced by Hansen (Abh der Math-Phys Cl der Kon Sachs Ges der Wissensch 5:41-218, 1857). Based on the very same concept, Deprit (J Res Natl Bur Stand Sect B Math Sci 79B(1-2):1-15, 1975) proposed a formulation for orbit propagation. In this paper, similarities and differences with the theory carried out by Deprit are analyzed. Simultaneously, some improvements are introduced in the formulation, that lead to a more synthetic and better performing propagator. Also, the long-term effect of the oblateness of the primary is studied in terms of DROMO variables, and new numerical results are presented to evaluate the performance of the method.

  9. Shallow Water Propagation

    Science.gov (United States)

    2014-09-30

    response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and...these modes decay much more slowly than leaky modes as they propagate. The initial focus is on modal phase and group velocity formulas, obtained from...acoustic quantities such as transmission loss and scintillation index. (C) Both physical understanding and reasonable estimates of

  10. Scaling Irrational Beliefs in the General Attitude and Belief Scale

    Directory of Open Access Journals (Sweden)

    Lindsay R. Owings

    2013-04-01

    Full Text Available Accurate measurement of key constructs is essential to the continued development of Rational-Emotive Behavior Therapy (REBT. The General Attitude and Belief Scale (GABS, a contemporary inventory of rational and irrational beliefs based on current REBT theory, is one of the most valid and widely used instruments available, and recent research has continued to improve its psychometric standing. In this study of 544 students, item response theory (IRT methods were used (a to identify the most informative item in each irrational subscale of the GABS, (b to determine the level of irrationality represented by each of those items, and (c to suggest a condensed form of the GABS for further study with clinical populations. Administering only the most psychometrically informative items to clients could result in economies of time and effort. Further research based on the scaling of items could clarify the specific patterns of irrational beliefs associated with particular clinical syndromes.

  11. Abductive inference and delusional belief.

    Science.gov (United States)

    Coltheart, Max; Menzies, Peter; Sutton, John

    2010-01-01

    Delusional beliefs have sometimes been considered as rational inferences from abnormal experiences. We explore this idea in more detail, making the following points. First, the abnormalities of cognition that initially prompt the entertaining of a delusional belief are not always conscious and since we prefer to restrict the term "experience" to consciousness we refer to "abnormal data" rather than "abnormal experience". Second, we argue that in relation to many delusions (we consider seven) one can clearly identify what the abnormal cognitive data are which prompted the delusion and what the neuropsychological impairment is which is responsible for the occurrence of these data; but one can equally clearly point to cases where this impairment is present but delusion is not. So the impairment is not sufficient for delusion to occur: a second cognitive impairment, one that affects the ability to evaluate beliefs, must also be present. Third (and this is the main thrust of our paper), we consider in detail what the nature of the inference is that leads from the abnormal data to the belief. This is not deductive inference and it is not inference by enumerative induction; it is abductive inference. We offer a Bayesian account of abductive inference and apply it to the explanation of delusional belief.

  12. The ecology of religious beliefs

    Science.gov (United States)

    Botero, Carlos A.; Gardner, Beth; Kirby, Kathryn R.; Bulbulia, Joseph; Gavin, Michael C.; Gray, Russell D.

    2014-01-01

    Although ecological forces are known to shape the expression of sociality across a broad range of biological taxa, their role in shaping human behavior is currently disputed. Both comparative and experimental evidence indicate that beliefs in moralizing high gods promote cooperation among humans, a behavioral attribute known to correlate with environmental harshness in nonhuman animals. Here we combine fine-grained bioclimatic data with the latest statistical tools from ecology and the social sciences to evaluate the potential effects of environmental forces, language history, and culture on the global distribution of belief in moralizing high gods (n = 583 societies). After simultaneously accounting for potential nonindependence among societies because of shared ancestry and cultural diffusion, we find that these beliefs are more prevalent among societies that inhabit poorer environments and are more prone to ecological duress. In addition, we find that these beliefs are more likely in politically complex societies that recognize rights to movable property. Overall, our multimodel inference approach predicts the global distribution of beliefs in moralizing high gods with an accuracy of 91%, and estimates the relative importance of different potential mechanisms by which this spatial pattern may have arisen. The emerging picture is neither one of pure cultural transmission nor of simple ecological determinism, but rather a complex mixture of social, cultural, and environmental influences. Our methods and findings provide a blueprint for how the increasing wealth of ecological, linguistic, and historical data can be leveraged to understand the forces that have shaped the behavior of our own species. PMID:25385605

  13. Propagating Instabilities in Solids

    Science.gov (United States)

    Kyriakides, Stelios

    1998-03-01

    Instability is one of the factors which limit the extent to which solids can be loaded or deformed and plays a pivotal role in the design of many structures. Such instabilities often result in localized deformation which precipitates catastrophic failure. Some materials have the capacity to recover their stiffness following a certain amount of localized deformation. This local recovery in stiffness arrests further local deformation and spreading of the instability to neighboring material becomes preferred. Under displacement controlled loading the propagation of the transition fronts can be achieved in a steady-state manner at a constant stress level known as the propagation stress. The stresses in the transition fronts joining the highly deformed zone to the intact material overcome the instability nucleation stresses and, as a result, the propagation stress is usually much lower than the stress required to nucleate the instability. The classical example of this class of material instabilities is L/"uders bands which tend to affect mild steels and other metals. Recent work has demonstrated that propagating instabilities occur in several other materials. Experimental and analytical results from four examples will be used to illustrate this point: First the evolution of L=FCders bands in mild steel strips will be revisited. The second example involves the evolution of stress induced phase transformations (austenite to martensite phases and the reverse) in a shape memory alloy under displacement controlled stretching. The third example is the crushing behavior of cellular materials such as honeycombs and foams made from metals and polymers. The fourth example involves the axial broadening/propagation of kink bands in aligned fiber/matrix composites under compression. The microstructure and, as a result, the micromechanisms governing the onset, localization, local arrest and propagation of instabilities in each of the four materials are vastly different. Despite this

  14. Belief Revision and Argumentation Theory

    Science.gov (United States)

    Falappa, Marcelo Alejandro; Kern-Isberner, Gabriele; Simari, Guillermo Ricardo

    Belief revision is the process of changing beliefs to adapt the epistemic state of an agent to a new piece of information. The logical formalization of belief revision is a topic of research in philosophy, logic, and in computer science, in areas such as databases or artificial intelligence. On the other hand, argumentation is concerned primarily with the evaluation of claims based on premises in order to reach conclusions. Both provide basic and substantial techniques for the art of reasoning, as it is performed by human beings in everyday life situations and which goes far beyond logical deduction. Reasoning, in this sense, makes possible to deal successfully with problems in uncertain, dynamic environments and has been promoting the development of human societies.

  15. Vegetative propagation of jojoba

    Energy Technology Data Exchange (ETDEWEB)

    Low, C.B.; Hackett, W.P.

    1981-03-01

    Development of jojoba as an economically viable crop requires improved methods of propagation and culture. Rooting experiments were performed on cutting material collected from wild jojoba plants. A striking seasonal fluctuation in rooting potential was found. Jojoba plants can be successfully propagated from stem cuttings made during spring, summer, and, to some extent, fall. Variability among jojoba plants may also play a role in rooting potential, although it is not as important as season. In general, the use of auxin (4,000 ppm indolebutyric acid) on jojoba cuttings during periods of high rooting potential promotes adventitious root formation, but during periods of low rooting potential it has no effect or is even slightly inhibitory. In the greenhouse, cutting-grown plants apparently reproductively matured sooner than those grown from seed. If this observation holds true for plants transplanted into the field, earlier fruit production by cutting--grown plants would mean earlier return of initial planting and maintenance costs.

  16. Propagation of Tau aggregates.

    Science.gov (United States)

    Goedert, Michel; Spillantini, Maria Grazia

    2017-05-30

    Since 2009, evidence has accumulated to suggest that Tau aggregates form first in a small number of brain cells, from where they propagate to other regions, resulting in neurodegeneration and disease. Propagation of Tau aggregates is often called prion-like, which refers to the capacity of an assembled protein to induce the same abnormal conformation in a protein of the same kind, initiating a self-amplifying cascade. In addition, prion-like encompasses the release of protein aggregates from brain cells and their uptake by neighbouring cells. In mice, the intracerebral injection of Tau inclusions induced the ordered assembly of monomeric Tau, followed by its spreading to distant brain regions. Short fibrils constituted the major species of seed-competent Tau. The existence of several human Tauopathies with distinct fibril morphologies has led to the suggestion that different molecular conformers (or strains) of aggregated Tau exist.

  17. Stochastic wave propagation

    CERN Document Server

    Sobczyk, K

    1985-01-01

    This is a concise, unified exposition of the existing methods of analysis of linear stochastic waves with particular reference to the most recent results. Both scalar and vector waves are considered. Principal attention is concentrated on wave propagation in stochastic media and wave scattering at stochastic surfaces. However, discussion extends also to various mathematical aspects of stochastic wave equations and problems of modelling stochastic media.

  18. Bidirectional beam propagation method

    Science.gov (United States)

    Kaczmarski, P.; Lagasse, P. E.

    1988-05-01

    A bidirectional extension of the beam propagation method (BPM) to optical waveguides with a longitudinal discontinuity is presented. The algorithm is verified by computing a reflection of the TE(0) mode from a semiconductor laser facet. The bidirectional BPM is applicable to other configurations such as totally reflecting waveguide mirrors, an abruption transition in a waveguide, or a waveguide with many discontinuities generating multiple reflections. The method can also be adapted to TM polarization.

  19. Gauge engineering and propagators

    Directory of Open Access Journals (Sweden)

    Maas Axel

    2017-01-01

    The dependence of the propagators on the choice of these complete gauge-fixings will then be investigated using lattice gauge theory for Yang-Mills theory. It is found that the implications for the infrared, and to some extent mid-momentum behavior, can be substantial. In going beyond the Yang-Mills case it turns out that the influence of matter can generally not be neglected. This will be briefly discussed for various types of matter.

  20. Spline Nonparametric Regression Analysis of Stress-Strain Curve of Confined Concrete

    Directory of Open Access Journals (Sweden)

    Tavio Tavio

    2008-01-01

    Full Text Available Due to enormous uncertainties in confinement models associated with the maximum compressive strength and ductility of concrete confined by rectilinear ties, the implementation of spline nonparametric regression analysis is proposed herein as an alternative approach. The statistical evaluation is carried out based on 128 large-scale column specimens of either normal-or high-strength concrete tested under uniaxial compression. The main advantage of this kind of analysis is that it can be applied when the trend of relation between predictor and response variables are not obvious. The error in the analysis can, therefore, be minimized so that it does not depend on the assumption of a particular shape of the curve. This provides higher flexibility in the application. The results of the statistical analysis indicates that the stress-strain curves of confined concrete obtained from the spline nonparametric regression analysis proves to be in good agreement with the experimental curves available in literatures

  1. Non-parametric Bayesian human motion recognition using a single MEMS tri-axial accelerometer.

    Science.gov (United States)

    Ahmed, M Ejaz; Song, Ju Bin

    2012-09-27

    In this paper, we propose a non-parametric clustering method to recognize the number of human motions using features which are obtained from a single microelectromechanical system (MEMS) accelerometer. Since the number of human motions under consideration is not known a priori and because of the unsupervised nature of the proposed technique, there is no need to collect training data for the human motions. The infinite Gaussian mixture model (IGMM) and collapsed Gibbs sampler are adopted to cluster the human motions using extracted features. From the experimental results, we show that the unanticipated human motions are detected and recognized with significant accuracy, as compared with the parametric Fuzzy C-Mean (FCM) technique, the unsupervised K-means algorithm, and the non-parametric mean-shift method.

  2. Non-Parametric Bayesian Human Motion Recognition Using a Single MEMS Tri-Axial Accelerometer

    Directory of Open Access Journals (Sweden)

    M. Ejaz Ahmed

    2012-09-01

    Full Text Available In this paper, we propose a non-parametric clustering method to recognize the number of human motions using features which are obtained from a single microelectromechanical system (MEMS accelerometer. Since the number of human motions under consideration is not known a priori and because of the unsupervised nature of the proposed technique, there is no need to collect training data for the human motions. The infinite Gaussian mixture model (IGMM and collapsed Gibbs sampler are adopted to cluster the human motions using extracted features. From the experimental results, we show that the unanticipated human motions are detected and recognized with significant accuracy, as compared with the parametric Fuzzy C-Mean (FCM technique, the unsupervised K-means algorithm, and the non-parametric mean-shift method.

  3. Applications of non-parametric statistics and analysis of variance on sample variances

    Science.gov (United States)

    Myers, R. H.

    1981-01-01

    Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.

  4. Testing the Non-Parametric Conditional CAPM in the Brazilian Stock Market

    Directory of Open Access Journals (Sweden)

    Daniel Reed Bergmann

    2014-04-01

    Full Text Available This paper seeks to analyze if the variations of returns and systematic risks from Brazilian portfolios could be explained by the nonparametric conditional Capital Asset Pricing Model (CAPM by Wang (2002. There are four informational variables available to the investors: (i the Brazilian industrial production level; (ii the broad money supply M4; (iii the inflation represented by the Índice de Preços ao Consumidor Amplo (IPCA; and (iv the real-dollar exchange rate, obtained by PTAX dollar quotation.This study comprised the shares listed in the BOVESPA throughout January 2002 to December 2009. The test methodology developed by Wang (2002 and retorted to the Mexican context by Castillo-Spíndola (2006 was used. The observed results indicate that the nonparametric conditional model is relevant in explaining the portfolios’ returns of the sample considered for two among the four tested variables, M4 and PTAX dollar at 5% level of significance.

  5. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...... to avoid this problem. The main objective is to investigate the applicability of the nonparametric kernel regression method in applied production analysis. The focus of the empirical analyses included in this thesis is the agricultural sector in Poland. Data on Polish farms are used to investigate...... practically and politically relevant problems and to illustrate how nonparametric regression methods can be used in applied microeconomic production analysis both in panel data and cross-section data settings. The thesis consists of four papers. The first paper addresses problems of parametric...

  6. On Wasserstein Two-Sample Testing and Related Families of Nonparametric Tests

    Directory of Open Access Journals (Sweden)

    Aaditya Ramdas

    2017-01-01

    Full Text Available Nonparametric two-sample or homogeneity testing is a decision theoretic problem that involves identifying differences between two random variables without making parametric assumptions about their underlying distributions. The literature is old and rich, with a wide variety of statistics having being designed and analyzed, both for the unidimensional and the multivariate setting. Inthisshortsurvey,wefocusonteststatisticsthatinvolvetheWassersteindistance. Usingan entropic smoothing of the Wasserstein distance, we connect these to very different tests including multivariate methods involving energy statistics and kernel based maximum mean discrepancy and univariate methods like the Kolmogorov–Smirnov test, probability or quantile (PP/QQ plots and receiver operating characteristic or ordinal dominance (ROC/ODC curves. Some observations are implicit in the literature, while others seem to have not been noticed thus far. Given nonparametric two-sample testing’s classical and continued importance, we aim to provide useful connections for theorists and practitioners familiar with one subset of methods but not others.

  7. Stahel-Donoho kernel estimation for fixed design nonparametric regression models

    Institute of Scientific and Technical Information of China (English)

    LIN; Lu

    2006-01-01

    This paper reports a robust kernel estimation for fixed design nonparametric regression models.A Stahel-Donoho kernel estimation is introduced,in which the weight functions depend on both the depths of data and the distances between the design points and the estimation points.Based on a local approximation,a computational technique is given to approximate to the incomputable depths of the errors.As a result the new estimator is computationally efficient.The proposed estimator attains a high breakdown point and has perfect asymptotic behaviors such as the asymptotic normality and convergence in the mean squared error.Unlike the depth-weighted estimator for parametric regression models,this depth-weighted nonparametric estimator has a simple variance structure and then we can compare its efficiency with the original one.Some simulations show that the new method can smooth the regression estimation and achieve some desirable balances between robustness and efficiency.

  8. Bayesian Bandwidth Selection for a Nonparametric Regression Model with Mixed Types of Regressors

    Directory of Open Access Journals (Sweden)

    Xibin Zhang

    2016-04-01

    Full Text Available This paper develops a sampling algorithm for bandwidth estimation in a nonparametric regression model with continuous and discrete regressors under an unknown error density. The error density is approximated by the kernel density estimator of the unobserved errors, while the regression function is estimated using the Nadaraya-Watson estimator admitting continuous and discrete regressors. We derive an approximate likelihood and posterior for bandwidth parameters, followed by a sampling algorithm. Simulation results show that the proposed approach typically leads to better accuracy of the resulting estimates than cross-validation, particularly for smaller sample sizes. This bandwidth estimation approach is applied to nonparametric regression model of the Australian All Ordinaries returns and the kernel density estimation of gross domestic product (GDP growth rates among the organisation for economic co-operation and development (OECD and non-OECD countries.

  9. Functional-Coefficient Spatial Durbin Models with Nonparametric Spatial Weights: An Application to Economic Growth

    Directory of Open Access Journals (Sweden)

    Mustafa Koroglu

    2016-02-01

    Full Text Available This paper considers a functional-coefficient spatial Durbin model with nonparametric spatial weights. Applying the series approximation method, we estimate the unknown functional coefficients and spatial weighting functions via a nonparametric two-stage least squares (or 2SLS estimation method. To further improve estimation accuracy, we also construct a second-step estimator of the unknown functional coefficients by a local linear regression approach. Some Monte Carlo simulation results are reported to assess the finite sample performance of our proposed estimators. We then apply the proposed model to re-examine national economic growth by augmenting the conventional Solow economic growth convergence model with unknown spatial interactive structures of the national economy, as well as country-specific Solow parameters, where the spatial weighting functions and Solow parameters are allowed to be a function of geographical distance and the countries’ openness to trade, respectively.

  10. A Cooperative Bayesian Nonparametric Framework for Primary User Activity Monitoring in Cognitive Radio Network

    CERN Document Server

    Saad, Walid; Poor, H Vincent; Başar, Tamer; Song, Ju Bin

    2012-01-01

    This paper introduces a novel approach that enables a number of cognitive radio devices that are observing the availability pattern of a number of primary users(PUs), to cooperate and use \\emph{Bayesian nonparametric} techniques to estimate the distributions of the PUs' activity pattern, assumed to be completely unknown. In the proposed model, each cognitive node may have its own individual view on each PU's distribution, and, hence, seeks to find partners having a correlated perception. To address this problem, a coalitional game is formulated between the cognitive devices and an algorithm for cooperative coalition formation is proposed. It is shown that the proposed coalition formation algorithm allows the cognitive nodes that are experiencing a similar behavior from some PUs to self-organize into disjoint, independent coalitions. Inside each coalition, the cooperative cognitive nodes use a combination of Bayesian nonparametric models such as the Dirichlet process and statistical goodness of fit techniques ...

  11. 非参数判别模型%Nonparametric discriminant model

    Institute of Scientific and Technical Information of China (English)

    谢斌锋; 梁飞豹

    2011-01-01

    提出了一类新的判别分析方法,主要思想是将非参数回归模型推广到判别分析中,形成相应的非参数判别模型.通过实例与传统判别法相比较,表明非参数判别法具有更广泛的适用性和较高的回代正确率.%In this paper, the author puts forth a new class of discriminant method, which the main idea is applied non- parametric regression model to discriminant analysis and forms the corresponding nonparametric discriminant model. Compared with the traditional discriminant methods by citing an example, the nonparametric discriminant method has more comprehensive adaptability and higher correct rate of back subsitution.

  12. Non-Parametric Tests of Structure for High Angular Resolution Diffusion Imaging in Q-Space

    CERN Document Server

    Olhede, Sofia C

    2010-01-01

    High angular resolution diffusion imaging data is the observed characteristic function for the local diffusion of water molecules in tissue. This data is used to infer structural information in brain imaging. Non-parametric scalar measures are proposed to summarize such data, and to locally characterize spatial features of the diffusion probability density function (PDF), relying on the geometry of the characteristic function. Summary statistics are defined so that their distributions are, to first order, both independent of nuisance parameters and also analytically tractable. The dominant direction of the diffusion at a spatial location (voxel) is determined, and a new set of axes are introduced in Fourier space. Variation quantified in these axes determines the local spatial properties of the diffusion density. Non-parametric hypothesis tests for determining whether the diffusion is unimodal, isotropic or multi-modal are proposed. More subtle characteristics of white-matter microstructure, such as the degre...

  13. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...... function. However, the a priori specification of a functional form involves the risk of choosing one that is not similar to the “true” but unknown relationship between the regressors and the dependent variable. This problem, known as parametric misspecification, can result in biased parameter estimates...... and nonparametric estimations of production functions in order to evaluate the optimal firm size. The second paper discusses the use of parametric and nonparametric regression methods to estimate panel data regression models. The third paper analyses production risk, price uncertainty, and farmers' risk preferences...

  14. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J.

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  15. Floating Car Data Based Nonparametric Regression Model for Short-Term Travel Speed Prediction

    Institute of Scientific and Technical Information of China (English)

    WENG Jian-cheng; HU Zhong-wei; YU Quan; REN Fu-tian

    2007-01-01

    A K-nearest neighbor (K-NN) based nonparametric regression model was proposed to predict travel speed for Beijing expressway. By using the historical traffic data collected from the detectors in Beijing expressways, a specically designed database was developed via the processes including data filtering, wavelet analysis and clustering. The relativity based weighted Euclidean distance was used as the distance metric to identify the K groups of nearest data series. Then, a K-NN nonparametric regression model was built to predict the average travel speeds up to 6 min into the future. Several randomly selected travel speed data series,collected from the floating car data (FCD) system, were used to validate the model. The results indicate that using the FCD, the model can predict average travel speeds with an accuracy of above 90%, and hence is feasible and effective.

  16. Aggregation of Information and Beliefs

    DEFF Research Database (Denmark)

    Ottaviani, Marco; Sørensen, Peter Norman

    In a binary prediction market in which risk-neutral traders have heterogeneous prior beliefs and are allowed to invest a limited amount of money, the static rational expectations equilibrium price is demonstrated to underreact to information. This effect is consistent with a favorite-longshot bias......, and is more pronounced when prior beliefs are more heterogeneous. Relaxing the assumptions of risk neutrality and bounded budget, underreaction to information also holds in a more general asset market with heterogeneous priors, provided traders have decreasing absolute risk aversion. In a dynamic asset market...

  17. Patients' treatment beliefs in low back pain: development and validation of a questionnaire in primary care.

    Science.gov (United States)

    Dima, Alexandra; Lewith, George T; Little, Paul; Moss-Morris, Rona; Foster, Nadine E; Hankins, Matthew; Surtees, George; Bishop, Felicity L

    2015-08-01

    Choosing the most appropriate treatment for individual patients with low back pain (LBP) can be challenging, and clinical guidelines recommend taking into account patients' preferences. However, no tools exist to assess or compare patients' views about LBP treatments. We report the development and validation of the Low Back Pain Treatment Beliefs Questionnaire (LBP-TBQ) for use across different treatments in clinical practice and research. Using qualitative data, we developed a pool of items assessing perceived credibility, effectiveness, concerns about, and individual "fit" of specific treatments. These items were included in a survey completed by 429 primary care patients with LBP, of whom 115 completed it again 1 to 2 weeks later. We performed psychometric analyses using nonparametric item response theory and classical test theory. The 4 subscales of the resulting 16-item LBP-TBQ showed good homogeneity (H = 0.46-0.76), internal consistency (α = 0.73-0.94), and stability (r = 0.63-0.83), confirmed most convergent and discriminant validity hypotheses, and had acceptable structural validity for 4 guideline-recommended treatments: pain medication, exercise, manual therapy, and acupuncture. Participants with stronger positive treatment beliefs were more likely to rank that treatment as their first choice, indicating good criterion validity (t values = 3.11-9.80, all P pain medication effectiveness beliefs, t(339) = 1.35; P = 0.18). A short 4-item version also displayed good homogeneity (H = 0.43-0.66), internal consistency (α = 0.70-0.86), and stability (r = 0.82-0.85) and was significantly related to treatment choice (t values = 4.33-9.25, all P treatment beliefs in primary care patients with LBP and to investigate the effects of treatment beliefs on treatment uptake and adherence.

  18. Variable selection in identification of a high dimensional nonlinear non-parametric system

    Institute of Scientific and Technical Information of China (English)

    Er-Wei BAI; Wenxiao ZHAO; Weixing ZHENG

    2015-01-01

    The problem of variable selection in system identification of a high dimensional nonlinear non-parametric system is described. The inherent difficulty, the curse of dimensionality, is introduced. Then its connections to various topics and research areas are briefly discussed, including order determination, pattern recognition, data mining, machine learning, statistical regression and manifold embedding. Finally, some results of variable selection in system identification in the recent literature are presented.

  19. Estimating Financial Risk Measures for Futures Positions:A Non-Parametric Approach

    OpenAIRE

    Cotter, John; dowd, kevin

    2011-01-01

    This paper presents non-parametric estimates of spectral risk measures applied to long and short positions in 5 prominent equity futures contracts. It also compares these to estimates of two popular alternative measures, the Value-at-Risk (VaR) and Expected Shortfall (ES). The spectral risk measures are conditioned on the coefficient of absolute risk aversion, and the latter two are conditioned on the confidence level. Our findings indicate that all risk measures increase dramatically and the...

  20. Measuring the influence of information networks on transaction costs using a non-parametric regression technique

    DEFF Research Database (Denmark)

    Henningsen, Geraldine; Henningsen, Arne; Henning, Christian H. C. A.

    All business transactions as well as achieving innovations take up resources, subsumed under the concept of transaction costs (TAC). One of the major factors in TAC theory is information. Information networks can catalyse the interpersonal information exchange and hence, increase the access to no...... are unveiled by reduced productivity. A cross-validated local linear non-parametric regression shows that good information networks increase the productivity of farms. A bootstrapping procedure confirms that this result is statistically significant....

  1. Asymmetry Effects in Chinese Stock Markets Volatility: A Generalized Additive Nonparametric Approach

    OpenAIRE

    Hou, Ai Jun

    2007-01-01

    The unique characteristics of the Chinese stock markets make it difficult to assume a particular distribution for innovations in returns and the specification form of the volatility process when modeling return volatility with the parametric GARCH family models. This paper therefore applies a generalized additive nonparametric smoothing technique to examine the volatility of the Chinese stock markets. The empirical results indicate that an asymmetric effect of negative news exists in the Chin...

  2. Using a nonparametric PV model to forecast AC power output of PV plants

    OpenAIRE

    Almeida, Marcelo Pinho; Perpiñan Lamigueiro, Oscar; Narvarte Fernández, Luis

    2015-01-01

    In this paper, a methodology using a nonparametric model is used to forecast AC power output of PV plants using as inputs several forecasts of meteorological variables from a Numerical Weather Prediction (NWP) model and actual AC power measurements of PV plants. The methodology was built upon the R environment and uses Quantile Regression Forests as machine learning tool to forecast the AC power with a confidence interval. Real data from five PV plants was used to validate the methodology, an...

  3. An exact predictive recursion for Bayesian nonparametric analysis of incomplete data

    OpenAIRE

    Garibaldi, Ubaldo; Viarengo, Paolo

    2010-01-01

    This paper presents a new derivation of nonparametric distribution estimation with right-censored data. It is based on an extension of the predictive inferences to compound evidence. The estimate is recursive and exact, and no stochastic approximation is needed: it simply requires that the censored data are processed in decreasing order. Only in this case the recursion provides exact posterior predictive distributions for subsequent samples under a Dirichlet process prior. The resulting estim...

  4. t-tests, non-parametric tests, and large studies—a paradox of statistical practice?

    Directory of Open Access Journals (Sweden)

    Fagerland Morten W

    2012-06-01

    Full Text Available Abstract Background During the last 30 years, the median sample size of research studies published in high-impact medical journals has increased manyfold, while the use of non-parametric tests has increased at the expense of t-tests. This paper explores this paradoxical practice and illustrates its consequences. Methods A simulation study is used to compare the rejection rates of the Wilcoxon-Mann-Whitney (WMW test and the two-sample t-test for increasing sample size. Samples are drawn from skewed distributions with equal means and medians but with a small difference in spread. A hypothetical case study is used for illustration and motivation. Results The WMW test produces, on average, smaller p-values than the t-test. This discrepancy increases with increasing sample size, skewness, and difference in spread. For heavily skewed data, the proportion of p Conclusions Non-parametric tests are most useful for small studies. Using non-parametric tests in large studies may provide answers to the wrong question, thus confusing readers. For studies with a large sample size, t-tests and their corresponding confidence intervals can and should be used even for heavily skewed data.

  5. Nonparametric Kernel Smoothing Methods. The sm library in Xlisp-Stat

    Directory of Open Access Journals (Sweden)

    Luca Scrucca

    2001-06-01

    Full Text Available In this paper we describe the Xlisp-Stat version of the sm library, a software for applying nonparametric kernel smoothing methods. The original version of the sm library was written by Bowman and Azzalini in S-Plus, and it is documented in their book Applied Smoothing Techniques for Data Analysis (1997. This is also the main reference for a complete description of the statistical methods implemented. The sm library provides kernel smoothing methods for obtaining nonparametric estimates of density functions and regression curves for different data structures. Smoothing techniques may be employed as a descriptive graphical tool for exploratory data analysis. Furthermore, they can also serve for inferential purposes as, for instance, when a nonparametric estimate is used for checking a proposed parametric model. The Xlisp-Stat version includes some extensions to the original sm library, mainly in the area of local likelihood estimation for generalized linear models. The Xlisp-Stat version of the sm library has been written following an object-oriented approach. This should allow experienced Xlisp-Stat users to implement easily their own methods and new research ideas into the built-in prototypes.

  6. Nonparametric feature extraction for classification of hyperspectral images with limited training samples

    Science.gov (United States)

    Kianisarkaleh, Azadeh; Ghassemian, Hassan

    2016-09-01

    Feature extraction plays a crucial role in improvement of hyperspectral images classification. Nonparametric feature extraction methods show better performance compared to parametric ones when distribution of classes is non normal-like. Moreover, they can extract more features than parametric methods do. In this paper, a new nonparametric linear feature extraction method is introduced for classification of hyperspectral images. The proposed method has no free parameter and its novelty can be discussed in two parts. First, neighbor samples are specified by using Parzen window idea for determining local mean. Second, two new weighting functions are used. Samples close to class boundaries will have more weight in the between-class scatter matrix formation and samples close to class mean will have more weight in the within-class scatter matrix formation. The experimental results on three real hyperspectral data sets, Indian Pines, Salinas and Pavia University, demonstrate that the proposed method has better performance in comparison with some other nonparametric and parametric feature extraction methods.

  7. A Comparison of Parametric and Non-Parametric Methods Applied to a Likert Scale.

    Science.gov (United States)

    Mircioiu, Constantin; Atkinson, Jeffrey

    2017-05-10

    A trenchant and passionate dispute over the use of parametric versus non-parametric methods for the analysis of Likert scale ordinal data has raged for the past eight decades. The answer is not a simple "yes" or "no" but is related to hypotheses, objectives, risks, and paradigms. In this paper, we took a pragmatic approach. We applied both types of methods to the analysis of actual Likert data on responses from different professional subgroups of European pharmacists regarding competencies for practice. Results obtained show that with "large" (>15) numbers of responses and similar (but clearly not normal) distributions from different subgroups, parametric and non-parametric analyses give in almost all cases the same significant or non-significant results for inter-subgroup comparisons. Parametric methods were more discriminant in the cases of non-similar conclusions. Considering that the largest differences in opinions occurred in the upper part of the 4-point Likert scale (ranks 3 "very important" and 4 "essential"), a "score analysis" based on this part of the data was undertaken. This transformation of the ordinal Likert data into binary scores produced a graphical representation that was visually easier to understand as differences were accentuated. In conclusion, in this case of Likert ordinal data with high response rates, restraining the analysis to non-parametric methods leads to a loss of information. The addition of parametric methods, graphical analysis, analysis of subsets, and transformation of data leads to more in-depth analyses.

  8. Non-parametric foreground subtraction for 21cm epoch of reionization experiments

    CERN Document Server

    Harker, Geraint; Bernardi, Gianni; Brentjens, Michiel A; De Bruyn, A G; Ciardi, Benedetta; Jelic, Vibor; Koopmans, Leon V E; Labropoulos, Panagiotis; Mellema, Garrelt; Offringa, Andre; Pandey, V N; Schaye, Joop; Thomas, Rajat M; Yatawatta, Sarod

    2009-01-01

    An obstacle to the detection of redshifted 21cm emission from the epoch of reionization (EoR) is the presence of foregrounds which exceed the cosmological signal in intensity by orders of magnitude. We argue that in principle it would be better to fit the foregrounds non-parametrically - allowing the data to determine their shape - rather than selecting some functional form in advance and then fitting its parameters. Non-parametric fits often suffer from other problems, however. We discuss these before suggesting a non-parametric method, Wp smoothing, which seems to avoid some of them. After outlining the principles of Wp smoothing we describe an algorithm used to implement it. We then apply Wp smoothing to a synthetic data cube for the LOFAR EoR experiment. The performance of Wp smoothing, measured by the extent to which it is able to recover the variance of the cosmological signal and to which it avoids leakage of power from the foregrounds, is compared to that of a parametric fit, and to another non-parame...

  9. The properties and mechanism of long-term memory in nonparametric volatility

    Science.gov (United States)

    Li, Handong; Cao, Shi-Nan; Wang, Yan

    2010-08-01

    Recent empirical literature documents the presence of long-term memory in return volatility. But the mechanism of the existence of long-term memory is still unclear. In this paper, we investigate the origin and properties of long-term memory with nonparametric volatility, using high-frequency time series data of the Chinese Shanghai Composite Stock Price Index. We perform Detrended Fluctuation Analysis (DFA) on three different nonparametric volatility estimators with different sampling frequencies. For the same volatility series, the Hurst exponents reduce as the sampling time interval increases, but they are still larger than 1/2, which means that no matter how the interval changes, it still cannot change the existence of long memory. RRV presents a relatively stable property on long-term memory and is less influenced by sampling frequency. RV and RBV have some evolutionary trends depending on time intervals, which indicating that the jump component has no significant impact on the long-term memory property. This suggests that the presence of long-term memory in nonparametric volatility can be contributed to the integrated variance component. Considering the impact of microstructure noise, RBV and RRV still present long-term memory under various time intervals. We can infer that the presence of long-term memory in realized volatility is not affected by market microstructure noise. Our findings imply that the long-term memory phenomenon is an inherent characteristic of the data generating process, not a result of microstructure noise or volatility clustering.

  10. Bayesian Nonparametric Estimation for Dynamic Treatment Regimes with Sequential Transition Times.

    Science.gov (United States)

    Xu, Yanxun; Müller, Peter; Wahed, Abdus S; Thall, Peter F

    2016-01-01

    We analyze a dataset arising from a clinical trial involving multi-stage chemotherapy regimes for acute leukemia. The trial design was a 2 × 2 factorial for frontline therapies only. Motivated by the idea that subsequent salvage treatments affect survival time, we model therapy as a dynamic treatment regime (DTR), that is, an alternating sequence of adaptive treatments or other actions and transition times between disease states. These sequences may vary substantially between patients, depending on how the regime plays out. To evaluate the regimes, mean overall survival time is expressed as a weighted average of the means of all possible sums of successive transitions times. We assume a Bayesian nonparametric survival regression model for each transition time, with a dependent Dirichlet process prior and Gaussian process base measure (DDP-GP). Posterior simulation is implemented by Markov chain Monte Carlo (MCMC) sampling. We provide general guidelines for constructing a prior using empirical Bayes methods. The proposed approach is compared with inverse probability of treatment weighting, including a doubly robust augmented version of this approach, for both single-stage and multi-stage regimes with treatment assignment depending on baseline covariates. The simulations show that the proposed nonparametric Bayesian approach can substantially improve inference compared to existing methods. An R program for implementing the DDP-GP-based Bayesian nonparametric analysis is freely available at https://www.ma.utexas.edu/users/yxu/.

  11. On the Choice of Difference Sequence in a Unified Framework for Variance Estimation in Nonparametric Regression

    KAUST Repository

    Dai, Wenlin

    2017-09-01

    Difference-based methods do not require estimating the mean function in nonparametric regression and are therefore popular in practice. In this paper, we propose a unified framework for variance estimation that combines the linear regression method with the higher-order difference estimators systematically. The unified framework has greatly enriched the existing literature on variance estimation that includes most existing estimators as special cases. More importantly, the unified framework has also provided a smart way to solve the challenging difference sequence selection problem that remains a long-standing controversial issue in nonparametric regression for several decades. Using both theory and simulations, we recommend to use the ordinary difference sequence in the unified framework, no matter if the sample size is small or if the signal-to-noise ratio is large. Finally, to cater for the demands of the application, we have developed a unified R package, named VarED, that integrates the existing difference-based estimators and the unified estimators in nonparametric regression and have made it freely available in the R statistical program http://cran.r-project.org/web/packages/.

  12. The Relationships among Chinese Practicing Teachers' Epistemic Beliefs, Pedagogical Beliefs and Their Beliefs about the Use of ICT

    Science.gov (United States)

    Deng, Feng; Chai, Ching Sing; Tsai, Chin-Chung; Lee, Min-Hsien

    2014-01-01

    This study aimed to investigate the relationships among practicing teachers' epistemic beliefs, pedagogical beliefs and their beliefs about the use of ICT through survey methodology. Participants were 396 high school practicing teachers from mainland China. The path analysis results analyzed via structural equation modelling technique indicated…

  13. Changing Conspiracy Beliefs through Rationality and Ridiculing

    Directory of Open Access Journals (Sweden)

    Gábor Orosz

    2016-10-01

    Full Text Available Conspiracy theory (CT beliefs can be harmful. How is it possible to reduce them effectively? Three reduction strategies were tested in an online experiment using general and well-known CT beliefs on a comprehensive randomly assigned Hungarian sample (N = 813: exposing rational counter CT arguments, ridiculing those who hold CT beliefs, and empathizing with the targets of CT beliefs. Several relevant individual differences were measured. Rational and ridiculing arguments were effective in reducing CT, whereas empathizing with the targets of CTs had no effect. Individual differences played no role in CT reduction, but the perceived intelligence and competence of the individual who conveyed the CT belief-reduction information contributed to the success of the CT belief reduction. Rational arguments targeting the link between the object of belief and its characteristics appear to be an effective tool in fighting conspiracy theory beliefs.

  14. Astrology Beliefs among Undergraduate Students

    Science.gov (United States)

    Sugarman, Hannah; Impey, Chris; Buxner, Sanlyn; Antonellis, Jessie

    2011-01-01

    A survey of the science knowledge and attitudes toward science of nearly 10000 undergraduates at a large public university over a 20-year period included several questions addressing student beliefs in astrology and other forms of pseudoscience. The results from our data reveal that a large majority of students (78%) considered astrology "very" or…

  15. Witchcraft Beliefs and Witch Hunts

    NARCIS (Netherlands)

    Koning, N.B.J.

    2013-01-01

    This paper proposes an interdisciplinary explanation of the cross-cultural similarities and evolutionary patterns of witchcraft beliefs. It argues that human social dilemmas have led to the evolution of a fear system that is sensitive to signs of deceit and envy. This was adapted in the evolutionary

  16. Resilience: It Begins with Beliefs

    Science.gov (United States)

    Truebridge, Sara

    2016-01-01

    Educators' beliefs are powerful, affecting not only their pedagogical practices, but also student efficacy and success. The academic achievement of any particular student may rely greatly on whether the teacher believes that student has the ability to succeed. This article affirms the imperative for administrators and educators to spend time…

  17. Witchcraft Beliefs and Witch Hunts

    NARCIS (Netherlands)

    Koning, N.B.J.

    2013-01-01

    This paper proposes an interdisciplinary explanation of the cross-cultural similarities and evolutionary patterns of witchcraft beliefs. It argues that human social dilemmas have led to the evolution of a fear system that is sensitive to signs of deceit and envy. This was adapted in the evolutionary

  18. Intuition, affect, and peculiar beliefs

    NARCIS (Netherlands)

    Boden, M.T.; Berenbaum, H.; Topper, M.

    2012-01-01

    Research with college students has found that intuitive thinking (e.g., using hunches to ascribe meaning to experiences) and positive affect interactively predict ideas of reference and odd/magical beliefs. We investigated whether these results would generalize to a diverse community sample of adult

  19. Astrology Beliefs among Undergraduate Students

    Science.gov (United States)

    Sugarman, Hannah; Impey, Chris; Buxner, Sanlyn; Antonellis, Jessie

    2011-01-01

    A survey of the science knowledge and attitudes toward science of nearly 10000 undergraduates at a large public university over a 20-year period included several questions addressing student beliefs in astrology and other forms of pseudoscience. The results from our data reveal that a large majority of students (78%) considered astrology "very" or…

  20. Order effects in research on paranormal belief.

    Science.gov (United States)

    Dudley, R Thomas

    2002-04-01

    Measures of paranormal belief and emotional intelligence were given a group of 72 college students using Tobacyk's Revised Paranormal Belief Scale and Schutte, Malouff, Hall, Haggerty, Cooper, Golden, and Dornheim's Emotional Intelligence Scale. Order effects indicated that participants who took the Paranormal Belief Scale first had lower emotional intelligence scores than those who took the Emotional Intelligence Scale first. The study demonstrates the importance of taking order effects into account when conducting research on paranormal belief.

  1. Development of the Beliefs About Yoga Scale

    OpenAIRE

    Sohl, Stephanie J.; Schnur, Julie B.; Daly, Leslie; Suslov, Kathryn; Montgomery, Guy H.

    2011-01-01

    Beliefs about yoga may influence participation in yoga and outcomes of yoga interventions. There is currently no scale appropriate for assessing these beliefs in the general U.S. population. This study took the first steps in developing and validating a Beliefs About Yoga Scale (BAYS) to assess beliefs about yoga that may influence people’s engagement in yoga interventions. Items were generated based on previously published research about perceptions of yoga and reviewed by experts within the...

  2. Everyday beliefs about food and health

    OpenAIRE

    Saher, Marieke

    2006-01-01

    The series of studies addresses several everyday beliefs about food and health from the perspective of everyday thinking and paranormal beliefs. They are "you are what you eat" beliefs, attitudes towards genetically modified and organic foods, and belief in alternative medicine. The survey studies included from 239 to 3261 Finnish participants. It was found that food consumption can have far-stretching consequences for the impressions of the eater in a "you are what you eat" manner. The ...

  3. Illness causal beliefs in Turkish immigrants

    OpenAIRE

    Klimidis Steven; Minas Harry; Tuncer Can

    2007-01-01

    Abstract Background People hold a wide variety of beliefs concerning the causes of illness. Such beliefs vary across cultures and, among immigrants, may be influenced by many factors, including level of acculturation, gender, level of education, and experience of illness and treatment. This study examines illness causal beliefs in Turkish-immigrants in Australia. Methods Causal beliefs about somatic and mental illness were examined in a sample of 444 members of the Turkish population of Melbo...

  4. Wave propagation in elastic solids

    CERN Document Server

    Achenbach, Jan

    1984-01-01

    The propagation of mechanical disturbances in solids is of interest in many branches of the physical scienses and engineering. This book aims to present an account of the theory of wave propagation in elastic solids. The material is arranged to present an exposition of the basic concepts of mechanical wave propagation within a one-dimensional setting and a discussion of formal aspects of elastodynamic theory in three dimensions, followed by chapters expounding on typical wave propagation phenomena, such as radiation, reflection, refraction, propagation in waveguides, and diffraction. The treat

  5. Stochastic model in microwave propagation

    Energy Technology Data Exchange (ETDEWEB)

    Ranfagni, A. [“Nello Carrara” Institute of Applied Physics, CNR Florence Research Area, Via Madonna del Piano 10, 50019 Sesto Fiorentino (Italy); Mugnai, D., E-mail: d.mugnai@ifac.cnr.it [“Nello Carrara” Institute of Applied Physics, CNR Florence Research Area, Via Madonna del Piano 10, 50019 Sesto Fiorentino (Italy)

    2011-11-28

    Further experimental results of delay time in microwave propagation are reported in the presence of a lossy medium (wood). The measurements show that the presence of a lossy medium makes the propagation slightly superluminal. The results are interpreted on the basis of a stochastic (or path integral) model, showing how this model is able to describe each kind of physical system in which multi-path trajectories are present. -- Highlights: ► We present new experimental results on electromagnetic “anomalous” propagation. ► We apply a path integral theoretical model to wave propagation. ► Stochastic processes and multi-path trajectories in propagation are considered.

  6. The moderating role of rational beliefs in the relationship between irrational beliefs and posttraumatic stress symptomology.

    Science.gov (United States)

    Hyland, Philip; Shevlin, Mark; Adamson, Gary; Boduszek, Daniel

    2014-05-01

    Rational Emotive Behaviour Therapy (REBT) assumes that rational beliefs act as cognitive protective factors against the development of psychopathology; however little empirical evidence exists regarding the nature of the possible protective effects that they offer. The current study investigates whether rational beliefs moderate the impact of irrational beliefs on posttraumatic stress symptomology (PTS). Three hundred and thirteen active law enforcement, military, and related emergency service personnel took part in the current study. Sequential moderated multiple regression analysis was employed to investigate: (i) the direct impact of irrational beliefs on PTS; (ii) the direct impact of rational beliefs on PTS; (iii) the moderating effects of rational beliefs in the relationship between irrational beliefs and PTS. The irrational beliefs predicted by REBT theory emerged as critical predictors of PTS symptomology, in particular Depreciation beliefs. Rational beliefs (Preferences, and Acceptance beliefs) had a direct, negative impact on levels of PTS, and Acceptance beliefs moderated the impact of Catastrophizing beliefs on PTS. Irrational beliefs are important cognitive vulnerability factors in symptoms of PTS, while rational beliefs (Acceptance) appear to have a protective role in the emergence of PTS symptoms, both directly and by moderating the impact of Catastrophizing beliefs.

  7. Temporal scaling in information propagation.

    Science.gov (United States)

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-18

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  8. Temporal scaling in information propagation

    Science.gov (United States)

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-01

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  9. Changing Preservice Teachers' Beliefs about Motivating Students

    Science.gov (United States)

    Peterson, Sarah; Schreiber, Jim; Moss, Connie

    2011-01-01

    We examined the effects of an educational psychology course on students' beliefs about motivating students. After providing opportunities to engage in systematic intentional inquiry of their beliefs about teaching and learning, we expected that students' beliefs would become more soundly based in theory and research. Following several classes on…

  10. Religious beliefs and entrepreneurship among Dutch protestants

    NARCIS (Netherlands)

    Rietveld, C.A.; Burg, van J.C.

    2014-01-01

    Religious beliefs are known to correlate with a wide range of socio-economic behaviors. The aim of this study is to investigate the relation between religious beliefs and entrepreneurship. Empirical evidence that entrepreneurship rates differ among religions suggests that religious beliefs influence

  11. Understanding Children's Epistemic Beliefs in Elementary Education

    Science.gov (United States)

    Lunn Brownlee, Jo; Curtis, Elizabeth; Spooner-Lane, Rebecca; Feucht, Florian

    2017-01-01

    Research shows that the beliefs individuals hold about knowledge and knowing (epistemic beliefs) influence learning approaches and outcomes. However, little is known about the nature of children's epistemic beliefs and how best to measure these. In this pilot study, 11 Australian children (in Grade 4 or Grade 6) were asked to "draw, write and…

  12. An Investigation on English Teachers’ Teaching Beliefs

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    <正>The teaching repertoire of any individual teacher is an amalgam of beliefs,knowledge and assumptions.Teaching beliefs are considered particularly powerful by some researchers.This article aims to investigate English teachers’ teaching beliefs,which will contribute to the cultivation of qualified English teachers.

  13. Loneliness and Irrational Beliefs among College Students.

    Science.gov (United States)

    Hoglund, Collette L.; Collison, Brooke B.

    1989-01-01

    Investigated relationship between loneliness and irrational beliefs among 236 college students who completed the University of California at Los Angeles (UCLA) Loneliness Scale and the Irrational Beliefs Test (IBT). Results revealed three specific irrational beliefs (Dependency, Anxious Overconcern, and Frustration Reactivity) to be predictive of…

  14. Well-Founded Belief and Perceptual Justification

    DEFF Research Database (Denmark)

    Broncano-Berrocal, Fernando

    2016-01-01

    According to Alan Millar, justified beliefs are well-founded beliefs. Millar cashes out the notion of well-foundedness in terms of having an adequate reason to believe something and believing it for that reason. To make his account of justified belief compatible with perceptual justification he a...

  15. Losing Belief, While Keeping Up the Attitudes

    DEFF Research Database (Denmark)

    Klausen, Søren Harnow

    2013-01-01

    While arguing that many cognitive states do indeed have a characteristic phenomenology, I find reasons for exempting beliefs from the program of cognitive phenomenology. Examining the complex relationship between beliefs and various kinds of conscious experience shows that belief is a messy conce...

  16. Religious beliefs and entrepreneurship among Dutch protestants

    NARCIS (Netherlands)

    Rietveld, C.A.; Burg, van J.C.

    2014-01-01

    Religious beliefs are known to correlate with a wide range of socio-economic behaviors. The aim of this study is to investigate the relation between religious beliefs and entrepreneurship. Empirical evidence that entrepreneurship rates differ among religions suggests that religious beliefs influence

  17. Validity of Parametrized Quark Propagator

    Institute of Scientific and Technical Information of China (English)

    ZHUJi-Zhen; ZHOULi-Juan; MAWei-Xing

    2005-01-01

    Based on an extensively study of the Dyson-Schwinger equations for a fully dressed quark propagator in the “rainbow”approximation, a parametrized fully dressed quark propagator is proposed in this paper. The parametrized propagator describes a confining quark propagator in hadron since it is analytic everywhere in complex p2-plane and has no Lemmann representation. The validity of the new propagator is discussed by comparing its predictions on selfenergy functions A/(p2), Bl(p2) and effective mass M$(p2) of quark with flavor f to their corresponding theoretical results produced by Dyson-Schwinger equations. Our comparison shows that the parametrized quark propagator is a good approximation to the fully dressed quark propagator given by the solutions of Dyson-Schwinger equations in the rainbow approximation and is convenient to use in any theoretical calculations.

  18. Validity of Parametrized Quark Propagator

    Institute of Scientific and Technical Information of China (English)

    ZHU Ji-Zhen; ZHOU Li-Juan; MA Wei-Xing

    2005-01-01

    Based on an extensively study of the Dyson-Schwinger equations for a fully dressed quark propagator in the "rainbow" approximation, a parametrized fully dressed quark propagator is proposed in this paper. The parametrized propagator describes a confining quark propagator in hadron since it is analytic everywhere in complex p2-plane and has no Lemmann representation. The validity of the new propagator is discussed by comparing its predictions on selfenergy functions Af(p2), Bf(p2) and effective mass Mf(p2) of quark with flavor f to their corresponding theoretical results produced by Dyson-Schwinger equations. Our comparison shows that the parametrized quark propagator is a good approximation to the fully dressed quark propagator given by the solutions of Dyson-Schwinger equations in the rainbow approximation and is convenient to use in any theoretical calculations.

  19. PIV uncertainty propagation

    Science.gov (United States)

    Sciacchitano, Andrea; Wieneke, Bernhard

    2016-08-01

    This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It is shown that the uncertainty of vorticity and velocity divergence requires the knowledge of the spatial correlation between the error of the x and y particle image displacement, which depends upon the measurement spatial resolution. The uncertainty of statistical quantities is often dominated by the random uncertainty due to the finite sample size and decreases with the square root of the effective number of independent samples. Monte Carlo simulations are conducted to assess the accuracy of the uncertainty propagation formulae. Furthermore, three experimental assessments are carried out. In the first experiment, a turntable is used to simulate a rigid rotation flow field. The estimated uncertainty of the vorticity is compared with the actual vorticity error root-mean-square, with differences between the two quantities within 5-10% for different interrogation window sizes and overlap factors. A turbulent jet flow is investigated in the second experimental assessment. The reference velocity, which is used to compute the reference value of the instantaneous flow properties of interest, is obtained with an auxiliary PIV system, which features a higher dynamic range than the measurement system. Finally, the uncertainty quantification of statistical quantities is assessed via PIV measurements in a cavity flow. The comparison between estimated uncertainty and actual error demonstrates the accuracy of the proposed uncertainty propagation methodology.

  20. Ionospheric Radio Propagation

    Science.gov (United States)

    1948-06-25

    applies Chapter 2 presents in simple form the mathe - principles which have been found to work in prac- matical theory underlying the propagation of...6.17, which was described under sec- L713 412 812.111 410.511171IS1&41&716.3 A& tion 6.5 above, and read the value of the muf for muFi .-f------ - 01...twepse, 71. mathe 1measuring virtual 6elghts, is world contour charts. 57; zero distance, 73. Se .1.. Contour chairt MCNiII, A. 0., 106 G Median value

  1. Sequential Back—Propagation

    Institute of Scientific and Technical Information of China (English)

    王晖; 刘大有; 等

    1994-01-01

    In this paper we consider the problem of sequential processing and present a sequential model based on the back-propagation algorithm.This model is intended to deal with intrinsically sequential problems,such as word recognition,speech recognition,natural language understanding.This model can be used to train a network to learn the sequence of input patterns,in a fixed order or a random order.Besides,this model is open- and partial-associative,characterized as “resognizing while accumulating”, which, as we argue, is mental cognition process oriented.

  2. Introducing the modified paranormal belief scale: distinguishing between classic paranormal beliefs, religious paranormal beliefs and conventional religiosity among undergraduates in Northern Ireland and Wales

    OpenAIRE

    Williams, Emyr; Francis, Leslie J.; Lewis, Christopher Alan

    2009-01-01

    Previous empirical studies concerned with the association between paranormal beliefs and conventional religiosity have produced conflicting evidence. Drawing on Rice's (2003) distinction between classic paranormal beliefs and religious paranormal beliefs, the present study proposed a modified form of the Tobacyk Revised Paranormal Belief Scale to produce separate scores for these two forms of paranormal belief, styled 'religious paranormal beliefs' and 'classic paranormal beliefs'. Data provi...

  3. Light Propagation Volumes

    OpenAIRE

    Mikulica, Tomáš

    2016-01-01

    Cílem diplomové práce je popsat různé metody výpočtu globálního osvětlení scény včetně techniky Light Propagation Volumes. Pro tuto metodu jsou podrobně popsány všechny tři kroky výpočtu: injekce, propagace a vykreslení. Dále je navrženo několik vlastních rozšíření zlepšující grafickou kvalitu metody. Části návrhu a implementace jsou zaměřeny na popis scény, zobrazovacího systému, tvorby stínů, implementace metody Light Propagation Volumes a navržených rozšíření. Práci uzavírá měření, porovná...

  4. Instrumenting Beliefs in Threshold Public Goods.

    Science.gov (United States)

    de Oliveira, Angela C M; Spraggon, John M; Denny, Matthew J

    2016-01-01

    Understanding the causal impact of beliefs on contributions in Threshold Public Goods (TPGs) is particularly important since the social optimum can be supported as a Nash Equilibrium and best-response contributions are a function of beliefs. Unfortunately, investigations of the impact of beliefs on behavior are plagued with endogeneity concerns. We create a set of instruments by cleanly and exogenously manipulating beliefs without deception. Tests indicate that the instruments are valid and relevant. Perhaps surprisingly, we fail to find evidence that beliefs are endogenous in either the one-shot or repeated-decision settings. TPG allocations are determined by a base contribution and beliefs in a one shot-setting. In the repeated-decision environment, once we instrument for first-round allocations, we find that second-round allocations are driven equally by beliefs and history. Moreover, we find that failing to instrument prior decisions overstates their importance.

  5. Beliefs about language development: construct validity evidence.

    Science.gov (United States)

    Donahue, Mavis L; Fu, Qiong; Smith, Everett V

    2012-01-01

    Understanding language development is incomplete without recognizing children's sociocultural environments, including adult beliefs about language development. Yet there is a need for data supporting valid inferences to assess these beliefs. The current study investigated the psychometric properties of data from a survey (MODeL) designed to explore beliefs in the popular culture, and their alignment with more formal theories. Support for the content, substantive, structural, generalizability, and external aspects of construct validity of the data were investigated. Subscales representing Behaviorist, Cognitive, Nativist, and Sociolinguistic models were identified as dimensions of beliefs. More than half of the items showed a high degree of consensus, suggesting culturally-transmitted beliefs. Behaviorist ideas were most popular. Bilingualism and ethnicity were related to Cognitive and Sociolinguistic beliefs. Identifying these beliefs may clarify the nature of child-directed speech, and enable the design of language intervention programs that are congruent with family and cultural expectations.

  6. Parental and child health beliefs and behavior.

    Science.gov (United States)

    Dielman, T E; Leech, S; Becker, M H; Rosenstock, I M; Horvath, W J; Radius, S M

    1982-01-01

    Personal interviews concerning health beliefs and behaviors were conducted with a parent and child in each of 250 households. Index scores were constructed for parental and child health beliefs, and these scores were entered, along with demographic variables, in a series of multiple regression analyses predicting child health beliefs and behaviors. The age of the child was the variable most highly associated with three of four child health behaviors and four of six child health beliefs. The children's snacking between meals and cigarette smoking were related to several parental behaviors and, to a lesser extent, parental health beliefs. The children's health beliefs were less predictable than were their health behaviors, and the observed significant relationships were with parental health beliefs and demographics. The implications for the design of health education programs are discussed.

  7. Narcissism and belief in the paranormal.

    Science.gov (United States)

    Roe, Chris A; Morgan, Claire L

    2002-04-01

    The present study was designed to assess whether the relationship between narcissistic personality and paranormal belief identified by Tobacyk and Mitchell earlier could be replicated with a general population and to see whether the effect could be found with a narrower definition of paranormal beliefs that focuses only on belief in psychic phenomena. 75 participants completed the Narcissistic Personality Inventory and two measures of paranormal belief, the Paranormal Belief Scale and the Australian Sheep-Goat Scale. There was no correlation between narcissism and Paranormal Belief Scale scores, but narcissism and Australian Sheep-Goat Scale scores were significantly positively correlated. Of the three subscales to the Australian Sheep-Goat measure, scores for narcissism correlated with belief in ESP and PK but not in Life after death. These relationships were interpreted in terms of need for control.

  8. When Brain Death Belies Belief.

    Science.gov (United States)

    Yanke, Greg; Rady, Mohamed Y; Verheijde, Joseph L

    2016-12-01

    The case of Jahi McMath has reignited a discussion concerning how society should define death. Despite pronouncing McMath brain dead based on the American Academy of Neurology criteria, the court ordered continued mechanical ventilation to accommodate the family's religious beliefs. Recent case law suggests that the potential for a successful challenge to the neurologic criteria of death provisions of the Uniform Determination of Death Act are greater than ever in the majority of states that have passed religious freedom legislation. As well, because standard ethical claims regarding brain death are either patently untrue or subject to legitimate dispute, those whose beliefs do not comport with the brain death standard should be able to reject it.

  9. Belief bias and relational reasoning.

    Science.gov (United States)

    Roberts, Maxwell J; Sykes, Elizabeth D A

    2003-01-01

    When people evaluate categorical syllogisms, they tend to reject unbelievable conclusions and accept believable ones irrespective of their validity. Typically, this effect is particularly marked for invalid conclusions that are possible, but do not necessarily follow, given the premises. However, smaller believability effects can also be detected for other types of conclusion. Three experiments are reported here, in which an attempt was made to determine whether belief bias effects can manifest themselves on the relational inference task. Subjects evaluated the validity of conclusions such as William the Conqueror was king after the Pyramids were built (temporal task) or Manchester is north of Bournemouth (spatial task) with respect to their premises. All of the major findings for equivalent categorical syllogism tasks were replicated. However, the overall size of the main effect of believability appears to be related to task presentation, a phenomenon not previously identified for categorical syllogisms and which current theories of belief bias have difficulty explaining.

  10. Outcome Knowledge and False Belief.

    Science.gov (United States)

    Ghrear, Siba E; Birch, Susan A J; Bernstein, Daniel M

    2016-01-01

    Virtually every social interaction involves reasoning about the perspectives of others, or 'theory of mind (ToM).' Previous research suggests that it is difficult to ignore our current knowledge when reasoning about a more naïve perspective (i.e., the curse of knowledge). In this Mini Review, we discuss the implications of the curse of knowledge for certain aspects of ToM. Particularly, we examine how the curse of knowledge influences key measurements of false belief reasoning. In closing, we touch on the need to develop new measurement tools to discern the mechanisms involved in the curse of knowledge and false belief reasoning, and how they develop across the lifespan.

  11. Outcome knowledge and false belief

    Directory of Open Access Journals (Sweden)

    Siba eGhrear

    2016-02-01

    Full Text Available Virtually every social interaction involves reasoning about the perspectives of others, or ‘theory of mind’. Previous research suggests that it is difficult to ignore our current knowledge when reasoning about a more naïve perspective (i.e., the curse of knowledge. In this Mini Review, we discuss the implications of the curse of knowledge for certain aspects of theory of mind. Particularly, we examine how the curse of knowledge influences key measurements of false belief reasoning. In closing, we touch on the need to develop new measurement tools to discern the mechanisms involved in the curse of knowledge and false belief reasoning, and how they develop across the lifespan.

  12. Light Front Boson Model Propagation

    Institute of Scientific and Technical Information of China (English)

    Jorge Henrique Sales; Alfredo Takashi Suzuki

    2011-01-01

    stract The scope and aim of this work is to describe the two-body interaction mediated by a particle (either the scalar or the gauge boson) within the light-front formulation. To do this, first of all we point out the importance of propagators and Green functions in Quantum Mechanics. Then we project the covariant quantum propagator onto the light front time to get the propagator for scalar particles in these coordinates. This operator propagates the wave function from x+ = 0 to x+ > O. It corresponds to the definition of the time ordering operation in the light front time x+. We calculate the light-front Green's function for 2 interacting bosons propagating forward in x+. We also show how to write down the light front Green's function from the Feynman propagator and finally make a generalization to N bosons.

  13. Breast Health Belief System Study

    Science.gov (United States)

    2001-08-01

    advantages: cultural and geographic familiarity, interinstitutional collaboration , and cost effectiveness in terms of controlling per diem and travel...providers of the significance of beliefs on health practices. Collaborators in this study included Georgia State, Albany State, Fort Valley State, and...Georgia counties. As the map on the following pages shows, many of Georgia’s southern counties are in economic decline. Collaborators are Georgia

  14. Outcome knowledge and false belief

    OpenAIRE

    Siba eGhrear; Susan eBirch; Daniel eBernstein

    2016-01-01

    Virtually every social interaction involves reasoning about the perspectives of others, or ‘theory of mind’. Previous research suggests that it is difficult to ignore our current knowledge when reasoning about a more naïve perspective (i.e., the curse of knowledge). In this Mini Review, we discuss the implications of the curse of knowledge for certain aspects of theory of mind. Particularly, we examine how the curse of knowledge influences key measurements of false belief reasoning. In closi...

  15. Gauge engineering and propagators

    CERN Document Server

    Maas, Axel

    2016-01-01

    Beyond perturbation theory gauge-fixing becomes more involved due to the Gribov-Singer ambiguity: The appearance of additional gauge copies requires to define a procedure how to handle them. For the case of Landau gauge the structure and properties of these additional gauge copies will be investigated. Based on these properties gauge conditions are constructed to account for these gauge copies. The dependence of the propagators on the choice of these complete gauge-fixings will then be investigated using lattice gauge theory for Yang-Mills theory. It is found that the implications for the infrared, and to some extent mid-momentum behavior, can be substantial. In going beyond the Yang-Mills case it turns out that the influence of matter can generally not be neglected. This will be briefly discussed for various types of matter.

  16. Gauge engineering and propagators

    Science.gov (United States)

    Maas, Axel

    2017-03-01

    Beyond perturbation theory gauge-fixing becomes more involved due to the Gribov-Singer ambiguity: The appearance of additional gauge copies requires to define a procedure how to handle them. For the case of Landau gauge the structure and properties of these additional gauge copies will be investigated. Based on these properties gauge conditions are constructed to account for these gauge copies. The dependence of the propagators on the choice of these complete gauge-fixings will then be investigated using lattice gauge theory for Yang-Mills theory. It is found that the implications for the infrared, and to some extent mid-momentum behavior, can be substantial. In going beyond the Yang-Mills case it turns out that the influence of matter can generally not be neglected. This will be briefly discussed for various types of matter.

  17. Propagators and path integrals

    Energy Technology Data Exchange (ETDEWEB)

    Holten, J.W. van

    1995-08-22

    Path-integral expressions for one-particle propagators in scalar and fermionic field theories are derived, for arbitrary mass. This establishes a direct connection between field theory and specific classical point-particle models. The role of world-line reparametrization invariance of the classical action and the implementation of the corresponding BRST-symmetry in the quantum theory are discussed. The presence of classical world-line supersymmetry is shown to lead to an unwanted doubling of states for massive spin-1/2 particles. The origin of this phenomenon is traced to a `hidden` topological fermionic excitation. A different formulation of the pseudo-classical mechanics using a bosonic representation of {gamma}{sub 5} is shown to remove these extra states at the expense of losing manifest supersymmetry. (orig.).

  18. Study of the Wheeler Propagator

    OpenAIRE

    Bollini, C. G.; Rocca, M. C.

    2010-01-01

    We study the half advanced and half retarded Wheeler Green function and its relation to Feynman propagators. First for massless equation. Then, for Klein-Gordon equations with arbitrary mass parameters; real, imaginary or complex. In all cases the Wheeler propagator lacks an on-shell free propagation. The Wheeler function has support inside the light-cone (whatever the mass). The associated vacuum is symmetric with respect to annihilation and creation operators.

  19. Non-parametric change-point method for differential gene expression detection.

    Directory of Open Access Journals (Sweden)

    Yao Wang

    Full Text Available BACKGROUND: We proposed a non-parametric method, named Non-Parametric Change Point Statistic (NPCPS for short, by using a single equation for detecting differential gene expression (DGE in microarray data. NPCPS is based on the change point theory to provide effective DGE detecting ability. METHODOLOGY: NPCPS used the data distribution of the normal samples as input, and detects DGE in the cancer samples by locating the change point of gene expression profile. An estimate of the change point position generated by NPCPS enables the identification of the samples containing DGE. Monte Carlo simulation and ROC study were applied to examine the detecting accuracy of NPCPS, and the experiment on real microarray data of breast cancer was carried out to compare NPCPS with other methods. CONCLUSIONS: Simulation study indicated that NPCPS was more effective for detecting DGE in cancer subset compared with five parametric methods and one non-parametric method. When there were more than 8 cancer samples containing DGE, the type I error of NPCPS was below 0.01. Experiment results showed both good accuracy and reliability of NPCPS. Out of the 30 top genes ranked by using NPCPS, 16 genes were reported as relevant to cancer. Correlations between the detecting result of NPCPS and the compared methods were less than 0.05, while between the other methods the values were from 0.20 to 0.84. This indicates that NPCPS is working on different features and thus provides DGE identification from a distinct perspective comparing with the other mean or median based methods.

  20. Beliefs of subfertile Saudi women.

    Science.gov (United States)

    Al-Jaroudi, Dania H

    2010-04-01

    To assess the beliefs of infertile Saudi women and acquire information on alternative medicine usage by infertility patients. Between February 2008 and January 2009 a cross-sectional survey was performed in a tertiary hospital. Fifty-one patients attending the Reproductive Endocrinology and Infertility Medicine Department, Women's Specialized Hospital, King Fahad Medical City, Riyadh, Kingdom of Saudi Arabia were interviewed at random. They were asked questions pertaining to their beliefs and alternative therapies used to treat their infertility. Information was entered into an Excel sheet, and statistical analysis was carried out. Eighteen (35.3%) women believed their infertility was the result of the evil eye, and 13 (25.5%) believed it was due to envy. The first choice when the women realized they had difficulty conceiving were du'a (supplications) (n=23; 45.1%), visiting a doctor (n=22; 43.1%), and reading Qur'an (n=5; 9.8%), although most (n=37; 72.5%) ultimately turned to the Qur'an as a remedy. Herbal medicine was used by 35 (68.6%) patients. We believe healthcare professionals should consider the personal beliefs and alternatives that subfertile women resort to.

  1. Applications of Parametric and Nonparametric Tests for Event Studies on ISE

    OpenAIRE

    Handan YOLSAL

    2011-01-01

    In this study, we conducted a research as to whether splits in shares on the ISE-ON Index at the Istanbul Stock Exchange have had an impact on returns generated from shares between 2005 and 2011 or not using event study method. This study is based on parametric tests, as well as on nonparametric tests developed as an alternative to them. It has been observed that, when cross-sectional variance adjustment is applied to data set, such null hypothesis as “there is no average abnormal return at d...

  2. Nonparametric model reconstruction for stochastic differential equations from discretely observed time-series data.

    Science.gov (United States)

    Ohkubo, Jun

    2011-12-01

    A scheme is developed for estimating state-dependent drift and diffusion coefficients in a stochastic differential equation from time-series data. The scheme does not require to specify parametric forms for the drift and diffusion coefficients in advance. In order to perform the nonparametric estimation, a maximum likelihood method is combined with a concept based on a kernel density estimation. In order to deal with discrete observation or sparsity of the time-series data, a local linearization method is employed, which enables a fast estimation.

  3. Measuring the price responsiveness of gasoline demand: economic shape restrictions and nonparametric demand estimation

    OpenAIRE

    Blundell, Richard; Horowitz, Joel L.; Parey, Matthias

    2011-01-01

    This paper develops a new method for estimating a demand function and the welfare consequences of price changes. The method is applied to gasoline demand in the U.S. and is applicable to other goods. The method uses shape restrictions derived from economic theory to improve the precision of a nonparametric estimate of the demand function. Using data from the U.S. National Household Travel Survey, we show that the restrictions are consistent with the data on gasoline demand and remove the anom...

  4. Two new non-parametric tests to the distance duality relation with galaxy clusters

    CERN Document Server

    Costa, S S; Holanda, R F L

    2015-01-01

    The cosmic distance duality relation is a milestone of cosmology involving the luminosity and angular diameter distances. Any departure of the relation points to new physics or systematic errors in the observations, therefore tests of the relation are extremely important to build a consistent cosmological framework. Here, two new tests are proposed based on galaxy clusters observations (angular diameter distance and gas mass fraction) and $H(z)$ measurements. By applying Gaussian Processes, a non-parametric method, we are able to derive constraints on departures of the relation where no evidence of deviation is found in both methods, reinforcing the cosmological and astrophysical hypotheses adopted so far.

  5. Nonparametric variance estimation in the analysis of microarray data: a measurement error approach.

    Science.gov (United States)

    Carroll, Raymond J; Wang, Yuedong

    2008-01-01

    This article investigates the effects of measurement error on the estimation of nonparametric variance functions. We show that either ignoring measurement error or direct application of the simulation extrapolation, SIMEX, method leads to inconsistent estimators. Nevertheless, the direct SIMEX method can reduce bias relative to a naive estimator. We further propose a permutation SIMEX method which leads to consistent estimators in theory. The performance of both SIMEX methods depends on approximations to the exact extrapolants. Simulations show that both SIMEX methods perform better than ignoring measurement error. The methodology is illustrated using microarray data from colon cancer patients.

  6. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  7. A nonparametric approach to calculate critical micelle concentrations: the local polynomial regression method.

    Science.gov (United States)

    López Fontán, J L; Costa, J; Ruso, J M; Prieto, G; Sarmiento, F

    2004-02-01

    The application of a statistical method, the local polynomial regression method, (LPRM), based on a nonparametric estimation of the regression function to determine the critical micelle concentration (cmc) is presented. The method is extremely flexible because it does not impose any parametric model on the subjacent structure of the data but rather allows the data to speak for themselves. Good concordance of cmc values with those obtained by other methods was found for systems in which the variation of a measured physical property with concentration showed an abrupt change. When this variation was slow, discrepancies between the values obtained by LPRM and others methods were found.

  8. A nonparametric approach to calculate critical micelle concentrations: the local polynomial regression method

    Energy Technology Data Exchange (ETDEWEB)

    Lopez Fontan, J.L.; Costa, J.; Ruso, J.M.; Prieto, G. [Dept. of Applied Physics, Univ. of Santiago de Compostela, Santiago de Compostela (Spain); Sarmiento, F. [Dept. of Mathematics, Faculty of Informatics, Univ. of A Coruna, A Coruna (Spain)

    2004-02-01

    The application of a statistical method, the local polynomial regression method, (LPRM), based on a nonparametric estimation of the regression function to determine the critical micelle concentration (cmc) is presented. The method is extremely flexible because it does not impose any parametric model on the subjacent structure of the data but rather allows the data to speak for themselves. Good concordance of cmc values with those obtained by other methods was found for systems in which the variation of a measured physical property with concentration showed an abrupt change. When this variation was slow, discrepancies between the values obtained by LPRM and others methods were found. (orig.)

  9. A sequential nonparametric pattern classification algorithm based on the Wald SPRT. [Sequential Probability Ratio Test

    Science.gov (United States)

    Poage, J. L.

    1975-01-01

    A sequential nonparametric pattern classification procedure is presented. The method presented is an estimated version of the Wald sequential probability ratio test (SPRT). This method utilizes density function estimates, and the density estimate used is discussed, including a proof of convergence in probability of the estimate to the true density function. The classification procedure proposed makes use of the theory of order statistics, and estimates of the probabilities of misclassification are given. The procedure was tested on discriminating between two classes of Gaussian samples and on discriminating between two kinds of electroencephalogram (EEG) responses.

  10. Nonparametric bootstrap analysis with applications to demographic effects in demand functions.

    Science.gov (United States)

    Gozalo, P L

    1997-12-01

    "A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt

  11. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering...... the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  12. Nonparametric Bayesian Sparse Factor Models with application to Gene Expression modelling

    CERN Document Server

    Knowles, David

    2010-01-01

    A nonparametric Bayesian extension of Factor Analysis (FA) is proposed where observed data Y is modeled as a linear superposition, G, of a potentially infinite number of hidden factors, X. The Indian Buffet Process (IBP) is used as a prior on G to incorporate sparsity and to allow the number of latent features to be inferred. The model's utility for modeling gene expression data is investigated using randomly generated datasets based on a known sparse connectivity matrix for E. Coli, and on three biological datasets of increasing complexity.

  13. Non-parametric trend analysis of water quality data of rivers in Kansas

    Science.gov (United States)

    Yu, Y.-S.; Zou, S.; Whittemore, D.

    1993-01-01

    Surface water quality data for 15 sampling stations in the Arkansas, Verdigris, Neosho, and Walnut river basins inside the state of Kansas were analyzed to detect trends (or lack of trends) in 17 major constituents by using four different non-parametric methods. The results show that concentrations of specific conductance, total dissolved solids, calcium, total hardness, sodium, potassium, alkalinity, sulfate, chloride, total phosphorus, ammonia plus organic nitrogen, and suspended sediment generally have downward trends. Some of the downward trends are related to increases in discharge, while others could be caused by decreases in pollution sources. Homogeneity tests show that both station-wide trends and basinwide trends are non-homogeneous. ?? 1993.

  14. Noise and speckle reduction in synthetic aperture radar imagery by nonparametric Wiener filtering.

    Science.gov (United States)

    Caprari, R S; Goh, A S; Moffatt, E K

    2000-12-10

    We present a Wiener filter that is especially suitable for speckle and noise reduction in multilook synthetic aperture radar (SAR) imagery. The proposed filter is nonparametric, not being based on parametrized analytical models of signal statistics. Instead, the Wiener-Hopf equation is expressed entirely in terms of observed signal statistics, with no reference to the possibly unobservable pure signal and noise. This Wiener filter is simple in concept and implementation, exactly minimum mean-square error, and directly applicable to signal-dependent and multiplicative noise. We demonstrate the filtering of a genuine two-look SAR image and show how a nonnegatively constrained version of the filter substantially reduces ringing.

  15. Comparison between scaling law and nonparametric Bayesian estimate for the recurrence time of strong earthquakes

    Science.gov (United States)

    Rotondi, R.

    2009-04-01

    According to the unified scaling theory the probability distribution function of the recurrence time T is a scaled version of a base function and the average value of T can be used as a scale parameter for the distribution. The base function must belong to the scale family of distributions: tested on different catalogues and for different scale levels, for Corral (2005) the (truncated) generalized gamma distribution is the best model, for German (2006) the Weibull distribution. The scaling approach should overcome the difficulty of estimating distribution functions over small areas but theorical limitations and partial instability of the estimated distributions have been pointed out in the literature. Our aim is to analyze the recurrence time of strong earthquakes that occurred in the Italian territory. To satisfy the hypotheses of independence and identical distribution we have evaluated the times between events that occurred in each area of the Database of Individual Seismogenic Sources and then we have gathered them by eight tectonically coherent regions, each of them dominated by a well characterized geodynamic process. To solve problems like: paucity of data, presence of outliers and uncertainty in the choice of the functional expression for the distribution of t, we have followed a nonparametric approach (Rotondi (2009)) in which: (a) the maximum flexibility is obtained by assuming that the probability distribution is a random function belonging to a large function space, distributed as a stochastic process; (b) nonparametric estimation method is robust when the data contain outliers; (c) Bayesian methodology allows to exploit different information sources so that the model fitting may be good also to scarce samples. We have compared the hazard rates evaluated through the parametric and nonparametric approach. References Corral A. (2005). Mixing of rescaled data and Bayesian inference for earthquake recurrence times, Nonlin. Proces. Geophys., 12, 89

  16. A NONPARAMETRIC PROCEDURE OF THE SAMPLE SIZE DETERMINATION FOR SURVIVAL RATE TEST

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Objective This paper proposes a nonparametric procedure of the sample size determination for survival rate test. Methods Using the classical asymptotic normal procedure yields the required homogenetic effective sample size and using the inverse operation with the prespecified value of the survival function of censoring times yields the required sample size. Results It is matched with the rate test for censored data, does not involve survival distributions, and reduces to its classical counterpart when there is no censoring. The observed power of the test coincides with the prescribed power under usual clinical conditions. Conclusion It can be used for planning survival studies of chronic diseases.

  17. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering...... the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...

  18. The geometry of distributional preferences and a non-parametric identification approach: The Equality Equivalence Test.

    Science.gov (United States)

    Kerschbamer, Rudolf

    2015-05-01

    This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.

  19. Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images

    Science.gov (United States)

    2010-04-01

    OF EACH CELL ARE RESULTS OF KSVD AND BPFA, RESPECTIVELY. σ C.man House Peppers Lena Barbara Boats F.print Couple Hill 5 37.87 39.37 37.78 38.60 38.08...INTERPOLATION PSNR RESULTS, USING PATCH SIZE 8× 8. BOTTOM: BPFA RGB IMAGE INTERPOLATION PSNR RESULTS, USING PATCH SIZE 7× 7. data ratio C.man House Peppers Lena...of subspaces. IEEE Trans. Inform. Theory, 2009. [16] T. Ferguson . A Bayesian analysis of some nonparametric problems. Annals of Statistics, 1:209–230

  20. BOOTSTRAP WAVELET IN THE NONPARAMETRIC REGRESSION MODEL WITH WEAKLY DEPENDENT PROCESSES

    Institute of Scientific and Technical Information of China (English)

    林路; 张润楚

    2004-01-01

    This paper introduces a method of bootstrap wavelet estimation in a nonparametric regression model with weakly dependent processes for both fixed and random designs. The asymptotic bounds for the bias and variance of the bootstrap wavelet estimators are given in the fixed design model. The conditional normality for a modified version of the bootstrap wavelet estimators is obtained in the fixed model. The consistency for the bootstrap wavelet estimator is also proved in the random design model. These results show that the bootstrap wavelet method is valid for the model with weakly dependent processes.

  1. An adaptive nonparametric method in benchmark analysis for bioassay and environmental studies.

    Science.gov (United States)

    Bhattacharya, Rabi; Lin, Lizhen

    2010-12-01

    We present a novel nonparametric method for bioassay and benchmark analysis in risk assessment, which averages isotonic MLEs based on disjoint subgroups of dosages. The asymptotic theory for the methodology is derived, showing that the MISEs (mean integrated squared error) of the estimates of both the dose-response curve F and its inverse F(-1) achieve the optimal rate O(N(-4/5)). Also, we compute the asymptotic distribution of the estimate ζ~p of the effective dosage ζ(p) = F(-1) (p) which is shown to have an optimally small asymptotic variance.

  2. A Critical Evaluation of the Nonparametric Approach to Estimate Terrestrial Evaporation

    Directory of Open Access Journals (Sweden)

    Yongmin Yang

    2016-01-01

    Full Text Available Evapotranspiration (ET estimation has been one of the most challenging problems in recent decades for hydrometeorologists. In this study, a nonparametric approach to estimate terrestrial evaporation was evaluated using both model simulation and measurements from three sites. Both the model simulation and the in situ evaluation at the Tiger Bush Site revealed that this approach would greatly overestimate ET under dry conditions (evaporative fraction smaller than 0.4. For the evaluation at the Tiger Bush Site, the difference between ET estimates and site observations could be as large as 130 W/m2. However, this approach provided good estimates over the two crop sites. The Nash-Sutcliffe coefficient (E was 0.9 and 0.94, respectively, for WC06 and Yingke. A further theoretical analysis indicates the nonparametric approach is very close to the equilibrium evaporation equation under wet conditions, and this can explain the good performance of this approach at the two crop sites in this study. The evaluation indicates that this approach needs more careful appraisal and that its application in dry conditions should be avoided.

  3. Distributed Nonparametric and Semiparametric Regression on SPARK for Big Data Forecasting

    Directory of Open Access Journals (Sweden)

    Jelena Fiosina

    2017-01-01

    Full Text Available Forecasting in big datasets is a common but complicated task, which cannot be executed using the well-known parametric linear regression. However, nonparametric and semiparametric methods, which enable forecasting by building nonlinear data models, are computationally intensive and lack sufficient scalability to cope with big datasets to extract successful results in a reasonable time. We present distributed parallel versions of some nonparametric and semiparametric regression models. We used MapReduce paradigm and describe the algorithms in terms of SPARK data structures to parallelize the calculations. The forecasting accuracy of the proposed algorithms is compared with the linear regression model, which is the only forecasting model currently having parallel distributed realization within the SPARK framework to address big data problems. The advantages of the parallelization of the algorithm are also provided. We validate our models conducting various numerical experiments: evaluating the goodness of fit, analyzing how increasing dataset size influences time consumption, and analyzing time consumption by varying the degree of parallelism (number of workers in the distributed realization.

  4. Nonparametric Identification of Glucose-Insulin Process in IDDM Patient with Multi-meal Disturbance

    Science.gov (United States)

    Bhattacharjee, A.; Sutradhar, A.

    2012-12-01

    Modern close loop control for blood glucose level in a diabetic patient necessarily uses an explicit model of the process. A fixed parameter full order or reduced order model does not characterize the inter-patient and intra-patient parameter variability. This paper deals with a frequency domain nonparametric identification of the nonlinear glucose-insulin process in an insulin dependent diabetes mellitus patient that captures the process dynamics in presence of uncertainties and parameter variations. An online frequency domain kernel estimation method has been proposed that uses the input-output data from the 19th order first principle model of the patient in intravenous route. Volterra equations up to second order kernels with extended input vector for a Hammerstein model are solved online by adaptive recursive least square (ARLS) algorithm. The frequency domain kernels are estimated using the harmonic excitation input data sequence from the virtual patient model. A short filter memory length of M = 2 was found sufficient to yield acceptable accuracy with lesser computation time. The nonparametric models are useful for closed loop control, where the frequency domain kernels can be directly used as the transfer function. The validation results show good fit both in frequency and time domain responses with nominal patient as well as with parameter variations.

  5. A web application for evaluating Phase I methods using a non-parametric optimal benchmark.

    Science.gov (United States)

    Wages, Nolan A; Varhegyi, Nikole

    2017-06-01

    In evaluating the performance of Phase I dose-finding designs, simulation studies are typically conducted to assess how often a method correctly selects the true maximum tolerated dose under a set of assumed dose-toxicity curves. A necessary component of the evaluation process is to have some concept for how well a design can possibly perform. The notion of an upper bound on the accuracy of maximum tolerated dose selection is often omitted from the simulation study, and the aim of this work is to provide researchers with accessible software to quickly evaluate the operating characteristics of Phase I methods using a benchmark. The non-parametric optimal benchmark is a useful theoretical tool for simulations that can serve as an upper limit for the accuracy of maximum tolerated dose identification based on a binary toxicity endpoint. It offers researchers a sense of the plausibility of a Phase I method's operating characteristics in simulation. We have developed an R shiny web application for simulating the benchmark. The web application has the ability to quickly provide simulation results for the benchmark and requires no programming knowledge. The application is free to access and use on any device with an Internet browser. The application provides the percentage of correct selection of the maximum tolerated dose and an accuracy index, operating characteristics typically used in evaluating the accuracy of dose-finding designs. We hope this software will facilitate the use of the non-parametric optimal benchmark as an evaluation tool in dose-finding simulation.

  6. Application of the LSQR algorithm in non-parametric estimation of aerosol size distribution

    Science.gov (United States)

    He, Zhenzong; Qi, Hong; Lew, Zhongyuan; Ruan, Liming; Tan, Heping; Luo, Kun

    2016-05-01

    Based on the Least Squares QR decomposition (LSQR) algorithm, the aerosol size distribution (ASD) is retrieved in non-parametric approach. The direct problem is solved by the Anomalous Diffraction Approximation (ADA) and the Lambert-Beer Law. An optimal wavelength selection method is developed to improve the retrieval accuracy of the ASD. The proposed optimal wavelength set is selected by the method which can make the measurement signals sensitive to wavelength and decrease the degree of the ill-condition of coefficient matrix of linear systems effectively to enhance the anti-interference ability of retrieval results. Two common kinds of monomodal and bimodal ASDs, log-normal (L-N) and Gamma distributions, are estimated, respectively. Numerical tests show that the LSQR algorithm can be successfully applied to retrieve the ASD with high stability in the presence of random noise and low susceptibility to the shape of distributions. Finally, the experimental measurement ASD over Harbin in China is recovered reasonably. All the results confirm that the LSQR algorithm combined with the optimal wavelength selection method is an effective and reliable technique in non-parametric estimation of ASD.

  7. Contribution to the Nonparametric Estimation of the Density of the Regression Errors (Doctoral Thesis)

    CERN Document Server

    LSTA, Rawane Samb

    2010-01-01

    This thesis deals with the nonparametric estimation of density f of the regression error term E of the model Y=m(X)+E, assuming its independence with the covariate X. The difficulty linked to this study is the fact that the regression error E is not observed. In a such setup, it would be unwise, for estimating f, to use a conditional approach based upon the probability distribution function of Y given X. Indeed, this approach is affected by the curse of dimensionality, so that the resulting estimator of the residual term E would have considerably a slow rate of convergence if the dimension of X is very high. Two approaches are proposed in this thesis to avoid the curse of dimensionality. The first approach uses the estimated residuals, while the second integrates a nonparametric conditional density estimator of Y given X. If proceeding so can circumvent the curse of dimensionality, a challenging issue is to evaluate the impact of the estimated residuals on the final estimator of the density f. We will also at...

  8. Passenger Flow Prediction of Subway Transfer Stations Based on Nonparametric Regression Model

    Directory of Open Access Journals (Sweden)

    Yujuan Sun

    2014-01-01

    Full Text Available Passenger flow is increasing dramatically with accomplishment of subway network system in big cities of China. As convergence nodes of subway lines, transfer stations need to assume more passengers due to amount transfer demand among different lines. Then, transfer facilities have to face great pressure such as pedestrian congestion or other abnormal situations. In order to avoid pedestrian congestion or warn the management before it occurs, it is very necessary to predict the transfer passenger flow to forecast pedestrian congestions. Thus, based on nonparametric regression theory, a transfer passenger flow prediction model was proposed. In order to test and illustrate the prediction model, data of transfer passenger flow for one month in XIDAN transfer station were used to calibrate and validate the model. By comparing with Kalman filter model and support vector machine regression model, the results show that the nonparametric regression model has the advantages of high accuracy and strong transplant ability and could predict transfer passenger flow accurately for different intervals.

  9. Hadron energy reconstruction for the ATLAS calorimetry in the framework of the nonparametrical method

    CERN Document Server

    Akhmadaliev, S Z; Ambrosini, G; Amorim, A; Anderson, K; Andrieux, M L; Aubert, Bernard; Augé, E; Badaud, F; Baisin, L; Barreiro, F; Battistoni, G; Bazan, A; Bazizi, K; Belymam, A; Benchekroun, D; Berglund, S R; Berset, J C; Blanchot, G; Bogush, A A; Bohm, C; Boldea, V; Bonivento, W; Bosman, M; Bouhemaid, N; Breton, D; Brette, P; Bromberg, C; Budagov, Yu A; Burdin, S V; Calôba, L P; Camarena, F; Camin, D V; Canton, B; Caprini, M; Carvalho, J; Casado, M P; Castillo, M V; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Chadelas, R; Chalifour, M; Chekhtman, A; Chevalley, J L; Chirikov-Zorin, I E; Chlachidze, G; Citterio, M; Cleland, W E; Clément, C; Cobal, M; Cogswell, F; Colas, Jacques; Collot, J; Cologna, S; Constantinescu, S; Costa, G; Costanzo, D; Crouau, M; Daudon, F; David, J; David, M; Davidek, T; Dawson, J; De, K; de La Taille, C; Del Peso, J; Del Prete, T; de Saintignon, P; Di Girolamo, B; Dinkespiler, B; Dita, S; Dodd, J; Dolejsi, J; Dolezal, Z; Downing, R; Dugne, J J; Dzahini, D; Efthymiopoulos, I; Errede, D; Errede, S; Evans, H; Eynard, G; Fassi, F; Fassnacht, P; Ferrari, A; Ferrer, A; Flaminio, Vincenzo; Fournier, D; Fumagalli, G; Gallas, E; Gaspar, M; Giakoumopoulou, V; Gianotti, F; Gildemeister, O; Giokaris, N; Glagolev, V; Glebov, V Yu; Gomes, A; González, V; González de la Hoz, S; Grabskii, V; Graugès-Pous, E; Grenier, P; Hakopian, H H; Haney, M; Hébrard, C; Henriques, A; Hervás, L; Higón, E; Holmgren, Sven Olof; Hostachy, J Y; Hoummada, A; Huston, J; Imbault, D; Ivanyushenkov, Yu M; Jézéquel, S; Johansson, E K; Jon-And, K; Jones, R; Juste, A; Kakurin, S; Karyukhin, A N; Khokhlov, Yu A; Khubua, J I; Klioukhine, V I; Kolachev, G M; Kopikov, S V; Kostrikov, M E; Kozlov, V; Krivkova, P; Kukhtin, V V; Kulagin, M; Kulchitskii, Yu A; Kuzmin, M V; Labarga, L; Laborie, G; Lacour, D; Laforge, B; Lami, S; Lapin, V; Le Dortz, O; Lefebvre, M; Le Flour, T; Leitner, R; Leltchouk, M; Li, J; Liablin, M V; Linossier, O; Lissauer, D; Lobkowicz, F; Lokajícek, M; Lomakin, Yu F; López-Amengual, J M; Lund-Jensen, B; Maio, A; Makowiecki, D S; Malyukov, S N; Mandelli, L; Mansoulié, B; Mapelli, Livio P; Marin, C P; Marrocchesi, P S; Marroquim, F; Martin, P; Maslennikov, A L; Massol, N; Mataix, L; Mazzanti, M; Mazzoni, E; Merritt, F S; Michel, B; Miller, R; Minashvili, I A; Miralles, L; Mnatzakanian, E A; Monnier, E; Montarou, G; Mornacchi, Giuseppe; Moynot, M; Muanza, G S; Nayman, P; Némécek, S; Nessi, Marzio; Nicoleau, S; Niculescu, M; Noppe, J M; Onofre, A; Pallin, D; Pantea, D; Paoletti, R; Park, I C; Parrour, G; Parsons, J; Pereira, A; Perini, L; Perlas, J A; Perrodo, P; Pilcher, J E; Pinhão, J; Plothow-Besch, Hartmute; Poggioli, Luc; Poirot, S; Price, L; Protopopov, Yu; Proudfoot, J; Puzo, P; Radeka, V; Rahm, David Charles; Reinmuth, G; Renzoni, G; Rescia, S; Resconi, S; Richards, R; Richer, J P; Roda, C; Rodier, S; Roldán, J; Romance, J B; Romanov, V; Romero, P; Rossel, F; Rusakovitch, N A; Sala, P; Sanchis, E; Sanders, H; Santoni, C; Santos, J; Sauvage, D; Sauvage, G; Sawyer, L; Says, L P; Schaffer, A C; Schwemling, P; Schwindling, J; Seguin-Moreau, N; Seidl, W; Seixas, J M; Selldén, B; Seman, M; Semenov, A; Serin, L; Shaldaev, E; Shochet, M J; Sidorov, V; Silva, J; Simaitis, V J; Simion, S; Sissakian, A N; Snopkov, R; Söderqvist, J; Solodkov, A A; Soloviev, A; Soloviev, I V; Sonderegger, P; Soustruznik, K; Spanó, F; Spiwoks, R; Stanek, R; Starchenko, E A; Stavina, P; Stephens, R; Suk, M; Surkov, A; Sykora, I; Takai, H; Tang, F; Tardell, S; Tartarelli, F; Tas, P; Teiger, J; Thaler, J; Thion, J; Tikhonov, Yu A; Tisserant, S; Tokar, S; Topilin, N D; Trka, Z; Turcotte, M; Valkár, S; Varanda, M J; Vartapetian, A H; Vazeille, F; Vichou, I; Vinogradov, V; Vorozhtsov, S B; Vuillemin, V; White, A; Wielers, M; Wingerter-Seez, I; Wolters, H; Yamdagni, N; Yosef, C; Zaitsev, A; Zitoun, R; Zolnierowski, Y

    2002-01-01

    This paper discusses hadron energy reconstruction for the ATLAS barrel prototype combined calorimeter (consisting of a lead-liquid argon electromagnetic part and an iron-scintillator hadronic part) in the framework of the nonparametrical method. The nonparametrical method utilizes only the known e/h ratios and the electron calibration constants and does not require the determination of any parameters by a minimization technique. Thus, this technique lends itself to an easy use in a first level trigger. The reconstructed mean values of the hadron energies are within +or-1% of the true values and the fractional energy resolution is [(58+or-3)%/ square root E+(2.5+or-0.3)%](+)(1.7+or-0.2)/E. The value of the e/h ratio obtained for the electromagnetic compartment of the combined calorimeter is 1.74+or-0.04 and agrees with the prediction that e/h >1.66 for this electromagnetic calorimeter. Results of a study of the longitudinal hadronic shower development are also presented. The data have been taken in the H8 beam...

  10. Non-parametric transformation for data correlation and integration: From theory to practice

    Energy Technology Data Exchange (ETDEWEB)

    Datta-Gupta, A.; Xue, Guoping; Lee, Sang Heon [Texas A& M Univ., College Station, TX (United States)

    1997-08-01

    The purpose of this paper is two-fold. First, we introduce the use of non-parametric transformations for correlating petrophysical data during reservoir characterization. Such transformations are completely data driven and do not require a priori functional relationship between response and predictor variables which is the case with traditional multiple regression. The transformations are very general, computationally efficient and can easily handle mixed data types for example, continuous variables such as porosity, permeability and categorical variables such as rock type, lithofacies. The power of the non-parametric transformation techniques for data correlation has been illustrated through synthetic and field examples. Second, we utilize these transformations to propose a two-stage approach for data integration during heterogeneity characterization. The principal advantages of our approach over traditional cokriging or cosimulation methods are: (1) it does not require a linear relationship between primary and secondary data, (2) it exploits the secondary information to its fullest potential by maximizing the correlation between the primary and secondary data, (3) it can be easily applied to cases where several types of secondary or soft data are involved, and (4) it significantly reduces variance function calculations and thus, greatly facilitates non-Gaussian cosimulation. We demonstrate the data integration procedure using synthetic and field examples. The field example involves estimation of pore-footage distribution using well data and multiple seismic attributes.

  11. Bayesian Nonparametric Measurement of Factor Betas and Clustering with Application to Hedge Fund Returns

    Directory of Open Access Journals (Sweden)

    Urbi Garay

    2016-03-01

    Full Text Available We define a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return and betas (to a choice set of explanatory factors in a multivariate setting. This approach, as well as the outputs, has a dynamic, nonstationary and nonparametric form, which circumvents the problem of model risk and parametric assumptions that the Kalman filter and other widely used approaches rely on. The by-product of clusters, used for shrinkage and information borrowing, can be of use to determine relationships around specific events. This approach exhibits a smaller Root Mean Squared Error than traditionally used benchmarks in financial settings, which we illustrate through simulation. As an illustration, we use hedge fund index data, and find that our estimated alphas are, on average, 0.13% per month higher (1.6% per year than alphas estimated through Ordinary Least Squares. The approach exhibits fast adaptation to abrupt changes in the parameters, as seen in our estimated alphas and betas, which exhibit high volatility, especially in periods which can be identified as times of stressful market events, a reflection of the dynamic positioning of hedge fund portfolio managers.

  12. Identification and well-posedness in a class of nonparametric problems

    CERN Document Server

    Zinde-Walsh, Victoria

    2010-01-01

    This is a companion note to Zinde-Walsh (2010), arXiv:1009.4217v1[MATH.ST], to clarify and extend results on identification in a number of problems that lead to a system of convolution equations. Examples include identification of the distribution of mismeasured variables, of a nonparametric regression function under Berkson type measurement error, some nonparametric panel data models, etc. The reason that identification in different problems can be considered in one approach is that they lead to the same system of convolution equations; moreover the solution can be given under more general assumptions than those usually considered, by examining these equations in spaces of generalized functions. An important issue that did not receive sufficient attention is that of well-posedness. This note gives conditions under which well-posedness obtains, an example that demonstrates that when well-posedness does not hold functions that are far apart can give rise to observable arbitrarily close functions and discusses ...

  13. Comparison of Parametric and Nonparametric Methods for Analyzing the Bias of a Numerical Model

    Directory of Open Access Journals (Sweden)

    Isaac Mugume

    2016-01-01

    Full Text Available Numerical models are presently applied in many fields for simulation and prediction, operation, or research. The output from these models normally has both systematic and random errors. The study compared January 2015 temperature data for Uganda as simulated using the Weather Research and Forecast model with actual observed station temperature data to analyze the bias using parametric (the root mean square error (RMSE, the mean absolute error (MAE, mean error (ME, skewness, and the bias easy estimate (BES and nonparametric (the sign test, STM methods. The RMSE normally overestimates the error compared to MAE. The RMSE and MAE are not sensitive to direction of bias. The ME gives both direction and magnitude of bias but can be distorted by extreme values while the BES is insensitive to extreme values. The STM is robust for giving the direction of bias; it is not sensitive to extreme values but it does not give the magnitude of bias. The graphical tools (such as time series and cumulative curves show the performance of the model with time. It is recommended to integrate parametric and nonparametric methods along with graphical methods for a comprehensive analysis of bias of a numerical model.

  14. A nonparametric statistical method for image segmentation using information theory and curve evolution.

    Science.gov (United States)

    Kim, Junmo; Fisher, John W; Yezzi, Anthony; Cetin, Müjdat; Willsky, Alan S

    2005-10-01

    In this paper, we present a new information-theoretic approach to image segmentation. We cast the segmentation problem as the maximization of the mutual information between the region labels and the image pixel intensities, subject to a constraint on the total length of the region boundaries. We assume that the probability densities associated with the image pixel intensities within each region are completely unknown a priori, and we formulate the problem based on nonparametric density estimates. Due to the nonparametric structure, our method does not require the image regions to have a particular type of probability distribution and does not require the extraction and use of a particular statistic. We solve the information-theoretic optimization problem by deriving the associated gradient flows and applying curve evolution techniques. We use level-set methods to implement the resulting evolution. The experimental results based on both synthetic and real images demonstrate that the proposed technique can solve a variety of challenging image segmentation problems. Futhermore, our method, which does not require any training, performs as good as methods based on training.

  15. Bayesian Nonparametric Regression Analysis of Data with Random Effects Covariates from Longitudinal Measurements

    KAUST Repository

    Ryu, Duchwan

    2010-09-28

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.

  16. Bayesian nonparametric regression analysis of data with random effects covariates from longitudinal measurements.

    Science.gov (United States)

    Ryu, Duchwan; Li, Erning; Mallick, Bani K

    2011-06-01

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves.

  17. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population.

  18. Gaussian process-based Bayesian nonparametric inference of population size trajectories from gene genealogies.

    Science.gov (United States)

    Palacios, Julia A; Minin, Vladimir N

    2013-03-01

    Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method.

  19. A non-parametric approach to estimate the total deviation index for non-normal data.

    Science.gov (United States)

    Perez-Jaume, Sara; Carrasco, Josep L

    2015-11-10

    Concordance indices are used to assess the degree of agreement between different methods that measure the same characteristic. In this context, the total deviation index (TDI) is an unscaled concordance measure that quantifies to which extent the readings from the same subject obtained by different methods may differ with a certain probability. Common approaches to estimate the TDI assume data are normally distributed and linearity between response and effects (subjects, methods and random error). Here, we introduce a new non-parametric methodology for estimation and inference of the TDI that can deal with any kind of quantitative data. The present study introduces this non-parametric approach and compares it with the already established methods in two real case examples that represent situations of non-normal data (more specifically, skewed data and count data). The performance of the already established methodologies and our approach in these contexts is assessed by means of a simulation study. Copyright © 2015 John Wiley & Sons, Ltd.

  20. A non-parametric Bayesian approach for clustering and tracking non-stationarities of neural spikes.

    Science.gov (United States)

    Shalchyan, Vahid; Farina, Dario

    2014-02-15

    Neural spikes from multiple neurons recorded in a multi-unit signal are usually separated by clustering. Drifts in the position of the recording electrode relative to the neurons over time cause gradual changes in the position and shapes of the clusters, challenging the clustering task. By dividing the data into short time intervals, Bayesian tracking of the clusters based on Gaussian cluster model has been previously proposed. However, the Gaussian cluster model is often not verified for neural spikes. We present a Bayesian clustering approach that makes no assumptions on the distribution of the clusters and use kernel-based density estimation of the clusters in every time interval as a prior for Bayesian classification of the data in the subsequent time interval. The proposed method was tested and compared to Gaussian model-based approach for cluster tracking by using both simulated and experimental datasets. The results showed that the proposed non-parametric kernel-based density estimation of the clusters outperformed the sequential Gaussian model fitting in both simulated and experimental data tests. Using non-parametric kernel density-based clustering that makes no assumptions on the distribution of the clusters enhances the ability of tracking cluster non-stationarity over time with respect to the Gaussian cluster modeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.