WorldWideScience

Sample records for nonparametric density estimation

  1. Fusion of Hard and Soft Information in Nonparametric Density Estimation

    Science.gov (United States)

    2015-06-10

    estimation exploiting, in concert, hard and soft information. Although our development, theoretical and numerical, makes no distinction based on sample...Fusion of Hard and Soft Information in Nonparametric Density Estimation∗ Johannes O. Royset Roger J-B Wets Department of Operations Research...univariate density estimation in situations when the sample ( hard information) is supplemented by “soft” information about the random phenomenon. These

  2. Kernel bandwidth estimation for non-parametric density estimation: a comparative study

    CSIR Research Space (South Africa)

    Van der Walt, CM

    2013-12-01

    Full Text Available We investigate the performance of conventional bandwidth estimators for non-parametric kernel density estimation on a number of representative pattern-recognition tasks, to gain a better understanding of the behaviour of these estimators in high...

  3. Nonparametric estimation of population density for line transect sampling using FOURIER series

    Science.gov (United States)

    Crain, B.R.; Burnham, K.P.; Anderson, D.R.; Lake, J.L.

    1979-01-01

    A nonparametric, robust density estimation method is explored for the analysis of right-angle distances from a transect line to the objects sighted. The method is based on the FOURIER series expansion of a probability density function over an interval. With only mild assumptions, a general population density estimator of wide applicability is obtained.

  4. Photo-z Estimation: An Example of Nonparametric Conditional Density Estimation under Selection Bias

    CERN Document Server

    Izbicki, Rafael; Freeman, Peter E

    2016-01-01

    Redshift is a key quantity for inferring cosmological model parameters. In photometric redshift estimation, cosmologists use the coarse data collected from the vast majority of galaxies to predict the redshift of individual galaxies. To properly quantify the uncertainty in the predictions, however, one needs to go beyond standard regression and instead estimate the full conditional density f(z|x) of a galaxy's redshift z given its photometric covariates x. The problem is further complicated by selection bias: usually only the rarest and brightest galaxies have known redshifts, and these galaxies have characteristics and measured covariates that do not necessarily match those of more numerous and dimmer galaxies of unknown redshift. Unfortunately, there is not much research on how to best estimate complex multivariate densities in such settings. Here we describe a general framework for properly constructing and assessing nonparametric conditional density estimators under selection bias, and for combining two o...

  5. Contribution to the Nonparametric Estimation of the Density of the Regression Errors (Doctoral Thesis)

    CERN Document Server

    LSTA, Rawane Samb

    2010-01-01

    This thesis deals with the nonparametric estimation of density f of the regression error term E of the model Y=m(X)+E, assuming its independence with the covariate X. The difficulty linked to this study is the fact that the regression error E is not observed. In a such setup, it would be unwise, for estimating f, to use a conditional approach based upon the probability distribution function of Y given X. Indeed, this approach is affected by the curse of dimensionality, so that the resulting estimator of the residual term E would have considerably a slow rate of convergence if the dimension of X is very high. Two approaches are proposed in this thesis to avoid the curse of dimensionality. The first approach uses the estimated residuals, while the second integrates a nonparametric conditional density estimator of Y given X. If proceeding so can circumvent the curse of dimensionality, a challenging issue is to evaluate the impact of the estimated residuals on the final estimator of the density f. We will also at...

  6. Testing and Estimating Shape-Constrained Nonparametric Density and Regression in the Presence of Measurement Error

    KAUST Repository

    Carroll, Raymond J.

    2011-03-01

    In many applications we can expect that, or are interested to know if, a density function or a regression curve satisfies some specific shape constraints. For example, when the explanatory variable, X, represents the value taken by a treatment or dosage, the conditional mean of the response, Y , is often anticipated to be a monotone function of X. Indeed, if this regression mean is not monotone (in the appropriate direction) then the medical or commercial value of the treatment is likely to be significantly curtailed, at least for values of X that lie beyond the point at which monotonicity fails. In the case of a density, common shape constraints include log-concavity and unimodality. If we can correctly guess the shape of a curve, then nonparametric estimators can be improved by taking this information into account. Addressing such problems requires a method for testing the hypothesis that the curve of interest satisfies a shape constraint, and, if the conclusion of the test is positive, a technique for estimating the curve subject to the constraint. Nonparametric methodology for solving these problems already exists, but only in cases where the covariates are observed precisely. However in many problems, data can only be observed with measurement errors, and the methods employed in the error-free case typically do not carry over to this error context. In this paper we develop a novel approach to hypothesis testing and function estimation under shape constraints, which is valid in the context of measurement errors. Our method is based on tilting an estimator of the density or the regression mean until it satisfies the shape constraint, and we take as our test statistic the distance through which it is tilted. Bootstrap methods are used to calibrate the test. The constrained curve estimators that we develop are also based on tilting, and in that context our work has points of contact with methodology in the error-free case.

  7. Non-parametric kernel density estimation of species sensitivity distributions in developing water quality criteria of metals.

    Science.gov (United States)

    Wang, Ying; Wu, Fengchang; Giesy, John P; Feng, Chenglian; Liu, Yuedan; Qin, Ning; Zhao, Yujie

    2015-09-01

    Due to use of different parametric models for establishing species sensitivity distributions (SSDs), comparison of water quality criteria (WQC) for metals of the same group or period in the periodic table is uncertain and results can be biased. To address this inadequacy, a new probabilistic model, based on non-parametric kernel density estimation was developed and optimal bandwidths and testing methods are proposed. Zinc (Zn), cadmium (Cd), and mercury (Hg) of group IIB of the periodic table are widespread in aquatic environments, mostly at small concentrations, but can exert detrimental effects on aquatic life and human health. With these metals as target compounds, the non-parametric kernel density estimation method and several conventional parametric density estimation methods were used to derive acute WQC of metals for protection of aquatic species in China that were compared and contrasted with WQC for other jurisdictions. HC5 values for protection of different types of species were derived for three metals by use of non-parametric kernel density estimation. The newly developed probabilistic model was superior to conventional parametric density estimations for constructing SSDs and for deriving WQC for these metals. HC5 values for the three metals were inversely proportional to atomic number, which means that the heavier atoms were more potent toxicants. The proposed method provides a novel alternative approach for developing SSDs that could have wide application prospects in deriving WQC and use in assessment of risks to ecosystems.

  8. Nonparametric estimate of spectral density functions of sample covariance matrices: A first step

    OpenAIRE

    2012-01-01

    The density function of the limiting spectral distribution of general sample covariance matrices is usually unknown. We propose to use kernel estimators which are proved to be consistent. A simulation study is also conducted to show the performance of the estimators.

  9. Low default credit scoring using two-class non-parametric kernel density estimation

    CSIR Research Space (South Africa)

    Rademeyer, E

    2016-12-01

    Full Text Available This paper investigates the performance of two-class classification credit scoring data sets with low default ratios. The standard two-class parametric Gaussian and non-parametric Parzen classifiers are extended, using Bayes’ rule, to include either...

  10. Density Estimation for Protein Conformation Angles Using a Bivariate von Mises Distribution and Bayesian Nonparametrics.

    Science.gov (United States)

    Lennox, Kristin P; Dahl, David B; Vannucci, Marina; Tsai, Jerry W

    2009-06-01

    Interest in predicting protein backbone conformational angles has prompted the development of modeling and inference procedures for bivariate angular distributions. We present a Bayesian approach to density estimation for bivariate angular data that uses a Dirichlet process mixture model and a bivariate von Mises distribution. We derive the necessary full conditional distributions to fit the model, as well as the details for sampling from the posterior predictive distribution. We show how our density estimation method makes it possible to improve current approaches for protein structure prediction by comparing the performance of the so-called "whole" and "half" position distributions. Current methods in the field are based on whole position distributions, as density estimation for the half positions requires techniques, such as ours, that can provide good estimates for small datasets. With our method we are able to demonstrate that half position data provides a better approximation for the distribution of conformational angles at a given sequence position, therefore providing increased efficiency and accuracy in structure prediction.

  11. Nonparametric estimation of ultrasound pulses

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Leeman, Sidney

    1994-01-01

    An algorithm for nonparametric estimation of 1D ultrasound pulses in echo sequences from human tissues is derived. The technique is a variation of the homomorphic filtering technique using the real cepstrum, and the underlying basis of the method is explained. The algorithm exploits a priori...

  12. Bootstrap Estimation for Nonparametric Efficiency Estimates

    OpenAIRE

    1995-01-01

    This paper develops a consistent bootstrap estimation procedure to obtain confidence intervals for nonparametric measures of productive efficiency. Although the methodology is illustrated in terms of technical efficiency measured by output distance functions, the technique can be easily extended to other consistent nonparametric frontier models. Variation in estimated efficiency scores is assumed to result from variation in empirical approximations to the true boundary of the production set. ...

  13. Central limit theorem of nonparametric estimate of spectral density functions of sample covariance matrices

    CERN Document Server

    Pan, Guangming; Zhou, Wang

    2010-01-01

    A consistent kernel estimator of the limiting spectral distribution of general sample covariance matrices was introduced in Jing, Pan, Shao and Zhou (2010). The central limit theorem of the kernel estimator is proved in this paper.

  14. Comparative study of species sensitivity distributions based on non-parametric kernel density estimation for some transition metals.

    Science.gov (United States)

    Wang, Ying; Feng, Chenglian; Liu, Yuedan; Zhao, Yujie; Li, Huixian; Zhao, Tianhui; Guo, Wenjing

    2017-02-01

    Transition metals in the fourth period of the periodic table of the elements are widely widespread in aquatic environments. They could often occur at certain concentrations to cause adverse effects on aquatic life and human health. Generally, parametric models are mostly used to construct species sensitivity distributions (SSDs), which result in comparison for water quality criteria (WQC) of elements in the same period or group of the periodic table might be inaccurate and the results could be biased. To address this inadequacy, the non-parametric kernel density estimation (NPKDE) with its optimal bandwidths and testing methods were developed for establishing SSDs. The NPKDE was better fit, more robustness and better predicted than conventional normal and logistic parametric density estimations for constructing SSDs and deriving acute HC5 and WQC for transition metals in the fourth period of the periodic table. The decreasing sequence of HC5 values for the transition metals in the fourth period was Ti > Mn > V > Ni > Zn > Cu > Fe > Co > Cr(VI), which were not proportional to atomic number in the periodic table, and for different metals the relatively sensitive species were also different. The results indicated that except for physical and chemical properties there are other factors affecting toxicity mechanisms of transition metals. The proposed method enriched the methodological foundation for WQC. Meanwhile, it also provided a relatively innovative, accurate approach for the WQC derivation and risk assessment of the same group and period metals in aquatic environments to support protection of aquatic organisms.

  15. Nonparametric k-nearest-neighbor entropy estimator.

    Science.gov (United States)

    Lombardi, Damiano; Pant, Sanjay

    2016-01-01

    A nonparametric k-nearest-neighbor-based entropy estimator is proposed. It improves on the classical Kozachenko-Leonenko estimator by considering nonuniform probability densities in the region of k-nearest neighbors around each sample point. It aims to improve the classical estimators in three situations: first, when the dimensionality of the random variable is large; second, when near-functional relationships leading to high correlation between components of the random variable are present; and third, when the marginal variances of random variable components vary significantly with respect to each other. Heuristics on the error of the proposed and classical estimators are presented. Finally, the proposed estimator is tested for a variety of distributions in successively increasing dimensions and in the presence of a near-functional relationship. Its performance is compared with a classical estimator, and a significant improvement is demonstrated.

  16. Nonparametric estimation for hazard rate monotonously decreasing system

    Institute of Scientific and Technical Information of China (English)

    Han Fengyan; Li Weisong

    2005-01-01

    Estimation of density and hazard rate is very important to the reliability analysis of a system. In order to estimate the density and hazard rate of a hazard rate monotonously decreasing system, a new nonparametric estimator is put forward. The estimator is based on the kernel function method and optimum algorithm. Numerical experiment shows that the method is accurate enough and can be used in many cases.

  17. Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series

    DEFF Research Database (Denmark)

    Gao, Jiti; Kanaya, Shin; Li, Degui

    2015-01-01

    This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our...... results can be viewed as a nonstationary extension of some well-known uniform consistency results for stationary time series....

  18. Nonparametric estimation for contamination distribution

    Institute of Scientific and Technical Information of China (English)

    HUI Jun; MIAO Bai-qi; NING Jing; PENG Heng

    2008-01-01

    In the paper, for the contamination distribution model F(x)=(1-α)F1(x)+αF2(x), the estimates of α and F1(x) are studied using two different ways when F2(x) is known and the strong consistency of the two estimates is proved. At the same time the consistency rate of estimate α is also given.

  19. Nonparametric estimation of Fisher information from real data

    Science.gov (United States)

    Har-Shemesh, Omri; Quax, Rick; Miñano, Borja; Hoekstra, Alfons G.; Sloot, Peter M. A.

    2016-02-01

    The Fisher information matrix (FIM) is a widely used measure for applications including statistical inference, information geometry, experiment design, and the study of criticality in biological systems. The FIM is defined for a parametric family of probability distributions and its estimation from data follows one of two paths: either the distribution is assumed to be known and the parameters are estimated from the data or the parameters are known and the distribution is estimated from the data. We consider the latter case which is applicable, for example, to experiments where the parameters are controlled by the experimenter and a complicated relation exists between the input parameters and the resulting distribution of the data. Since we assume that the distribution is unknown, we use a nonparametric density estimation on the data and then compute the FIM directly from that estimate using a finite-difference approximation to estimate the derivatives in its definition. The accuracy of the estimate depends on both the method of nonparametric estimation and the difference Δ θ between the densities used in the finite-difference formula. We develop an approach for choosing the optimal parameter difference Δ θ based on large deviations theory and compare two nonparametric density estimation methods, the Gaussian kernel density estimator and a novel density estimation using field theory method. We also compare these two methods to a recently published approach that circumvents the need for density estimation by estimating a nonparametric f divergence and using it to approximate the FIM. We use the Fisher information of the normal distribution to validate our method and as a more involved example we compute the temperature component of the FIM in the two-dimensional Ising model and show that it obeys the expected relation to the heat capacity and therefore peaks at the phase transition at the correct critical temperature.

  20. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are pointed...... out, and methods to prevent bias are presented. The techniques are evaluated by comparing their speed and accuracy on the simple case of estimating auto-correlation functions for the response of a single degree-of-freedom system loaded with white noise....

  1. Non-parametric estimation of Fisher information from real data

    CERN Document Server

    Shemesh, Omri Har; Miñano, Borja; Hoekstra, Alfons G; Sloot, Peter M A

    2015-01-01

    The Fisher Information matrix is a widely used measure for applications ranging from statistical inference, information geometry, experiment design, to the study of criticality in biological systems. Yet there is no commonly accepted non-parametric algorithm to estimate it from real data. In this rapid communication we show how to accurately estimate the Fisher information in a nonparametric way. We also develop a numerical procedure to minimize the errors by choosing the interval of the finite difference scheme necessary to compute the derivatives in the definition of the Fisher information. Our method uses the recently published "Density Estimation using Field Theory" algorithm to compute the probability density functions for continuous densities. We use the Fisher information of the normal distribution to validate our method and as an example we compute the temperature component of the Fisher Information Matrix in the two dimensional Ising model and show that it obeys the expected relation to the heat capa...

  2. Bayesian nonparametric estimation for Quantum Homodyne Tomography

    OpenAIRE

    Naulet, Zacharie; Barat, Eric

    2016-01-01

    We estimate the quantum state of a light beam from results of quantum homodyne tomography noisy measurements performed on identically prepared quantum systems. We propose two Bayesian nonparametric approaches. The first approach is based on mixture models and is illustrated through simulation examples. The second approach is based on random basis expansions. We study the theoretical performance of the second approach by quantifying the rate of contraction of the posterior distribution around ...

  3. NONPARAMETRIC ESTIMATION OF CHARACTERISTICS OF PROBABILITY DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-10-01

    Full Text Available The article is devoted to the nonparametric point and interval estimation of the characteristics of the probabilistic distribution (the expectation, median, variance, standard deviation, variation coefficient of the sample results. Sample values are regarded as the implementation of independent and identically distributed random variables with an arbitrary distribution function having the desired number of moments. Nonparametric analysis procedures are compared with the parametric procedures, based on the assumption that the sample values have a normal distribution. Point estimators are constructed in the obvious way - using sample analogs of the theoretical characteristics. Interval estimators are based on asymptotic normality of sample moments and functions from them. Nonparametric asymptotic confidence intervals are obtained through the use of special output technology of the asymptotic relations of Applied Statistics. In the first step this technology uses the multidimensional central limit theorem, applied to the sums of vectors whose coordinates are the degrees of initial random variables. The second step is the conversion limit multivariate normal vector to obtain the interest of researcher vector. At the same considerations we have used linearization and discarded infinitesimal quantities. The third step - a rigorous justification of the results on the asymptotic standard for mathematical and statistical reasoning level. It is usually necessary to use the necessary and sufficient conditions for the inheritance of convergence. This article contains 10 numerical examples. Initial data - information about an operating time of 50 cutting tools to the limit state. Using the methods developed on the assumption of normal distribution, it can lead to noticeably distorted conclusions in a situation where the normality hypothesis failed. Practical recommendations are: for the analysis of real data we should use nonparametric confidence limits

  4. Nonparametric estimation of location and scale parameters

    KAUST Repository

    Potgieter, C.J.

    2012-12-01

    Two random variables X and Y belong to the same location-scale family if there are constants μ and σ such that Y and μ+σX have the same distribution. In this paper we consider non-parametric estimation of the parameters μ and σ under minimal assumptions regarding the form of the distribution functions of X and Y. We discuss an approach to the estimation problem that is based on asymptotic likelihood considerations. Our results enable us to provide a methodology that can be implemented easily and which yields estimators that are often near optimal when compared to fully parametric methods. We evaluate the performance of the estimators in a series of Monte Carlo simulations. © 2012 Elsevier B.V. All rights reserved.

  5. Nonparametric Maximum Entropy Estimation on Information Diagrams

    CERN Document Server

    Martin, Elliot A; Meinke, Alexander; Děchtěrenko, Filip; Davidsen, Jörn

    2016-01-01

    Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies. Specifically, we show how to apply the concept to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish a number of significant advantages of our approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases. In addition, we propose a nonparametric formulation of connected informations and give an illustrative example showing how this agrees with the existing parametric formulation in cases of interest. We furthe...

  6. Nonparametric estimation of employee stock options

    Institute of Scientific and Technical Information of China (English)

    FU Qiang; LIU Li-an; LIU Qian

    2006-01-01

    We proposed a new model to price employee stock options (ESOs). The model is based on nonparametric statistical methods with market data. It incorporates the kernel estimator and employs a three-step method to modify BlackScholes formula. The model overcomes the limits of Black-Scholes formula in handling option prices with varied volatility. It disposes the effects of ESOs self-characteristics such as non-tradability, the longer term for expiration, the early exercise feature, the restriction on shorting selling and the employee's risk aversion on risk neutral pricing condition, and can be applied to ESOs valuation with the explanatory variable in no matter the certainty case or random case.

  7. Asymptotic theory of nonparametric regression estimates with censored data

    Institute of Scientific and Technical Information of China (English)

    施沛德; 王海燕; 张利华

    2000-01-01

    For regression analysis, some useful Information may have been lost when the responses are right censored. To estimate nonparametric functions, several estimates based on censored data have been proposed and their consistency and convergence rates have been studied in literat黵e, but the optimal rates of global convergence have not been obtained yet. Because of the possible Information loss, one may think that it is impossible for an estimate based on censored data to achieve the optimal rates of global convergence for nonparametric regression, which were established by Stone based on complete data. This paper constructs a regression spline estimate of a general nonparametric regression f unction based on right-censored response data, and proves, under some regularity condi-tions, that this estimate achieves the optimal rates of global convergence for nonparametric regression. Since the parameters for the nonparametric regression estimate have to be chosen based on a data driven criterion, we also obtai

  8. Nonparametric TOA estimators for low-resolution IR-UWB digital receiver

    Institute of Scientific and Technical Information of China (English)

    Yanlong Zhang; Weidong Chen

    2015-01-01

    Nonparametric time-of-arrival (TOA) estimators for im-pulse radio ultra-wideband (IR-UWB) signals are proposed. Non-parametric detection is obviously useful in situations where de-tailed information about the statistics of the noise is unavailable or not accurate. Such TOA estimators are obtained based on condi-tional statistical tests with only a symmetry distribution assumption on the noise probability density function. The nonparametric es-timators are attractive choices for low-resolution IR-UWB digital receivers which can be implemented by fast comparators or high sampling rate low resolution analog-to-digital converters (ADCs), in place of high sampling rate high resolution ADCs which may not be available in practice. Simulation results demonstrate that nonparametric TOA estimators provide more effective and robust performance than typical energy detection (ED) based estimators.

  9. Nonparametric Bayesian drift estimation for multidimensional stochastic differential equations

    NARCIS (Netherlands)

    Gugushvili, S.; Spreij, P.

    2014-01-01

    We consider nonparametric Bayesian estimation of the drift coefficient of a multidimensional stochastic differential equation from discrete-time observations on the solution of this equation. Under suitable regularity conditions, we establish posterior consistency in this context.

  10. portfolio optimization based on nonparametric estimation methods

    Directory of Open Access Journals (Sweden)

    mahsa ghandehari

    2017-03-01

    Full Text Available One of the major issues investors are facing with in capital markets is decision making about select an appropriate stock exchange for investing and selecting an optimal portfolio. This process is done through the risk and expected return assessment. On the other hand in portfolio selection problem if the assets expected returns are normally distributed, variance and standard deviation are used as a risk measure. But, the expected returns on assets are not necessarily normal and sometimes have dramatic differences from normal distribution. This paper with the introduction of conditional value at risk ( CVaR, as a measure of risk in a nonparametric framework, for a given expected return, offers the optimal portfolio and this method is compared with the linear programming method. The data used in this study consists of monthly returns of 15 companies selected from the top 50 companies in Tehran Stock Exchange during the winter of 1392 which is considered from April of 1388 to June of 1393. The results of this study show the superiority of nonparametric method over the linear programming method and the nonparametric method is much faster than the linear programming method.

  11. Asymptotic theory of nonparametric regression estimates with censored data

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    For regression analysis, some useful information may have been lost when the responses are right censored. To estimate nonparametric functions, several estimates based on censored data have been proposed and their consistency and convergence rates have been studied in literature, but the optimal rates of global convergence have not been obtained yet. Because of the possible information loss, one may think that it is impossible for an estimate based on censored data to achieve the optimal rates of global convergence for nonparametric regression, which were established by Stone based on complete data. This paper constructs a regression spline estimate of a general nonparametric regression function based on right_censored response data, and proves, under some regularity conditions, that this estimate achieves the optimal rates of global convergence for nonparametric regression. Since the parameters for the nonparametric regression estimate have to be chosen based on a data driven criterion, we also obtain the asymptotic optimality of AIC, AICC, GCV, Cp and FPE criteria in the process of selecting the parameters.

  12. Rediscovery of Good-Turing estimators via Bayesian nonparametrics.

    Science.gov (United States)

    Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye

    2016-03-01

    The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library.

  13. Estimation of Stochastic Volatility Models by Nonparametric Filtering

    DEFF Research Database (Denmark)

    Kanaya, Shin; Kristensen, Dennis

    2016-01-01

    /estimated volatility process replacing the latent process. Our estimation strategy is applicable to both parametric and nonparametric stochastic volatility models, and can handle both jumps and market microstructure noise. The resulting estimators of the stochastic volatility model will carry additional biases......A two-step estimation method of stochastic volatility models is proposed: In the first step, we nonparametrically estimate the (unobserved) instantaneous volatility process. In the second step, standard estimation methods for fully observed diffusion processes are employed, but with the filtered...... and variances due to the first-step estimation, but under regularity conditions we show that these vanish asymptotically and our estimators inherit the asymptotic properties of the infeasible estimators based on observations of the volatility process. A simulation study examines the finite-sample properties...

  14. Nonparametric estimation of a convex bathtub-shaped hazard function.

    Science.gov (United States)

    Jankowski, Hanna K; Wellner, Jon A

    2009-11-01

    In this paper, we study the nonparametric maximum likelihood estimator (MLE) of a convex hazard function. We show that the MLE is consistent and converges at a local rate of n(2/5) at points x(0) where the true hazard function is positive and strictly convex. Moreover, we establish the pointwise asymptotic distribution theory of our estimator under these same assumptions. One notable feature of the nonparametric MLE studied here is that no arbitrary choice of tuning parameter (or complicated data-adaptive selection of the tuning parameter) is required.

  15. Nonparametric estimation of quantum states, processes and measurements

    Science.gov (United States)

    Lougovski, Pavel; Bennink, Ryan

    Quantum state, process, and measurement estimation methods traditionally use parametric models, in which the number and role of relevant parameters is assumed to be known. When such an assumption cannot be justified, a common approach in many disciplines is to fit the experimental data to multiple models with different sets of parameters and utilize an information criterion to select the best fitting model. However, it is not always possible to assume a model with a finite (countable) number of parameters. This typically happens when there are unobserved variables that stem from hidden correlations that can only be unveiled after collecting experimental data. How does one perform quantum characterization in this situation? We present a novel nonparametric method of experimental quantum system characterization based on the Dirichlet Process (DP) that addresses this problem. Using DP as a prior in conjunction with Bayesian estimation methods allows us to increase model complexity (number of parameters) adaptively as the number of experimental observations grows. We illustrate our approach for the one-qubit case and show how a probability density function for an unknown quantum process can be estimated.

  16. Nonparametric Estimation of Mean and Variance and Pricing of Securities Nonparametric Estimation of Mean and Variance and Pricing of Sec

    Directory of Open Access Journals (Sweden)

    Akhtar R. Siddique

    2000-03-01

    Full Text Available This paper develops a filtering-based framework of non-parametric estimation of parameters of a diffusion process from the conditional moments of discrete observations of the process. This method is implemented for interest rate data in the Eurodollar and long term bond markets. The resulting estimates are then used to form non-parametric univariate and bivariate interest rate models and compute prices for the short term Eurodollar interest rate futures options and long term discount bonds. The bivariate model produces prices substantially closer to the market prices. This paper develops a filtering-based framework of non-parametric estimation of parameters of a diffusion process from the conditional moments of discrete observations of the process. This method is implemented for interest rate data in the Eurodollar and long term bond markets. The resulting estimates are then used to form non-parametric univariate and bivariate interest rate models and compute prices for the short term Eurodollar interest rate futures options and long term discount bonds. The bivariate model produces prices substantially closer to the market prices.

  17. Nonparametric Regression Estimation for Multivariate Null Recurrent Processes

    Directory of Open Access Journals (Sweden)

    Biqing Cai

    2015-04-01

    Full Text Available This paper discusses nonparametric kernel regression with the regressor being a \\(d\\-dimensional \\(\\beta\\-null recurrent process in presence of conditional heteroscedasticity. We show that the mean function estimator is consistent with convergence rate \\(\\sqrt{n(Th^{d}}\\, where \\(n(T\\ is the number of regenerations for a \\(\\beta\\-null recurrent process and the limiting distribution (with proper normalization is normal. Furthermore, we show that the two-step estimator for the volatility function is consistent. The finite sample performance of the estimate is quite reasonable when the leave-one-out cross validation method is used for bandwidth selection. We apply the proposed method to study the relationship of Federal funds rate with 3-month and 5-year T-bill rates and discover the existence of nonlinearity of the relationship. Furthermore, the in-sample and out-of-sample performance of the nonparametric model is far better than the linear model.

  18. Wavelet Estimators in Nonparametric Regression: A Comparative Simulation Study

    Directory of Open Access Journals (Sweden)

    Anestis Antoniadis

    2001-06-01

    Full Text Available Wavelet analysis has been found to be a powerful tool for the nonparametric estimation of spatially-variable objects. We discuss in detail wavelet methods in nonparametric regression, where the data are modelled as observations of a signal contaminated with additive Gaussian noise, and provide an extensive review of the vast literature of wavelet shrinkage and wavelet thresholding estimators developed to denoise such data. These estimators arise from a wide range of classical and empirical Bayes methods treating either individual or blocks of wavelet coefficients. We compare various estimators in an extensive simulation study on a variety of sample sizes, test functions, signal-to-noise ratios and wavelet filters. Because there is no single criterion that can adequately summarise the behaviour of an estimator, we use various criteria to measure performance in finite sample situations. Insight into the performance of these estimators is obtained from graphical outputs and numerical tables. In order to provide some hints of how these estimators should be used to analyse real data sets, a detailed practical step-by-step illustration of a wavelet denoising analysis on electrical consumption is provided. Matlab codes are provided so that all figures and tables in this paper can be reproduced.

  19. Genomic breeding value estimation using nonparametric additive regression models

    Directory of Open Access Journals (Sweden)

    Solberg Trygve

    2009-01-01

    Full Text Available Abstract Genomic selection refers to the use of genomewide dense markers for breeding value estimation and subsequently for selection. The main challenge of genomic breeding value estimation is the estimation of many effects from a limited number of observations. Bayesian methods have been proposed to successfully cope with these challenges. As an alternative class of models, non- and semiparametric models were recently introduced. The present study investigated the ability of nonparametric additive regression models to predict genomic breeding values. The genotypes were modelled for each marker or pair of flanking markers (i.e. the predictors separately. The nonparametric functions for the predictors were estimated simultaneously using additive model theory, applying a binomial kernel. The optimal degree of smoothing was determined by bootstrapping. A mutation-drift-balance simulation was carried out. The breeding values of the last generation (genotyped was predicted using data from the next last generation (genotyped and phenotyped. The results show moderate to high accuracies of the predicted breeding values. A determination of predictor specific degree of smoothing increased the accuracy.

  20. A non-parametric framework for estimating threshold limit values

    Directory of Open Access Journals (Sweden)

    Ulm Kurt

    2005-11-01

    Full Text Available Abstract Background To estimate a threshold limit value for a compound known to have harmful health effects, an 'elbow' threshold model is usually applied. We are interested on non-parametric flexible alternatives. Methods We describe how a step function model fitted by isotonic regression can be used to estimate threshold limit values. This method returns a set of candidate locations, and we discuss two algorithms to select the threshold among them: the reduced isotonic regression and an algorithm considering the closed family of hypotheses. We assess the performance of these two alternative approaches under different scenarios in a simulation study. We illustrate the framework by analysing the data from a study conducted by the German Research Foundation aiming to set a threshold limit value in the exposure to total dust at workplace, as a causal agent for developing chronic bronchitis. Results In the paper we demonstrate the use and the properties of the proposed methodology along with the results from an application. The method appears to detect the threshold with satisfactory success. However, its performance can be compromised by the low power to reject the constant risk assumption when the true dose-response relationship is weak. Conclusion The estimation of thresholds based on isotonic framework is conceptually simple and sufficiently powerful. Given that in threshold value estimation context there is not a gold standard method, the proposed model provides a useful non-parametric alternative to the standard approaches and can corroborate or challenge their findings.

  1. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  2. Varying kernel density estimation on ℝ+

    Science.gov (United States)

    Mnatsakanov, Robert; Sarkisian, Khachatur

    2015-01-01

    In this article a new nonparametric density estimator based on the sequence of asymmetric kernels is proposed. This method is natural when estimating an unknown density function of a positive random variable. The rates of Mean Squared Error, Mean Integrated Squared Error, and the L1-consistency are investigated. Simulation studies are conducted to compare a new estimator and its modified version with traditional kernel density construction. PMID:26740729

  3. Nonparametric Estimation of Cumulative Incidence Functions for Competing Risks Data with Missing Cause of Failure

    DEFF Research Database (Denmark)

    Effraimidis, Georgios; Dahl, Christian Møller

    In this paper, we develop a fully nonparametric approach for the estimation of the cumulative incidence function with Missing At Random right-censored competing risks data. We obtain results on the pointwise asymptotic normality as well as the uniform convergence rate of the proposed nonparametric...... estimator. A simulation study that serves two purposes is provided. First, it illustrates in details how to implement our proposed nonparametric estimator. Secondly, it facilitates a comparison of the nonparametric estimator to a parametric counterpart based on the estimator of Lu and Liang (2008...

  4. Nonparametric estimation of stochastic differential equations with sparse Gaussian processes

    Science.gov (United States)

    García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.

    2017-08-01

    The application of stochastic differential equations (SDEs) to the analysis of temporal data has attracted increasing attention, due to their ability to describe complex dynamics with physically interpretable equations. In this paper, we introduce a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series. The use of Gaussian processes as priors permits working directly in a function-space view and thus the inference takes place directly in this space. To cope with the computational complexity that requires the use of Gaussian processes, a sparse Gaussian process approximation is provided. This approximation permits the efficient computation of predictions for the drift and diffusion terms by using a distribution over a small subset of pseudosamples. The proposed method has been validated using both simulated data and real data from economy and paleoclimatology. The application of the method to real data demonstrates its ability to capture the behavior of complex systems.

  5. Non-parametric Estimation approach in statistical investigation of nuclear spectra

    CERN Document Server

    Jafarizadeh, M A; Sabri, H; Maleki, B Rashidian

    2011-01-01

    In this paper, Kernel Density Estimation (KDE) as a non-parametric estimation method is used to investigate statistical properties of nuclear spectra. The deviation to regular or chaotic dynamics, is exhibited by closer distances to Poisson or Wigner limits respectively which evaluated by Kullback-Leibler Divergence (KLD) measure. Spectral statistics of different sequences prepared by nuclei corresponds to three dynamical symmetry limits of Interaction Boson Model(IBM), oblate and prolate nuclei and also the pairing effect on nuclear level statistics are analyzed (with pure experimental data). KD-based estimated density function, confirm previous predictions with minimum uncertainty (evaluated with Integrate Absolute Error (IAE)) in compare to Maximum Likelihood (ML)-based method. Also, the increasing of regularity degrees of spectra due to pairing effect is reveal.

  6. Pivotal Estimation of Nonparametric Functions via Square-root Lasso

    CERN Document Server

    Belloni, Alexandre; Wang, Lie

    2011-01-01

    In a nonparametric linear regression model we study a variant of LASSO, called square-root LASSO, which does not require the knowledge of the scaling parameter $\\sigma$ of the noise or bounds for it. This work derives new finite sample upper bounds for prediction norm rate of convergence, $\\ell_1$-rate of converge, $\\ell_\\infty$-rate of convergence, and sparsity of the square-root LASSO estimator. A lower bound for the prediction norm rate of convergence is also established. In many non-Gaussian noise cases, we rely on moderate deviation theory for self-normalized sums and on new data-dependent empirical process inequalities to achieve Gaussian-like results provided log p = o(n^{1/3}) improving upon results derived in the parametric case that required log p = O(log n). In addition, we derive finite sample bounds on the performance of ordinary least square (OLS) applied tom the model selected by square-root LASSO accounting for possible misspecification of the selected model. In particular, we provide mild con...

  7. Strong Convergence of Partitioning Estimation for Nonparametric Regression Function under Dependence Samples

    Institute of Scientific and Technical Information of China (English)

    LINGNeng-xiang; DUXue-qiao

    2005-01-01

    In this paper, we study the strong consistency for partitioning estimation of regression function under samples that axe φ-mixing sequences with identically distribution.Key words: nonparametric regression function; partitioning estimation; strong convergence;φ-mixing sequences.

  8. Estimation of Spatial Dynamic Nonparametric Durbin Models with Fixed Effects

    Science.gov (United States)

    Qian, Minghui; Hu, Ridong; Chen, Jianwei

    2016-01-01

    Spatial panel data models have been widely studied and applied in both scientific and social science disciplines, especially in the analysis of spatial influence. In this paper, we consider the spatial dynamic nonparametric Durbin model (SDNDM) with fixed effects, which takes the nonlinear factors into account base on the spatial dynamic panel…

  9. A Modified Nonparametric Message Passing Algorithm for Soft Iterative Channel Estimation

    Directory of Open Access Journals (Sweden)

    Linlin Duan

    2013-08-01

    Full Text Available Based on the factor graph framework, we derived a Modified Nonparametric Message Passing Algorithm (MNMPA for soft iterative channel estimation in a Low Density Parity-Check (LDPC coded Bit-Interleaved Coded Modulation (BICM system. The algorithm combines ideas from Particle Filtering (PF with popular factor graph techniques. A Markov Chain Monte Carlo (MCMC move step is added after typical sequential Important Sampling (SIS -resampling to prevent particle impoverishment and to improve channel estimation precision. To reduce complexity, a new max-sum rule for updating particle based messages is reformulated and two proper update schedules are designed. Simulation results illustrate the effectiveness of MNMPA and its comparison with other sum-product algorithms in a Gaussian or non-Gaussian noise environment. We also studied the effect of the particle number, pilot symbol spacing and different schedules on BER performance.

  10. Bayesian nonparametric estimation and consistency of mixed multinomial logit choice models

    CERN Document Server

    De Blasi, Pierpaolo; Lau, John W; 10.3150/09-BEJ233

    2011-01-01

    This paper develops nonparametric estimation for discrete choice models based on the mixed multinomial logit (MMNL) model. It has been shown that MMNL models encompass all discrete choice models derived under the assumption of random utility maximization, subject to the identification of an unknown distribution $G$. Noting the mixture model description of the MMNL, we employ a Bayesian nonparametric approach, using nonparametric priors on the unknown mixing distribution $G$, to estimate choice probabilities. We provide an important theoretical support for the use of the proposed methodology by investigating consistency of the posterior distribution for a general nonparametric prior on the mixing distribution. Consistency is defined according to an $L_1$-type distance on the space of choice probabilities and is achieved by extending to a regression model framework a recent approach to strong consistency based on the summability of square roots of prior probabilities. Moving to estimation, slightly different te...

  11. Parametrically guided estimation in nonparametric varying coefficient models with quasi-likelihood.

    Science.gov (United States)

    Davenport, Clemontina A; Maity, Arnab; Wu, Yichao

    2015-04-01

    Varying coefficient models allow us to generalize standard linear regression models to incorporate complex covariate effects by modeling the regression coefficients as functions of another covariate. For nonparametric varying coefficients, we can borrow the idea of parametrically guided estimation to improve asymptotic bias. In this paper, we develop a guided estimation procedure for the nonparametric varying coefficient models. Asymptotic properties are established for the guided estimators and a method of bandwidth selection via bias-variance tradeoff is proposed. We compare the performance of the guided estimator with that of the unguided estimator via both simulation and real data examples.

  12. Panel data nonparametric estimation of production risk and risk preferences

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We apply nonparametric panel data kernel regression to investigate production risk, out-put price uncertainty, and risk attitudes of Polish dairy farms based on a firm-level unbalanced panel data set that covers the period 2004–2010. We compare different model specifications and different...... approaches for obtaining firm-specific measures of risk attitudes. We found that Polish dairy farmers are risk averse regarding production risk and price uncertainty. According to our results, Polish dairy farmers perceive the production risk as being more significant than the risk related to output price...

  13. Non-Parametric Inference in Astrophysics

    CERN Document Server

    Wasserman, L H; Nichol, R C; Genovese, C; Jang, W; Connolly, A J; Moore, A W; Schneider, J; Wasserman, Larry; Miller, Christopher J.; Nichol, Robert C.; Genovese, Chris; Jang, Woncheol; Connolly, Andrew J.; Moore, Andrew W.; Schneider, Jeff; group, the PICA

    2001-01-01

    We discuss non-parametric density estimation and regression for astrophysics problems. In particular, we show how to compute non-parametric confidence intervals for the location and size of peaks of a function. We illustrate these ideas with recent data on the Cosmic Microwave Background. We also briefly discuss non-parametric Bayesian inference.

  14. Prior processes and their applications nonparametric Bayesian estimation

    CERN Document Server

    Phadia, Eswar G

    2016-01-01

    This book presents a systematic and comprehensive treatment of various prior processes that have been developed over the past four decades for dealing with Bayesian approach to solving selected nonparametric inference problems. This revised edition has been substantially expanded to reflect the current interest in this area. After an overview of different prior processes, it examines the now pre-eminent Dirichlet process and its variants including hierarchical processes, then addresses new processes such as dependent Dirichlet, local Dirichlet, time-varying and spatial processes, all of which exploit the countable mixture representation of the Dirichlet process. It subsequently discusses various neutral to right type processes, including gamma and extended gamma, beta and beta-Stacy processes, and then describes the Chinese Restaurant, Indian Buffet and infinite gamma-Poisson processes, which prove to be very useful in areas such as machine learning, information retrieval and featural modeling. Tailfree and P...

  15. Determining the Mass of Kepler-78b with Nonparametric Gaussian Process Estimation

    Science.gov (United States)

    Grunblatt, Samuel Kai; Howard, Andrew; Haywood, Raphaëlle

    2016-01-01

    Kepler-78b is a transiting planet that is 1.2 times the radius of Earth and orbits a young, active K dwarf every 8 hr. The mass of Kepler-78b has been independently reported by two teams based on radial velocity (RV) measurements using the HIRES and HARPS-N spectrographs. Due to the active nature of the host star, a stellar activity model is required to distinguish and isolate the planetary signal in RV data. Whereas previous studies tested parametric stellar activity models, we modeled this system using nonparametric Gaussian process (GP) regression. We produced a GP regression of relevant Kepler photometry. We then use the posterior parameter distribution for our photometric fit as a prior for our simultaneous GP + Keplerian orbit models of the RV data sets. We tested three simple kernel functions for our GP regressions. Based on a Bayesian likelihood analysis, we selected a quasi-periodic kernel model with GP hyperparameters coupled between the two RV data sets, giving a Doppler amplitude of 1.86 ± 0.25 m s-1 and supporting our belief that the correlated noise we are modeling is astrophysical. The corresponding mass of 1.87-0.26+0.27 ME is consistent with that measured in previous studies, and more robust due to our nonparametric signal estimation. Based on our mass and the radius measurement from transit photometry, Kepler-78b has a bulk density of 6.0-1.4+1.9 g cm-3. We estimate that Kepler-78b is 32% ± 26% iron using a two-component rock-iron model. This is consistent with an Earth-like composition, with uncertainty spanning Moon-like to Mercury-like compositions.

  16. Nonparametric estimates of drift and diffusion profiles via Fokker-Planck algebra.

    Science.gov (United States)

    Lund, Steven P; Hubbard, Joseph B; Halter, Michael

    2014-11-06

    Diffusion processes superimposed upon deterministic motion play a key role in understanding and controlling the transport of matter, energy, momentum, and even information in physics, chemistry, material science, biology, and communications technology. Given functions defining these random and deterministic components, the Fokker-Planck (FP) equation is often used to model these diffusive systems. Many methods exist for estimating the drift and diffusion profiles from one or more identifiable diffusive trajectories; however, when many identical entities diffuse simultaneously, it may not be possible to identify individual trajectories. Here we present a method capable of simultaneously providing nonparametric estimates for both drift and diffusion profiles from evolving density profiles, requiring only the validity of Langevin/FP dynamics. This algebraic FP manipulation provides a flexible and robust framework for estimating stationary drift and diffusion coefficient profiles, is not based on fluctuation theory or solved diffusion equations, and may facilitate predictions for many experimental systems. We illustrate this approach on experimental data obtained from a model lipid bilayer system exhibiting free diffusion and electric field induced drift. The wide range over which this approach provides accurate estimates for drift and diffusion profiles is demonstrated through simulation.

  17. Stahel-Donoho kernel estimation for fixed design nonparametric regression models

    Institute of Scientific and Technical Information of China (English)

    LIN; Lu

    2006-01-01

    This paper reports a robust kernel estimation for fixed design nonparametric regression models.A Stahel-Donoho kernel estimation is introduced,in which the weight functions depend on both the depths of data and the distances between the design points and the estimation points.Based on a local approximation,a computational technique is given to approximate to the incomputable depths of the errors.As a result the new estimator is computationally efficient.The proposed estimator attains a high breakdown point and has perfect asymptotic behaviors such as the asymptotic normality and convergence in the mean squared error.Unlike the depth-weighted estimator for parametric regression models,this depth-weighted nonparametric estimator has a simple variance structure and then we can compare its efficiency with the original one.Some simulations show that the new method can smooth the regression estimation and achieve some desirable balances between robustness and efficiency.

  18. Nonparametric Least Squares Estimation of a Multivariate Convex Regression Function

    CERN Document Server

    Seijo, Emilio

    2010-01-01

    This paper deals with the consistency of the least squares estimator of a convex regression function when the predictor is multidimensional. We characterize and discuss the computation of such an estimator via the solution of certain quadratic and linear programs. Mild sufficient conditions for the consistency of this estimator and its subdifferentials in fixed and stochastic design regression settings are provided. We also consider a regression function which is known to be convex and componentwise nonincreasing and discuss the characterization, computation and consistency of its least squares estimator.

  19. A Nonparametric Approach to Estimate Classification Accuracy and Consistency

    Science.gov (United States)

    Lathrop, Quinn N.; Cheng, Ying

    2014-01-01

    When cut scores for classifications occur on the total score scale, popular methods for estimating classification accuracy (CA) and classification consistency (CC) require assumptions about a parametric form of the test scores or about a parametric response model, such as item response theory (IRT). This article develops an approach to estimate CA…

  20. Nonparametric estimation of the stationary M/G/1 workload distribution function

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted

    2005-01-01

    In this paper it is demonstrated how a nonparametric estimator of the stationary workload distribution function of the M/G/1-queue can be obtained by systematic sampling the workload process. Weak convergence results and bootstrap methods for empirical distribution functions for stationary associ...

  1. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  2. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  3. Nonparametric estimation in an "illness-death" model when all transition times are interval censored

    DEFF Research Database (Denmark)

    Frydman, Halina; Gerds, Thomas; Grøn, Randi

    2013-01-01

    We develop nonparametric maximum likelihood estimation for the parameters of an irreversible Markov chain on states {0,1,2} from the observations with interval censored times of 0 → 1, 0 → 2 and 1 → 2 transitions. The distinguishing aspect of the data is that, in addition to all transition times ...

  4. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  5. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  6. A Hybrid Index for Characterizing Drought Based on a Nonparametric Kernel Estimator

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Shengzhi; Huang, Qiang; Leng, Guoyong; Chang, Jianxia

    2016-06-01

    This study develops a nonparametric multivariate drought index, namely, the Nonparametric Multivariate Standardized Drought Index (NMSDI), by considering the variations of both precipitation and streamflow. Building upon previous efforts in constructing Nonparametric Multivariate Drought Index, we use the nonparametric kernel estimator to derive the joint distribution of precipitation and streamflow, thus providing additional insights in drought index development. The proposed NMSDI are applied in the Wei River Basin (WRB), based on which the drought evolution characteristics are investigated. Results indicate: (1) generally, NMSDI captures the drought onset similar to Standardized Precipitation Index (SPI) and drought termination and persistence similar to Standardized Streamflow Index (SSFI). The drought events identified by NMSDI match well with historical drought records in the WRB. The performances are also consistent with that by an existing Multivariate Standardized Drought Index (MSDI) at various timescales, confirming the validity of the newly constructed NMSDI in drought detections (2) An increasing risk of drought has been detected for the past decades, and will be persistent to a certain extent in future in most areas of the WRB; (3) the identified change points of annual NMSDI are mainly concentrated in the early 1970s and middle 1990s, coincident with extensive water use and soil reservation practices. This study highlights the nonparametric multivariable drought index, which can be used for drought detections and predictions efficiently and comprehensively.

  7. Maximum likelihood estimation for semiparametric density ratio model.

    Science.gov (United States)

    Diao, Guoqing; Ning, Jing; Qin, Jing

    2012-06-27

    In the statistical literature, the conditional density model specification is commonly used to study regression effects. One attractive model is the semiparametric density ratio model, under which the conditional density function is the product of an unknown baseline density function and a known parametric function containing the covariate information. This model has a natural connection with generalized linear models and is closely related to biased sampling problems. Despite the attractive features and importance of this model, most existing methods are too restrictive since they are based on multi-sample data or conditional likelihood functions. The conditional likelihood approach can eliminate the unknown baseline density but cannot estimate it. We propose efficient estimation procedures based on the nonparametric likelihood. The nonparametric likelihood approach allows for general forms of covariates and estimates the regression parameters and the baseline density simultaneously. Therefore, the nonparametric likelihood approach is more versatile than the conditional likelihood approach especially when estimation of the conditional mean or other quantities of the outcome is of interest. We show that the nonparametric maximum likelihood estimators are consistent, asymptotically normal, and asymptotically efficient. Simulation studies demonstrate that the proposed methods perform well in practical settings. A real example is used for illustration.

  8. Nonparametric methods for drought severity estimation at ungauged sites

    Science.gov (United States)

    Sadri, S.; Burn, D. H.

    2012-12-01

    The objective in frequency analysis is, given extreme events such as drought severity or duration, to estimate the relationship between that event and the associated return periods at a catchment. Neural networks and other artificial intelligence approaches in function estimation and regression analysis are relatively new techniques in engineering, providing an attractive alternative to traditional statistical models. There are, however, few applications of neural networks and support vector machines in the area of severity quantile estimation for drought frequency analysis. In this paper, we compare three methods for this task: multiple linear regression, radial basis function neural networks, and least squares support vector regression (LS-SVR). The area selected for this study includes 32 catchments in the Canadian Prairies. From each catchment drought severities are extracted and fitted to a Pearson type III distribution, which act as observed values. For each method-duration pair, we use a jackknife algorithm to produce estimated values at each site. The results from these three approaches are compared and analyzed, and it is found that LS-SVR provides the best quantile estimates and extrapolating capacity.

  9. Efficient robust nonparametric estimation in a semimartingale regression model

    CERN Document Server

    Konev, Victor

    2010-01-01

    The paper considers the problem of robust estimating a periodic function in a continuous time regression model with dependent disturbances given by a general square integrable semimartingale with unknown distribution. An example of such a noise is non-gaussian Ornstein-Uhlenbeck process with the L\\'evy process subordinator, which is used to model the financial Black-Scholes type markets with jumps. An adaptive model selection procedure, based on the weighted least square estimates, is proposed. Under general moment conditions on the noise distribution, sharp non-asymptotic oracle inequalities for the robust risks have been derived and the robust efficiency of the model selection procedure has been shown.

  10. A nonparametric approach to the estimation of diffusion processes, with an application to a short-term interest rate model

    NARCIS (Netherlands)

    Jiang, GJ; Knight, JL

    1997-01-01

    In this paper, we propose a nonparametric identification and estimation procedure for an Ito diffusion process based on discrete sampling observations. The nonparametric kernel estimator for the diffusion function developed in this paper deals with general Ito diffusion processes and avoids any

  11. A nonparametric approach to the estimation of diffusion processes, with an application to a short-term interest rate model

    NARCIS (Netherlands)

    Jiang, GJ; Knight, JL

    1997-01-01

    In this paper, we propose a nonparametric identification and estimation procedure for an Ito diffusion process based on discrete sampling observations. The nonparametric kernel estimator for the diffusion function developed in this paper deals with general Ito diffusion processes and avoids any func

  12. Contingent kernel density estimation.

    Directory of Open Access Journals (Sweden)

    Scott Fortmann-Roe

    Full Text Available Kernel density estimation is a widely used method for estimating a distribution based on a sample of points drawn from that distribution. Generally, in practice some form of error contaminates the sample of observed points. Such error can be the result of imprecise measurements or observation bias. Often this error is negligible and may be disregarded in analysis. In cases where the error is non-negligible, estimation methods should be adjusted to reduce resulting bias. Several modifications of kernel density estimation have been developed to address specific forms of errors. One form of error that has not yet been addressed is the case where observations are nominally placed at the centers of areas from which the points are assumed to have been drawn, where these areas are of varying sizes. In this scenario, the bias arises because the size of the error can vary among points and some subset of points can be known to have smaller error than another subset or the form of the error may change among points. This paper proposes a "contingent kernel density estimation" technique to address this form of error. This new technique adjusts the standard kernel on a point-by-point basis in an adaptive response to changing structure and magnitude of error. In this paper, equations for our contingent kernel technique are derived, the technique is validated using numerical simulations, and an example using the geographic locations of social networking users is worked to demonstrate the utility of the method.

  13. Nonparametric Divergence Estimation with Applications to Machine Learning on Distributions

    CERN Document Server

    Poczos, Barnabas; Schneider, Jeff

    2012-01-01

    Low-dimensional embedding, manifold learning, clustering, classification, and anomaly detection are among the most important problems in machine learning. The existing methods usually consider the case when each instance has a fixed, finite-dimensional feature representation. Here we consider a different setting. We assume that each instance corresponds to a continuous probability distribution. These distributions are unknown, but we are given some i.i.d. samples from each distribution. Our goal is to estimate the distances between these distributions and use these distances to perform low-dimensional embedding, clustering/classification, or anomaly detection for the distributions. We present estimation algorithms, describe how to apply them for machine learning tasks on distributions, and show empirical results on synthetic data, real word images, and astronomical data sets.

  14. Nonparametric spectral-based estimation of latent structures

    OpenAIRE

    Bonhomme, Stéphane; Jochmans, Koen; Robin, Jean-Marc

    2014-01-01

    We present a constructive identification proof of p-linear decompositions of q-way arrays. The analysis is based on the joint spectral decomposition of a set of matrices. It has applications in the analysis of a variety of latent-structure models, such as q-variate mixtures of p distributions. As such, our results provide a constructive alternative to Allman, Matias and Rhodes [2009]. The identification argument suggests a joint approximate-diagonalization estimator that is easy to implement ...

  15. MAP estimators and their consistency in Bayesian nonparametric inverse problems

    Science.gov (United States)

    Dashti, M.; Law, K. J. H.; Stuart, A. M.; Voss, J.

    2013-09-01

    We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map {G} applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ0. We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μy. Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager-Machlup functional defined on the Cameron-Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of {G}(u) can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier-Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics.

  16. MAP estimators and their consistency in Bayesian nonparametric inverse problems

    KAUST Repository

    Dashti, M.

    2013-09-01

    We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ0. We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μy. Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager-Machlup functional defined on the Cameron-Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier-Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. © 2013 IOP Publishing Ltd.

  17. On the Choice of Difference Sequence in a Unified Framework for Variance Estimation in Nonparametric Regression

    KAUST Repository

    Dai, Wenlin

    2017-09-01

    Difference-based methods do not require estimating the mean function in nonparametric regression and are therefore popular in practice. In this paper, we propose a unified framework for variance estimation that combines the linear regression method with the higher-order difference estimators systematically. The unified framework has greatly enriched the existing literature on variance estimation that includes most existing estimators as special cases. More importantly, the unified framework has also provided a smart way to solve the challenging difference sequence selection problem that remains a long-standing controversial issue in nonparametric regression for several decades. Using both theory and simulations, we recommend to use the ordinary difference sequence in the unified framework, no matter if the sample size is small or if the signal-to-noise ratio is large. Finally, to cater for the demands of the application, we have developed a unified R package, named VarED, that integrates the existing difference-based estimators and the unified estimators in nonparametric regression and have made it freely available in the R statistical program http://cran.r-project.org/web/packages/.

  18. Estimating Financial Risk Measures for Futures Positions:A Non-Parametric Approach

    OpenAIRE

    Cotter, John; dowd, kevin

    2011-01-01

    This paper presents non-parametric estimates of spectral risk measures applied to long and short positions in 5 prominent equity futures contracts. It also compares these to estimates of two popular alternative measures, the Value-at-Risk (VaR) and Expected Shortfall (ES). The spectral risk measures are conditioned on the coefficient of absolute risk aversion, and the latter two are conditioned on the confidence level. Our findings indicate that all risk measures increase dramatically and the...

  19. Non-Parametric Bayesian State Space Estimator for Negative Information

    Directory of Open Access Journals (Sweden)

    Guillaume de Chambrier

    2017-09-01

    Full Text Available Simultaneous Localization and Mapping (SLAM is concerned with the development of filters to accurately and efficiently infer the state parameters (position, orientation, etc. of an agent and aspects of its environment, commonly referred to as the map. A mapping system is necessary for the agent to achieve situatedness, which is a precondition for planning and reasoning. In this work, we consider an agent who is given the task of finding a set of objects. The agent has limited perception and can only sense the presence of objects if a direct contact is made, as a result most of the sensing is negative information. In the absence of recurrent sightings or direct measurements of objects, there are no correlations from the measurement errors that can be exploited. This renders SLAM estimators, for which this fact is their backbone such as EKF-SLAM, ineffective. In addition for our setting, no assumptions are taken with respect to the marginals (beliefs of both the agent and objects (map. From the loose assumptions we stipulate regarding the marginals and measurements, we adopt a histogram parametrization. We introduce a Bayesian State Space Estimator (BSSE, which we name Measurement Likelihood Memory Filter (MLMF, in which the values of the joint distribution are not parametrized but instead we directly apply changes from the measurement integration step to the marginals. This is achieved by keeping track of the history of likelihood functions’ parameters. We demonstrate that the MLMF gives the same filtered marginals as a histogram filter and show two implementations: MLMF and scalable-MLMF that both have a linear space complexity. The original MLMF retains an exponential time complexity (although an order of magnitude smaller than the histogram filter while the scalable-MLMF introduced independence assumption such to have a linear time complexity. We further quantitatively demonstrate the scalability of our algorithm with 25 beliefs having up to

  20. Airborne Crowd Density Estimation

    Science.gov (United States)

    Meynberg, O.; Kuschk, G.

    2013-10-01

    This paper proposes a new method for estimating human crowd densities from aerial imagery. Applications benefiting from an accurate crowd monitoring system are mainly found in the security sector. Normally crowd density estimation is done through in-situ camera systems mounted on high locations although this is not appropriate in case of very large crowds with thousands of people. Using airborne camera systems in these scenarios is a new research topic. Our method uses a preliminary filtering of the whole image space by suitable and fast interest point detection resulting in a number of image regions, possibly containing human crowds. Validation of these candidates is done by transforming the corresponding image patches into a low-dimensional and discriminative feature space and classifying the results using a support vector machine (SVM). The feature space is spanned by texture features computed by applying a Gabor filter bank with varying scale and orientation to the image patches. For evaluation, we use 5 different image datasets acquired by the 3K+ aerial camera system of the German Aerospace Center during real mass events like concerts or football games. To evaluate the robustness and generality of our method, these datasets are taken from different flight heights between 800 m and 1500 m above ground (keeping a fixed focal length) and varying daylight and shadow conditions. The results of our crowd density estimation are evaluated against a reference data set obtained by manually labeling tens of thousands individual persons in the corresponding datasets and show that our method is able to estimate human crowd densities in challenging realistic scenarios.

  1. Nonparametric variance estimation in the analysis of microarray data: a measurement error approach.

    Science.gov (United States)

    Carroll, Raymond J; Wang, Yuedong

    2008-01-01

    This article investigates the effects of measurement error on the estimation of nonparametric variance functions. We show that either ignoring measurement error or direct application of the simulation extrapolation, SIMEX, method leads to inconsistent estimators. Nevertheless, the direct SIMEX method can reduce bias relative to a naive estimator. We further propose a permutation SIMEX method which leads to consistent estimators in theory. The performance of both SIMEX methods depends on approximations to the exact extrapolants. Simulations show that both SIMEX methods perform better than ignoring measurement error. The methodology is illustrated using microarray data from colon cancer patients.

  2. Measuring the price responsiveness of gasoline demand: economic shape restrictions and nonparametric demand estimation

    OpenAIRE

    Blundell, Richard; Horowitz, Joel L.; Parey, Matthias

    2011-01-01

    This paper develops a new method for estimating a demand function and the welfare consequences of price changes. The method is applied to gasoline demand in the U.S. and is applicable to other goods. The method uses shape restrictions derived from economic theory to improve the precision of a nonparametric estimate of the demand function. Using data from the U.S. National Household Travel Survey, we show that the restrictions are consistent with the data on gasoline demand and remove the anom...

  3. Bayesian Nonparametric Mixture Estimation for Time-Indexed Functional Data in R

    Directory of Open Access Journals (Sweden)

    Terrance D. Savitsky

    2016-08-01

    Full Text Available We present growfunctions for R that offers Bayesian nonparametric estimation models for analysis of dependent, noisy time series data indexed by a collection of domains. This data structure arises from combining periodically published government survey statistics, such as are reported in the Current Population Study (CPS. The CPS publishes monthly, by-state estimates of employment levels, where each state expresses a noisy time series. Published state-level estimates from the CPS are composed from household survey responses in a model-free manner and express high levels of volatility due to insufficient sample sizes. Existing software solutions borrow information over a modeled time-based dependence to extract a de-noised time series for each domain. These solutions, however, ignore the dependence among the domains that may be additionally leveraged to improve estimation efficiency. The growfunctions package offers two fully nonparametric mixture models that simultaneously estimate both a time and domain-indexed dependence structure for a collection of time series: (1 A Gaussian process (GP construction, which is parameterized through the covariance matrix, estimates a latent function for each domain. The covariance parameters of the latent functions are indexed by domain under a Dirichlet process prior that permits estimation of the dependence among functions across the domains: (2 An intrinsic Gaussian Markov random field prior construction provides an alternative to the GP that expresses different computation and estimation properties. In addition to performing denoised estimation of latent functions from published domain estimates, growfunctions allows estimation of collections of functions for observation units (e.g., households, rather than aggregated domains, by accounting for an informative sampling design under which the probabilities for inclusion of observation units are related to the response variable. growfunctions includes plot

  4. Fault prediction of fighter based on nonparametric density estimation

    Institute of Scientific and Technical Information of China (English)

    Zhang Zhengdao; Hu Shousong

    2005-01-01

    Fighters and other complex engineering systems have many characteristics such as difficult modeling and testing, multiple working situations, and high cost. Aim at these points, a new kind of real-time fault predictor is designed based on an improved k-nearest neighbor method, which needs neither the math model of system nor the training data and prior knowledge. It can study and predict while system's running, so that it can overcome the difficulty of data acquirement. Besides, this predictor has a fast prediction speed, and the false alarm rate and missing alarm rate can be adjusted randomly. The method is simple and universalizable. The result of simulation on fighter F-16 proved the efficiency.

  5. A Critical Evaluation of the Nonparametric Approach to Estimate Terrestrial Evaporation

    Directory of Open Access Journals (Sweden)

    Yongmin Yang

    2016-01-01

    Full Text Available Evapotranspiration (ET estimation has been one of the most challenging problems in recent decades for hydrometeorologists. In this study, a nonparametric approach to estimate terrestrial evaporation was evaluated using both model simulation and measurements from three sites. Both the model simulation and the in situ evaluation at the Tiger Bush Site revealed that this approach would greatly overestimate ET under dry conditions (evaporative fraction smaller than 0.4. For the evaluation at the Tiger Bush Site, the difference between ET estimates and site observations could be as large as 130 W/m2. However, this approach provided good estimates over the two crop sites. The Nash-Sutcliffe coefficient (E was 0.9 and 0.94, respectively, for WC06 and Yingke. A further theoretical analysis indicates the nonparametric approach is very close to the equilibrium evaporation equation under wet conditions, and this can explain the good performance of this approach at the two crop sites in this study. The evaluation indicates that this approach needs more careful appraisal and that its application in dry conditions should be avoided.

  6. Statistical inference based on the nonparametric maximum likelihood estimator under double-truncation.

    Science.gov (United States)

    Emura, Takeshi; Konno, Yoshihiko; Michimae, Hirofumi

    2015-07-01

    Doubly truncated data consist of samples whose observed values fall between the right- and left- truncation limits. With such samples, the distribution function of interest is estimated using the nonparametric maximum likelihood estimator (NPMLE) that is obtained through a self-consistency algorithm. Owing to the complicated asymptotic distribution of the NPMLE, the bootstrap method has been suggested for statistical inference. This paper proposes a closed-form estimator for the asymptotic covariance function of the NPMLE, which is computationally attractive alternative to bootstrapping. Furthermore, we develop various statistical inference procedures, such as confidence interval, goodness-of-fit tests, and confidence bands to demonstrate the usefulness of the proposed covariance estimator. Simulations are performed to compare the proposed method with both the bootstrap and jackknife methods. The methods are illustrated using the childhood cancer dataset.

  7. Comparison between scaling law and nonparametric Bayesian estimate for the recurrence time of strong earthquakes

    Science.gov (United States)

    Rotondi, R.

    2009-04-01

    According to the unified scaling theory the probability distribution function of the recurrence time T is a scaled version of a base function and the average value of T can be used as a scale parameter for the distribution. The base function must belong to the scale family of distributions: tested on different catalogues and for different scale levels, for Corral (2005) the (truncated) generalized gamma distribution is the best model, for German (2006) the Weibull distribution. The scaling approach should overcome the difficulty of estimating distribution functions over small areas but theorical limitations and partial instability of the estimated distributions have been pointed out in the literature. Our aim is to analyze the recurrence time of strong earthquakes that occurred in the Italian territory. To satisfy the hypotheses of independence and identical distribution we have evaluated the times between events that occurred in each area of the Database of Individual Seismogenic Sources and then we have gathered them by eight tectonically coherent regions, each of them dominated by a well characterized geodynamic process. To solve problems like: paucity of data, presence of outliers and uncertainty in the choice of the functional expression for the distribution of t, we have followed a nonparametric approach (Rotondi (2009)) in which: (a) the maximum flexibility is obtained by assuming that the probability distribution is a random function belonging to a large function space, distributed as a stochastic process; (b) nonparametric estimation method is robust when the data contain outliers; (c) Bayesian methodology allows to exploit different information sources so that the model fitting may be good also to scarce samples. We have compared the hazard rates evaluated through the parametric and nonparametric approach. References Corral A. (2005). Mixing of rescaled data and Bayesian inference for earthquake recurrence times, Nonlin. Proces. Geophys., 12, 89

  8. A comparison of nonparametric estimators of survival under left-truncation and right-censoring motivated by a case study

    Directory of Open Access Journals (Sweden)

    Mauro Gasparini

    2013-05-01

    Full Text Available We present an application of nonparametric estimation of survival in the presence of left-truncated and right-censored data. We confirm the well-known unstable behavior of the survival estimates when the risk set is small and there are too few early deaths. How ever, in our real scenario where only few death times are necessarily available, the proper nonparametric maximum likelihood estimator, and its usual modification, behave less badly than alternative methods proposed in the literature. The relative merits of the different estimators are discussed in a simulation study extending the settings of the case study to more general scenarios.

  9. Application of the LSQR algorithm in non-parametric estimation of aerosol size distribution

    Science.gov (United States)

    He, Zhenzong; Qi, Hong; Lew, Zhongyuan; Ruan, Liming; Tan, Heping; Luo, Kun

    2016-05-01

    Based on the Least Squares QR decomposition (LSQR) algorithm, the aerosol size distribution (ASD) is retrieved in non-parametric approach. The direct problem is solved by the Anomalous Diffraction Approximation (ADA) and the Lambert-Beer Law. An optimal wavelength selection method is developed to improve the retrieval accuracy of the ASD. The proposed optimal wavelength set is selected by the method which can make the measurement signals sensitive to wavelength and decrease the degree of the ill-condition of coefficient matrix of linear systems effectively to enhance the anti-interference ability of retrieval results. Two common kinds of monomodal and bimodal ASDs, log-normal (L-N) and Gamma distributions, are estimated, respectively. Numerical tests show that the LSQR algorithm can be successfully applied to retrieve the ASD with high stability in the presence of random noise and low susceptibility to the shape of distributions. Finally, the experimental measurement ASD over Harbin in China is recovered reasonably. All the results confirm that the LSQR algorithm combined with the optimal wavelength selection method is an effective and reliable technique in non-parametric estimation of ASD.

  10. A non-parametric approach to estimate the total deviation index for non-normal data.

    Science.gov (United States)

    Perez-Jaume, Sara; Carrasco, Josep L

    2015-11-10

    Concordance indices are used to assess the degree of agreement between different methods that measure the same characteristic. In this context, the total deviation index (TDI) is an unscaled concordance measure that quantifies to which extent the readings from the same subject obtained by different methods may differ with a certain probability. Common approaches to estimate the TDI assume data are normally distributed and linearity between response and effects (subjects, methods and random error). Here, we introduce a new non-parametric methodology for estimation and inference of the TDI that can deal with any kind of quantitative data. The present study introduces this non-parametric approach and compares it with the already established methods in two real case examples that represent situations of non-normal data (more specifically, skewed data and count data). The performance of the already established methodologies and our approach in these contexts is assessed by means of a simulation study. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    Science.gov (United States)

    González, Adriana; Delouille, Véronique; Jacques, Laurent

    2016-01-01

    Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF). Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting). The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  12. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    Directory of Open Access Journals (Sweden)

    González Adriana

    2016-01-01

    Full Text Available Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF. Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting. The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  13. Nonparametric estimation of the heterogeneity of a random medium using compound Poisson process modeling of wave multiple scattering

    Science.gov (United States)

    Le Bihan, Nicolas; Margerin, Ludovic

    2009-07-01

    In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity of waves transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using compound Poisson processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.

  14. Nonparametric estimation of the heterogeneity of a random medium using Compound Poisson Process modeling of wave multiple scattering

    CERN Document Server

    Bihan, Nicolas Le

    2009-01-01

    In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using Compound Poisson Processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.

  15. A NEW DE-NOISING METHOD BASED ON 3-BAND WAVELET AND NONPARAMETRIC ADAPTIVE ESTIMATION

    Institute of Scientific and Technical Information of China (English)

    Li Li; Peng Yuhua; Yang Mingqiang; Xue Peijun

    2007-01-01

    Wavelet de-noising has been well known as an important method of signal de-noising.Recently,most of the research efforts about wavelet de-noising focus on how to select the threshold,where Donoho method is applied widely.Compared with traditional 2-band wavelet,3-band wavelet has advantages in many aspects.According to this theory,an adaptive signal de-noising method in 3-band wavelet domain based on nonparametric adaptive estimation is proposed.The experimental results show that in 3-band wavelet domain,the proposed method represents better characteristics than Donoho method in protecting detail and improving the signal-to-noise ratio of reconstruction signal.

  16. Nonparametric Estimates of Gene × Environment Interaction Using Local Structural Equation Modeling

    Science.gov (United States)

    Briley, Daniel A.; Harden, K. Paige; Bates, Timothy C.; Tucker-Drob, Elliot M.

    2017-01-01

    Gene × Environment (G×E) interaction studies test the hypothesis that the strength of genetic influence varies across environmental contexts. Existing latent variable methods for estimating G×E interactions in twin and family data specify parametric (typically linear) functions for the interaction effect. An improper functional form may obscure the underlying shape of the interaction effect and may lead to failures to detect a significant interaction. In this article, we introduce a novel approach to the behavior genetic toolkit, local structural equation modeling (LOSEM). LOSEM is a highly flexible nonparametric approach for estimating latent interaction effects across the range of a measured moderator. This approach opens up the ability to detect and visualize new forms of G×E interaction. We illustrate the approach by using LOSEM to estimate gene × socioeconomic status (SES) interactions for six cognitive phenotypes. Rather than continuously and monotonically varying effects as has been assumed in conventional parametric approaches, LOSEM indicated substantial nonlinear shifts in genetic variance for several phenotypes. The operating characteristics of LOSEM were interrogated through simulation studies where the functional form of the interaction effect was known. LOSEM provides a conservative estimate of G×E interaction with sufficient power to detect statistically significant G×E signal with moderate sample size. We offer recommendations for the application of LOSEM and provide scripts for implementing these biometric models in Mplus and in OpenMx under R. PMID:26318287

  17. Semi- and Nonparametric ARCH Processes

    Directory of Open Access Journals (Sweden)

    Oliver B. Linton

    2011-01-01

    Full Text Available ARCH/GARCH modelling has been successfully applied in empirical finance for many years. This paper surveys the semiparametric and nonparametric methods in univariate and multivariate ARCH/GARCH models. First, we introduce some specific semiparametric models and investigate the semiparametric and nonparametrics estimation techniques applied to: the error density, the functional form of the volatility function, the relationship between mean and variance, long memory processes, locally stationary processes, continuous time processes and multivariate models. The second part of the paper is about the general properties of such processes, including stationary conditions, ergodic conditions and mixing conditions. The last part is on the estimation methods in ARCH/GARCH processes.

  18. [Nonparametric method of estimating survival functions containing right-censored and interval-censored data].

    Science.gov (United States)

    Xu, Yonghong; Gao, Xiaohuan; Wang, Zhengxi

    2014-04-01

    Missing data represent a general problem in many scientific fields, especially in medical survival analysis. Dealing with censored data, interpolation method is one of important methods. However, most of the interpolation methods replace the censored data with the exact data, which will distort the real distribution of the censored data and reduce the probability of the real data falling into the interpolation data. In order to solve this problem, we in this paper propose a nonparametric method of estimating the survival function of right-censored and interval-censored data and compare its performance to SC (self-consistent) algorithm. Comparing to the average interpolation and the nearest neighbor interpolation method, the proposed method in this paper replaces the right-censored data with the interval-censored data, and greatly improves the probability of the real data falling into imputation interval. Then it bases on the empirical distribution theory to estimate the survival function of right-censored and interval-censored data. The results of numerical examples and a real breast cancer data set demonstrated that the proposed method had higher accuracy and better robustness for the different proportion of the censored data. This paper provides a good method to compare the clinical treatments performance with estimation of the survival data of the patients. This pro vides some help to the medical survival data analysis.

  19. Nonparametric signal processing validation in T-wave alternans detection and estimation.

    Science.gov (United States)

    Goya-Esteban, R; Barquero-Pérez, O; Blanco-Velasco, M; Caamaño-Fernández, A J; García-Alberola, A; Rojo-Álvarez, J L

    2014-04-01

    Although a number of methods have been proposed for T-Wave Alternans (TWA) detection and estimation, their performance strongly depends on their signal processing stages and on their free parameters tuning. The dependence of the system quality with respect to the main signal processing stages in TWA algorithms has not yet been studied. This study seeks to optimize the final performance of the system by successive comparisons of pairs of TWA analysis systems, with one single processing difference between them. For this purpose, a set of decision statistics are proposed to evaluate the performance, and a nonparametric hypothesis test (from Bootstrap resampling) is used to make systematic decisions. Both the temporal method (TM) and the spectral method (SM) are analyzed in this study. The experiments were carried out in two datasets: first, in semisynthetic signals with artificial alternant waves and added noise; second, in two public Holter databases with different documented risk of sudden cardiac death. For semisynthetic signals (SNR = 15 dB), after the optimization procedure, a reduction of 34.0% (TM) and 5.2% (SM) of the power of TWA amplitude estimation errors was achieved, and the power of error probability was reduced by 74.7% (SM). For Holter databases, appropriate tuning of several processing blocks, led to a larger intergroup separation between the two populations for TWA amplitude estimation. Our proposal can be used as a systematic procedure for signal processing block optimization in TWA algorithmic implementations.

  20. Non-Parametric Evolutionary Algorithm for Estimating Root Zone Soil Moisture

    Science.gov (United States)

    Mohanty, B.; Shin, Y.; Ines, A. M.

    2013-12-01

    Prediction of root zone soil moisture is critical for water resources management. In this study, we explored a non-parametric evolutionary algorithm for estimating root zone soil moisture from a time series of spatially-distributed rainfall across multiple weather locations under two different hydro-climatic regions. A new genetic algorithm-based hidden Markov model (HMMGA) was developed to estimate long-term root zone soil moisture dynamics at different soil depths. Also, we analyzed rainfall occurrence probabilities and dry/wet spell lengths reproduced by this approach. The HMMGA was used to estimate the optimal state sequences (weather states) based on the precipitation history. Historical root zone soil moisture statistics were then determined based on the weather state conditions. To test the new approach, we selected two different soil moisture fields, Oklahoma (130 km x 130 km) and Illinois (300 km x 500 km), during 1995 to 2009 and 1994 to 2010, respectively. We found that the newly developed framework performed well in predicting root zone soil moisture dynamics at both the spatial scales. Also, the reproduced rainfall occurrence probabilities and dry/wet spell lengths matched well with the observations at the spatio-temporal scales. Since the proposed algorithm requires only precipitation and historical soil moisture data from existing, established weather stations, it can serve an attractive alternative for predicting root zone soil moisture in the future using climate change scenarios and root zone soil moisture history.

  1. Adaptive Particle Filter for Nonparametric Estimation with Measurement Uncertainty in Wireless Sensor Networks.

    Science.gov (United States)

    Li, Xiaofan; Zhao, Yubin; Zhang, Sha; Fan, Xiaopeng

    2016-05-30

    Particle filters (PFs) are widely used for nonlinear signal processing in wireless sensor networks (WSNs). However, the measurement uncertainty makes the WSN observations unreliable to the actual case and also degrades the estimation accuracy of the PFs. In addition to the algorithm design, few works focus on improving the likelihood calculation method, since it can be pre-assumed by a given distribution model. In this paper, we propose a novel PF method, which is based on a new likelihood fusion method for WSNs and can further improve the estimation performance. We firstly use a dynamic Gaussian model to describe the nonparametric features of the measurement uncertainty. Then, we propose a likelihood adaptation method that employs the prior information and a belief factor to reduce the measurement noise. The optimal belief factor is attained by deriving the minimum Kullback-Leibler divergence. The likelihood adaptation method can be integrated into any PFs, and we use our method to develop three versions of adaptive PFs for a target tracking system using wireless sensor network. The simulation and experimental results demonstrate that our likelihood adaptation method has greatly improved the estimation performance of PFs in a high noise environment. In addition, the adaptive PFs are highly adaptable to the environment without imposing computational complexity.

  2. Posterior contraction rate for non-parametric Bayesian estimation of the dispersion coefficient of a stochastic differential equation

    NARCIS (Netherlands)

    Gugushvili, S.; Spreij, P.

    2016-01-01

    We consider the problem of non-parametric estimation of the deterministic dispersion coefficient of a linear stochastic differential equation based on discrete time observations on its solution. We take a Bayesian approach to the problem and under suitable regularity assumptions derive the posteror

  3. Estimation of Subpixel Snow-Covered Area by Nonparametric Regression Splines

    Science.gov (United States)

    Kuter, S.; Akyürek, Z.; Weber, G.-W.

    2016-10-01

    Measurement of the areal extent of snow cover with high accuracy plays an important role in hydrological and climate modeling. Remotely-sensed data acquired by earth-observing satellites offer great advantages for timely monitoring of snow cover. However, the main obstacle is the tradeoff between temporal and spatial resolution of satellite imageries. Soft or subpixel classification of low or moderate resolution satellite images is a preferred technique to overcome this problem. The most frequently employed snow cover fraction methods applied on Moderate Resolution Imaging Spectroradiometer (MODIS) data have evolved from spectral unmixing and empirical Normalized Difference Snow Index (NDSI) methods to latest machine learning-based artificial neural networks (ANNs). This study demonstrates the implementation of subpixel snow-covered area estimation based on the state-of-the-art nonparametric spline regression method, namely, Multivariate Adaptive Regression Splines (MARS). MARS models were trained by using MODIS top of atmospheric reflectance values of bands 1-7 as predictor variables. Reference percentage snow cover maps were generated from higher spatial resolution Landsat ETM+ binary snow cover maps. A multilayer feed-forward ANN with one hidden layer trained with backpropagation was also employed to estimate the percentage snow-covered area on the same data set. The results indicated that the developed MARS model performed better than th

  4. Comparison of density estimators. [Estimation of probability density functions

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.; Monahan, J.F.

    1977-09-01

    Recent work in the field of probability density estimation has included the introduction of some new methods, such as the polynomial and spline methods and the nearest neighbor method, and the study of asymptotic properties in depth. This earlier work is summarized here. In addition, the computational complexity of the various algorithms is analyzed, as are some simulations. The object is to compare the performance of the various methods in small samples and their sensitivity to change in their parameters, and to attempt to discover at what point a sample is so small that density estimation can no longer be worthwhile. (RWR)

  5. Density and hazard rate estimation for censored and a-mixing data using gamma kernels

    OpenAIRE

    2006-01-01

    In this paper we consider the nonparametric estimation for a density and hazard rate function for right censored -mixing survival time data using kernel smoothing techniques. Since survival times are positive with potentially a high concentration at zero, one has to take into account the bias problems when the functions are estimated in the boundary region. In this paper, gamma kernel estimators of the density and the hazard rate function are proposed. The estimators use adaptive weights depe...

  6. A method for density estimation based on expectation identities

    Science.gov (United States)

    Peralta, Joaquín; Loyola, Claudia; Loguercio, Humberto; Davis, Sergio

    2017-06-01

    We present a simple and direct method for non-parametric estimation of a one-dimensional probability density, based on the application of the recent conjugate variables theorem. The method expands the logarithm of the probability density ln P(x|I) in terms of a complete basis and numerically solves for the coefficients of the expansion using a linear system of equations. No Monte Carlo sampling is needed. We present preliminary results that show the practical usefulness of the method for modeling statistical data.

  7. Nonparametric VSS-APA based on precise background noise power estimate

    Institute of Scientific and Technical Information of China (English)

    昊翔; 赖晓翰; 陈隆道; 蔡忠法

    2015-01-01

    The adaptive algorithm used for echo cancellation (EC) system needs to provide 1) low misadjustment and 2) high convergence rate. The affine projection algorithm (APA) is a better alternative than normalized least mean square (NLMS) algorithm in EC applications where the input signal is highly correlated. Since the APA with a constant step-size has to make compromise between the performance criteria 1) and 2), a variable step-size APA (VSS-APA) provides a more reliable solution. A nonparametric VSS-APA (NPVSS-APA) is proposed by recovering the background noise within the error signal instead of cancelling the a posteriori errors. The most problematic term of its variable step-size formula is the value of background noise power (BNP). The power difference between the desired signal and output signal, which equals the power of error signal statistically, has been considered the BNP estimate in a rough manner. Considering that the error signal consists of background noise and misalignment noise, a precise BNP estimate is achieved by multiplying the rough estimate with a corrective factor. After the analysis on the power ratio of misalignment noise to background noise of APA, the corrective factor is formulated depending on the projection order and the latest value of variable step-size. The new algorithm which does not require any a priori knowledge of EC environment has the advantage of easier controllability in practical application. The simulation results in the EC context indicate the accuracy of the proposed BNP estimate and the more effective behavior of the proposed algorithm compared with other versions of APA class.

  8. Quantiles, parametric-select density estimation, and bi-information parameter estimators

    Science.gov (United States)

    Parzen, E.

    1982-01-01

    A quantile-based approach to statistical analysis and probability modeling of data is presented which formulates statistical inference problems as functional inference problems in which the parameters to be estimated are density functions. Density estimators can be non-parametric (computed independently of model identified) or parametric-select (approximated by finite parametric models that can provide standard models whose fit can be tested). Exponential models and autoregressive models are approximating densities which can be justified as maximum entropy for respectively the entropy of a probability density and the entropy of a quantile density. Applications of these ideas are outlined to the problems of modeling: (1) univariate data; (2) bivariate data and tests for independence; and (3) two samples and likelihood ratios. It is proposed that bi-information estimation of a density function can be developed by analogy to the problem of identification of regression models.

  9. Parallel Multiscale Autoregressive Density Estimation

    OpenAIRE

    Reed, Scott; Oord, Aäron van den; Kalchbrenner, Nal; Colmenarejo, Sergio Gómez; Wang, Ziyu; Belov, Dan; de Freitas, Nando

    2017-01-01

    PixelCNN achieves state-of-the-art results in density estimation for natural images. Although training is fast, inference is costly, requiring one network evaluation per pixel; O(N) for N pixels. This can be sped up by caching activations, but still involves generating each pixel sequentially. In this work, we propose a parallelized PixelCNN that allows more efficient inference by modeling certain pixel groups as conditionally independent. Our new PixelCNN model achieves competitive density e...

  10. 基于非参数密度估计对烤烟烟叶颜色CIE 1976(L*a*b*)色空间的三维空间图的研究%Research on Three-dimensional Spatial Diagram for CIE 1976(L*a*b*) Color Space of Flue-cured Tobacco Leaf Color Based on Non-parametric Density Estimate

    Institute of Scientific and Technical Information of China (English)

    王浩雅; 马金晶; 王理珉; 张群芳; 孙力; 张涛; 石凤学; 和智君

    2013-01-01

    利用色差仪对云南9个烟区K326烤烟品种,3个部位的烟叶进行了颜色指标的测定.应用CIE 1976(L*a*b*)色空间理论,采用非参数密度估计函数,对颜色数据做非参数密度估计图,并以颜色的3个指标L、a*b*,分别作三维空间的坐标轴Z轴、X轴、Y轴,绘制了三维山峰图和等值线图.结果表明,通过三维图能够清晰地表征出3个部位的空间层次分布,通过山峰图能有空间立体层次地表现颜色指标,通过等值线图能看出每个部位颜色数据的分布情况,解决了颜色3个指标来表征一个特性的情况,可为今后颜色指标的数据库建立提供一定的基础.%In 9 tobacco - planting areas of Yunnan province, the color indicators of 3 parts of flue - cured tobacco variety K326 were measured by color difference instrument. CIE 1976(L* a* b* ) color space theory and nonparametric density estimate function were used to make nonparametric density estimation diagram of color data. Three indicators of color L, a * , b * were separately taken as the Z, X, Y coordinate axis of the three - dimensional space, and the three - dimensional peak image and contour map were drawn. The results showed that the three - dimensional peak image could clearly characterize the spatial and stepwise distributions of the three parts and color indicators. Contour map could understand the distributive conditions of color data every part, which solved a problem that three color indicators could only express a character. Therefore, it can provide a certain basis for establishing color indicator database in the future.

  11. Essays on parametric and nonparametric modeling and estimation with applications to energy economics

    Science.gov (United States)

    Gao, Weiyu

    My dissertation research is composed of two parts: a theoretical part on semiparametric efficient estimation and an applied part in energy economics under different dynamic settings. The essays are related in terms of their applications as well as the way in which models are constructed and estimated. In the first essay, efficient estimation of the partially linear model is studied. We work out the efficient score functions and efficiency bounds under four stochastic restrictions---independence, conditional symmetry, conditional zero mean, and partially conditional zero mean. A feasible efficient estimation method for the linear part of the model is developed based on the efficient score. A battery of specification test that allows for choosing between the alternative assumptions is provided. A Monte Carlo simulation is also conducted. The second essay presents a dynamic optimization model for a stylized oilfield resembling the largest developed light oil field in Saudi Arabia, Ghawar. We use data from different sources to estimate the oil production cost function and the revenue function. We pay particular attention to the dynamic aspect of the oil production by employing petroleum-engineering software to simulate the interaction between control variables and reservoir state variables. Optimal solutions are studied under different scenarios to account for the possible changes in the exogenous variables and the uncertainty about the forecasts. The third essay examines the effect of oil price volatility on the level of innovation displayed by the U.S. economy. A measure of innovation is calculated by decomposing an output-based Malmquist index. We also construct a nonparametric measure for oil price volatility. Technical change and oil price volatility are then placed in a VAR system with oil price and a variable indicative of monetary policy. The system is estimated and analyzed for significant relationships. We find that oil price volatility displays a significant

  12. Nonparametric Bayesian inference for multidimensional compound Poisson processes

    NARCIS (Netherlands)

    S. Gugushvili; F. van der Meulen; P. Spreij

    2015-01-01

    Given a sample from a discretely observed multidimensional compound Poisson process, we study the problem of nonparametric estimation of its jump size density r0 and intensity λ0. We take a nonparametric Bayesian approach to the problem and determine posterior contraction rates in this context, whic

  13. Estimating the Probability of Being the Best System: A Generalized Method and Nonparametric Hypothesis Test

    Science.gov (United States)

    2013-03-01

    Mendenhall , and Sheaffer [25]. For the remainder of this paper, however, we will make use of the Wilcoxon rank sum test for purposes of comparison with the...B. W. Silverman, Density Estimation for Statistics and Data Analysis, Chapman & Hall/CRC, 1986, p. 48. [25] D. D. Wackerly, W. Mendenhall III and R

  14. Regularized Regression and Density Estimation based on Optimal Transport

    KAUST Repository

    Burger, M.

    2012-03-11

    The aim of this paper is to investigate a novel nonparametric approach for estimating and smoothing density functions as well as probability densities from discrete samples based on a variational regularization method with the Wasserstein metric as a data fidelity. The approach allows a unified treatment of discrete and continuous probability measures and is hence attractive for various tasks. In particular, the variational model for special regularization functionals yields a natural method for estimating densities and for preserving edges in the case of total variation regularization. In order to compute solutions of the variational problems, a regularized optimal transport problem needs to be solved, for which we discuss several formulations and provide a detailed analysis. Moreover, we compute special self-similar solutions for standard regularization functionals and we discuss several computational approaches and results. © 2012 The Author(s).

  15. oA novel nonparametric approach for estimating cut-offs in continuous risk indicators with application to diabetes epidemiology

    Directory of Open Access Journals (Sweden)

    Ferger Dietmar

    2009-09-01

    Full Text Available Abstract Background Epidemiological and clinical studies, often including anthropometric measures, have established obesity as a major risk factor for the development of type 2 diabetes. Appropriate cut-off values for anthropometric parameters are necessary for prediction or decision purposes. The cut-off corresponding to the Youden-Index is often applied in epidemiology and biomedical literature for dichotomizing a continuous risk indicator. Methods Using data from a representative large multistage longitudinal epidemiological study in a primary care setting in Germany, this paper explores a novel approach for estimating optimal cut-offs of anthropomorphic parameters for predicting type 2 diabetes based on a discontinuity of a regression function in a nonparametric regression framework. Results The resulting cut-off corresponded to values obtained by the Youden Index (maximum of the sum of sensitivity and specificity, minus one, often considered the optimal cut-off in epidemiological and biomedical research. The nonparametric regression based estimator was compared to results obtained by the established methods of the Receiver Operating Characteristic plot in various simulation scenarios and based on bias and root mean square error, yielded excellent finite sample properties. Conclusion It is thus recommended that this nonparametric regression approach be considered as valuable alternative when a continuous indicator has to be dichotomized at the Youden Index for prediction or decision purposes.

  16. ks: Kernel Density Estimation and Kernel Discriminant Analysis for Multivariate Data in R

    Directory of Open Access Journals (Sweden)

    Tarn Duong

    2007-09-01

    Full Text Available Kernel smoothing is one of the most widely used non-parametric data smoothing techniques. We introduce a new R package ks for multivariate kernel smoothing. Currently it contains functionality for kernel density estimation and kernel discriminant analysis. It is a comprehensive package for bandwidth matrix selection, implementing a wide range of data-driven diagonal and unconstrained bandwidth selectors.

  17. Estimation of Esfarayen Farmers Risk Aversion Coefficient and Its Influencing Factors (Nonparametric Approach

    Directory of Open Access Journals (Sweden)

    Z. Nematollahi

    2016-03-01

    Full Text Available Introduction: Due to existence of the risk and uncertainty in agriculture, risk management is crucial for management in agriculture. Therefore the present study was designed to determine the risk aversion coefficient for Esfarayens farmers. Materials and Methods: The following approaches have been utilized to assess risk attitudes: (1 direct elicitation of utility functions, (2 experimental procedures in which individuals are presented with hypothetical questionnaires regarding risky alternatives with or without real payments and (3: Inference from observation of economic behavior. In this paper, we focused on approach (3: inference from observation of economic behavior, based on this assumption of existence of the relationship between the actual behavior of a decision maker and the behavior predicted from empirically specified models. A new non-parametric method and the QP method were used to calculate the coefficient of risk aversion. We maximized the decision maker expected utility with the E-V formulation (Freund, 1956. Ideally, in constructing a QP model, the variance-covariance matrix should be formed for each individual farmer. For this purpose, a sample of 100 farmers was selected using random sampling and their data about 14 products of years 2008- 2012 were assembled. The lowlands of Esfarayen were used since within this area, production possibilities are rather homogeneous. Results and Discussion: The results of this study showed that there was low correlation between some of the activities, which implies opportunities for income stabilization through diversification. With respect to transitory income, Ra, vary from 0.000006 to 0.000361 and the absolute coefficient of risk aversion in our sample were 0.00005. The estimated Ra values vary considerably from farm to farm. The results showed that the estimated Ra for the subsample existing of 'non-wealthy' farmers was 0.00010. The subsample with farmers in the 'wealthy' group had an

  18. Bayesian Nonparametric Estimation for Dynamic Treatment Regimes with Sequential Transition Times.

    Science.gov (United States)

    Xu, Yanxun; Müller, Peter; Wahed, Abdus S; Thall, Peter F

    2016-01-01

    We analyze a dataset arising from a clinical trial involving multi-stage chemotherapy regimes for acute leukemia. The trial design was a 2 × 2 factorial for frontline therapies only. Motivated by the idea that subsequent salvage treatments affect survival time, we model therapy as a dynamic treatment regime (DTR), that is, an alternating sequence of adaptive treatments or other actions and transition times between disease states. These sequences may vary substantially between patients, depending on how the regime plays out. To evaluate the regimes, mean overall survival time is expressed as a weighted average of the means of all possible sums of successive transitions times. We assume a Bayesian nonparametric survival regression model for each transition time, with a dependent Dirichlet process prior and Gaussian process base measure (DDP-GP). Posterior simulation is implemented by Markov chain Monte Carlo (MCMC) sampling. We provide general guidelines for constructing a prior using empirical Bayes methods. The proposed approach is compared with inverse probability of treatment weighting, including a doubly robust augmented version of this approach, for both single-stage and multi-stage regimes with treatment assignment depending on baseline covariates. The simulations show that the proposed nonparametric Bayesian approach can substantially improve inference compared to existing methods. An R program for implementing the DDP-GP-based Bayesian nonparametric analysis is freely available at https://www.ma.utexas.edu/users/yxu/.

  19. Nonparametric Fine Tuning of Mixtures: Application to Non-Life Insurance Claims Distribution Estimation

    Science.gov (United States)

    Sardet, Laure; Patilea, Valentin

    When pricing a specific insurance premium, actuary needs to evaluate the claims cost distribution for the warranty. Traditional actuarial methods use parametric specifications to model claims distribution, like lognormal, Weibull and Pareto laws. Mixtures of such distributions allow to improve the flexibility of the parametric approach and seem to be quite well-adapted to capture the skewness, the long tails as well as the unobserved heterogeneity among the claims. In this paper, instead of looking for a finely tuned mixture with many components, we choose a parsimonious mixture modeling, typically a two or three-component mixture. Next, we use the mixture cumulative distribution function (CDF) to transform data into the unit interval where we apply a beta-kernel smoothing procedure. A bandwidth rule adapted to our methodology is proposed. Finally, the beta-kernel density estimate is back-transformed to recover an estimate of the original claims density. The beta-kernel smoothing provides an automatic fine-tuning of the parsimonious mixture and thus avoids inference in more complex mixture models with many parameters. We investigate the empirical performance of the new method in the estimation of the quantiles with simulated nonnegative data and the quantiles of the individual claims distribution in a non-life insurance application.

  20. Nonparametric estimation of groundwater residence time distributions: What can environmental tracer data tell us about groundwater residence time?

    Science.gov (United States)

    McCallum, James L.; Engdahl, Nicholas B.; Ginn, Timothy R.; Cook, Peter. G.

    2014-03-01

    Residence time distributions (RTDs) have been used extensively for quantifying flow and transport in subsurface hydrology. In geochemical approaches, environmental tracer concentrations are used in conjunction with simple lumped parameter models (LPMs). Conversely, numerical simulation techniques require large amounts of parameterization and estimated RTDs are certainly limited by associated uncertainties. In this study, we apply a nonparametric deconvolution approach to estimate RTDs using environmental tracer concentrations. The model is based only on the assumption that flow is steady enough that the observed concentrations are well approximated by linear superposition of the input concentrations with the RTD; that is, the convolution integral holds. Even with large amounts of environmental tracer concentration data, the entire shape of an RTD remains highly nonunique. However, accurate estimates of mean ages and in some cases prediction of young portions of the RTD may be possible. The most useful type of data was found to be the use of a time series of tritium. This was due to the sharp variations in atmospheric concentrations and a short half-life. Conversely, the use of CFC compounds with smoothly varying atmospheric concentrations was more prone to nonuniqueness. This work highlights the benefits and limitations of using environmental tracer data to estimate whole RTDs with either LPMs or through numerical simulation. However, the ability of the nonparametric approach developed here to correct for mixing biases in mean ages appears promising.

  1. Nonparametric Regression with Common Shocks

    Directory of Open Access Journals (Sweden)

    Eduardo A. Souza-Rodrigues

    2016-09-01

    Full Text Available This paper considers a nonparametric regression model for cross-sectional data in the presence of common shocks. Common shocks are allowed to be very general in nature; they do not need to be finite dimensional with a known (small number of factors. I investigate the properties of the Nadaraya-Watson kernel estimator and determine how general the common shocks can be while still obtaining meaningful kernel estimates. Restrictions on the common shocks are necessary because kernel estimators typically manipulate conditional densities, and conditional densities do not necessarily exist in the present case. By appealing to disintegration theory, I provide sufficient conditions for the existence of such conditional densities and show that the estimator converges in probability to the Kolmogorov conditional expectation given the sigma-field generated by the common shocks. I also establish the rate of convergence and the asymptotic distribution of the kernel estimator.

  2. Resource exploitation and cross-region growth trajectories: nonparametric estimates for Chile.

    Science.gov (United States)

    Mainardi, Stefano

    2007-10-01

    As a sector of primary concern for national development strategies, mining keeps stimulating an intensive debate in Chile, regarding its role for long-term growth. Partly drawn on theoretical contributions to growth and mineral resource accounting, this analysis assesses patterns of economic growth across Chilean regions. The theoretical and methodological rationale for focusing on weak sustainability, by testing convergence across regions in a distribution dynamics perspective, is first discussed. This is followed by a brief review of policy issues and previous empirical findings of concern to Chile's mining and regional growth. Panel data over the period 1960-2001 are analysed, with growth measured in terms of both income per capita as such, and sustainable measures of this variable. Kernel density and quantile regression estimates indicate persistent bimodal (if not possibly trimodal) distribution of nationally standardised regional incomes per capita, whereby conditions for cross-region convergence are matched only within the inner range of this distribution.

  3. Variational estimation of the drift for stochastic differential equations from the empirical density

    Science.gov (United States)

    Batz, Philipp; Ruttor, Andreas; Opper, Manfred

    2016-08-01

    We present a method for the nonparametric estimation of the drift function of certain types of stochastic differential equations from the empirical density. It is based on a variational formulation of the Fokker-Planck equation. The minimization of an empirical estimate of the variational functional using kernel based regularization can be performed in closed form. We demonstrate the performance of the method on second order, Langevin-type equations and show how the method can be generalized to other noise models.

  4. Variational estimation of the drift for stochastic differential equations from the empirical density

    CERN Document Server

    Batz, Philipp; Opper, Manfred

    2016-01-01

    We present a method for the nonparametric estimation of the drift function of certain types of stochastic differential equations from the empirical density. It is based on a variational formulation of the Fokker-Planck equation. The minimization of an empirical estimate of the variational functional using kernel based regularization can be performed in closed form. We demonstrate the performance of the method on second order, Langevin-type equations and show how the method can be generalized to other noise models.

  5. Estimation from PET data of transient changes in dopamine concentration induced by alcohol: support for a non-parametric signal estimation method

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, C C; Yoder, K K; Normandin, M D; Morris, E D [Department of Radiology, Indiana University School of Medicine, Indianapolis, IN (United States); Kareken, D A [Department of Neurology, Indiana University School of Medicine, Indianapolis, IN (United States); Bouman, C A [Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN (United States); O' Connor, S J [Department of Psychiatry, Indiana University School of Medicine, Indianapolis, IN (United States)], E-mail: emorris@iupui.edu

    2008-03-07

    We previously developed a model-independent technique (non-parametric ntPET) for extracting the transient changes in neurotransmitter concentration from paired (rest and activation) PET studies with a receptor ligand. To provide support for our method, we introduced three hypotheses of validation based on work by Endres and Carson (1998 J. Cereb. Blood Flow Metab. 18 1196-210) and Yoder et al (2004 J. Nucl. Med. 45 903-11), and tested them on experimental data. All three hypotheses describe relationships between the estimated free (synaptic) dopamine curves (F{sup DA}(t)) and the change in binding potential ({delta}BP). The veracity of the F{sup DA}(t) curves recovered by nonparametric ntPET is supported when the data adhere to the following hypothesized behaviors: (1) {delta}BP should decline with increasing DA peak time, (2) {delta}BP should increase as the strength of the temporal correlation between F{sup DA}(t) and the free raclopride (F{sup RAC}(t)) curve increases, (3) {delta}BP should decline linearly with the effective weighted availability of the receptor sites. We analyzed regional brain data from 8 healthy subjects who received two [{sup 11}C]raclopride scans: one at rest, and one during which unanticipated IV alcohol was administered to stimulate dopamine release. For several striatal regions, nonparametric ntPET was applied to recover F{sup DA}(t), and binding potential values were determined. Kendall rank-correlation analysis confirmed that the F{sup DA}(t) data followed the expected trends for all three validation hypotheses. Our findings lend credence to our model-independent estimates of F{sup DA}(t). Application of nonparametric ntPET may yield important insights into how alterations in timing of dopaminergic neurotransmission are involved in the pathologies of addiction and other psychiatric disorders.

  6. Application of Kernel Density Estimation in Lamb Wave-Based Damage Detection

    Directory of Open Access Journals (Sweden)

    Long Yu

    2012-01-01

    Full Text Available The present work concerns the estimation of the probability density function (p.d.f. of measured data in the Lamb wave-based damage detection. Although there was a number of research work which focused on the consensus algorithm of combining all the results of individual sensors, the p.d.f. of measured data, which was the fundamental part of the probability-based method, was still given by experience in existing work. Based on the analysis about the noise-induced errors in measured data, it was learned that the type of distribution was related with the level of noise. In the case of weak noise, the p.d.f. of measured data could be considered as the normal distribution. The empirical methods could give satisfied estimating results. However, in the case of strong noise, the p.d.f. was complex and did not belong to any type of common distribution function. Nonparametric methods, therefore, were needed. As the most popular nonparametric method, kernel density estimation was introduced. In order to demonstrate the performance of the kernel density estimation methods, a numerical model was built to generate the signals of Lamb waves. Three levels of white Gaussian noise were intentionally added into the simulated signals. The estimation results showed that the nonparametric methods outperformed the empirical methods in terms of accuracy.

  7. A Non-parametric Approach to the Overall Estimate of Cognitive Load Using NIRS Time Series.

    Science.gov (United States)

    Keshmiri, Soheil; Sumioka, Hidenobu; Yamazaki, Ryuji; Ishiguro, Hiroshi

    2017-01-01

    We present a non-parametric approach to prediction of the n-back n ∈ {1, 2} task as a proxy measure of mental workload using Near Infrared Spectroscopy (NIRS) data. In particular, we focus on measuring the mental workload through hemodynamic responses in the brain induced by these tasks, thereby realizing the potential that they can offer for their detection in real world scenarios (e.g., difficulty of a conversation). Our approach takes advantage of intrinsic linearity that is inherent in the components of the NIRS time series to adopt a one-step regression strategy. We demonstrate the correctness of our approach through its mathematical analysis. Furthermore, we study the performance of our model in an inter-subject setting in contrast with state-of-the-art techniques in the literature to show a significant improvement on prediction of these tasks (82.50 and 86.40% for female and male participants, respectively). Moreover, our empirical analysis suggest a gender difference effect on the performance of the classifiers (with male data exhibiting a higher non-linearity) along with the left-lateralized activation in both genders with higher specificity in females.

  8. An Non-parametrical Approach to Estimate Location Parameters under Simple Order

    Institute of Scientific and Technical Information of China (English)

    孙旭

    2005-01-01

    This paper deals with estimating parameters under simple order when samples come from location models. Based on the idea of Hodges and Lehmann estimator (H-L estimator), a new approach to estimate parameters is proposed, which is difference with the classical L1 isotoaic regression and L2 isotonic regression. An algorithm to corupute estimators is given. Simulations by the Monte-Carlo method is applied to compare the likelihood functions with respect to L1 estimators and weighted isotonic H-L estimators.

  9. An Adaptive Background Subtraction Method Based on Kernel Density Estimation

    Directory of Open Access Journals (Sweden)

    Mignon Park

    2012-09-01

    Full Text Available In this paper, a pixel-based background modeling method, which uses nonparametric kernel density estimation, is proposed. To reduce the burden of image storage, we modify the original KDE method by using the first frame to initialize it and update it subsequently at every frame by controlling the learning rate according to the situations. We apply an adaptive threshold method based on image changes to effectively subtract the dynamic backgrounds. The devised scheme allows the proposed method to automatically adapt to various environments and effectively extract the foreground. The method presented here exhibits good performance and is suitable for dynamic background environments. The algorithm is tested on various video sequences and compared with other state-of-the-art background subtraction methods so as to verify its performance.

  10. Nonparametric trend estimation in the presence of fractal noise: application to fMRI time-series analysis.

    Science.gov (United States)

    Afshinpour, Babak; Hossein-Zadeh, Gholam-Ali; Soltanian-Zadeh, Hamid

    2008-06-30

    Unknown low frequency fluctuations called "trend" are observed in noisy time-series measured for different applications. In some disciplines, they carry primary information while in other fields such as functional magnetic resonance imaging (fMRI) they carry nuisance effects. In all cases, however, it is necessary to estimate them accurately. In this paper, a method for estimating trend in the presence of fractal noise is proposed and applied to fMRI time-series. To this end, a partly linear model (PLM) is fitted to each time-series. The parametric and nonparametric parts of PLM are considered as contributions of hemodynamic response and trend, respectively. Using the whitening property of wavelet transform, the unknown components of the model are estimated in the wavelet domain. The results of the proposed method are compared to those of other parametric trend-removal approaches such as spline and polynomial models. It is shown that the proposed method improves activation detection and decreases variance of the estimated parameters relative to the other methods.

  11. Density estimation from local structure

    CSIR Research Space (South Africa)

    Van der Walt, Christiaan M

    2009-11-01

    Full Text Available Mixture Model (GMM) density function of the data and the log-likelihood scores are compared to the scores of a GMM trained with the expectation maximization (EM) algorithm on 5 real-world classification datasets (from the UCI collection). They show...

  12. Density Estimation Trees in High Energy Physics

    CERN Document Server

    Anderlini, Lucio

    2015-01-01

    Density Estimation Trees can play an important role in exploratory data analysis for multidimensional, multi-modal data models of large samples. I briefly discuss the algorithm, a self-optimization technique based on kernel density estimation, and some applications in High Energy Physics.

  13. Non-parametric estimation of the availability in a general repairable system

    Energy Technology Data Exchange (ETDEWEB)

    Gamiz, M.L. [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain)], E-mail: mgamiz@ugr.es; Roman, Y. [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain)

    2008-08-15

    This work deals with repairable systems with unknown failure and repair time distributions. We focus on the estimation of the instantaneous availability, that is, the probability that the system is functioning at a given time, which we consider as the most significant measure for evaluating the effectiveness of a repairable system. The estimation of the availability function is not, in general, an easy task, i.e., analytical techniques are difficult to apply. We propose a smooth estimation of the availability based on kernel estimator of the cumulative distribution functions (CDF) of the failure and repair times, for which the bandwidth parameters are obtained by bootstrap procedures. The consistency properties of the availability estimator are established by using techniques based on the Laplace transform.

  14. Poverty and life cycle effects: A nonparametric analysis for Germany

    OpenAIRE

    Stich, Andreas

    1996-01-01

    Most empirical studies on poverty consider the extent of poverty either for the entire society or for separate groups like elderly people.However, these papers do not show what the situation looks like for persons of a certain age. In this paper poverty measures depending on age are derived using the joint density of income and age. The density is nonparametrically estimated by weighted Gaussian kernel density estimation. Applying the conditional density of income to several poverty measures ...

  15. Type I Error Rates and Power Estimates of Selected Parametric and Nonparametric Tests of Scale.

    Science.gov (United States)

    Olejnik, Stephen F.; Algina, James

    1987-01-01

    Estimated Type I Error rates and power are reported for the Brown-Forsythe, O'Brien, Klotz, and Siegal-Tukey procedures. The effect of aligning the data using deviations from group means or group medians is investigated. (RB)

  16. Type I Error Rates and Power Estimates of Selected Parametric and Nonparametric Tests of Scale.

    Science.gov (United States)

    Olejnik, Stephen F.; Algina, James

    1987-01-01

    Estimated Type I Error rates and power are reported for the Brown-Forsythe, O'Brien, Klotz, and Siegal-Tukey procedures. The effect of aligning the data using deviations from group means or group medians is investigated. (RB)

  17. Economic capacity estimation in fisheries: A non-parametric ray approach

    Energy Technology Data Exchange (ETDEWEB)

    Pascoe, Sean; Tingley, Diana [Centre for the Economics and Management of Aquatic Resources (CEMARE), University of Portsmouth, Boathouse No. 6, College Road, HM Naval Base, Portsmouth PO1 3LJ (United Kingdom)

    2006-05-15

    Data envelopment analysis (DEA) has generally been adopted as the most appropriate methodology for the estimation of fishing capacity, particularly in multi-species fisheries. More recently, economic DEA methods have been developed that incorporate the costs and benefits of increasing capacity utilisation. One such method was applied to estimate the capacity utilisation and output of the Scottish fleet. By comparing the results of the economic and traditional DEA approaches, it can be concluded that many fleet segments are operating at or close to full capacity, and that the vessels defining the frontier are operating consistent with profit maximising behaviour. (author)

  18. Estimation of floodplain aboveground biomass using multispectral remote sensing and nonparametric modeling

    Science.gov (United States)

    Güneralp, İnci; Filippi, Anthony M.; Randall, Jarom

    2014-12-01

    Floodplain forests serve a critical function in the global carbon cycle because floodplains constitute an important carbon sink compared with other terrestrial ecosystems. Forests on dynamic floodplain landscapes, such as those created by river meandering processes, are characterized by uneven-aged trees and exhibit high spatial variability, reflecting the influence of interacting fluvial, hydrological, and ecological processes. Detailed and accurate mapping of aboveground biomass (AGB) on floodplain landscapes characterized by uneven-aged forests is critical for improving estimates of floodplain-forest carbon pools, which is useful for greenhouse gas (GHG) life cycle assessment. It would also help improve our process understanding of biomorphodynamics of river-floodplain systems, as well as planning and monitoring of conservation, restoration, and management of riverine ecosystems. Using stochastic gradient boosting (SGB), multivariate adaptive regression splines (MARS), and Cubist, we remotely estimate AGB of a bottomland hardwood forest on a meander bend of a dynamic lowland river. As predictors, we use 30-m and 10-m multispectral image bands (Landsat 7 ETM+ and SPOT 5, respectively) and ancillary data. Our findings show that SGB and MARS significantly outperform Cubist, which is used for U.S. national-scale forest biomass mapping. Across all data-experiments and algorithms, at 10-m spatial resolution, SGB yields the best estimates (RMSE = 22.49 tonnes/ha; coefficient of determination (R2) = 0.96) when geomorphometric data are also included. On the other hand, at 30-m spatial resolution, MARS yields the best estimates (RMSE = 29.2 tonnes/ha; R2 = 0.94) when image-derived data are also included. By enabling more accurate AGB mapping of floodplains characterized by uneven-aged forests, SGB and MARS provide an avenue for improving operational estimates of AGB and carbon at local, regional/continental, and global scales.

  19. On Non-Parametric Field Estimation using Randomly Deployed, Noisy, Binary Sensors

    CERN Document Server

    Wang, Ye

    2007-01-01

    We consider the problem of reconstructing a deterministic data field from binary quantized noisy observations of sensors randomly deployed over the field domain. Our focus is on the extremes of lack of control in the sensor deployment, arbitrariness and lack of knowledge of the noise distribution, and low-precision and unreliability in the sensors. These adverse conditions are motivated by possible real-world scenarios where a large collection of low-cost, crudely manufactured sensors are mass-deployed in an environment where little can be assumed about the ambient noise. We propose a simple estimator that reconstructs the entire data field from these unreliable, binary quantized, noisy observations. Under the assumption of a bounded amplitude field, we prove almost sure and mean-square convergence of the estimator to the actual field as the number of sensors tends to infinity. For fields with bounded-variation, Sobolev differentiable, or finite-dimensionality properties, we derive specific mean squared error...

  20. [Non-parametric Bootstrap estimation on the intraclass correlation coefficient generated from quantitative hierarchical data].

    Science.gov (United States)

    Liang, Rong; Zhou, Shu-dong; Li, Li-xia; Zhang, Jun-guo; Gao, Yan-hui

    2013-09-01

    This paper aims to achieve Bootstraping in hierarchical data and to provide a method for the estimation on confidence interval(CI) of intraclass correlation coefficient(ICC).First, we utilize the mixed-effects model to estimate data from ICC of repeated measurement and from the two-stage sampling. Then, we use Bootstrap method to estimate CI from related ICCs. Finally, the influences of different Bootstraping strategies to ICC's CIs are compared. The repeated measurement instance show that the CI of cluster Bootsraping containing the true ICC value. However, when ignoring the hierarchy characteristics of data, the random Bootsraping method shows that it has the invalid CI. Result from the two-stage instance shows that bias observed between cluster Bootstraping's ICC means while the ICC of the original sample is the smallest, but with wide CI. It is necessary to consider the structure of data as important, when hierarchical data is being resampled. Bootstrapping seems to be better on the higher than that on lower levels.

  1. Modified Nonparametric Kernel Estimates of a Regression Function and their Consistencies with Rates.

    Science.gov (United States)

    1985-04-01

    estimates. In each case the speed of convergence is examined. An explicit bound for the mean square error, lacking to date in the literature for the...suP cBIg (x)-g(x)Il - O(max{nS,(nn) "I/ 21 and - -1/2suPx B lg(x)-g(x)l O(max{nS(nn)’ 1) in prob. To deduce the uniform weak consistency of r and r...Multivariate Analysis 515 Thftckeray Hall University ofPittsburgh._Pgh._PA__15260______________ It. CONTROLLING OFFICE NAME AND ADDRESS ta. REPORT DATE Air

  2. msSurv: An R Package for Nonparametric Estimation of Multistate Models

    Directory of Open Access Journals (Sweden)

    Nicole Ferguson

    2012-09-01

    Full Text Available We present an R package, msSurv, to calculate the marginal (that is, not conditional on any covariates state occupation probabilities, the state entry and exit time distributions, and the marginal integrated transition hazard for a general, possibly non-Markov, multistate system under left-truncation and right censoring. For a Markov model, msSurv also calculates and returns the transition probability matrix between any two states. Dependent censoring is handled via modeling the censoring hazard through observable covariates. Pointwise confidence intervals for the above mentioned quantities are obtained and returned for independent censoring from closed-form variance estimators and for dependent censoring using the bootstrap.

  3. Nonparametric Kernel Distribution Function Estimation with kerdiest: An R Package for Bandwidth Choice and Applications

    Directory of Open Access Journals (Sweden)

    Alejandro Quintela-del-Rio

    2012-08-01

    Full Text Available The R package kerdiest has been designed for computing kernel estimators of the distribution function and other related functions. Because of its usefulness in real applications, the bandwidth parameter selection problem has been considered, and a cross-validation method and two of plug-in type have been implemented. Moreover, three relevant functions in nature hazards have also been programmed. The package is completed with two interesting data sets, one of geological type (a complete catalogue of the earthquakes occurring in the northwest of the Iberian Peninsula and another containing the maximum peak flow levels of a river in the United States of America.

  4. The binned bispectrum estimator: template-based and non-parametric CMB non-Gaussianity searches

    CERN Document Server

    Bucher, Martin; van Tent, Bartjan

    2015-01-01

    We describe the details of the binned bispectrum estimator as used for the official 2013 and 2015 analyses of the temperature and polarization CMB maps from the ESA Planck satellite. The defining aspect of this estimator is the determination of a map bispectrum (3-point correlator) that has been binned in harmonic space. For a parametric determination of the non-Gaussianity in the map (the so-called fNL parameters), one takes the inner product of this binned bispectrum with theoretically motivated templates. However, as a complementary approach one can also smooth the binned bispectrum using a variable smoothing scale in order to suppress noise and make coherent features stand out above the noise. This allows one to look in a model-independent way for any statistically significant bispectral signal. This approach is useful for characterizing the bispectral shape of the galactic foreground emission, for which a theoretical prediction of the bispectral anisotropy is lacking, and for detecting a serendipitous pr...

  5. Earthquake Risk Management Strategy Plan Using Nonparametric Estimation of Hazard Rate

    Directory of Open Access Journals (Sweden)

    Aflaton Amiri

    2008-01-01

    Full Text Available Earthquake risk is defined as the product of hazard and vulnerability studies. The main aims of earthquake risk management are to make plans and apply those for reducing human losses and protect properties from earthquake hazards. Natural risk managers are studying to identify and manage the risk from an earthquake for highly populated urban areas. They want to put some strategic plans for this purpose. Risk managers need some input about these kinds of studies. The prediction of earthquake events such as a input for preparation of earthquake risk management strategy plans were tried to find in this study. A Bayesian approach to earthquake hazard rate estimation is studied and magnitudes of historical earthquakes is used to predict the probability of occurrence of major earthquakes.

  6. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    CERN Document Server

    Gonzalez, Adriana; Jacques, Laurent

    2016-01-01

    Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. Optics are never perfect and the non-ideal path through the telescope is usually represented by the convolution of an ideal image with a Point Spread Function (PSF). Other sources of noise (read-out, Photon) also contaminate the image acquisition process. The problem of estimating both the PSF filter and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, it does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis image prior model and weak assumptions on the PSF filter's response. We use the observations from a celestial body transit where such object can be assumed to be a black disk. Such constraints limits the interchangeabil...

  7. Parametric Return Density Estimation for Reinforcement Learning

    CERN Document Server

    Morimura, Tetsuro; Kashima, Hisashi; Hachiya, Hirotaka; Tanaka, Toshiyuki

    2012-01-01

    Most conventional Reinforcement Learning (RL) algorithms aim to optimize decision- making rules in terms of the expected re- turns. However, especially for risk man- agement purposes, other risk-sensitive crite- ria such as the value-at-risk or the expected shortfall are sometimes preferred in real ap- plications. Here, we describe a parametric method for estimating density of the returns, which allows us to handle various criteria in a unified manner. We first extend the Bellman equation for the conditional expected return to cover a conditional probability density of the returns. Then we derive an extension of the TD-learning algorithm for estimating the return densities in an unknown environment. As test instances, several parametric density estimation algorithms are presented for the Gaussian, Laplace, and skewed Laplace dis- tributions. We show that these algorithms lead to risk-sensitive as well as robust RL paradigms through numerical experiments.

  8. Estimating stellar mean density through seismic inversions

    CERN Document Server

    Reese, D R; Goupil, M J; Thompson, M J; Deheuvels, S

    2012-01-01

    Determining the mass of stars is crucial both to improving stellar evolution theory and to characterising exoplanetary systems. Asteroseismology offers a promising way to estimate stellar mean density. When combined with accurate radii determinations, such as is expected from GAIA, this yields accurate stellar masses. The main difficulty is finding the best way to extract the mean density from a set of observed frequencies. We seek to establish a new method for estimating stellar mean density, which combines the simplicity of a scaling law while providing the accuracy of an inversion technique. We provide a framework in which to construct and evaluate kernel-based linear inversions which yield directly the mean density of a star. We then describe three different inversion techniques (SOLA and two scaling laws) and apply them to the sun, several test cases and three stars. The SOLA approach and the scaling law based on the surface correcting technique described by Kjeldsen et al. (2008) yield comparable result...

  9. Semi-automatic liver tumor segmentation with hidden Markov measure field model and non-parametric distribution estimation.

    Science.gov (United States)

    Häme, Yrjö; Pollari, Mika

    2012-01-01

    A novel liver tumor segmentation method for CT images is presented. The aim of this work was to reduce the manual labor and time required in the treatment planning of radiofrequency ablation (RFA), by providing accurate and automated tumor segmentations reliably. The developed method is semi-automatic, requiring only minimal user interaction. The segmentation is based on non-parametric intensity distribution estimation and a hidden Markov measure field model, with application of a spherical shape prior. A post-processing operation is also presented to remove the overflow to adjacent tissue. In addition to the conventional approach of using a single image as input data, an approach using images from multiple contrast phases was developed. The accuracy of the method was validated with two sets of patient data, and artificially generated samples. The patient data included preoperative RFA images and a public data set from "3D Liver Tumor Segmentation Challenge 2008". The method achieved very high accuracy with the RFA data, and outperformed other methods evaluated with the public data set, receiving an average overlap error of 30.3% which represents an improvement of 2.3% points to the previously best performing semi-automatic method. The average volume difference was 23.5%, and the average, the RMS, and the maximum surface distance errors were 1.87, 2.43, and 8.09 mm, respectively. The method produced good results even for tumors with very low contrast and ambiguous borders, and the performance remained high with noisy image data.

  10. Bayesian mixture models for spectral density estimation

    OpenAIRE

    Cadonna, Annalisa

    2017-01-01

    We introduce a novel Bayesian modeling approach to spectral density estimation for multiple time series. Considering first the case of non-stationary timeseries, the log-periodogram of each series is modeled as a mixture of Gaussiandistributions with frequency-dependent weights and mean functions. The implied model for the log-spectral density is a mixture of linear mean functionswith frequency-dependent weights. The mixture weights are built throughsuccessive differences of a logit-normal di...

  11. Particle Size Estimation Based on Edge Density

    Institute of Scientific and Technical Information of China (English)

    WANG Wei-xing

    2005-01-01

    Given image sequences of closely packed particles, the underlying aim is to estimate diameters without explicit segmentation. In a way, this is similar to the task of counting objects without directly counting them. Such calculations may, for example, be useful fast estimation of particle size in different application areas. The topic is that of estimating average size (=average diameter) of packed particles, from formulas involving edge density, and the edges from moment-based thresholding are used. An average shape factor is involved in the calculations, obtained for some frames from crude partial segmentation. Measurement results from about 80 frames have been analyzed.

  12. Nonparametric Maximum Penalized Likelihood Estimation of a Density from Arbitrarily Right-Censored Observations.

    Science.gov (United States)

    1984-06-01

    RKHS) if point evaluation is a continuous operation, that is, v - vn in H(f?) implies that v (t) -o v(t) for all t e 12. See n Goffman and Pedrick ...34, ,-’, . .’. ’**’,,**4, .’ *"".-..’ * . .’ *" .-. .;¢ .’ .*** * .’ ’.L’,.’o ¢ . .* h~ ’a . ’’ *’’".2’ *.,’.. .: " 17 Goffman, C. & Pedrick , C

  13. Nonparametric statistical methods using R

    CERN Document Server

    Kloke, John

    2014-01-01

    A Practical Guide to Implementing Nonparametric and Rank-Based ProceduresNonparametric Statistical Methods Using R covers traditional nonparametric methods and rank-based analyses, including estimation and inference for models ranging from simple location models to general linear and nonlinear models for uncorrelated and correlated responses. The authors emphasize applications and statistical computation. They illustrate the methods with many real and simulated data examples using R, including the packages Rfit and npsm.The book first gives an overview of the R language and basic statistical c

  14. Anisotropic Density Estimation in Global Illumination

    DEFF Research Database (Denmark)

    Schjøth, Lars

    2009-01-01

    Density estimation employed in multi-pass global illumination algorithms gives cause to a trade-off problem between bias and noise. The problem is seen most evident as blurring of strong illumination features. This thesis addresses the problem, presenting four methods that reduce both noise...... and bias in estimates. Good results are obtained by the use of anisotropic filtering. Two methods handles the most common cases; filtering illumination reflected from object surfaces. One methods extends filtering to the temporal domain and one performs filtering on illumination from participating media...

  15. Improved nonparametric estimation of the optimal diagnostic cut-off point associated with the Youden index under different sampling schemes.

    Science.gov (United States)

    Yin, Jingjing; Samawi, Hani; Linder, Daniel

    2016-07-01

    A diagnostic cut-off point of a biomarker measurement is needed for classifying a random subject to be either diseased or healthy. However, the cut-off point is usually unknown and needs to be estimated by some optimization criteria. One important criterion is the Youden index, which has been widely adopted in practice. The Youden index, which is defined as the maximum of (sensitivity + specificity -1), directly measures the largest total diagnostic accuracy a biomarker can achieve. Therefore, it is desirable to estimate the optimal cut-off point associated with the Youden index. Sometimes, taking the actual measurements of a biomarker is very difficult and expensive, while ranking them without the actual measurement can be relatively easy. In such cases, ranked set sampling can give more precise estimation than simple random sampling, as ranked set samples are more likely to span the full range of the population. In this study, kernel density estimation is utilized to numerically solve for an estimate of the optimal cut-off point. The asymptotic distributions of the kernel estimators based on two sampling schemes are derived analytically and we prove that the estimators based on ranked set sampling are relatively more efficient than that of simple random sampling and both estimators are asymptotically unbiased. Furthermore, the asymptotic confidence intervals are derived. Intensive simulations are carried out to compare the proposed method using ranked set sampling with simple random sampling, with the proposed method outperforming simple random sampling in all cases. A real data set is analyzed for illustrating the proposed method.

  16. Rigorous home range estimation with movement data: a new autocorrelated kernel density estimator.

    Science.gov (United States)

    Fleming, C H; Fagan, W F; Mueller, T; Olson, K A; Leimgruber, P; Calabrese, J M

    2015-05-01

    Quantifying animals' home ranges is a key problem in ecology and has important conservation and wildlife management applications. Kernel density estimation (KDE) is a workhorse technique for range delineation problems that is both statistically efficient and nonparametric. KDE assumes that the data are independent and identically distributed (IID). However, animal tracking data, which are routinely used as inputs to KDEs, are inherently autocorrelated and violate this key assumption. As we demonstrate, using realistically autocorrelated data in conventional KDEs results in grossly underestimated home ranges. We further show that the performance of conventional KDEs actually degrades as data quality improves, because autocorrelation strength increases as movement paths become more finely resolved. To remedy these flaws with the traditional KDE method, we derive an autocorrelated KDE (AKDE) from first principles to use autocorrelated data, making it perfectly suited for movement data sets. We illustrate the vastly improved performance of AKDE using analytical arguments, relocation data from Mongolian gazelles, and simulations based upon the gazelle's observed movement process. By yielding better minimum area estimates for threatened wildlife populations, we believe that future widespread use of AKDE will have significant impact on ecology and conservation biology.

  17. Bird population density estimated from acoustic signals

    Science.gov (United States)

    Dawson, D.K.; Efford, M.G.

    2009-01-01

    Many animal species are detected primarily by sound. Although songs, calls and other sounds are often used for population assessment, as in bird point counts and hydrophone surveys of cetaceans, there are few rigorous methods for estimating population density from acoustic data. 2. The problem has several parts - distinguishing individuals, adjusting for individuals that are missed, and adjusting for the area sampled. Spatially explicit capture-recapture (SECR) is a statistical methodology that addresses jointly the second and third parts of the problem. We have extended SECR to use uncalibrated information from acoustic signals on the distance to each source. 3. We applied this extension of SECR to data from an acoustic survey of ovenbird Seiurus aurocapilla density in an eastern US deciduous forest with multiple four-microphone arrays. We modelled average power from spectrograms of ovenbird songs measured within a window of 0??7 s duration and frequencies between 4200 and 5200 Hz. 4. The resulting estimates of the density of singing males (0??19 ha -1 SE 0??03 ha-1) were consistent with estimates of the adult male population density from mist-netting (0??36 ha-1 SE 0??12 ha-1). The fitted model predicts sound attenuation of 0??11 dB m-1 (SE 0??01 dB m-1) in excess of losses from spherical spreading. 5.Synthesis and applications. Our method for estimating animal population density from acoustic signals fills a gap in the census methods available for visually cryptic but vocal taxa, including many species of bird and cetacean. The necessary equipment is simple and readily available; as few as two microphones may provide adequate estimates, given spatial replication. The method requires that individuals detected at the same place are acoustically distinguishable and all individuals vocalize during the recording interval, or that the per capita rate of vocalization is known. We believe these requirements can be met, with suitable field methods, for a significant

  18. Statistical Analysis of Photopyroelectric Signals using Histogram and Kernel Density Estimation for differentiation of Maize Seeds

    Science.gov (United States)

    Rojas-Lima, J. E.; Domínguez-Pacheco, A.; Hernández-Aguilar, C.; Cruz-Orea, A.

    2016-09-01

    Considering the necessity of photothermal alternative approaches for characterizing nonhomogeneous materials like maize seeds, the objective of this research work was to analyze statistically the amplitude variations of photopyroelectric signals, by means of nonparametric techniques such as the histogram and the kernel density estimator, and the probability density function of the amplitude variations of two genotypes of maize seeds with different pigmentations and structural components: crystalline and floury. To determine if the probability density function had a known parametric form, the histogram was determined which did not present a known parametric form, so the kernel density estimator using the Gaussian kernel, with an efficiency of 95 % in density estimation, was used to obtain the probability density function. The results obtained indicated that maize seeds could be differentiated in terms of the statistical values for floury and crystalline seeds such as the mean (93.11, 159.21), variance (1.64× 103, 1.48× 103), and standard deviation (40.54, 38.47) obtained from the amplitude variations of photopyroelectric signals in the case of the histogram approach. For the case of the kernel density estimator, seeds can be differentiated in terms of kernel bandwidth or smoothing constant h of 9.85 and 6.09 for floury and crystalline seeds, respectively.

  19. CURRENT STATUS OF NONPARAMETRIC STATISTICS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-02-01

    Full Text Available Nonparametric statistics is one of the five points of growth of applied mathematical statistics. Despite the large number of publications on specific issues of nonparametric statistics, the internal structure of this research direction has remained undeveloped. The purpose of this article is to consider its division into regions based on the existing practice of scientific activity determination of nonparametric statistics and classify investigations on nonparametric statistical methods. Nonparametric statistics allows to make statistical inference, in particular, to estimate the characteristics of the distribution and testing statistical hypotheses without, as a rule, weakly proven assumptions about the distribution function of samples included in a particular parametric family. For example, the widespread belief that the statistical data are often have the normal distribution. Meanwhile, analysis of results of observations, in particular, measurement errors, always leads to the same conclusion - in most cases the actual distribution significantly different from normal. Uncritical use of the hypothesis of normality often leads to significant errors, in areas such as rejection of outlying observation results (emissions, the statistical quality control, and in other cases. Therefore, it is advisable to use nonparametric methods, in which the distribution functions of the results of observations are imposed only weak requirements. It is usually assumed only their continuity. On the basis of generalization of numerous studies it can be stated that to date, using nonparametric methods can solve almost the same number of tasks that previously used parametric methods. Certain statements in the literature are incorrect that nonparametric methods have less power, or require larger sample sizes than parametric methods. Note that in the nonparametric statistics, as in mathematical statistics in general, there remain a number of unresolved problems

  20. The Nonparametric Estimate of Exponential Premium Under Collective Risk Models%聚合风险模型下指数保费的非参数估计

    Institute of Scientific and Technical Information of China (English)

    张林娜; 温利民; 方婧

    2016-01-01

    The exponential premium principle is one of the most important premium principles and is wide ‐ly applied in non‐life insurance actuarial science .In this paper ,the nonparametric estimate of exponential premium is investigated under collective risk models .In addition ,the estimator is proved strongly consist‐ent and asymptotically normal .Finally ,a numerical simulation method is used to verify the estimated speed of convergence ,and the asymptotic normality of the estimator is checked in the simulations .%在聚合风险模型的假设下,研究了聚合风险下指数保费的非参数估计,证明了估计的强相合性和渐近正态性。最后通过数值模拟的方法验证了估计的收敛速度及渐近正态性。

  1. Variable kernel density estimation in high-dimensional feature spaces

    CSIR Research Space (South Africa)

    Van der Walt, Christiaan M

    2017-02-01

    Full Text Available Estimating the joint probability density function of a dataset is a central task in many machine learning applications. In this work we address the fundamental problem of kernel bandwidth estimation for variable kernel density estimation in high...

  2. Multivariate density estimation theory, practice, and visualization

    CERN Document Server

    Scott, David W

    2015-01-01

    David W. Scott, PhD, is Noah Harding Professor in the Department of Statistics at Rice University. The author of over 100 published articles, papers, and book chapters, Dr. Scott is also Fellow of the American Statistical Association (ASA) and the Institute of Mathematical Statistics. He is recipient of the ASA Founder's Award and the Army Wilks Award. His research interests include computational statistics, data visualization, and density estimation. Dr. Scott is also Coeditor of Wiley Interdisciplinary Reviews: Computational Statistics and previous Editor of the Journal of Computational and

  3. Regularized Multitask Learning for Multidimensional Log-Density Gradient Estimation.

    Science.gov (United States)

    Yamane, Ikko; Sasaki, Hiroaki; Sugiyama, Masashi

    2016-07-01

    Log-density gradient estimation is a fundamental statistical problem and possesses various practical applications such as clustering and measuring nongaussianity. A naive two-step approach of first estimating the density and then taking its log gradient is unreliable because an accurate density estimate does not necessarily lead to an accurate log-density gradient estimate. To cope with this problem, a method to directly estimate the log-density gradient without density estimation has been explored and demonstrated to work much better than the two-step method. The objective of this letter is to improve the performance of this direct method in multidimensional cases. Our idea is to regard the problem of log-density gradient estimation in each dimension as a task and apply regularized multitask learning to the direct log-density gradient estimator. We experimentally demonstrate the usefulness of the proposed multitask method in log-density gradient estimation and mode-seeking clustering.

  4. Estimation of the limit of detection with a bootstrap-derived standard error by a partly non-parametric approach. Application to HPLC drug assays

    DEFF Research Database (Denmark)

    Linnet, Kristian

    2005-01-01

    Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...

  5. Density Estimations in Laboratory Debris Flow Experiments

    Science.gov (United States)

    Queiroz de Oliveira, Gustavo; Kulisch, Helmut; Malcherek, Andreas; Fischer, Jan-Thomas; Pudasaini, Shiva P.

    2016-04-01

    Bulk density and its variation is an important physical quantity to estimate the solid-liquid fractions in two-phase debris flows. Here we present mass and flow depth measurements for experiments performed in a large-scale laboratory set up. Once the mixture is released and it moves down the inclined channel, measurements allow us to determine the bulk density evolution throughout the debris flow. Flow depths are determined by ultrasonic pulse reflection, and the mass is measured with a total normal force sensor. The data were obtained at 50 Hz. The initial two phase material was composed of 350 kg debris with water content of 40%. A very fine pebble with mean particle diameter of 3 mm, particle density of 2760 kg/m³ and bulk density of 1400 kg/m³ in dry condition was chosen as the solid material. Measurements reveal that the debris bulk density remains high from the head to the middle of the debris body whereas it drops substantially at the tail. This indicates lower water content at the tail, compared to the head and the middle portion of the debris body. This means that the solid and fluid fractions are varying strongly in a non-linear manner along the flow path, and from the head to the tail of the debris mass. Importantly, this spatial-temporal density variation plays a crucial role in determining the impact forces associated with the dynamics of the flow. Our setup allows for investigating different two phase material compositions, including large fluid fractions, with high resolutions. The considered experimental set up may enable us to transfer the observed phenomena to natural large-scale events. Furthermore, the measurement data allows evaluating results of numerical two-phase mass flow simulations. These experiments are parts of the project avaflow.org that intends to develop a GIS-based open source computational tool to describe wide spectrum of rapid geophysical mass flows, including avalanches and real two-phase debris flows down complex natural

  6. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  7. Kernel density estimation using graphical processing unit

    Science.gov (United States)

    Sunarko, Su'ud, Zaki

    2015-09-01

    Kernel density estimation for particles distributed over a 2-dimensional space is calculated using a single graphical processing unit (GTX 660Ti GPU) and CUDA-C language. Parallel calculations are done for particles having bivariate normal distribution and by assigning calculations for equally-spaced node points to each scalar processor in the GPU. The number of particles, blocks and threads are varied to identify favorable configuration. Comparisons are obtained by performing the same calculation using 1, 2 and 4 processors on a 3.0 GHz CPU using MPICH 2.0 routines. Speedups attained with the GPU are in the range of 88 to 349 times compared the multiprocessor CPU. Blocks of 128 threads are found to be the optimum configuration for this case.

  8. SOPIE: an R package for the non-parametric estimation of the off-pulse interval of a pulsar light curve

    Science.gov (United States)

    Schutte, Willem D.; Swanepoel, Jan W. H.

    2016-09-01

    An automated tool to derive the off-pulse interval of a light curve originating from a pulsar is needed. First, we derive a powerful and accurate non-parametric sequential estimation technique to estimate the off-pulse interval of a pulsar light curve in an objective manner. This is in contrast to the subjective `eye-ball' (visual) technique, and complementary to the Bayesian Block method which is currently used in the literature. The second aim involves the development of a statistical package, necessary for the implementation of our new estimation technique. We develop a statistical procedure to estimate the off-pulse interval in the presence of noise. It is based on a sequential application of p-values obtained from goodness-of-fit tests for uniformity. The Kolmogorov-Smirnov, Cramér-von Mises, Anderson-Darling and Rayleigh test statistics are applied. The details of the newly developed statistical package SOPIE (Sequential Off-Pulse Interval Estimation) are discussed. The developed estimation procedure is applied to simulated and real pulsar data. Finally, the SOPIE estimated off-pulse intervals of two pulsars are compared to the estimates obtained with the Bayesian Block method and yield very satisfactory results. We provide the code to implement the SOPIE package, which is publicly available at http://CRAN.R-project.org/package=SOPIE (Schutte).

  9. Bayesian Bandwidth Selection for a Nonparametric Regression Model with Mixed Types of Regressors

    Directory of Open Access Journals (Sweden)

    Xibin Zhang

    2016-04-01

    Full Text Available This paper develops a sampling algorithm for bandwidth estimation in a nonparametric regression model with continuous and discrete regressors under an unknown error density. The error density is approximated by the kernel density estimator of the unobserved errors, while the regression function is estimated using the Nadaraya-Watson estimator admitting continuous and discrete regressors. We derive an approximate likelihood and posterior for bandwidth parameters, followed by a sampling algorithm. Simulation results show that the proposed approach typically leads to better accuracy of the resulting estimates than cross-validation, particularly for smaller sample sizes. This bandwidth estimation approach is applied to nonparametric regression model of the Australian All Ordinaries returns and the kernel density estimation of gross domestic product (GDP growth rates among the organisation for economic co-operation and development (OECD and non-OECD countries.

  10. Bayesian semiparametric power spectral density estimation in gravitational wave data analysis

    CERN Document Server

    Edwards, Matthew C; Christensen, Nelson

    2015-01-01

    The standard noise model in gravitational wave (GW) data analysis assumes detector noise is stationary and Gaussian distributed, with a known power spectral density (PSD) that is usually estimated using clean off-source data. Real GW data often depart from these assumptions, and misspecified parametric models of the PSD could result in misleading inferences. We propose a Bayesian semiparametric approach to improve this. We use a nonparametric Bernstein polynomial prior on the PSD, with weights attained via a Dirichlet process distribution, and update this using the Whittle likelihood. Posterior samples are obtained using a Metropolis-within-Gibbs sampler. We simultaneously estimate the reconstruction parameters of a rotating core collapse supernova GW burst that has been embedded in simulated Advanced LIGO noise. We also discuss an approach to deal with non-stationary data by breaking longer data streams into smaller and locally stationary components.

  11. Nonparametric Bayes analysis of social science data

    Science.gov (United States)

    Kunihama, Tsuyoshi

    Social science data often contain complex characteristics that standard statistical methods fail to capture. Social surveys assign many questions to respondents, which often consist of mixed-scale variables. Each of the variables can follow a complex distribution outside parametric families and associations among variables may have more complicated structures than standard linear dependence. Therefore, it is not straightforward to develop a statistical model which can approximate structures well in the social science data. In addition, many social surveys have collected data over time and therefore we need to incorporate dynamic dependence into the models. Also, it is standard to observe massive number of missing values in the social science data. To address these challenging problems, this thesis develops flexible nonparametric Bayesian methods for the analysis of social science data. Chapter 1 briefly explains backgrounds and motivations of the projects in the following chapters. Chapter 2 develops a nonparametric Bayesian modeling of temporal dependence in large sparse contingency tables, relying on a probabilistic factorization of the joint pmf. Chapter 3 proposes nonparametric Bayes inference on conditional independence with conditional mutual information used as a measure of the strength of conditional dependence. Chapter 4 proposes a novel Bayesian density estimation method in social surveys with complex designs where there is a gap between sample and population. We correct for the bias by adjusting mixture weights in Bayesian mixture models. Chapter 5 develops a nonparametric model for mixed-scale longitudinal surveys, in which various types of variables can be induced through latent continuous variables and dynamic latent factors lead to flexibly time-varying associations among variables.

  12. Mammographic density estimation with automated volumetric breast density measurement.

    Science.gov (United States)

    Ko, Su Yeon; Kim, Eun-Kyung; Kim, Min Jung; Moon, Hee Jung

    2014-01-01

    To compare automated volumetric breast density measurement (VBDM) with radiologists' evaluations based on the Breast Imaging Reporting and Data System (BI-RADS), and to identify the factors associated with technical failure of VBDM. In this study, 1129 women aged 19-82 years who underwent mammography from December 2011 to January 2012 were included. Breast density evaluations by radiologists based on BI-RADS and by VBDM (Volpara Version 1.5.1) were compared. The agreement in interpreting breast density between radiologists and VBDM was determined based on four density grades (D1, D2, D3, and D4) and a binary classification of fatty (D1-2) vs. dense (D3-4) breast using kappa statistics. The association between technical failure of VBDM and patient age, total breast volume, fibroglandular tissue volume, history of partial mastectomy, the frequency of mass > 3 cm, and breast density was analyzed. The agreement between breast density evaluations by radiologists and VBDM was fair (k value = 0.26) when the four density grades (D1/D2/D3/D4) were used and moderate (k value = 0.47) for the binary classification (D1-2/D3-4). Twenty-seven women (2.4%) showed failure of VBDM. Small total breast volume, history of partial mastectomy, and high breast density were significantly associated with technical failure of VBDM (p = 0.001 to 0.015). There is fair or moderate agreement in breast density evaluation between radiologists and VBDM. Technical failure of VBDM may be related to small total breast volume, a history of partial mastectomy, and high breast density.

  13. Mammography density estimation with automated volumetic breast density measurement

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Su Yeon; Kim, Eun Kyung; Kim, Min Jung; Moon, Hee Jung [Dept. of Radiology, Severance Hospital, Research Institute of Radiological Science, Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2014-06-15

    To compare automated volumetric breast density measurement (VBDM) with radiologists' evaluations based on the Breast Imaging Reporting and Data System (BI-RADS), and to identify the factors associated with technical failure of VBDM. In this study, 1129 women aged 19-82 years who underwent mammography from December 2011 to January 2012 were included. Breast density evaluations by radiologists based on BI-RADS and by VBDM (Volpara Version 1.5.1) were compared. The agreement in interpreting breast density between radiologists and VBDM was determined based on four density grades (D1, D2, D3, and D4) and a binary classification of fatty (D1-2) vs. dense (D3-4) breast using kappa statistics. The association between technical failure of VBDM and patient age, total breast volume, fibroglandular tissue volume, history of partial mastectomy, the frequency of mass > 3 cm, and breast density was analyzed. The agreement between breast density evaluations by radiologists and VBDM was fair (k value = 0.26) when the four density grades (D1/D2/D3/D4) were used and moderate (k value = 0.47) for the binary classification (D1-2/D3-4). Twenty-seven women (2.4%) showed failure of VBDM. Small total breast volume, history of partial mastectomy, and high breast density were significantly associated with technical failure of VBDM (p 0.001 to 0.015). There is fair or moderate agreement in breast density evaluation between radiologists and VBDM. Technical failure of VBDM may be related to small total breast volume, a history of partial mastectomy, and high breast density.

  14. An asymptotically optimal nonparametric adaptive controller

    Institute of Scientific and Technical Information of China (English)

    郭雷; 谢亮亮

    2000-01-01

    For discrete-time nonlinear stochastic systems with unknown nonparametric structure, a kernel estimation-based nonparametric adaptive controller is constructed based on truncated certainty equivalence principle. Global stability and asymptotic optimality of the closed-loop systems are established without resorting to any external excitations.

  15. Local Component Analysis for Nonparametric Bayes Classifier

    CERN Document Server

    Khademi, Mahmoud; safayani, Meharn

    2010-01-01

    The decision boundaries of Bayes classifier are optimal because they lead to maximum probability of correct decision. It means if we knew the prior probabilities and the class-conditional densities, we could design a classifier which gives the lowest probability of error. However, in classification based on nonparametric density estimation methods such as Parzen windows, the decision regions depend on the choice of parameters such as window width. Moreover, these methods suffer from curse of dimensionality of the feature space and small sample size problem which severely restricts their practical applications. In this paper, we address these problems by introducing a novel dimension reduction and classification method based on local component analysis. In this method, by adopting an iterative cross-validation algorithm, we simultaneously estimate the optimal transformation matrices (for dimension reduction) and classifier parameters based on local information. The proposed method can classify the data with co...

  16. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    Science.gov (United States)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  17. 非参数密度估计在个体损失分布中的应用%Application of non-parameter kernel density method to the estimation of individual loss distribution

    Institute of Scientific and Technical Information of China (English)

    谭英平

    2003-01-01

    As an exploratory research, this paper tries to present a new approach through which the individualloss distribution can be analyzed. Unlike the traditionally parametric statistics idea, the author introducesthe whole procedure about how the nonparametric kernel density estimation can be utilized in the analysisof individual loss distribution. Further,the effect of the new estimation is testified.

  18. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  19. Rasch Model Parameter Estimation in the Presence of a Nonnormal Latent Trait Using a Nonparametric Bayesian Approach

    Science.gov (United States)

    Finch, Holmes; Edwards, Julianne M.

    2016-01-01

    Standard approaches for estimating item response theory (IRT) model parameters generally work under the assumption that the latent trait being measured by a set of items follows the normal distribution. Estimation of IRT parameters in the presence of nonnormal latent traits has been shown to generate biased person and item parameter estimates. A…

  20. Nonparametric estimation of transition probabilities in the non-Markov illness-death model: A comparative study.

    Science.gov (United States)

    de Uña-Álvarez, Jacobo; Meira-Machado, Luís

    2015-06-01

    Multi-state models are often used for modeling complex event history data. In these models the estimation of the transition probabilities is of particular interest, since they allow for long-term predictions of the process. These quantities have been traditionally estimated by the Aalen-Johansen estimator, which is consistent if the process is Markov. Several non-Markov estimators have been proposed in the recent literature, and their superiority with respect to the Aalen-Johansen estimator has been proved in situations in which the Markov condition is strongly violated. However, the existing estimators have the drawback of requiring that the support of the censoring distribution contains the support of the lifetime distribution, which is not often the case. In this article, we propose two new methods for estimating the transition probabilities in the progressive illness-death model. Some asymptotic results are derived. The proposed estimators are consistent regardless the Markov condition and the referred assumption about the censoring support. We explore the finite sample behavior of the estimators through simulations. The main conclusion of this piece of research is that the proposed estimators are much more efficient than the existing non-Markov estimators in most cases. An application to a clinical trial on colon cancer is included. Extensions to progressive processes beyond the three-state illness-death model are discussed.

  1. GPU Acceleration of Mean Free Path Based Kernel Density Estimators for Monte Carlo Neutronics Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Burke, TImothy P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kiedrowski, Brian C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Martin, William R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-11-19

    Kernel Density Estimators (KDEs) are a non-parametric density estimation technique that has recently been applied to Monte Carlo radiation transport simulations. Kernel density estimators are an alternative to histogram tallies for obtaining global solutions in Monte Carlo tallies. With KDEs, a single event, either a collision or particle track, can contribute to the score at multiple tally points with the uncertainty at those points being independent of the desired resolution of the solution. Thus, KDEs show potential for obtaining estimates of a global solution with reduced variance when compared to a histogram. Previously, KDEs have been applied to neutronics for one-group reactor physics problems and fixed source shielding applications. However, little work was done to obtain reaction rates using KDEs. This paper introduces a new form of the MFP KDE that is capable of handling general geometries. Furthermore, extending the MFP KDE to 2-D problems in continuous energy introduces inaccuracies to the solution. An ad-hoc solution to these inaccuracies is introduced that produces errors smaller than 4% at material interfaces.

  2. Nonparametric Inference for Periodic Sequences

    KAUST Repository

    Sun, Ying

    2012-02-01

    This article proposes a nonparametric method for estimating the period and values of a periodic sequence when the data are evenly spaced in time. The period is estimated by a "leave-out-one-cycle" version of cross-validation (CV) and complements the periodogram, a widely used tool for period estimation. The CV method is computationally simple and implicitly penalizes multiples of the smallest period, leading to a "virtually" consistent estimator of integer periods. This estimator is investigated both theoretically and by simulation.We also propose a nonparametric test of the null hypothesis that the data have constantmean against the alternative that the sequence of means is periodic. Finally, our methodology is demonstrated on three well-known time series: the sunspots and lynx trapping data, and the El Niño series of sea surface temperatures. © 2012 American Statistical Association and the American Society for Quality.

  3. Concrete density estimation by rebound hammer method

    Science.gov (United States)

    Ismail, Mohamad Pauzi bin; Jefri, Muhamad Hafizie Bin; Abdullah, Mahadzir Bin; Masenwat, Noor Azreen bin; Sani, Suhairy bin; Mohd, Shukri; Isa, Nasharuddin bin; Mahmud, Mohamad Haniza bin

    2016-01-01

    Concrete is the most common and cheap material for radiation shielding. Compressive strength is the main parameter checked for determining concrete quality. However, for shielding purposes density is the parameter that needs to be considered. X- and -gamma radiations are effectively absorbed by a material with high atomic number and high density such as concrete. The high strength normally implies to higher density in concrete but this is not always true. This paper explains and discusses the correlation between rebound hammer testing and density for concrete containing hematite aggregates. A comparison is also made with normal concrete i.e. concrete containing crushed granite.

  4. Concrete density estimation by rebound hammer method

    Energy Technology Data Exchange (ETDEWEB)

    Ismail, Mohamad Pauzi bin, E-mail: pauzi@nm.gov.my; Masenwat, Noor Azreen bin; Sani, Suhairy bin; Mohd, Shukri [NDT Group, Nuclear Malaysia, Bangi, Kajang, Selangor (Malaysia); Jefri, Muhamad Hafizie Bin; Abdullah, Mahadzir Bin [Material Technology Program, Faculty of Applied Sciences, UiTM, Shah Alam, Selangor (Malaysia); Isa, Nasharuddin bin; Mahmud, Mohamad Haniza bin [Pusat Penyelidikan Mineral, Jabatan Mineral dan Geosains, Ipoh, Perak (Malaysia)

    2016-01-22

    Concrete is the most common and cheap material for radiation shielding. Compressive strength is the main parameter checked for determining concrete quality. However, for shielding purposes density is the parameter that needs to be considered. X- and -gamma radiations are effectively absorbed by a material with high atomic number and high density such as concrete. The high strength normally implies to higher density in concrete but this is not always true. This paper explains and discusses the correlation between rebound hammer testing and density for concrete containing hematite aggregates. A comparison is also made with normal concrete i.e. concrete containing crushed granite.

  5. Minimax lower bound for kink location estimators in a nonparametric regression model with long-range dependence

    CERN Document Server

    Wishart, Justin Rory

    2011-01-01

    In this paper, a lower bound is determined in the minimax sense for change point estimators of the first derivative of a regression function in the fractional white noise model. Similar minimax results presented previously in the area focus on change points in the derivatives of a regression function in the white noise model or consider estimation of the regression function in the presence of correlated errors.

  6. Large Scale Density Estimation of Blue and Fin Whales (LSD)

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...sensors, or both. The goal of this research is to develop and implement a new method for estimating blue and fin whale density that is effective over...develop and implement a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse

  7. Application of non-parametric bootstrap methods to estimate confidence intervals for QTL location in a beef cattle QTL experimental population.

    Science.gov (United States)

    Jongjoo, Kim; Davis, Scott K; Taylor, Jeremy F

    2002-06-01

    Empirical confidence intervals (CIs) for the estimated quantitative trait locus (QTL) location from selective and non-selective non-parametric bootstrap resampling methods were compared for a genome scan involving an Angus x Brahman reciprocal fullsib backcross population. Genetic maps, based on 357 microsatellite markers, were constructed for 29 chromosomes using CRI-MAP V2.4. Twelve growth, carcass composition and beef quality traits (n = 527-602) were analysed to detect QTLs utilizing (composite) interval mapping approaches. CIs were investigated for 28 likelihood ratio test statistic (LRT) profiles for the one QTL per chromosome model. The CIs from the non-selective bootstrap method were largest (87 7 cM average or 79-2% coverage of test chromosomes). The Selective II procedure produced the smallest CI size (42.3 cM average). However, CI sizes from the Selective II procedure were more variable than those produced by the two LOD drop method. CI ranges from the Selective II procedure were also asymmetrical (relative to the most likely QTL position) due to the bias caused by the tendency for the estimated QTL position to be at a marker position in the bootstrap samples and due to monotonicity and asymmetry of the LRT curve in the original sample.

  8. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  9. Limit Distribution Theory for Maximum Likelihood Estimation of a Log-Concave Density.

    Science.gov (United States)

    Balabdaoui, Fadoua; Rufibach, Kaspar; Wellner, Jon A

    2009-06-01

    We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, i.e. a density of the form f(0) = exp varphi(0) where varphi(0) is a concave function on R. Existence, form, characterizations and uniform rates of convergence of the MLE are given by Rufibach (2006) and Dümbgen and Rufibach (2007). The characterization of the log-concave MLE in terms of distribution functions is the same (up to sign) as the characterization of the least squares estimator of a convex density on [0, infinity) as studied by Groeneboom, Jongbloed and Wellner (2001b). We use this connection to show that the limiting distributions of the MLE and its derivative are, under comparable smoothness assumptions, the same (up to sign) as in the convex density estimation problem. In particular, changing the smoothness assumptions of Groeneboom, Jongbloed and Wellner (2001b) slightly by allowing some higher derivatives to vanish at the point of interest, we find that the pointwise limiting distributions depend on the second and third derivatives at 0 of H(k), the "lower invelope" of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of varphi(0) = log f(0) at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode M(f(0)) and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values.

  10. Ant-inspired density estimation via random walks.

    Science.gov (United States)

    Musco, Cameron; Su, Hsin-Hao; Lynch, Nancy A

    2017-09-19

    Many ant species use distributed population density estimation in applications ranging from quorum sensing, to task allocation, to appraisal of enemy colony strength. It has been shown that ants estimate local population density by tracking encounter rates: The higher the density, the more often the ants bump into each other. We study distributed density estimation from a theoretical perspective. We prove that a group of anonymous agents randomly walking on a grid are able to estimate their density within a small multiplicative error in few steps by measuring their rates of encounter with other agents. Despite dependencies inherent in the fact that nearby agents may collide repeatedly (and, worse, cannot recognize when this happens), our bound nearly matches what would be required to estimate density by independently sampling grid locations. From a biological perspective, our work helps shed light on how ants and other social insects can obtain relatively accurate density estimates via encounter rates. From a technical perspective, our analysis provides tools for understanding complex dependencies in the collision probabilities of multiple random walks. We bound the strength of these dependencies using local mixing properties of the underlying graph. Our results extend beyond the grid to more general graphs, and we discuss applications to size estimation for social networks, density estimation for robot swarms, and random walk-based sampling for sensor networks.

  11. SPECIES-SPECIFIC FOREST VARIABLE ESTIMATION USING NON-PARAMETRIC MODELING OF MULTI-SPECTRAL PHOTOGRAMMETRIC POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    J. Bohlin

    2012-07-01

    Full Text Available The recent development in software for automatic photogrammetric processing of multispectral aerial imagery, and the growing nation-wide availability of Digital Elevation Model (DEM data, are about to revolutionize data capture for forest management planning in Scandinavia. Using only already available aerial imagery and ALS-assessed DEM data, raster estimates of the forest variables mean tree height, basal area, total stem volume, and species-specific stem volumes were produced and evaluated. The study was conducted at a coniferous hemi-boreal test site in southern Sweden (lat. 58° N, long. 13° E. Digital aerial images from the Zeiss/Intergraph Digital Mapping Camera system were used to produce 3D point-cloud data with spectral information. Metrics were calculated for 696 field plots (10 m radius from point-cloud data and used in k-MSN to estimate forest variables. For these stands, the tree height ranged from 1.4 to 33.0 m (18.1 m mean, stem volume from 0 to 829 m3 ha-1 (249 m3 ha-1 mean and basal area from 0 to 62.2 m2 ha-1 (26.1 m2 ha-1 mean, with mean stand size of 2.8 ha. Estimates made using digital aerial images corresponding to the standard acquisition of the Swedish National Land Survey (Lantmäteriet showed RMSEs (in percent of the surveyed stand mean of 7.5% for tree height, 11.4% for basal area, 13.2% for total stem volume, 90.6% for pine stem volume, 26.4 for spruce stem volume, and 72.6% for deciduous stem volume. The results imply that photogrammetric matching of digital aerial images has significant potential for operational use in forestry.

  12. A sequential nonparametric pattern classification algorithm based on the Wald SPRT. [Sequential Probability Ratio Test

    Science.gov (United States)

    Poage, J. L.

    1975-01-01

    A sequential nonparametric pattern classification procedure is presented. The method presented is an estimated version of the Wald sequential probability ratio test (SPRT). This method utilizes density function estimates, and the density estimate used is discussed, including a proof of convergence in probability of the estimate to the true density function. The classification procedure proposed makes use of the theory of order statistics, and estimates of the probabilities of misclassification are given. The procedure was tested on discriminating between two classes of Gaussian samples and on discriminating between two kinds of electroencephalogram (EEG) responses.

  13. Breast density estimation from high spectral and spatial resolution MRI.

    Science.gov (United States)

    Li, Hui; Weiss, William A; Medved, Milica; Abe, Hiroyuki; Newstead, Gillian M; Karczmar, Gregory S; Giger, Maryellen L

    2016-10-01

    A three-dimensional breast density estimation method is presented for high spectral and spatial resolution (HiSS) MR imaging. Twenty-two patients were recruited (under an Institutional Review Board--approved Health Insurance Portability and Accountability Act-compliant protocol) for high-risk breast cancer screening. Each patient received standard-of-care clinical digital x-ray mammograms and MR scans, as well as HiSS scans. The algorithm for breast density estimation includes breast mask generating, breast skin removal, and breast percentage density calculation. The inter- and intra-user variabilities of the HiSS-based density estimation were determined using correlation analysis and limits of agreement. Correlation analysis was also performed between the HiSS-based density estimation and radiologists' breast imaging-reporting and data system (BI-RADS) density ratings. A correlation coefficient of 0.91 ([Formula: see text]) was obtained between left and right breast density estimations. An interclass correlation coefficient of 0.99 ([Formula: see text]) indicated high reliability for the inter-user variability of the HiSS-based breast density estimations. A moderate correlation coefficient of 0.55 ([Formula: see text]) was observed between HiSS-based breast density estimations and radiologists' BI-RADS. In summary, an objective density estimation method using HiSS spectral data from breast MRI was developed. The high reproducibility with low inter- and low intra-user variabilities shown in this preliminary study suggest that such a HiSS-based density metric may be potentially beneficial in programs requiring breast density such as in breast cancer risk assessment and monitoring effects of therapy.

  14. Application of a Bayesian nonparametric model to derive toxicity estimates based on the response of Antarctic microbial communities to fuel‐contaminated soil

    National Research Council Canada - National Science Library

    Arbel, Julyan; King, Catherine K; Raymond, Ben; Winsley, Tristrom; Mengersen, Kerrie L

    2015-01-01

    ...‐species toxicity tests. In this study, we apply a Bayesian nonparametric model to a soil microbial data set acquired across a hydrocarbon contamination gradient at the site of a fuel spill in Antarctica...

  15. Nonparametric Bayesian Filtering for Location Estimation, Position Tracking, and Global Localization of Mobile Terminals in Outdoor Wireless Environments

    Directory of Open Access Journals (Sweden)

    Mohamed Khalaf-Allah

    2008-01-01

    Full Text Available The mobile terminal positioning problem is categorized into three different types according to the availability of (1 initial accurate location information and (2 motion measurement data.Location estimation refers to the mobile positioning problem when both the initial location and motion measurement data are not available. If both are available, the positioning problem is referred to as position tracking. When only motion measurements are available, the problem is known as global localization. These positioning problems were solved within the Bayesian filtering framework. Filter derivation and implementation algorithms are provided with emphasis on the mapping approach. The radio maps of the experimental area have been created by a 3D deterministic radio propagation tool with a grid resolution of 5 m. Real-world experimentation was conducted in a GSM network deployed in a semiurban environment in order to investigate the performance of the different positioning algorithms.

  16. Current Source Density Estimation for Single Neurons

    Directory of Open Access Journals (Sweden)

    Dorottya Cserpán

    2014-03-01

    Full Text Available Recent developments of multielectrode technology made it possible to measure the extracellular potential generated in the neural tissue with spatial precision on the order of tens of micrometers and on submillisecond time scale. Combining such measurements with imaging of single neurons within the studied tissue opens up new experimental possibilities for estimating distribution of current sources along a dendritic tree. In this work we show that if we are able to relate part of the recording of extracellular potential to a specific cell of known morphology we can estimate the spatiotemporal distribution of transmembrane currents along it. We present here an extension of the kernel CSD method (Potworowski et al., 2012 applicable in such case. We test it on several model neurons of progressively complicated morphologies from ball-and-stick to realistic, up to analysis of simulated neuron activity embedded in a substantial working network (Traub et al, 2005. We discuss the caveats and possibilities of this new approach.

  17. Highway traffic model-based density estimation

    OpenAIRE

    Morarescu, Irinel - Constantin; CANUDAS DE WIT, Carlos

    2011-01-01

    International audience; The travel time spent in traffic networks is one of the main concerns of the societies in developed countries. A major requirement for providing traffic control and services is the continuous prediction, for several minutes into the future. This paper focuses on an important ingredient necessary for the traffic forecasting which is the real-time traffic state estimation using only a limited amount of data. Simulation results illustrate the performances of the proposed ...

  18. Non-parametric seismic hazard analysis in the presence of incomplete data

    Science.gov (United States)

    Yazdani, Azad; Mirzaei, Sajjad; Dadkhah, Koroush

    2017-01-01

    The distribution of earthquake magnitudes plays a crucial role in the estimation of seismic hazard parameters. Due to the complexity of earthquake magnitude distribution, non-parametric approaches are recommended over classical parametric methods. The main deficiency of the non-parametric approach is the lack of complete magnitude data in almost all cases. This study aims to introduce an imputation procedure for completing earthquake catalog data that will allow the catalog to be used for non-parametric density estimation. Using a Monte Carlo simulation, the efficiency of introduced approach is investigated. This study indicates that when a magnitude catalog is incomplete, the imputation procedure can provide an appropriate tool for seismic hazard assessment. As an illustration, the imputation procedure was applied to estimate earthquake magnitude distribution in Tehran, the capital city of Iran.

  19. Toward accurate and precise estimates of lion density.

    Science.gov (United States)

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2017-08-01

    Reliable estimates of animal density are fundamental to understanding ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation because wildlife authorities rely on estimates to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores, such as lions (Panthera leo). Although abundance indices for lions may produce poor inferences, they continue to be used to estimate density and inform management and policy. We used sighting data from a 3-month survey and adapted a Bayesian spatially explicit capture-recapture (SECR) model to estimate spatial lion density in the Maasai Mara National Reserve and surrounding conservancies in Kenya. Our unstructured spatial capture-recapture sampling design incorporated search effort to explicitly estimate detection probability and density on a fine spatial scale, making our approach robust in the context of varying detection probabilities. Overall posterior mean lion density was estimated to be 17.08 (posterior SD 1.310) lions >1 year old/100 km(2) , and the sex ratio was estimated at 2.2 females to 1 male. Our modeling framework and narrow posterior SD demonstrate that SECR methods can produce statistically rigorous and precise estimates of population parameters, and we argue that they should be favored over less reliable abundance indices. Furthermore, our approach is flexible enough to incorporate different data types, which enables robust population estimates over relatively short survey periods in a variety of systems. Trend analyses are essential to guide conservation decisions but are frequently based on surveys of differing reliability. We therefore call for a unified framework to assess lion numbers in key populations to improve management and

  20. Nonparametric regression with filtered data

    CERN Document Server

    Linton, Oliver; Nielsen, Jens Perch; Van Keilegom, Ingrid; 10.3150/10-BEJ260

    2011-01-01

    We present a general principle for estimating a regression function nonparametrically, allowing for a wide variety of data filtering, for example, repeated left truncation and right censoring. Both the mean and the median regression cases are considered. The method works by first estimating the conditional hazard function or conditional survivor function and then integrating. We also investigate improved methods that take account of model structure such as independent errors and show that such methods can improve performance when the model structure is true. We establish the pointwise asymptotic normality of our estimators.

  1. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how....... For this purpose non-parametric methods together with additive models are suggested. Also, a new approach specifically designed to detect non-linearities is introduced. Confidence intervals are constructed by use of bootstrapping. As a link between non-parametric and parametric methods a paper dealing with neural...... the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...

  2. Bayesian nonparametric duration model with censorship

    Directory of Open Access Journals (Sweden)

    Joseph Hakizamungu

    2007-10-01

    Full Text Available This paper is concerned with nonparametric i.i.d. durations models censored observations and we establish by a simple and unified approach the general structure of a bayesian nonparametric estimator for a survival function S. For Dirichlet prior distributions, we describe completely the structure of the posterior distribution of the survival function. These results are essentially supported by prior and posterior independence properties.

  3. Effective dysphonia detection using feature dimension reduction and kernel density estimation for patients with Parkinson's disease.

    Directory of Open Access Journals (Sweden)

    Shanshan Yang

    Full Text Available Detection of dysphonia is useful for monitoring the progression of phonatory impairment for patients with Parkinson's disease (PD, and also helps assess the disease severity. This paper describes the statistical pattern analysis methods to study different vocal measurements of sustained phonations. The feature dimension reduction procedure was implemented by using the sequential forward selection (SFS and kernel principal component analysis (KPCA methods. Four selected vocal measures were projected by the KPCA onto the bivariate feature space, in which the class-conditional feature densities can be approximated with the nonparametric kernel density estimation technique. In the vocal pattern classification experiments, Fisher's linear discriminant analysis (FLDA was applied to perform the linear classification of voice records for healthy control subjects and PD patients, and the maximum a posteriori (MAP decision rule and support vector machine (SVM with radial basis function kernels were employed for the nonlinear classification tasks. Based on the KPCA-mapped feature densities, the MAP classifier successfully distinguished 91.8% voice records, with a sensitivity rate of 0.986, a specificity rate of 0.708, and an area value of 0.94 under the receiver operating characteristic (ROC curve. The diagnostic performance provided by the MAP classifier was superior to those of the FLDA and SVM classifiers. In addition, the classification results indicated that gender is insensitive to dysphonia detection, and the sustained phonations of PD patients with minimal functional disability are more difficult to be correctly identified.

  4. Nonparametric Econometrics: The np Package

    Directory of Open Access Journals (Sweden)

    Tristen Hayfield

    2008-07-01

    Full Text Available We describe the R np package via a series of applications that may be of interest to applied econometricians. The np package implements a variety of nonparametric and semiparametric kernel-based estimators that are popular among econometricians. There are also procedures for nonparametric tests of significance and consistent model specification tests for parametric mean regression models and parametric quantile regression models, among others. The np package focuses on kernel methods appropriate for the mix of continuous, discrete, and categorical data often found in applied settings. Data-driven methods of bandwidth selection are emphasized throughout, though we caution the user that data-driven bandwidth selection methods can be computationally demanding.

  5. A comparison of selected parametric and imputation methods for estimating snag density and snag quality attributes

    Science.gov (United States)

    Eskelson, Bianca N.I.; Hagar, Joan; Temesgen, Hailemariam

    2012-01-01

    Snags (standing dead trees) are an essential structural component of forests. Because wildlife use of snags depends on size and decay stage, snag density estimation without any information about snag quality attributes is of little value for wildlife management decision makers. Little work has been done to develop models that allow multivariate estimation of snag density by snag quality class. Using climate, topography, Landsat TM data, stand age and forest type collected for 2356 forested Forest Inventory and Analysis plots in western Washington and western Oregon, we evaluated two multivariate techniques for their abilities to estimate density of snags by three decay classes. The density of live trees and snags in three decay classes (D1: recently dead, little decay; D2: decay, without top, some branches and bark missing; D3: extensive decay, missing bark and most branches) with diameter at breast height (DBH) ≥ 12.7 cm was estimated using a nonparametric random forest nearest neighbor imputation technique (RF) and a parametric two-stage model (QPORD), for which the number of trees per hectare was estimated with a Quasipoisson model in the first stage and the probability of belonging to a tree status class (live, D1, D2, D3) was estimated with an ordinal regression model in the second stage. The presence of large snags with DBH ≥ 50 cm was predicted using a logistic regression and RF imputation. Because of the more homogenous conditions on private forest lands, snag density by decay class was predicted with higher accuracies on private forest lands than on public lands, while presence of large snags was more accurately predicted on public lands, owing to the higher prevalence of large snags on public lands. RF outperformed the QPORD model in terms of percent accurate predictions, while QPORD provided smaller root mean square errors in predicting snag density by decay class. The logistic regression model achieved more accurate presence/absence classification

  6. Density estimates of monarch butterflies overwintering in central Mexico.

    Science.gov (United States)

    Thogmartin, Wayne E; Diffendorfer, Jay E; López-Hoffman, Laura; Oberhauser, Karen; Pleasants, John; Semmens, Brice X; Semmens, Darius; Taylor, Orley R; Wiederholt, Ruscena

    2017-01-01

    Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9-60.9 million ha(-1). We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha(-1) (95% CI [2.4-80.7] million ha(-1)); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha(-1)). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.

  7. Density estimates of monarch butterflies overwintering in central Mexico

    Directory of Open Access Journals (Sweden)

    Wayne E. Thogmartin

    2017-04-01

    Full Text Available Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L. under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1; the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1. Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp. lost (0.86 billion stems in the northern US plus the amount of milkweed remaining (1.34 billion stems, we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.

  8. Density estimates of monarch butterflies overwintering in central Mexico

    Science.gov (United States)

    Thogmartin, Wayne E.; Diffendorfer, James E.; Lopez-Hoffman, Laura; Oberhauser, Karen; Pleasants, John M.; Semmens, Brice X.; Semmens, Darius J.; Taylor, Orley R.; Wiederholt, Ruscena

    2017-01-01

    Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.

  9. Comparison of density estimation methods for astronomical datasets

    NARCIS (Netherlands)

    Ferdosi, B.J.; Buddelmeijer, H.; Trager, S.C.; Wilkinson, M.H.F.; Roerdink, J.B.T.M.

    2011-01-01

    Context. Galaxies are strongly influenced by their environment. Quantifying the galaxy density is a difficult but critical step in studying the properties of galaxies. Aims. We aim to determine differences in density estimation methods and their applicability in astronomical problems. We study the p

  10. Density estimators in particle hydrodynamics - DTFE versus regular SPH

    NARCIS (Netherlands)

    Pelupessy, FI; Schaap, WE; van de Weygaert, R

    2003-01-01

    We present the results of a study comparing density maps reconstructed by the Delaunay Tessellation Field Estimator (DTFE) and by regular SPH kernel-based techniques. The density maps are constructed from the outcome of an SPH particle hydrodynamics simulation of a multiphase interstellar medium. Th

  11. Estimating maritime snow density from seasonal climate variables

    Science.gov (United States)

    Bormann, K. J.; Evans, J. P.; Westra, S.; McCabe, M. F.; Painter, T. H.

    2013-12-01

    Snow density is a complex parameter that influences thermal, optical and mechanical snow properties and processes. Depth-integrated properties of snowpacks, including snow density, remain very difficult to obtain remotely. Observations of snow density are therefore limited to in-situ point locations. In maritime snowfields such as those in Australia and in parts of the western US, snow densification rates are enhanced and inter-annual variability is high compared to continental snow regions. In-situ snow observation networks in maritime climates often cannot characterise the variability in snowpack properties at spatial and temporal resolutions required for many modelling and observations-based applications. Regionalised density-time curves are commonly used to approximate snow densities over broad areas. However, these relationships have limited spatial applicability and do not allow for interannual variability in densification rates, which are important in maritime environments. Physically-based density models are relatively complex and rely on empirical algorithms derived from limited observations, which may not represent the variability observed in maritime snow. In this study, seasonal climate factors were used to estimate late season snow densities using multiple linear regressions. Daily snow density estimates were then obtained by projecting linearly to fresh snow densities at the start of the season. When applied spatially, the daily snow density fields compare well to in-situ observations across multiple sites in Australia, and provide a new method for extrapolating existing snow density datasets in maritime snow environments. While the relatively simple algorithm for estimating snow densities has been used in this study to constrain snowmelt rates in a temperature-index model, the estimates may also be used to incorporate variability in snow depth to snow water equivalent conversion.

  12. Quantal Response: Nonparametric Modeling

    Science.gov (United States)

    2017-01-01

    spline N−spline Fig. 3 Logistic regression 7 Approved for public release; distribution is unlimited. 5. Nonparametric QR Models Nonparametric linear ...stimulus and probability of response. The Generalized Linear Model approach does not make use of the limit distribution but allows arbitrary functional...7. Conclusions and Recommendations 18 8. References 19 Appendix A. The Linear Model 21 Appendix B. The Generalized Linear Model 33 Appendix C. B

  13. Analysis of quantization noise and state estimation with quantized measurements

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The approximate correction of the additive white noise model in quantized Kalman filter is investigated under certain conditions. The probability density function of the error of quantized measurements is analyzed theoretically and experimentally. The analysis is based on the probability theory and nonparametric density estimation technique, respectively. The approximator of probability density function of quantized measurement noise is given. The numerical results of nonparametric density estimation algori...

  14. A Censored Nonparametric Software Reliability Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper analyses the effct of censoring on the estimation of failure rate, and presents a framework of a censored nonparametric software reliability model. The model is based on nonparametric testing of failure rate monotonically decreasing and weighted kernel failure rate estimation under the constraint of failure rate monotonically decreasing. Not only does the model have the advantages of little assumptions and weak constraints, but also the residual defects number of the software system can be estimated. The numerical experiment and real data analysis show that the model performs well with censored data.

  15. 基于核密度估计多样性采集的背景建模算法%Diversity Sampling Based Kernel Density Estimation for Background Modeling

    Institute of Scientific and Technical Information of China (English)

    毛燕芬; 施鹏飞

    2005-01-01

    A novel diversity-sampling based nonparametric multi-modal background model is proposed. Using the samples having more popular and various intensity values in the training sequence, a nonparametric model is built for background subtraction. According to the related intensities, different weights are given to the distinct samples in kernel density estimation. This avoids repeated computation using all samples, and makes computation more efficient in the evaluation phase. Experimental results show the validity of the diversitysampling scheme and robustness of the proposed model in moving objects segmentation. The proposed algorithm can be used in outdoor surveillance systems.

  16. Application of the Vertex Exchange Method to estimate a semi-parametric mixture model for the MIC density of Escherichia coli isolates tested for susceptibility against ampicillin.

    Science.gov (United States)

    Jaspers, Stijn; Verbeke, Geert; Böhning, Dankmar; Aerts, Marc

    2016-01-01

    In the last decades, considerable attention has been paid to the collection of antimicrobial resistance data, with the aim of monitoring non-wild-type isolates. This monitoring is performed based on minimum inhibition concentration (MIC) values, which are collected through dilution experiments. We present a semi-parametric mixture model to estimate the entire MIC density on the continuous scale. The parametric first component is extended with a non-parametric second component and a new back-fitting algorithm, based on the Vertex Exchange Method, is proposed. Our data example shows how to estimate the MIC density for Escherichia coli tested for ampicillin and how to use this estimate for model-based classification. A simulation study was performed, showing the promising behavior of the new method, both in terms of density estimation as well as classification.

  17. A nonparametric dynamic additive regression model for longitudinal data

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas H.

    2000-01-01

    dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models......dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models...

  18. Multiatlas segmentation as nonparametric regression.

    Science.gov (United States)

    Awate, Suyash P; Whitaker, Ross T

    2014-09-01

    This paper proposes a novel theoretical framework to model and analyze the statistical characteristics of a wide range of segmentation methods that incorporate a database of label maps or atlases; such methods are termed as label fusion or multiatlas segmentation. We model these multiatlas segmentation problems as nonparametric regression problems in the high-dimensional space of image patches. We analyze the nonparametric estimator's convergence behavior that characterizes expected segmentation error as a function of the size of the multiatlas database. We show that this error has an analytic form involving several parameters that are fundamental to the specific segmentation problem (determined by the chosen anatomical structure, imaging modality, registration algorithm, and label-fusion algorithm). We describe how to estimate these parameters and show that several human anatomical structures exhibit the trends modeled analytically. We use these parameter estimates to optimize the regression estimator. We show that the expected error for large database sizes is well predicted by models learned on small databases. Thus, a few expert segmentations can help predict the database sizes required to keep the expected error below a specified tolerance level. Such cost-benefit analysis is crucial for deploying clinical multiatlas segmentation systems.

  19. Panel data specifications in nonparametric kernel regression

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...

  20. Kernel density estimation of a multidimensional efficiency profile

    CERN Document Server

    Poluektov, Anton

    2014-01-01

    Kernel density estimation is a convenient way to estimate the probability density of a distribution given the sample of data points. However, it has certain drawbacks: proper description of the density using narrow kernels needs large data samples, whereas if the kernel width is large, boundaries and narrow structures tend to be smeared. Here, an approach to correct for such effects, is proposed that uses an approximate density to describe narrow structures and boundaries. The approach is shown to be well suited for the description of the efficiency shape over a multidimensional phase space in a typical particle physics analysis. An example is given for the five-dimensional phase space of the $\\Lambda_b^0\\to D^0p\\pi$ decay.

  1. A morpho-density approach to estimating neural connectivity.

    Directory of Open Access Journals (Sweden)

    Michael P McAssey

    Full Text Available Neuronal signal integration and information processing in cortical neuronal networks critically depend on the organization of synaptic connectivity. Because of the challenges involved in measuring a large number of neurons, synaptic connectivity is difficult to determine experimentally. Current computational methods for estimating connectivity typically rely on the juxtaposition of experimentally available neurons and applying mathematical techniques to compute estimates of neural connectivity. However, since the number of available neurons is very limited, these connectivity estimates may be subject to large uncertainties. We use a morpho-density field approach applied to a vast ensemble of model-generated neurons. A morpho-density field (MDF describes the distribution of neural mass in the space around the neural soma. The estimated axonal and dendritic MDFs are derived from 100,000 model neurons that are generated by a stochastic phenomenological model of neurite outgrowth. These MDFs are then used to estimate the connectivity between pairs of neurons as a function of their inter-soma displacement. Compared with other density-field methods, our approach to estimating synaptic connectivity uses fewer restricting assumptions and produces connectivity estimates with a lower standard deviation. An important requirement is that the model-generated neurons reflect accurately the morphology and variation in morphology of the experimental neurons used for optimizing the model parameters. As such, the method remains subject to the uncertainties caused by the limited number of neurons in the experimental data set and by the quality of the model and the assumptions used in creating the MDFs and in calculating estimating connectivity. In summary, MDFs are a powerful tool for visualizing the spatial distribution of axonal and dendritic densities, for estimating the number of potential synapses between neurons with low standard deviation, and for obtaining

  2. Optimization of volumetric breast density estimation in digital mammograms.

    Science.gov (United States)

    Holland, Katharina; Gubern-Mérida, Albert; Mann, Ritse M; Karssemeijer, Nico

    2017-05-07

    Fibroglandular tissue volume and percent density can be estimated in unprocessed mammograms using a physics-based method, which relies on an internal reference value representing the projection of fat only. However, pixels representing fat only may not be present in dense breasts, causing an underestimation of density measurements. In this work, we investigate alternative approaches for obtaining a tissue reference value to improve density estimations, particularly in dense breasts. Two of three investigated reference values (F1, F2) are percentiles of the pixel value distribution in the breast interior (the contact area of breast and compression paddle). F1 is determined in a small breast interior, which minimizes the risk that peripheral pixels are included in the measurement at the cost of increasing the chance that no proper reference can be found. F2 is obtained using a larger breast interior. The new approach which is developed for very dense breasts does not require the presence of a fatty tissue region. As reference region we select the densest region in the mammogram and assume that this represents a projection of entirely dense tissue embedded between the subcutaneous fatty tissue layers. By measuring the thickness of the fat layers a reference (F3) can be computed. To obtain accurate breast density estimates irrespective of breast composition we investigated a combination of the results of the three reference values. We collected 202 pairs of MRI's and digital mammograms from 119 women. We compared the percent dense volume estimates based on both modalities and calculated Pearson's correlation coefficients. With the references F1-F3 we found respectively a correlation of [Formula: see text], [Formula: see text] and [Formula: see text]. Best results were obtained with the combination of the density estimations ([Formula: see text]). Results show that better volumetric density estimates can be obtained with the hybrid method, in particular for dense

  3. Nonparametric statistical methods

    CERN Document Server

    Hollander, Myles; Chicken, Eric

    2013-01-01

    Praise for the Second Edition"This book should be an essential part of the personal library of every practicing statistician."-Technometrics  Thoroughly revised and updated, the new edition of Nonparametric Statistical Methods includes additional modern topics and procedures, more practical data sets, and new problems from real-life situations. The book continues to emphasize the importance of nonparametric methods as a significant branch of modern statistics and equips readers with the conceptual and technical skills necessary to select and apply the appropriate procedures for any given sit

  4. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  5. Green's function based density estimation

    Energy Technology Data Exchange (ETDEWEB)

    Kovesarki, Peter; Brock, Ian C.; Nuncio Quiroz, Adriana Elizabeth [Physikalisches Institut, Universitaet Bonn (Germany)

    2012-07-01

    A method was developed based on Green's function identities to estimate probability densities. This can be used for likelihood estimations and for binary classifications. It offers several advantages over neural networks, boosted decision trees and other, regression based classifiers. For example, it is less prone to overtraining, and it is much easier to combine several samples. Some capabilities are demonstrated using ATLAS data.

  6. Density estimates of monarch butterflies overwintering in central Mexico

    OpenAIRE

    Thogmartin, Wayne; Diffendorfer, Jay E.; Lopez-Hoffman, Laura; Oberhauser, Karen; Pleasants, John; Semmens, Brice Xavier; Semmens, Darius; Taylor, Orley R.; Wiederholt, Ruscena

    2017-01-01

    Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There...

  7. Density Estimation in Several Populations With Uncertain Population Membership

    KAUST Repository

    Ma, Yanyuan

    2011-09-01

    We devise methods to estimate probability density functions of several populations using observations with uncertain population membership, meaning from which population an observation comes is unknown. The probability of an observation being sampled from any given population can be calculated. We develop general estimation procedures and bandwidth selection methods for our setting. We establish large-sample properties and study finite-sample performance using simulation studies. We illustrate our methods with data from a nutrition study.

  8. Estimation of volumetric breast density for breast cancer risk prediction

    Science.gov (United States)

    Pawluczyk, Olga; Yaffe, Martin J.; Boyd, Norman F.; Jong, Roberta A.

    2000-04-01

    Mammographic density (MD) has been shown to be a strong risk predictor for breast cancer. Compared to subjective assessment by a radiologist, computer-aided analysis of digitized mammograms provides a quantitative and more reproducible method for assessing breast density. However, the current methods of estimating breast density based on the area of bright signal in a mammogram do not reflect the true, volumetric quantity of dense tissue in the breast. A computerized method to estimate the amount of radiographically dense tissue in the overall volume of the breast has been developed to provide an automatic, user-independent tool for breast cancer risk assessment. The procedure for volumetric density estimation consists of first correcting the image for inhomogeneity, then performing a volume density calculation. First, optical sensitometry is used to convert all images to the logarithm of relative exposure (LRE), in order to simplify the image correction operations. The field non-uniformity correction, which takes into account heel effect, inverse square law, path obliquity and intrinsic field and grid non- uniformity is obtained by imaging a spherical section PMMA phantom. The processed LRE image of the phantom is then used as a correction offset for actual mammograms. From information about the thickness and placement of the breast, as well as the parameters of a breast-like calibration step wedge placed in the mammogram, MD of the breast is calculated. Post processing and a simple calibration phantom enable user- independent, reliable and repeatable volumetric estimation of density in breast-equivalent phantoms. Initial results obtained on known density phantoms show the estimation to vary less than 5% in MD from the actual value. This can be compared to estimated mammographic density differences of 30% between the true and non-corrected values. Since a more simplistic breast density measurement based on the projected area has been shown to be a strong indicator

  9. Estimating neuronal connectivity from axonal and dendritic density fields

    Science.gov (United States)

    van Pelt, Jaap; van Ooyen, Arjen

    2013-01-01

    Neurons innervate space by extending axonal and dendritic arborizations. When axons and dendrites come in close proximity of each other, synapses between neurons can be formed. Neurons vary greatly in their morphologies and synaptic connections with other neurons. The size and shape of the arborizations determine the way neurons innervate space. A neuron may therefore be characterized by the spatial distribution of its axonal and dendritic “mass.” A population mean “mass” density field of a particular neuron type can be obtained by averaging over the individual variations in neuron geometries. Connectivity in terms of candidate synaptic contacts between neurons can be determined directly on the basis of their arborizations but also indirectly on the basis of their density fields. To decide when a candidate synapse can be formed, we previously developed a criterion defining that axonal and dendritic line pieces should cross in 3D and have an orthogonal distance less than a threshold value. In this paper, we developed new methodology for applying this criterion to density fields. We show that estimates of the number of contacts between neuron pairs calculated from their density fields are fully consistent with the number of contacts calculated from the actual arborizations. However, the estimation of the connection probability and the expected number of contacts per connection cannot be calculated directly from density fields, because density fields do not carry anymore the correlative structure in the spatial distribution of synaptic contacts. Alternatively, these two connectivity measures can be estimated from the expected number of contacts by using empirical mapping functions. The neurons used for the validation studies were generated by our neuron simulator NETMORPH. An example is given of the estimation of average connectivity and Euclidean pre- and postsynaptic distance distributions in a network of neurons represented by their population mean density

  10. Face Value: Towards Robust Estimates of Snow Leopard Densities.

    Directory of Open Access Journals (Sweden)

    Justine S Alexander

    Full Text Available When densities of large carnivores fall below certain thresholds, dramatic ecological effects can follow, leading to oversimplified ecosystems. Understanding the population status of such species remains a major challenge as they occur in low densities and their ranges are wide. This paper describes the use of non-invasive data collection techniques combined with recent spatial capture-recapture methods to estimate the density of snow leopards Panthera uncia. It also investigates the influence of environmental and human activity indicators on their spatial distribution. A total of 60 camera traps were systematically set up during a three-month period over a 480 km2 study area in Qilianshan National Nature Reserve, Gansu Province, China. We recorded 76 separate snow leopard captures over 2,906 trap-days, representing an average capture success of 2.62 captures/100 trap-days. We identified a total number of 20 unique individuals from photographs and estimated snow leopard density at 3.31 (SE = 1.01 individuals per 100 km2. Results of our simulation exercise indicate that our estimates from the Spatial Capture Recapture models were not optimal to respect to bias and precision (RMSEs for density parameters less or equal to 0.87. Our results underline the critical challenge in achieving sufficient sample sizes of snow leopard captures and recaptures. Possible performance improvements are discussed, principally by optimising effective camera capture and photographic data quality.

  11. Face Value: Towards Robust Estimates of Snow Leopard Densities.

    Science.gov (United States)

    Alexander, Justine S; Gopalaswamy, Arjun M; Shi, Kun; Riordan, Philip

    2015-01-01

    When densities of large carnivores fall below certain thresholds, dramatic ecological effects can follow, leading to oversimplified ecosystems. Understanding the population status of such species remains a major challenge as they occur in low densities and their ranges are wide. This paper describes the use of non-invasive data collection techniques combined with recent spatial capture-recapture methods to estimate the density of snow leopards Panthera uncia. It also investigates the influence of environmental and human activity indicators on their spatial distribution. A total of 60 camera traps were systematically set up during a three-month period over a 480 km2 study area in Qilianshan National Nature Reserve, Gansu Province, China. We recorded 76 separate snow leopard captures over 2,906 trap-days, representing an average capture success of 2.62 captures/100 trap-days. We identified a total number of 20 unique individuals from photographs and estimated snow leopard density at 3.31 (SE = 1.01) individuals per 100 km2. Results of our simulation exercise indicate that our estimates from the Spatial Capture Recapture models were not optimal to respect to bias and precision (RMSEs for density parameters less or equal to 0.87). Our results underline the critical challenge in achieving sufficient sample sizes of snow leopard captures and recaptures. Possible performance improvements are discussed, principally by optimising effective camera capture and photographic data quality.

  12. Corruption clubs: empirical evidence from kernel density estimates

    NARCIS (Netherlands)

    Herzfeld, T.; Weiss, Ch.

    2007-01-01

    A common finding of many analytical models is the existence of multiple equilibria of corruption. Countries characterized by the same economic, social and cultural background do not necessarily experience the same levels of corruption. In this article, we use Kernel Density Estimation techniques to

  13. Corruption clubs: empirical evidence from kernel density estimates

    NARCIS (Netherlands)

    Herzfeld, T.; Weiss, Ch.

    2007-01-01

    A common finding of many analytical models is the existence of multiple equilibria of corruption. Countries characterized by the same economic, social and cultural background do not necessarily experience the same levels of corruption. In this article, we use Kernel Density Estimation techniques to

  14. Density estimation in tiger populations: combining information for strong inference.

    Science.gov (United States)

    Gopalaswamy, Arjun M; Royle, J Andrew; Delampady, Mohan; Nichols, James D; Karanth, K Ullas; Macdonald, David W

    2012-07-01

    A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture-recapture data. The model, which combined information, provided the most precise estimate of density (8.5 +/- 1.95 tigers/100 km2 [posterior mean +/- SD]) relative to a model that utilized only one data source (photographic, 12.02 +/- 3.02 tigers/100 km2 and fecal DNA, 6.65 +/- 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.

  15. Density estimation in tiger populations: combining information for strong inference

    Science.gov (United States)

    Gopalaswamy, Arjun M.; Royle, J. Andrew; Delampady, Mohan; Nichols, James D.; Karanth, K. Ullas; Macdonald, David W.

    2012-01-01

    A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture–recapture data. The model, which combined information, provided the most precise estimate of density (8.5 ± 1.95 tigers/100 km2 [posterior mean ± SD]) relative to a model that utilized only one data source (photographic, 12.02 ± 3.02 tigers/100 km2 and fecal DNA, 6.65 ± 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.

  16. Optimization of volumetric breast density estimation in digital mammograms

    NARCIS (Netherlands)

    Holland, K.; Gubern Merida, A.; Mann, R.M.; Karssemeijer, N.

    2017-01-01

    Fibroglandular tissue volume and percent density can be estimated in unprocessed mammograms using a physics-based method, which relies on an internal reference value representing the projection of fat only. However, pixels representing fat only may not be present in dense breasts, causing an

  17. State of the Art in Photon-Density Estimation

    DEFF Research Database (Denmark)

    Hachisuka, Toshiya; Jarosz, Wojciech; Georgiev, Iliyan

    2013-01-01

    Photon-density estimation techniques are a popular choice for simulating light transport in scenes with complicated geometry and materials. This class of algorithms can be used to accurately simulate inter-reflections, caustics, color bleeding, scattering in participating media, and subsurface sc...

  18. State of the Art in Photon Density Estimation

    DEFF Research Database (Denmark)

    Hachisuka, Toshiya; Jarosz, Wojciech; Bouchard, Guillaume

    2012-01-01

    Photon-density estimation techniques are a popular choice for simulating light transport in scenes with complicated geometry and materials. This class of algorithms can be used to accurately simulate inter-reflections, caustics, color bleeding, scattering in participating media, and subsurface sc...

  19. Estimation of the space density of low surface brightness galaxies

    NARCIS (Netherlands)

    Briggs, FH

    1997-01-01

    The space density of low surface brightness and tiny gas-rich dwarf galaxies are estimated for two recent catalogs: the Arecibo Survey of Northern Dwarf and Low Surface Brightness Galaxies and the Catalog of Low Surface Brightness Galaxies, List II. The goals are (1) to evaluate the additions to the

  20. State of the Art in Photon Density Estimation

    DEFF Research Database (Denmark)

    Hachisuka, Toshiya; Jarosz, Wojciech; Bouchard, Guillaume

    2012-01-01

    scattering. Since its introduction, photon-density estimation has been significantly extended in computer graphics with the introduction of: specialized techniques that intelligently modify the positions or bandwidths to reduce visual error using a small number of photons, approaches that eliminate error...

  1. Why preferring parametric forecasting to nonparametric methods?

    Science.gov (United States)

    Jabot, Franck

    2015-05-07

    A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Simplified large African carnivore density estimators from track indices

    Directory of Open Access Journals (Sweden)

    Christiaan W. Winterbach

    2016-12-01

    Full Text Available Background The range, population size and trend of large carnivores are important parameters to assess their status globally and to plan conservation strategies. One can use linear models to assess population size and trends of large carnivores from track-based surveys on suitable substrates. The conventional approach of a linear model with intercept may not intercept at zero, but may fit the data better than linear model through the origin. We assess whether a linear regression through the origin is more appropriate than a linear regression with intercept to model large African carnivore densities and track indices. Methods We did simple linear regression with intercept analysis and simple linear regression through the origin and used the confidence interval for ß in the linear model y = αx + ß, Standard Error of Estimate, Mean Squares Residual and Akaike Information Criteria to evaluate the models. Results The Lion on Clay and Low Density on Sand models with intercept were not significant (P > 0.05. The other four models with intercept and the six models thorough origin were all significant (P < 0.05. The models using linear regression with intercept all included zero in the confidence interval for ß and the null hypothesis that ß = 0 could not be rejected. All models showed that the linear model through the origin provided a better fit than the linear model with intercept, as indicated by the Standard Error of Estimate and Mean Square Residuals. Akaike Information Criteria showed that linear models through the origin were better and that none of the linear models with intercept had substantial support. Discussion Our results showed that linear regression through the origin is justified over the more typical linear regression with intercept for all models we tested. A general model can be used to estimate large carnivore densities from track densities across species and study areas. The formula observed track density = 3.26

  3. SVM for density estimation and application to medical image segmentation

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhao; ZHANG Su; ZHANG Chen-xi; CHEN Ya-zhu

    2006-01-01

    A method of medical image segmentation based on support vector machine (SVM) for density estimation is presented. We used this estimator to construct a prior model of the image intensity and curvature profile of the structure from training images. When segmenting a novel image similar to the training images, the technique of narrow level set method is used. The higher dimensional surface evolution metric is defined by the prior model instead of by energy minimization function. This method offers several advantages. First, SVM for density estimation is consistent and its solution is sparse. Second, compared to the traditional level set methods, this method incorporates shape information on the object to be segmented into the segmentation process.Segmentation results are demonstrated on synthetic images, MR images and ultrasonic images.

  4. Evaluating lidar point densities for effective estimation of aboveground biomass

    Science.gov (United States)

    Wu, Zhuoting; Dye, Dennis G.; Stoker, Jason M.; Vogel, John M.; Velasco, Miguel G.; Middleton, Barry R.

    2016-01-01

    The U.S. Geological Survey (USGS) 3D Elevation Program (3DEP) was recently established to provide airborne lidar data coverage on a national scale. As part of a broader research effort of the USGS to develop an effective remote sensing-based methodology for the creation of an operational biomass Essential Climate Variable (Biomass ECV) data product, we evaluated the performance of airborne lidar data at various pulse densities against Landsat 8 satellite imagery in estimating above ground biomass for forests and woodlands in a study area in east-central Arizona, U.S. High point density airborne lidar data, were randomly sampled to produce five lidar datasets with reduced densities ranging from 0.5 to 8 point(s)/m2, corresponding to the point density range of 3DEP to provide national lidar coverage over time. Lidar-derived aboveground biomass estimate errors showed an overall decreasing trend as lidar point density increased from 0.5 to 8 points/m2. Landsat 8-based aboveground biomass estimates produced errors larger than the lowest lidar point density of 0.5 point/m2, and therefore Landsat 8 observations alone were ineffective relative to airborne lidar for generating a Biomass ECV product, at least for the forest and woodland vegetation types of the Southwestern U.S. While a national Biomass ECV product with optimal accuracy could potentially be achieved with 3DEP data at 8 points/m2, our results indicate that even lower density lidar data could be sufficient to provide a national Biomass ECV product with accuracies significantly higher than that from Landsat observations alone.

  5. 随机设计下非参数回归模型变点的小波检测与估计%Wavelet Detection and Estimation of Change Points in Nonparametric Regression Models under Random Design

    Institute of Scientific and Technical Information of China (English)

    赵文芝; 田铮; 夏志明

    2009-01-01

    A wavelet method of detection and estimation of change points in nonparametric regression models under random design is proposed.The confidence bound of our test is derived by using the test statistics based on empirical wavelet coefficients as obtained by wavelet transformation of the data which is observed with noise.Moreover,the consistence of the test is proved while the rate of convergence is given.The method turns out to be effective after being tested on simulated examples and applied to IBM stock market data.

  6. Estimation of Enceladus Plume Density Using Cassini Flight Data

    Science.gov (United States)

    Wang, Eric K.; Lee, Allan Y.

    2011-01-01

    The Cassini spacecraft was launched on October 15, 1997 by a Titan 4B launch vehicle. After an interplanetary cruise of almost seven years, it arrived at Saturn on June 30, 2004. In 2005, Cassini completed three flybys of Enceladus, a small, icy satellite of Saturn. Observations made during these flybys confirmed the existence of water vapor plumes in the south polar region of Enceladus. Five additional low-altitude flybys of Enceladus were successfully executed in 2008-9 to better characterize these watery plumes. During some of these Enceladus flybys, the spacecraft attitude was controlled by a set of three reaction wheels. When the disturbance torque imparted on the spacecraft was predicted to exceed the control authority of the reaction wheels, thrusters were used to control the spacecraft attitude. Using telemetry data of reaction wheel rates or thruster on-times collected from four low-altitude Enceladus flybys (in 2008-10), one can reconstruct the time histories of the Enceladus plume jet density. The 1 sigma uncertainty of the estimated density is 5.9-6.7% (depending on the density estimation methodology employed). These plume density estimates could be used to confirm measurements made by other onboard science instruments and to support the modeling of Enceladus plume jets.

  7. Open-cluster density profiles derived using a kernel estimator

    CERN Document Server

    Seleznev, Anton F

    2016-01-01

    Surface and spatial radial density profiles in open clusters are derived using a kernel estimator method. Formulae are obtained for the contribution of every star into the spatial density profile. The evaluation of spatial density profiles is tested against open-cluster models from N-body experiments with N = 500. Surface density profiles are derived for seven open clusters (NGC 1502, 1960, 2287, 2516, 2682, 6819 and 6939) using Two-Micron All-Sky Survey data and for different limiting magnitudes. The selection of an optimal kernel half-width is discussed. It is shown that open-cluster radius estimates hardly depend on the kernel half-width. Hints of stellar mass segregation and structural features indicating cluster non-stationarity in the regular force field are found. A comparison with other investigations shows that the data on open-cluster sizes are often underestimated. The existence of an extended corona around the open cluster NGC 6939 was confirmed. A combined function composed of the King density pr...

  8. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural...... breaks in correlations. Only when correlations are constant does the parametric DCC model deliver the best outcome. The methodologies are illustrated by evaluating two interesting portfolios. The first portfolio consists of the equity sector SPDRs and the S&P 500, while the second one contains major...

  9. Bayesian error estimation in density-functional theory

    DEFF Research Database (Denmark)

    Mortensen, Jens Jørgen; Kaasbjerg, Kristen; Frederiksen, Søren Lund

    2005-01-01

    We present a practical scheme for performing error estimates for density-functional theory calculations. The approach, which is based on ideas from Bayesian statistics, involves creating an ensemble of exchange-correlation functionals by comparing with an experimental database of binding energies...... for molecules and solids. Fluctuations within the ensemble can then be used to estimate errors relative to experiment on calculated quantities such as binding energies, bond lengths, and vibrational frequencies. It is demonstrated that the error bars on energy differences may vary by orders of magnitude...

  10. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    breaks in correlations. Only when correlations are constant does the parametric DCC model deliver the best outcome. The methodologies are illustrated by evaluating two interesting portfolios. The first portfolio consists of the equity sector SPDRs and the S&P 500, while the second one contains major......This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural...... currencies. Results show the nonparametric model generally dominates the others when evaluating in-sample. However, the semiparametric model is best for out-of-sample analysis....

  11. Nonparametric Predictive Regression

    OpenAIRE

    Ioannis Kasparis; Elena Andreou; Phillips, Peter C.B.

    2012-01-01

    A unifying framework for inference is developed in predictive regressions where the predictor has unknown integration properties and may be stationary or nonstationary. Two easily implemented nonparametric F-tests are proposed. The test statistics are related to those of Kasparis and Phillips (2012) and are obtained by kernel regression. The limit distribution of these predictive tests holds for a wide range of predictors including stationary as well as non-stationary fractional and near unit...

  12. Large Scale Density Estimation of Blue and Fin Whales (LSD)

    Science.gov (United States)

    2014-09-30

    interactions with human activity requires knowledge of how many animals are present in an area during a specific time period. Many marine mammal species ...Ocean at Wake Island will then be applied to the same species in the Indian Ocean at the CTBTO location at Diego Garcia. 1. Develop and implement...proposed density estimation method is also highly dependent on call rate inputs, which are used in the development of species specific multipliers for

  13. Some Bayesian statistical techniques useful in estimating frequency and density

    Science.gov (United States)

    Johnson, D.H.

    1977-01-01

    This paper presents some elementary applications of Bayesian statistics to problems faced by wildlife biologists. Bayesian confidence limits for frequency of occurrence are shown to be generally superior to classical confidence limits. Population density can be estimated from frequency data if the species is sparsely distributed relative to the size of the sample plot. For other situations, limits are developed based on the normal distribution and prior knowledge that the density is non-negative, which insures that the lower confidence limit is non-negative. Conditions are described under which Bayesian confidence limits are superior to those calculated with classical methods; examples are also given on how prior knowledge of the density can be used to sharpen inferences drawn from a new sample.

  14. Covariance and correlation estimation in electron-density maps.

    Science.gov (United States)

    Altomare, Angela; Cuocci, Corrado; Giacovazzo, Carmelo; Moliterni, Anna; Rizzi, Rosanna

    2012-03-01

    Quite recently two papers have been published [Giacovazzo & Mazzone (2011). Acta Cryst. A67, 210-218; Giacovazzo et al. (2011). Acta Cryst. A67, 368-382] which calculate the variance in any point of an electron-density map at any stage of the phasing process. The main aim of the papers was to associate a standard deviation to each pixel of the map, in order to obtain a better estimate of the map reliability. This paper deals with the covariance estimate between points of an electron-density map in any space group, centrosymmetric or non-centrosymmetric, no matter the correlation between the model and target structures. The aim is as follows: to verify if the electron density in one point of the map is amplified or depressed as an effect of the electron density in one or more other points of the map. High values of the covariances are usually connected with undesired features of the map. The phases are the primitive random variables of our probabilistic model; the covariance changes with the quality of the model and therefore with the quality of the phases. The conclusive formulas show that the covariance is also influenced by the Patterson map. Uncertainty on measurements may influence the covariance, particularly in the final stages of the structure refinement; a general formula is obtained taking into account both phase and measurement uncertainty, valid at any stage of the crystal structure solution.

  15. Accurate photometric redshift probability density estimation - method comparison and application

    CERN Document Server

    Rau, Markus Michael; Brimioulle, Fabrice; Frank, Eibe; Friedrich, Oliver; Gruen, Daniel; Hoyle, Ben

    2015-01-01

    We introduce an ordinal classification algorithm for photometric redshift estimation, which vastly improves the reconstruction of photometric redshift probability density functions (PDFs) for individual galaxies and galaxy samples. As a use case we apply our method to CFHTLS galaxies. The ordinal classification algorithm treats distinct redshift bins as ordered values, which improves the quality of photometric redshift PDFs, compared with non-ordinal classification architectures. We also propose a new single value point estimate of the galaxy redshift, that can be used to estimate the full redshift PDF of a galaxy sample. This method is competitive in terms of accuracy with contemporary algorithms, which stack the full redshift PDFs of all galaxies in the sample, but requires orders of magnitudes less storage space. The methods described in this paper greatly improve the log-likelihood of individual object redshift PDFs, when compared with a popular Neural Network code (ANNz). In our use case, this improvemen...

  16. Nonparametric model reconstruction for stochastic differential equations from discretely observed time-series data.

    Science.gov (United States)

    Ohkubo, Jun

    2011-12-01

    A scheme is developed for estimating state-dependent drift and diffusion coefficients in a stochastic differential equation from time-series data. The scheme does not require to specify parametric forms for the drift and diffusion coefficients in advance. In order to perform the nonparametric estimation, a maximum likelihood method is combined with a concept based on a kernel density estimation. In order to deal with discrete observation or sparsity of the time-series data, a local linearization method is employed, which enables a fast estimation.

  17. A projection and density estimation method for knowledge discovery.

    Science.gov (United States)

    Stanski, Adam; Hellwich, Olaf

    2012-01-01

    A key ingredient to modern data analysis is probability density estimation. However, it is well known that the curse of dimensionality prevents a proper estimation of densities in high dimensions. The problem is typically circumvented by using a fixed set of assumptions about the data, e.g., by assuming partial independence of features, data on a manifold or a customized kernel. These fixed assumptions limit the applicability of a method. In this paper we propose a framework that uses a flexible set of assumptions instead. It allows to tailor a model to various problems by means of 1d-decompositions. The approach achieves a fast runtime and is not limited by the curse of dimensionality as all estimations are performed in 1d-space. The wide range of applications is demonstrated at two very different real world examples. The first is a data mining software that allows the fully automatic discovery of patterns. The software is publicly available for evaluation. As a second example an image segmentation method is realized. It achieves state of the art performance on a benchmark dataset although it uses only a fraction of the training data and very simple features.

  18. A projection and density estimation method for knowledge discovery.

    Directory of Open Access Journals (Sweden)

    Adam Stanski

    Full Text Available A key ingredient to modern data analysis is probability density estimation. However, it is well known that the curse of dimensionality prevents a proper estimation of densities in high dimensions. The problem is typically circumvented by using a fixed set of assumptions about the data, e.g., by assuming partial independence of features, data on a manifold or a customized kernel. These fixed assumptions limit the applicability of a method. In this paper we propose a framework that uses a flexible set of assumptions instead. It allows to tailor a model to various problems by means of 1d-decompositions. The approach achieves a fast runtime and is not limited by the curse of dimensionality as all estimations are performed in 1d-space. The wide range of applications is demonstrated at two very different real world examples. The first is a data mining software that allows the fully automatic discovery of patterns. The software is publicly available for evaluation. As a second example an image segmentation method is realized. It achieves state of the art performance on a benchmark dataset although it uses only a fraction of the training data and very simple features.

  19. Effect of Random Clustering on Surface Damage Density Estimates

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, M J; Feit, M D

    2007-10-29

    Identification and spatial registration of laser-induced damage relative to incident fluence profiles is often required to characterize the damage properties of laser optics near damage threshold. Of particular interest in inertial confinement laser systems are large aperture beam damage tests (>1cm{sup 2}) where the number of initiated damage sites for {phi}>14J/cm{sup 2} can approach 10{sup 5}-10{sup 6}, requiring automatic microscopy counting to locate and register individual damage sites. However, as was shown for the case of bacteria counting in biology decades ago, random overlapping or 'clumping' prevents accurate counting of Poisson-distributed objects at high densities, and must be accounted for if the underlying statistics are to be understood. In this work we analyze the effect of random clumping on damage initiation density estimates at fluences above damage threshold. The parameter {psi} = a{rho} = {rho}/{rho}{sub 0}, where a = 1/{rho}{sub 0} is the mean damage site area and {rho} is the mean number density, is used to characterize the onset of clumping, and approximations based on a simple model are used to derive an expression for clumped damage density vs. fluence and damage site size. The influence of the uncorrected {rho} vs. {phi} curve on damage initiation probability predictions is also discussed.

  20. A Concept of Approximated Densities for Efficient Nonlinear Estimation

    Directory of Open Access Journals (Sweden)

    Virginie F. Ruiz

    2002-10-01

    Full Text Available This paper presents the theoretical development of a nonlinear adaptive filter based on a concept of filtering by approximated densities (FAD. The most common procedures for nonlinear estimation apply the extended Kalman filter. As opposed to conventional techniques, the proposed recursive algorithm does not require any linearisation. The prediction uses a maximum entropy principle subject to constraints. Thus, the densities created are of an exponential type and depend on a finite number of parameters. The filtering yields recursive equations involving these parameters. The update applies the Bayes theorem. Through simulation on a generic exponential model, the proposed nonlinear filter is implemented and the results prove to be superior to that of the extended Kalman filter and a class of nonlinear filters based on partitioning algorithms.

  1. Estimation of probability densities using scale-free field theories.

    Science.gov (United States)

    Kinney, Justin B

    2014-07-01

    The question of how best to estimate a continuous probability density from finite data is an intriguing open problem at the interface of statistics and physics. Previous work has argued that this problem can be addressed in a natural way using methods from statistical field theory. Here I describe results that allow this field-theoretic approach to be rapidly and deterministically computed in low dimensions, making it practical for use in day-to-day data analysis. Importantly, this approach does not impose a privileged length scale for smoothness of the inferred probability density, but rather learns a natural length scale from the data due to the tradeoff between goodness of fit and an Occam factor. Open source software implementing this method in one and two dimensions is provided.

  2. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of

  3. 2nd Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Manteiga, Wenceslao; Romo, Juan

    2016-01-01

    This volume collects selected, peer-reviewed contributions from the 2nd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Cádiz (Spain) between June 11–16 2014, and sponsored by the American Statistical Association, the Institute of Mathematical Statistics, the Bernoulli Society for Mathematical Statistics and Probability, the Journal of Nonparametric Statistics and Universidad Carlos III de Madrid. The 15 articles are a representative sample of the 336 contributed papers presented at the conference. They cover topics such as high-dimensional data modelling, inference for stochastic processes and for dependent data, nonparametric and goodness-of-fit testing, nonparametric curve estimation, object-oriented data analysis, and semiparametric inference. The aim of the ISNPS 2014 conference was to bring together recent advances and trends in several areas of nonparametric statistics in order to facilitate the exchange of research ideas, promote collaboration among researchers...

  4. Kendall-Theil Robust Line (KTRLine--version 1.0)-A Visual Basic Program for Calculating and Graphing Robust Nonparametric Estimates of Linear-Regression Coefficients Between Two Continuous Variables

    Science.gov (United States)

    Granato, Gregory E.

    2006-01-01

    The Kendall-Theil Robust Line software (KTRLine-version 1.0) is a Visual Basic program that may be used with the Microsoft Windows operating system to calculate parameters for robust, nonparametric estimates of linear-regression coefficients between two continuous variables. The KTRLine software was developed by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, for use in stochastic data modeling with local, regional, and national hydrologic data sets to develop planning-level estimates of potential effects of highway runoff on the quality of receiving waters. The Kendall-Theil robust line was selected because this robust nonparametric method is resistant to the effects of outliers and nonnormality in residuals that commonly characterize hydrologic data sets. The slope of the line is calculated as the median of all possible pairwise slopes between points. The intercept is calculated so that the line will run through the median of input data. A single-line model or a multisegment model may be specified. The program was developed to provide regression equations with an error component for stochastic data generation because nonparametric multisegment regression tools are not available with the software that is commonly used to develop regression models. The Kendall-Theil robust line is a median line and, therefore, may underestimate total mass, volume, or loads unless the error component or a bias correction factor is incorporated into the estimate. Regression statistics such as the median error, the median absolute deviation, the prediction error sum of squares, the root mean square error, the confidence interval for the slope, and the bias correction factor for median estimates are calculated by use of nonparametric methods. These statistics, however, may be used to formulate estimates of mass, volume, or total loads. The program is used to read a two- or three-column tab-delimited input file with variable names in the first row and

  5. Change-in-ratio density estimator for feral pigs is less biased than closed mark-recapture estimates

    Science.gov (United States)

    Hanson, L.B.; Grand, J.B.; Mitchell, M.S.; Jolley, D.B.; Sparklin, B.D.; Ditchkoff, S.S.

    2008-01-01

    Closed-population capture-mark-recapture (CMR) methods can produce biased density estimates for species with low or heterogeneous detection probabilities. In an attempt to address such biases, we developed a density-estimation method based on the change in ratio (CIR) of survival between two populations where survival, calculated using an open-population CMR model, is known to differ. We used our method to estimate density for a feral pig (Sus scrofa) population on Fort Benning, Georgia, USA. To assess its validity, we compared it to an estimate of the minimum density of pigs known to be alive and two estimates based on closed-population CMR models. Comparison of the density estimates revealed that the CIR estimator produced a density estimate with low precision that was reasonable with respect to minimum known density. By contrast, density point estimates using the closed-population CMR models were less than the minimum known density, consistent with biases created by low and heterogeneous capture probabilities for species like feral pigs that may occur in low density or are difficult to capture. Our CIR density estimator may be useful for tracking broad-scale, long-term changes in species, such as large cats, for which closed CMR models are unlikely to work. ?? CSIRO 2008.

  6. The method of separation for evolutionary spectral density estimation of multi-variate and multi-dimensional non-stationary stochastic processes

    KAUST Repository

    Schillinger, Dominik

    2013-07-01

    The method of separation can be used as a non-parametric estimation technique, especially suitable for evolutionary spectral density functions of uniformly modulated and strongly narrow-band stochastic processes. The paper at hand provides a consistent derivation of method of separation based spectrum estimation for the general multi-variate and multi-dimensional case. The validity of the method is demonstrated by benchmark tests with uniformly modulated spectra, for which convergence to the analytical solution is demonstrated. The key advantage of the method of separation is the minimization of spectral dispersion due to optimum time- or space-frequency localization. This is illustrated by the calibration of multi-dimensional and multi-variate geometric imperfection models from strongly narrow-band measurements in I-beams and cylindrical shells. Finally, the application of the method of separation based estimates for the stochastic buckling analysis of the example structures is briefly discussed. © 2013 Elsevier Ltd.

  7. Matrix Methods for Estimating the Coherence Functions from Estimates of the Cross-Spectral Density Matrix

    Directory of Open Access Journals (Sweden)

    D.O. Smallwood

    1996-01-01

    Full Text Available It is shown that the usual method for estimating the coherence functions (ordinary, partial, and multiple for a general multiple-input! multiple-output problem can be expressed as a modified form of Cholesky decomposition of the cross-spectral density matrix of the input and output records. The results can be equivalently obtained using singular value decomposition (SVD of the cross-spectral density matrix. Using SVD suggests a new form of fractional coherence. The formulation as a SVD problem also suggests a way to order the inputs when a natural physical order of the inputs is absent.

  8. Probability Density and CFAR Threshold Estimation for Hyperspectral Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G A

    2004-09-21

    The work reported here shows the proof of principle (using a small data set) for a suite of algorithms designed to estimate the probability density function of hyperspectral background data and compute the appropriate Constant False Alarm Rate (CFAR) matched filter decision threshold for a chemical plume detector. Future work will provide a thorough demonstration of the algorithms and their performance with a large data set. The LASI (Large Aperture Search Initiative) Project involves instrumentation and image processing for hyperspectral images of chemical plumes in the atmosphere. The work reported here involves research and development on algorithms for reducing the false alarm rate in chemical plume detection and identification algorithms operating on hyperspectral image cubes. The chemical plume detection algorithms to date have used matched filters designed using generalized maximum likelihood ratio hypothesis testing algorithms [1, 2, 5, 6, 7, 12, 10, 11, 13]. One of the key challenges in hyperspectral imaging research is the high false alarm rate that often results from the plume detector [1, 2]. The overall goal of this work is to extend the classical matched filter detector to apply Constant False Alarm Rate (CFAR) methods to reduce the false alarm rate, or Probability of False Alarm P{sub FA} of the matched filter [4, 8, 9, 12]. A detector designer is interested in minimizing the probability of false alarm while simultaneously maximizing the probability of detection P{sub D}. This is summarized by the Receiver Operating Characteristic Curve (ROC) [10, 11], which is actually a family of curves depicting P{sub D} vs. P{sub FA}parameterized by varying levels of signal to noise (or clutter) ratio (SNR or SCR). Often, it is advantageous to be able to specify a desired P{sub FA} and develop a ROC curve (P{sub D} vs. decision threshold r{sub 0}) for that case. That is the purpose of this work. Specifically, this work develops a set of algorithms and MATLAB

  9. Nonparametric tests for censored data

    CERN Document Server

    Bagdonavicus, Vilijandas; Nikulin, Mikhail

    2013-01-01

    This book concerns testing hypotheses in non-parametric models. Generalizations of many non-parametric tests to the case of censored and truncated data are considered. Most of the test results are proved and real applications are illustrated using examples. Theories and exercises are provided. The incorrect use of many tests applying most statistical software is highlighted and discussed.

  10. Asymptotic Normality of Quadratic Estimators.

    Science.gov (United States)

    Robins, James; Li, Lingling; Tchetgen, Eric; van der Vaart, Aad

    2016-12-01

    We prove conditional asymptotic normality of a class of quadratic U-statistics that are dominated by their degenerate second order part and have kernels that change with the number of observations. These statistics arise in the construction of estimators in high-dimensional semi- and non-parametric models, and in the construction of nonparametric confidence sets. This is illustrated by estimation of the integral of a square of a density or regression function, and estimation of the mean response with missing data. We show that estimators are asymptotically normal even in the case that the rate is slower than the square root of the observations.

  11. Large Scale Density Estimation of Blue and Fin Whales: Utilizing Sparse Array Data to Develop and Implement a New Method for Estimating Blue and Fin Whale Density

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...Utilizing Sparse Array Data to Develop and Implement a New Method for Estimating Blue and Fin Whale Density Len Thomas & Danielle Harris Centre...to develop and implement a new method for estimating blue and fin whale density that is effective over large spatial scales and is designed to cope

  12. Some asymptotic results on density estimators by wavelet projections

    CERN Document Server

    Varron, Davit

    2012-01-01

    Let $(X_i)_{i\\geq 1}$ be an i.i.d. sample on $\\RRR^d$ having density $f$. Given a real function $\\phi$ on $\\RRR^d$ with finite variation and given an integer valued sequence $(j_n)$, let $\\fn$ denote the estimator of $f$ by wavelet projection based on $\\phi$ and with multiresolution level equal to $j_n$. We provide exact rates of almost sure convergence to 0 of the quantity $\\sup_{x\\in H}\\mid \\fn(x)-\\EEE(\\fn)(x)\\mid$, when $n2^{-dj_n}/\\log n \\rar \\infty$ and $H$ is a given hypercube of $\\RRR^d$. We then show that, if $n2^{-dj_n}/\\log n \\rar c$ for a constant $c>0$, then the quantity $\\sup_{x\\in H}\\mid \\fn(x)-f\\mid$ almost surely fails to converge to 0.

  13. Cortical cell and neuron density estimates in one chimpanzee hemisphere.

    Science.gov (United States)

    Collins, Christine E; Turner, Emily C; Sawyer, Eva Kille; Reed, Jamie L; Young, Nicole A; Flaherty, David K; Kaas, Jon H

    2016-01-19

    The density of cells and neurons in the neocortex of many mammals varies across cortical areas and regions. This variability is, perhaps, most pronounced in primates. Nonuniformity in the composition of cortex suggests regions of the cortex have different specializations. Specifically, regions with densely packed neurons contain smaller neurons that are activated by relatively few inputs, thereby preserving information, whereas regions that are less densely packed have larger neurons that have more integrative functions. Here we present the numbers of cells and neurons for 742 discrete locations across the neocortex in a chimpanzee. Using isotropic fractionation and flow fractionation methods for cell and neuron counts, we estimate that neocortex of one hemisphere contains 9.5 billion cells and 3.7 billion neurons. Primary visual cortex occupies 35 cm(2) of surface, 10% of the total, and contains 737 million densely packed neurons, 20% of the total neurons contained within the hemisphere. Other areas of high neuron packing include secondary visual areas, somatosensory cortex, and prefrontal granular cortex. Areas of low levels of neuron packing density include motor and premotor cortex. These values reflect those obtained from more limited samples of cortex in humans and other primates.

  14. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known to be inconsis......This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...

  15. Cheap DECAF: Density Estimation for Cetaceans from Acoustic Fixed Sensors Using Separate, Non-Linked Devices

    Science.gov (United States)

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Cheap DECAF: Density Estimation for Cetaceans from... cetaceans using passive fixed acoustics rely on large, dense arrays of cabled hydrophones and/or auxiliary information from animal tagging projects...estimating cetacean density. Therefore, the goal of Cheap DECAF is to focus on the development of cetacean density estimation methods using sensors that

  16. Estimating the richness of a population when the maximum number of classes is fixed: a nonparametric solution to an archaeological problem.

    Directory of Open Access Journals (Sweden)

    Metin I Eren

    Full Text Available BACKGROUND: Estimating assemblage species or class richness from samples remains a challenging, but essential, goal. Though a variety of statistical tools for estimating species or class richness have been developed, they are all singly-bounded: assuming only a lower bound of species or classes. Nevertheless there are numerous situations, particularly in the cultural realm, where the maximum number of classes is fixed. For this reason, a new method is needed to estimate richness when both upper and lower bounds are known. METHODOLOGY/PRINCIPAL FINDINGS: Here, we introduce a new method for estimating class richness: doubly-bounded confidence intervals (both lower and upper bounds are known. We specifically illustrate our new method using the Chao1 estimator, rarefaction, and extrapolation, although any estimator of asymptotic richness can be used in our method. Using a case study of Clovis stone tools from the North American Lower Great Lakes region, we demonstrate that singly-bounded richness estimators can yield confidence intervals with upper bound estimates larger than the possible maximum number of classes, while our new method provides estimates that make empirical sense. CONCLUSIONS/SIGNIFICANCE: Application of the new method for constructing doubly-bound richness estimates of Clovis stone tools permitted conclusions to be drawn that were not otherwise possible with singly-bounded richness estimates, namely, that Lower Great Lakes Clovis Paleoindians utilized a settlement pattern that was probably more logistical in nature than residential. However, our new method is not limited to archaeological applications. It can be applied to any set of data for which there is a fixed maximum number of classes, whether that be site occupancy models, commercial products (e.g. athletic shoes, or census information (e.g. nationality, religion, age, race.

  17. Nonparametric Kernel Smoothing Methods. The sm library in Xlisp-Stat

    Directory of Open Access Journals (Sweden)

    Luca Scrucca

    2001-06-01

    Full Text Available In this paper we describe the Xlisp-Stat version of the sm library, a software for applying nonparametric kernel smoothing methods. The original version of the sm library was written by Bowman and Azzalini in S-Plus, and it is documented in their book Applied Smoothing Techniques for Data Analysis (1997. This is also the main reference for a complete description of the statistical methods implemented. The sm library provides kernel smoothing methods for obtaining nonparametric estimates of density functions and regression curves for different data structures. Smoothing techniques may be employed as a descriptive graphical tool for exploratory data analysis. Furthermore, they can also serve for inferential purposes as, for instance, when a nonparametric estimate is used for checking a proposed parametric model. The Xlisp-Stat version includes some extensions to the original sm library, mainly in the area of local likelihood estimation for generalized linear models. The Xlisp-Stat version of the sm library has been written following an object-oriented approach. This should allow experienced Xlisp-Stat users to implement easily their own methods and new research ideas into the built-in prototypes.

  18. Nonparametric tests for pathwise properties of semimartingales

    CERN Document Server

    Cont, Rama; 10.3150/10-BEJ293

    2011-01-01

    We propose two nonparametric tests for investigating the pathwise properties of a signal modeled as the sum of a L\\'{e}vy process and a Brownian semimartingale. Using a nonparametric threshold estimator for the continuous component of the quadratic variation, we design a test for the presence of a continuous martingale component in the process and a test for establishing whether the jumps have finite or infinite variation, based on observations on a discrete-time grid. We evaluate the performance of our tests using simulations of various stochastic models and use the tests to investigate the fine structure of the DM/USD exchange rate fluctuations and SPX futures prices. In both cases, our tests reveal the presence of a non-zero Brownian component and a finite variation jump component.

  19. A Bayesian nonparametric meta-analysis model.

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G

    2015-03-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.

  20. 非参数回归模型变点两步估计法及实证分析%Two-step change point estimation in nonparametric regression model and the empirical analysis

    Institute of Scientific and Technical Information of China (English)

    赵文芝; 夏志明; 贺飞跃

    2016-01-01

    The two-step estimators for change point in nonparametric regression are proposed.In the first step,an initial estimator is obtained by local linear smoothing method.In the second step,the fi-nal estimator is obtained by CUSUM method on a closed neighborhood of initial estimator.It is found through a simulation study that the proposed estimator is efficient.The estimator for j ump size is also obtained.Further more,experimental results that using historical data on Nile river discharges,ex-change rate data of USD against RMB and global temperature data for the northern hemisphere show that the proposed method is also practical in applications.%针对非参数回归模型变点问题,给出了变点的两步估计方法。第一步,用局部线性方法给出变点的初始估计量;第二步,在初始估计量的邻域内,用 CUSUM方法给出变点的最终估计量,同时获得了变点跃度的估计量。蒙特卡罗随机模拟结果表明了此方法的有效性。最后以尼罗河流量数据,美元兑换人民币汇率数据以及北半球月平均气温数据为例进行分析,结果说明此方法有实际应用价值。

  1. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J.

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  2. Estimating the mass density of neutral gas at $z < 1$

    CERN Document Server

    Natarajan, P; Natarajan, Priyamvada; Pettini, Max

    1997-01-01

    We use the relationships between galactic HI mass and B-band luminosity determined by Rao & Briggs to recalculate the mass density of neutral gas at the present epoch based on more recent measures of the galaxy luminosity function than were available to those authors. We find $\\Omega_{gas}(z=0) value, suggesting that this quantity is now reasonably secure. We then show that, if the scaling between H I mass and B-band luminosity has remained approximately constant since $z = 1$, the evolution of the luminosity function found by the Canada-France redshift survey translates to an increase of obtained quite independently from consideration of the luminosity function of Mg II absorbers at $z = 0.65$. By combining these new estimates with data from damped \\lya systems at higher redshift, it is possible to assemble a rough sketch of the evolution of $Ømega_{gas}$ over the last 90% of the age of the universe. The consumption of H I gas with time is in broad agreement with models of chemical evolution which inclu...

  3. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  4. Comparison of non-parametric methods for ungrouping coarsely aggregated data

    DEFF Research Database (Denmark)

    Rizzi, Silvia; Thinggaard, Mikael; Engholm, Gerda

    2016-01-01

    Background Histograms are a common tool to estimate densities non-parametrically. They are extensively encountered in health sciences to summarize data in a compact format. Examples are age-specific distributions of death or onset of diseases grouped in 5-years age classes with an open-ended age...... methods for ungrouping count data. We compare the performance of two spline interpolation methods, two kernel density estimators and a penalized composite link model first via a simulation study and then with empirical data obtained from the NORDCAN Database. All methods analyzed can be used to estimate...... composite link model performs the best. Conclusion We give an overview and test different methods to estimate detailed distributions from grouped count data. Health researchers can benefit from these versatile methods, which are ready for use in the statistical software R. We recommend using the penalized...

  5. Kernel density estimation and marginalized-particle based probability hypothesis density filter for multi-target tracking

    Institute of Scientific and Technical Information of China (English)

    张路平; 王鲁平; 李飚; 赵明

    2015-01-01

    In order to improve the performance of the probability hypothesis density (PHD) algorithm based particle filter (PF) in terms of number estimation and states extraction of multiple targets, a new probability hypothesis density filter algorithm based on marginalized particle and kernel density estimation is proposed, which utilizes the idea of marginalized particle filter to enhance the estimating performance of the PHD. The state variables are decomposed into linear and non-linear parts. The particle filter is adopted to predict and estimate the nonlinear states of multi-target after dimensionality reduction, while the Kalman filter is applied to estimate the linear parts under linear Gaussian condition. Embedding the information of the linear states into the estimated nonlinear states helps to reduce the estimating variance and improve the accuracy of target number estimation. The meanshift kernel density estimation, being of the inherent nature of searching peak value via an adaptive gradient ascent iteration, is introduced to cluster particles and extract target states, which is independent of the target number and can converge to the local peak position of the PHD distribution while avoiding the errors due to the inaccuracy in modeling and parameters estimation. Experiments show that the proposed algorithm can obtain higher tracking accuracy when using fewer sampling particles and is of lower computational complexity compared with the PF-PHD.

  6. Semi-parametric regression: Efficiency gains from modeling the nonparametric part

    CERN Document Server

    Yu, Kyusang; Park, Byeong U; 10.3150/10-BEJ296

    2011-01-01

    It is widely admitted that structured nonparametric modeling that circumvents the curse of dimensionality is important in nonparametric estimation. In this paper we show that the same holds for semi-parametric estimation. We argue that estimation of the parametric component of a semi-parametric model can be improved essentially when more structure is put into the nonparametric part of the model. We illustrate this for the partially linear model, and investigate efficiency gains when the nonparametric part of the model has an additive structure. We present the semi-parametric Fisher information bound for estimating the parametric part of the partially linear additive model and provide semi-parametric efficient estimators for which we use a smooth backfitting technique to deal with the additive nonparametric part. We also present the finite sample performances of the proposed estimators and analyze Boston housing data as an illustration.

  7. Multivariate density estimation using dimension reducing information and tail flattening transformations for truncated or censored data

    DEFF Research Database (Denmark)

    Buch-Kromann, Tine; Nielsen, Jens

    2012-01-01

    This paper introduces a multivariate density estimator for truncated and censored data with special emphasis on extreme values based on survival analysis. A local constant density estimator is considered. We extend this estimator by means of tail flattening transformation, dimension reducing prio...

  8. Spectral density estimation for symmetric stable p-adic processes

    Directory of Open Access Journals (Sweden)

    Rachid Sabre

    2013-05-01

    Full Text Available Applications of p-adic numbers ar beming increasingly important espcially in the field of applied physics. The objective of this work is to study the estimation of the spectral of p-adic stable processes. An estimator formed by a smoothing periodogram is constructed. It is shwon that this estimator is asymptotically unbiased and consistent. Rates of convergences are also examined.

  9. Learning Mixtures of Polynomials of Conditional Densities from Data

    DEFF Research Database (Denmark)

    L. López-Cruz, Pedro; Nielsen, Thomas Dyhre; Bielza, Concha;

    2013-01-01

    Mixtures of polynomials (MoPs) are a non-parametric density estimation technique for hybrid Bayesian networks with continuous and discrete variables. We propose two methods for learning MoP ap- proximations of conditional densities from data. Both approaches are based on learning MoP approximations......- ods with the approach for learning mixtures of truncated basis functions from data....

  10. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Directory of Open Access Journals (Sweden)

    Hendra Gunawan

    2014-06-01

    Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  11. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Directory of Open Access Journals (Sweden)

    Hendra Gunawan

    2014-06-01

    Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  12. EuroMInd-D: A Density Estimate of Monthly Gross Domestic Product for the Euro Area

    DEFF Research Database (Denmark)

    Proietti, Tommaso; Marczak, Martyna; Mazzi, Gianluigi

    EuroMInd-D is a density estimate of monthly gross domestic product (GDP) constructed according to a bottom–up approach, pooling the density estimates of eleven GDP components, by output and expenditure type. The components density estimates are obtained from a medium-size dynamic factor model...... parameters, and conditional simulation filters for simulating from the predictive distribution of GDP. Both algorithms process sequentially the data as they become available in real time. The GDP density estimates for the output and expenditure approach are combined using alternative weighting schemes...

  13. A nonparametric and diversified portfolio model

    Science.gov (United States)

    Shirazi, Yasaman Izadparast; Sabiruzzaman, Md.; Hamzah, Nor Aishah

    2014-07-01

    Traditional portfolio models, like mean-variance (MV) suffer from estimation error and lack of diversity. Alternatives, like mean-entropy (ME) or mean-variance-entropy (MVE) portfolio models focus independently on the issue of either a proper risk measure or the diversity. In this paper, we propose an asset allocation model that compromise between risk of historical data and future uncertainty. In the new model, entropy is presented as a nonparametric risk measure as well as an index of diversity. Our empirical evaluation with a variety of performance measures shows that this model has better out-of-sample performances and lower portfolio turnover than its competitors.

  14. Estimation of current density distribution under electrodes for external defibrillation

    Directory of Open Access Journals (Sweden)

    Papazov Sava P

    2002-12-01

    Full Text Available Abstract Background Transthoracic defibrillation is the most common life-saving technique for the restoration of the heart rhythm of cardiac arrest victims. The procedure requires adequate application of large electrodes on the patient chest, to ensure low-resistance electrical contact. The current density distribution under the electrodes is non-uniform, leading to muscle contraction and pain, or risks of burning. The recent introduction of automatic external defibrillators and even wearable defibrillators, presents new demanding requirements for the structure of electrodes. Method and Results Using the pseudo-elliptic differential equation of Laplace type with appropriate boundary conditions and applying finite element method modeling, electrodes of various shapes and structure were studied. The non-uniformity of the current density distribution was shown to be moderately improved by adding a low resistivity layer between the metal and tissue and by a ring around the electrode perimeter. The inclusion of openings in long-term wearable electrodes additionally disturbs the current density profile. However, a number of small-size perforations may result in acceptable current density distribution. Conclusion The current density distribution non-uniformity of circular electrodes is about 30% less than that of square-shaped electrodes. The use of an interface layer of intermediate resistivity, comparable to that of the underlying tissues, and a high-resistivity perimeter ring, can further improve the distribution. The inclusion of skin aeration openings disturbs the current paths, but an appropriate selection of number and size provides a reasonable compromise.

  15. Curve fitting of the corporate recovery rates: the comparison of Beta distribution estimation and kernel density estimation.

    Directory of Open Access Journals (Sweden)

    Rongda Chen

    Full Text Available Recovery rate is essential to the estimation of the portfolio's loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody's. However, it has a fatal defect that it can't fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody's new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management.

  16. EXACT MINIMAX ESTIMATION OF THE PREDICTIVE DENSITY IN SPARSE GAUSSIAN MODELS.

    Science.gov (United States)

    Mukherjee, Gourab; Johnstone, Iain M

    We consider estimating the predictive density under Kullback-Leibler loss in an ℓ0 sparse Gaussian sequence model. Explicit expressions of the first order minimax risk along with its exact constant, asymptotically least favorable priors and optimal predictive density estimates are derived. Compared to the sparse recovery results involving point estimation of the normal mean, new decision theoretic phenomena are seen. Suboptimal performance of the class of plug-in density estimates reflects the predictive nature of the problem and optimal strategies need diversification of the future risk. We find that minimax optimal strategies lie outside the Gaussian family but can be constructed with threshold predictive density estimates. Novel minimax techniques involving simultaneous calibration of the sparsity adjustment and the risk diversification mechanisms are used to design optimal predictive density estimates.

  17. A non-parametric Bayesian approach for clustering and tracking non-stationarities of neural spikes.

    Science.gov (United States)

    Shalchyan, Vahid; Farina, Dario

    2014-02-15

    Neural spikes from multiple neurons recorded in a multi-unit signal are usually separated by clustering. Drifts in the position of the recording electrode relative to the neurons over time cause gradual changes in the position and shapes of the clusters, challenging the clustering task. By dividing the data into short time intervals, Bayesian tracking of the clusters based on Gaussian cluster model has been previously proposed. However, the Gaussian cluster model is often not verified for neural spikes. We present a Bayesian clustering approach that makes no assumptions on the distribution of the clusters and use kernel-based density estimation of the clusters in every time interval as a prior for Bayesian classification of the data in the subsequent time interval. The proposed method was tested and compared to Gaussian model-based approach for cluster tracking by using both simulated and experimental datasets. The results showed that the proposed non-parametric kernel-based density estimation of the clusters outperformed the sequential Gaussian model fitting in both simulated and experimental data tests. Using non-parametric kernel density-based clustering that makes no assumptions on the distribution of the clusters enhances the ability of tracking cluster non-stationarity over time with respect to the Gaussian cluster modeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  19. Divisive latent class modeling as a density estimation method for categorical data

    NARCIS (Netherlands)

    van der Palm, D.W.; van der Ark, L.A.; Vermunt, J.K.

    2016-01-01

    Traditionally latent class (LC) analysis is used by applied researchers as a tool for identifying substantively meaningful clusters. More recently, LC models have also been used as a density estimation tool for categorical variables. We introduce a divisive LC (DLC) model as a density estimation too

  20. Inference-less Density Estimation using Copula Bayesian Networks

    CERN Document Server

    Elidan, Gal

    2012-01-01

    We consider learning continuous probabilistic graphical models in the face of missing data. For non-Gaussian models, learning the parameters and structure of such models depends on our ability to perform efficient inference, and can be prohibitive even for relatively modest domains. Recently, we introduced the Copula Bayesian Network (CBN) density model - a flexible framework that captures complex high-dimensional dependency structures while offering direct control over the univariate marginals, leading to improved generalization. In this work we show that the CBN model also offers significant computational advantages when training data is partially observed. Concretely, we leverage on the specialized form of the model to derive a computationally amenable learning objective that is a lower bound on the log-likelihood function. Importantly, our energy-like bound circumvents the need for costly inference of an auxiliary distribution, thus facilitating practical learning of highdimensional densities. We demonstr...

  1. Confidence estimates in simulation of phase noise or spectral density.

    Science.gov (United States)

    Ashby, Neil

    2017-02-13

    In this paper we apply the method of discrete simulation of power law noise, developed in [1],[3],[4], to the problem of simulating phase noise for a combination of power law noises. We derive analytic expressions for the probability of observing a value of phase noise L(f) or of any of the onesided spectral densities S(f); Sy(f), or Sx(f), for arbitrary superpositions of power law noise.

  2. Optimisation of in-situ dry density estimation

    Directory of Open Access Journals (Sweden)

    Morvan Mathilde

    2016-01-01

    Full Text Available Nowadays, field experiments are mostly used to determine the resistance and settlements of a soil before building. The needed devices were heavy so they cannot be used in every situation. It is the reason why Gourves et al (1998 developed a light dynamic penetrometer called Panda. For this penetrometer, a standardized hammer has to be blown on the head of the piston. For each blow, it measures the driving energy as well as the driving depth of the cone into the soil. The obtained penetrogram gives us the cone resistance variation with depth. For homogeneous soils, three parameters can determined: the critical depth zc, the initial cone resistance qd0 and the cone resistance in depth qd1. In parallel to the improvement of this apparatus, some researches were lead to obtain a relationship between the dry density of soil and the cone resistance in depth qd1. Knowing dry density of soil can allow to evaluate compaction efficiency for example. To achieve this point, a database of soils was initiated. Each of these soils was tested and classified using laboratory tests, among others, grain size distribution, proctor results, Atterberg limits. Penetrometer tests were also performed for three to five densities and for three to five water contents. Using this database, Chaigneau managed to obtain a logarithmic relation linking qd1 and dry density. But this relation varies with the water content. This article presents our recent researches on a mean to obtain a unified relation using water content, saturation degree or suction. To achieve this point, at first we studied the CNR silt responses with saturation degree and water content. Its water retention curve was realised using filter paper method so we can obtain suction. Then we verified the conclusion of this study to seven soils of the database to validate our hypotheses.

  3. Volumetric breast density estimation from full-field digital mammograms.

    NARCIS (Netherlands)

    Engeland, S. van; Snoeren, P.R.; Huisman, H.J.; Boetes, C.; Karssemeijer, N.

    2006-01-01

    A method is presented for estimation of dense breast tissue volume from mammograms obtained with full-field digital mammography (FFDM). The thickness of dense tissue mapping to a pixel is determined by using a physical model of image acquisition. This model is based on the assumption that the breast

  4. Constructing valid density matrices on an NMR quantum information processor via maximum likelihood estimation

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Harpreet; Arvind; Dorai, Kavita, E-mail: kavita@iisermohali.ac.in

    2016-09-07

    Estimation of quantum states is an important step in any quantum information processing experiment. A naive reconstruction of the density matrix from experimental measurements can often give density matrices which are not positive, and hence not physically acceptable. How do we ensure that at all stages of reconstruction, we keep the density matrix positive? Recently a method has been suggested based on maximum likelihood estimation, wherein the density matrix is guaranteed to be positive definite. We experimentally implement this protocol on an NMR quantum information processor. We discuss several examples and compare with the standard method of state estimation. - Highlights: • State estimation using maximum likelihood method was performed on an NMR quantum information processor. • Physically valid density matrices were obtained every time in contrast to standard quantum state tomography. • Density matrices of several different entangled and separable states were reconstructed for two and three qubits.

  5. Choosing the best non-parametric richness estimator for benthic macroinvertebrates databases Eligiendo el mejor estimador no paramétrico para calcular riqueza en bases de datos de macroinvertebrados bentónicos

    Directory of Open Access Journals (Sweden)

    Carola V. Basualdo

    2011-06-01

    Full Text Available Non-parametric estimators allow to compare the estimates of richness among data sets from heterogeneous sources. However, since the estimator performance depends on the species-abundance distribution of the sample, preference for one or another is a difficult issue. The present study recovers and revalues some criteria already present in the literature in order to choose the most suitable estimator for streams macroinvertebrates, and provides some tools to apply them. Two abundance and four incidence estimators were applied to a regional database at family and genus level. They were evaluated under four criteria: sub-sample size required to estimate the observed richness; constancy of the sub-sample size; lack of erratic behavior and similarity in curve shape through different data sets. Among incidence estimators, Jack1 had the best performance. Between abundance estimators, ACE was the best when the observed richness was small and Chao1 when the observed richness was high. The uniformity of curves shapes allowed to describe the general sequences of curves behavior that could act as references to compare estimations of small databases and to infer the possible behavior of the curve (i.e the expected richness if the sample were larger. These results can be very useful for environmental management, and update the state of knowledge of regional macroinvertebrates.Los estimadores no paramétricos permiten comparar la riqueza estimada de conjuntos de datos de origen diverso. Empero, como su comportamiento depende de la distribución de abundancia del conjunto de datos, la preferencia por alguno representa una decisión difícil. Este trabajo rescata algunos criterios presentes en la literatura para elegir el estimador más adecuado para macroinvertebrados bentónicos de ríos y ofrece algunas herramientas para su aplicación. Cuatro estimadores de incidencia y dos de abundancia se aplicaron a un inventario regional a nivel de familia y género. Para

  6. Estimated global nitrogen deposition using NO2 column density

    Science.gov (United States)

    Lu, Xuehe; Jiang, Hong; Zhang, Xiuying; Liu, Jinxun; Zhang, Zhen; Jin, Jiaxin; Wang, Ying; Xu, Jianhui; Cheng, Miaomiao

    2013-01-01

    Global nitrogen deposition has increased over the past 100 years. Monitoring and simulation studies of nitrogen deposition have evaluated nitrogen deposition at both the global and regional scale. With the development of remote-sensing instruments, tropospheric NO2 column density retrieved from Global Ozone Monitoring Experiment (GOME) and Scanning Imaging Absorption Spectrometer for Atmospheric Chartography (SCIAMACHY) sensors now provides us with a new opportunity to understand changes in reactive nitrogen in the atmosphere. The concentration of NO2 in the atmosphere has a significant effect on atmospheric nitrogen deposition. According to the general nitrogen deposition calculation method, we use the principal component regression method to evaluate global nitrogen deposition based on global NO2 column density and meteorological data. From the accuracy of the simulation, about 70% of the land area of the Earth passed a significance test of regression. In addition, NO2 column density has a significant influence on regression results over 44% of global land. The simulated results show that global average nitrogen deposition was 0.34 g m−2 yr−1 from 1996 to 2009 and is increasing at about 1% per year. Our simulated results show that China, Europe, and the USA are the three hotspots of nitrogen deposition according to previous research findings. In this study, Southern Asia was found to be another hotspot of nitrogen deposition (about 1.58 g m−2 yr−1 and maintaining a high growth rate). As nitrogen deposition increases, the number of regions threatened by high nitrogen deposits is also increasing. With N emissions continuing to increase in the future, areas whose ecosystem is affected by high level nitrogen deposition will increase.

  7. METAPHOR: Probability density estimation for machine learning based photometric redshifts

    Science.gov (United States)

    Amaro, V.; Cavuoti, S.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.

    2017-06-01

    We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method able to provide a reliable PDF for photometric galaxy redshifts estimated through empirical techniques. METAPHOR is a modular workflow, mainly based on the MLPQNA neural network as internal engine to derive photometric galaxy redshifts, but giving the possibility to easily replace MLPQNA with any other method to predict photo-z's and their PDF. We present here the results about a validation test of the workflow on the galaxies from SDSS-DR9, showing also the universality of the method by replacing MLPQNA with KNN and Random Forest models. The validation test include also a comparison with the PDF's derived from a traditional SED template fitting method (Le Phare).

  8. 金融风险管理中ES度量的非参数方法的比较及其应用%Nonparametric estimation and comparative analyses of ES in risk measure with applications

    Institute of Scientific and Technical Information of China (English)

    刘晓倩; 周勇

    2011-01-01

    Expected shortfall (ES) model developed recently is a powerful mathematical tool to measure and control financial risk. In this paper, two-step kernel smoothed processes are used to develop a two-step nonparametric estimator of ES. Comparisons between the proposed two-step kernel smoothed ES estimator to the existing fully empirical ES estimator and one-step kernel smoothing ES estimator were made by calculating expectation and variance of them. It is of great interest that the proposed two-step kernel smoothed ES estimator has been shown to increases the variance, totally different from the existing result that the kernel smoothed VaR estimator can produces reduction in both the variance and the mean square error. In addition, the simulation results conform to the theoretical analysis. In the related empirical analysis, the close-ended funds in Shanghai and Shenzhen stock markets were explored to compute the empirical ES estimates and kernel smoothing ES estimates for risk analysis. And the RAROC of the funds sample based on weekly return and ES were computed to make the performance evaluation of the funds.The empirical results show that, with weekly return, the method based on ES is higher reliability than those based on VaR.%预期不足(ES)是近几年发展起来的用于测量和控制金融风险的量化工具.在金融时间序列中,将两步核估计应用于两步ES非参数估计之中,得到了ES模型的两步核光滑估计.通过计算其期望和方差,比较了两步核光滑ES估计与ES完全经验估计及一步核光滑估计的优劣,得到了有趣的结论:与VaR模型不同,两步光滑化并不能减小ES估计的方差,反而会增大其方差,并通过计算机模拟证实了理论获得的结论.对国内沪深两市中的封闭式基金进行了实证分析,计算了样本基金的ES完全经验估计、一步核光滑估计和两步核光滑估计,并计算了样本基金基于周收益率和ES的两步核光滑

  9. ESTIMATING NUMBER DENSITY NV – A COMPARISON OF AN IMPROVED SALTYKOV ESTIMATOR AND THE DISECTOR METHOD

    Directory of Open Access Journals (Sweden)

    Ashot Davtian

    2011-05-01

    Full Text Available Two methods for the estimation of number per unit volume NV of spherical particles are discussed: the (physical disector (Sterio, 1984 and Saltykov's estimator (Saltykov, 1950; Fullman, 1953. A modification of Saltykov's estimator is proposed which reduces the variance. Formulae for bias and variance are given for both disector and improved Saltykov estimator for the case of randomly positioned particles. They enable the comparison of the two estimators with respect to their precision in terms of mean squared error.

  10. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...... but that this dependence vanishes after 2-3 years....

  11. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of consump

  12. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move...

  13. Efficient estimation of dynamic density functions with an application to outlier detection

    KAUST Repository

    Qahtan, Abdulhakim Ali Ali

    2012-01-01

    In this paper, we propose a new method to estimate the dynamic density over data streams, named KDE-Track as it is based on a conventional and widely used Kernel Density Estimation (KDE) method. KDE-Track can efficiently estimate the density with linear complexity by using interpolation on a kernel model, which is incrementally updated upon the arrival of streaming data. Both theoretical analysis and experimental validation show that KDE-Track outperforms traditional KDE and a baseline method Cluster-Kernels on estimation accuracy of the complex density structures in data streams, computing time and memory usage. KDE-Track is also demonstrated on timely catching the dynamic density of synthetic and real-world data. In addition, KDE-Track is used to accurately detect outliers in sensor data and compared with two existing methods developed for detecting outliers and cleaning sensor data. © 2012 ACM.

  14. Technical Note: Cortical thickness and density estimation from clinical CT using a prior thickness-density relationship

    Energy Technology Data Exchange (ETDEWEB)

    Humbert, Ludovic, E-mail: ludohumberto@gmail.com [Galgo Medical, Barcelona 08036 (Spain); Hazrati Marangalou, Javad; Rietbergen, Bert van [Orthopaedic Biomechanics, Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven 5600 MB (Netherlands); Río Barquero, Luis Miguel del [CETIR Centre Medic, Barcelona 08029 (Spain); Lenthe, G. Harry van [Biomechanics Section, KU Leuven–University of Leuven, Leuven 3001 (Belgium)

    2016-04-15

    Purpose: Cortical thickness and density are critical components in determining the strength of bony structures. Computed tomography (CT) is one possible modality for analyzing the cortex in 3D. In this paper, a model-based approach for measuring the cortical bone thickness and density from clinical CT images is proposed. Methods: Density variations across the cortex were modeled as a function of the cortical thickness and density, location of the cortex, density of surrounding tissues, and imaging blur. High resolution micro-CT data of cadaver proximal femurs were analyzed to determine a relationship between cortical thickness and density. This thickness-density relationship was used as prior information to be incorporated in the model to obtain accurate measurements of cortical thickness and density from clinical CT volumes. The method was validated using micro-CT scans of 23 cadaver proximal femurs. Simulated clinical CT images with different voxel sizes were generated from the micro-CT data. Cortical thickness and density were estimated from the simulated images using the proposed method and compared with measurements obtained using the micro-CT images to evaluate the effect of voxel size on the accuracy of the method. Then, 19 of the 23 specimens were imaged using a clinical CT scanner. Cortical thickness and density were estimated from the clinical CT images using the proposed method and compared with the micro-CT measurements. Finally, a case-control study including 20 patients with osteoporosis and 20 age-matched controls with normal bone density was performed to evaluate the proposed method in a clinical context. Results: Cortical thickness (density) estimation errors were 0.07 ± 0.19 mm (−18 ± 92 mg/cm{sup 3}) using the simulated clinical CT volumes with the smallest voxel size (0.33 × 0.33 × 0.5 mm{sup 3}), and 0.10 ± 0.24 mm (−10 ± 115 mg/cm{sup 3}) using the volumes with the largest voxel size (1.0 × 1.0 × 3.0 mm{sup 3}). A trend for the

  15. Impact of Building Heights on 3d Urban Density Estimation from Spaceborne Stereo Imagery

    Science.gov (United States)

    Peng, Feifei; Gong, Jianya; Wang, Le; Wu, Huayi; Yang, Jiansi

    2016-06-01

    In urban planning and design applications, visualization of built up areas in three dimensions (3D) is critical for understanding building density, but the accurate building heights required for 3D density calculation are not always available. To solve this problem, spaceborne stereo imagery is often used to estimate building heights; however estimated building heights might include errors. These errors vary between local areas within a study area and related to the heights of the building themselves, distorting 3D density estimation. The impact of building height accuracy on 3D density estimation must be determined across and within a study area. In our research, accurate planar information from city authorities is used during 3D density estimation as reference data, to avoid the errors inherent to planar information extracted from remotely sensed imagery. Our experimental results show that underestimation of building heights is correlated to underestimation of the Floor Area Ratio (FAR). In local areas, experimental results show that land use blocks with low FAR values often have small errors due to small building height errors for low buildings in the blocks; and blocks with high FAR values often have large errors due to large building height errors for high buildings in the blocks. Our study reveals that the accuracy of 3D density estimated from spaceborne stereo imagery is correlated to heights of buildings in a scene; therefore building heights must be considered when spaceborne stereo imagery is used to estimate 3D density to improve precision.

  16. An Efficient Acoustic Density Estimation Method with Human Detectors Applied to Gibbons in Cambodia.

    Directory of Open Access Journals (Sweden)

    Darren Kidney

    Full Text Available Some animal species are hard to see but easy to hear. Standard visual methods for estimating population density for such species are often ineffective or inefficient, but methods based on passive acoustics show more promise. We develop spatially explicit capture-recapture (SECR methods for territorial vocalising species, in which humans act as an acoustic detector array. We use SECR and estimated bearing data from a single-occasion acoustic survey of a gibbon population in northeastern Cambodia to estimate the density of calling groups. The properties of the estimator are assessed using a simulation study, in which a variety of survey designs are also investigated. We then present a new form of the SECR likelihood for multi-occasion data which accounts for the stochastic availability of animals. In the context of gibbon surveys this allows model-based estimation of the proportion of groups that produce territorial vocalisations on a given day, thereby enabling the density of groups, instead of the density of calling groups, to be estimated. We illustrate the performance of this new estimator by simulation. We show that it is possible to estimate density reliably from human acoustic detections of visually cryptic species using SECR methods. For gibbon surveys we also show that incorporating observers' estimates of bearings to detected groups substantially improves estimator performance. Using the new form of the SECR likelihood we demonstrate that estimates of availability, in addition to population density and detection function parameters, can be obtained from multi-occasion data, and that the detection function parameters are not confounded with the availability parameter. This acoustic SECR method provides a means of obtaining reliable density estimates for territorial vocalising species. It is also efficient in terms of data requirements since since it only requires routine survey data. We anticipate that the low-tech field requirements will

  17. A generalized model for estimating the energy density of invertebrates

    Science.gov (United States)

    James, Daniel A.; Csargo, Isak J.; Von Eschen, Aaron; Thul, Megan D.; Baker, James M.; Hayer, Cari-Ann; Howell, Jessica; Krause, Jacob; Letvin, Alex; Chipps, Steven R.

    2012-01-01

    Invertebrate energy density (ED) values are traditionally measured using bomb calorimetry. However, many researchers rely on a few published literature sources to obtain ED values because of time and sampling constraints on measuring ED with bomb calorimetry. Literature values often do not account for spatial or temporal variability associated with invertebrate ED. Thus, these values can be unreliable for use in models and other ecological applications. We evaluated the generality of the relationship between invertebrate ED and proportion of dry-to-wet mass (pDM). We then developed and tested a regression model to predict ED from pDM based on a taxonomically, spatially, and temporally diverse sample of invertebrates representing 28 orders in aquatic (freshwater, estuarine, and marine) and terrestrial (temperate and arid) habitats from 4 continents and 2 oceans. Samples included invertebrates collected in all seasons over the last 19 y. Evaluation of these data revealed a significant relationship between ED and pDM (r2  =  0.96, p calorimetry approaches. This model should prove useful for a wide range of ecological studies because it is unaffected by taxonomic, seasonal, or spatial variability.

  18. Astronomical Methods for Nonparametric Regression

    Science.gov (United States)

    Steinhardt, Charles L.; Jermyn, Adam

    2017-01-01

    I will discuss commonly used techniques for nonparametric regression in astronomy. We find that several of them, particularly running averages and running medians, are generically biased, asymmetric between dependent and independent variables, and perform poorly in recovering the underlying function, even when errors are present only in one variable. We then examine less-commonly used techniques such as Multivariate Adaptive Regressive Splines and Boosted Trees and find them superior in bias, asymmetry, and variance both theoretically and in practice under a wide range of numerical benchmarks. In this context the chief advantage of the common techniques is runtime, which even for large datasets is now measured in microseconds compared with milliseconds for the more statistically robust techniques. This points to a tradeoff between bias, variance, and computational resources which in recent years has shifted heavily in favor of the more advanced methods, primarily driven by Moore's Law. Along these lines, we also propose a new algorithm which has better overall statistical properties than all techniques examined thus far, at the cost of significantly worse runtime, in addition to providing guidance on choosing the nonparametric regression technique most suitable to any specific problem. We then examine the more general problem of errors in both variables and provide a new algorithm which performs well in most cases and lacks the clear asymmetry of existing non-parametric methods, which fail to account for errors in both variables.

  19. Nonparametric predictive inference for combining diagnostic tests with parametric copula

    Science.gov (United States)

    Muhammad, Noryanti; Coolen, F. P. A.; Coolen-Maturi, T.

    2017-09-01

    Measuring the accuracy of diagnostic tests is crucial in many application areas including medicine and health care. The Receiver Operating Characteristic (ROC) curve is a popular statistical tool for describing the performance of diagnostic tests. The area under the ROC curve (AUC) is often used as a measure of the overall performance of the diagnostic test. In this paper, we interest in developing strategies for combining test results in order to increase the diagnostic accuracy. We introduce nonparametric predictive inference (NPI) for combining two diagnostic test results with considering dependence structure using parametric copula. NPI is a frequentist statistical framework for inference on a future observation based on past data observations. NPI uses lower and upper probabilities to quantify uncertainty and is based on only a few modelling assumptions. While copula is a well-known statistical concept for modelling dependence of random variables. A copula is a joint distribution function whose marginals are all uniformly distributed and it can be used to model the dependence separately from the marginal distributions. In this research, we estimate the copula density using a parametric method which is maximum likelihood estimator (MLE). We investigate the performance of this proposed method via data sets from the literature and discuss results to show how our method performs for different family of copulas. Finally, we briefly outline related challenges and opportunities for future research.

  20. Effect on Prediction when Modeling Covariates in Bayesian Nonparametric Models.

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Rosner, Gary L; Müller, Peter; Stewart, Clinton F

    2013-04-01

    In biomedical research, it is often of interest to characterize biologic processes giving rise to observations and to make predictions of future observations. Bayesian nonparametric methods provide a means for carrying out Bayesian inference making as few assumptions about restrictive parametric models as possible. There are several proposals in the literature for extending Bayesian nonparametric models to include dependence on covariates. Limited attention, however, has been directed to the following two aspects. In this article, we examine the effect on fitting and predictive performance of incorporating covariates in a class of Bayesian nonparametric models by one of two primary ways: either in the weights or in the locations of a discrete random probability measure. We show that different strategies for incorporating continuous covariates in Bayesian nonparametric models can result in big differences when used for prediction, even though they lead to otherwise similar posterior inferences. When one needs the predictive density, as in optimal design, and this density is a mixture, it is better to make the weights depend on the covariates. We demonstrate these points via a simulated data example and in an application in which one wants to determine the optimal dose of an anticancer drug used in pediatric oncology.

  1. EnviroAtlas Estimated Intersection Density of Walkable Roads Web Service

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in each EnviroAtlas community....

  2. EnviroAtlas - Paterson, NJ - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  3. EnviroAtlas - Minneapolis/St. Paul, MN - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  4. EnviroAtlas - New Bedford, MA - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  5. EnviroAtlas - Pittsburgh, PA - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  6. Echolocation detections and digital video surveys provide reliable estimates of the relative density of harbour porpoises

    National Research Council Canada - National Science Library

    Williamson, Laura D; Brookes, Kate L; Scott, Beth E; Graham, Isla M; Bradbury, Gareth; Hammond, Philip S; Thompson, Paul M; McPherson, Jana

    2016-01-01

    ...‐based visual surveys. Surveys of cetaceans using acoustic loggers or digital cameras provide alternative methods to estimate relative density that have the potential to reduce cost and provide a verifiable record of all detections...

  7. EnviroAtlas - New York, NY - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  8. EnviroAtlas - Memphis, TN - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  9. EnviroAtlas - Cleveland, OH - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  10. EnviroAtlas - Fresno, CA - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  11. EnviroAtlas - Green Bay, WI - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  12. EnviroAtlas - Tampa, FL - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  13. EnviroAtlas - Portland, ME - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  14. EnviroAtlas - Phoenix, AZ - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  15. EnviroAtlas - Des Moines, IA - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  16. EnviroAtlas - Austin, TX - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  17. EnviroAtlas - Woodbine, IA - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  18. EnviroAtlas - Milwaukee, WI - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  19. EnviroAtlas - Portland, OR - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  20. EnviroAtlas - Durham, NC - Estimated Intersection Density of Walkable Roads

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...

  1. Cetacean Density Estimation from Novel Acoustic Datasets by Acoustic Propagation Modeling

    Science.gov (United States)

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Cetacean Density Estimation from Novel Acoustic Datasets...OBJECTIVES The objectives of this research are to apply existing methods for cetacean density estimation from passive acoustic recordings made by single...sensors, to novel data sets and cetacean species, as well as refine the existing techniques in order to develop a more generalized model that can be

  2. Compressive and Noncompressive Power Spectral Density Estimation from Periodic Nonuniform Samples

    CERN Document Server

    Lexa, Michael A; Thompson, John S

    2011-01-01

    This paper presents a novel power spectral density estimation technique for bandlimited, wide-sense stationary signals from sub-Nyquist sampled data. The technique employs multi-coset sampling and applies to spectrally sparse and nonsparse power spectra alike. For sparse density functions, we apply compressed sensing theory and the resulting compressive estimates exhibit better tradeoffs among the estimator's resolution, system complexity, and average sampling rate compared to their noncompressive counterparts. Both compressive and noncompressive estimates, however, can be computed at arbitrarily low sampling rates. The estimator does not require signal reconstruction and can be directly obtained from solving either a least squares or a nonnegative least squares problem. The estimates are piecewise constant approximations whose resolutions (width of the piecewise constant segments) are controlled by the periodicity of the multi-coset sampling. The estimates are also statistically consistent. This method is wi...

  3. The importance of spatial models for estimating the strength of density dependence

    DEFF Research Database (Denmark)

    Thorson, James T.; Skaug, Hans J.; Kristensen, Kasper;

    2014-01-01

    Identifying the existence and magnitude of density dependence is one of the oldest concerns in ecology. Ecologists have aimed to estimate density dependence in population and community data by fitting a simple autoregressive (Gompertz) model for density dependence to time series of abundance...... for an entire population. However, it is increasingly recognized that spatial heterogeneity in population densities has implications for population and community dynamics. We therefore adapt the Gompertz model to approximate local densities over continuous space instead of population-wide abundance......, and to allow productivity to vary spatially. Using simulated data generated from a spatial model, we show that the conventional (nonspatial) Gompertz model will result in biased estimates of density dependence, e.g., identifying oscillatory dynamics when not present. By contrast, the spatial Gompertz model...

  4. The Visualization and Analysis of POI Features under Network Space Supported by Kernel Density Estimation

    Directory of Open Access Journals (Sweden)

    YU Wenhao

    2015-01-01

    Full Text Available The distribution pattern and the distribution density of urban facility POIs are of great significance in the fields of infrastructure planning and urban spatial analysis. The kernel density estimation, which has been usually utilized for expressing these spatial characteristics, is superior to other density estimation methods (such as Quadrat analysis, Voronoi-based method, for that the Kernel density estimation considers the regional impact based on the first law of geography. However, the traditional kernel density estimation is mainly based on the Euclidean space, ignoring the fact that the service function and interrelation of urban feasibilities is carried out on the network path distance, neither than conventional Euclidean distance. Hence, this research proposed a computational model of network kernel density estimation, and the extension type of model in the case of adding constraints. This work also discussed the impacts of distance attenuation threshold and height extreme to the representation of kernel density. The large-scale actual data experiment for analyzing the different POIs' distribution patterns (random type, sparse type, regional-intensive type, linear-intensive type discusses the POI infrastructure in the city on the spatial distribution of characteristics, influence factors, and service functions.

  5. Density estimation of Yangtze finless porpoises using passive acoustic sensors and automated click train detection.

    Science.gov (United States)

    Kimura, Satoko; Akamatsu, Tomonari; Li, Songhai; Dong, Shouyue; Dong, Lijun; Wang, Kexiong; Wang, Ding; Arai, Nobuaki

    2010-09-01

    A method is presented to estimate the density of finless porpoises using stationed passive acoustic monitoring. The number of click trains detected by stereo acoustic data loggers (A-tag) was converted to an estimate of the density of porpoises. First, an automated off-line filter was developed to detect a click train among noise, and the detection and false-alarm rates were calculated. Second, a density estimation model was proposed. The cue-production rate was measured by biologging experiments. The probability of detecting a cue and the area size were calculated from the source level, beam patterns, and a sound-propagation model. The effect of group size on the cue-detection rate was examined. Third, the proposed model was applied to estimate the density of finless porpoises at four locations from the Yangtze River to the inside of Poyang Lake. The estimated mean density of porpoises in a day decreased from the main stream to the lake. Long-term monitoring during 466 days from June 2007 to May 2009 showed variation in the density 0-4.79. However, the density was fewer than 1 porpoise/km(2) during 94% of the period. These results suggest a potential gap and seasonal migration of the population in the bottleneck of Poyang Lake.

  6. KDE-Track: An Efficient Dynamic Density Estimator for Data Streams

    KAUST Repository

    Qahtan, Abdulhakim Ali Ali

    2016-11-08

    Recent developments in sensors, global positioning system devices and smart phones have increased the availability of spatiotemporal data streams. Developing models for mining such streams is challenged by the huge amount of data that cannot be stored in the memory, the high arrival speed and the dynamic changes in the data distribution. Density estimation is an important technique in stream mining for a wide variety of applications. The construction of kernel density estimators is well studied and documented. However, existing techniques are either expensive or inaccurate and unable to capture the changes in the data distribution. In this paper, we present a method called KDE-Track to estimate the density of spatiotemporal data streams. KDE-Track can efficiently estimate the density function with linear time complexity using interpolation on a kernel model, which is incrementally updated upon the arrival of new samples from the stream. We also propose an accurate and efficient method for selecting the bandwidth value for the kernel density estimator, which increases its accuracy significantly. Both theoretical analysis and experimental validation show that KDE-Track outperforms a set of baseline methods on the estimation accuracy and computing time of complex density structures in data streams.

  7. Using gravity data to estimate the density of surface rocks of Taiwan region

    Science.gov (United States)

    Lo, Y. T.; Horng-Yen, Y.

    2016-12-01

    Surface rock density within terrain correction step is one of the important parameters for obtaining Bouguer anomaly map. In the past study, we obtain the Bouguer anomaly map considering the average density correction of a wide range of the study area. In this study, we will be the better estimate for the correction of the density of each observation point. A correction density that coincides with surface geology is in order to improve the accuracy of the cloth cover anomaly map. The main idea of estimating correction of the density using gravity data statistics are two method, g-H relationship and Nettleton density profile method, respectively. The common advantages of these methods are in the following: First, density estimating is calculated using existing gravity observations data, it may be avoided the trouble of directly measure the rock density. Second, after the establishment the measuring point s of absolute gravity value, latitude, longitude and elevation into the database, you can always apply its database of information and terrain data with the value to calculate the average rock density on any range. In addition, each measuring point and numerical data of each terrain mesh are independent, if found to be more accurate gravity or terrain data, simply update a document data alone, without having to rebuild the entire database. According the results of estimating density distribution map, the trends are broadly distributed close to Taiwan Geology Division. The average density of the backbone mountain region is about 2.5 to 2.6 g/cm^3, the average density of east Central Mountain Range and Hsuehshan Range are about 2.3 to 2.5 g/cm^3, compared with the western foothills of 2.1-2.3 g/cm^3, the western plains is from 1.8 to 2.0 g/cm^3.

  8. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo

    2013-06-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric and based on the asymptotic distribution of the empirical copula process.We perform simulation experiments to evaluate our test and conclude that our method is reliable and powerful for assessing common assumptions on the structure of copulas, particularly when the sample size is moderately large. We illustrate our testing approach on two datasets. © 2013 American Statistical Association.

  9. An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index

    DEFF Research Database (Denmark)

    Dierckx, Goedele; Goegebeur, Yuri; Guillou, Armelle

    2013-01-01

    We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency...

  10. Body Density Estimates from Upper-Body Skinfold Thicknesses Compared to Air-Displacement Plethysmography

    Science.gov (United States)

    Technical Summary Objectives: Determine the effect of body mass index (BMI) on the accuracy of body density (Db) estimated with skinfold thickness (SFT) measurements compared to air displacement plethysmography (ADP) in adults. Subjects/Methods: We estimated Db with SFT and ADP in 131 healthy men an...

  11. An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index

    DEFF Research Database (Denmark)

    Dierckx, G.; Goegebeur, Y.; Guillou, A.

    2013-01-01

    We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency and as...... by a small simulation experiment involving both uncontaminated and contaminated samples. (C) 2013 Elsevier Inc. All rights reserved....

  12. Estimation of Star-Shaped Distributions

    Directory of Open Access Journals (Sweden)

    Eckhard Liebscher

    2016-11-01

    Full Text Available Scatter plots of multivariate data sets motivate modeling of star-shaped distributions beyond elliptically contoured ones. We study properties of estimators for the density generator function, the star-generalized radius distribution and the density in a star-shaped distribution model. For the generator function and the star-generalized radius density, we consider a non-parametric kernel-type estimator. This estimator is combined with a parametric estimator for the contours which are assumed to follow a parametric model. Therefore, the semiparametric procedure features the flexibility of nonparametric estimators and the simple estimation and interpretation of parametric estimators. Alternatively, we consider pure parametric estimators for the density. For the semiparametric density estimator, we prove rates of uniform, almost sure convergence which coincide with the corresponding rates of one-dimensional kernel density estimators when excluding the center of the distribution. We show that the standardized density estimator is asymptotically normally distributed. Moreover, the almost sure convergence rate of the estimated distribution function of the star-generalized radius is derived. A particular new two-dimensional distribution class is adapted here to agricultural and financial data sets.

  13. Sensitivity of fish density estimates to standard analytical procedures applied to Great Lakes hydroacoustic data

    Science.gov (United States)

    Kocovsky, Patrick M.; Rudstam, Lars G.; Yule, Daniel L.; Warner, David M.; Schaner, Ted; Pientka, Bernie; Deller, John W.; Waterfield, Holly A.; Witzel, Larry D.; Sullivan, Patrick J.

    2013-01-01

    Standardized methods of data collection and analysis ensure quality and facilitate comparisons among systems. We evaluated the importance of three recommendations from the Standard Operating Procedure for hydroacoustics in the Laurentian Great Lakes (GLSOP) on density estimates of target species: noise subtraction; setting volume backscattering strength (Sv) thresholds from user-defined minimum target strength (TS) of interest (TS-based Sv threshold); and calculations of an index for multiple targets (Nv index) to identify and remove biased TS values. Eliminating noise had the predictable effect of decreasing density estimates in most lakes. Using the TS-based Sv threshold decreased fish densities in the middle and lower layers in the deepest lakes with abundant invertebrates (e.g., Mysis diluviana). Correcting for biased in situ TS increased measured density up to 86% in the shallower lakes, which had the highest fish densities. The current recommendations by the GLSOP significantly influence acoustic density estimates, but the degree of importance is lake dependent. Applying GLSOP recommendations, whether in the Laurentian Great Lakes or elsewhere, will improve our ability to compare results among lakes. We recommend further development of standards, including minimum TS and analytical cell size, for reducing the effect of biased in situ TS on density estimates.

  14. A nonparametric statistical method for image segmentation using information theory and curve evolution.

    Science.gov (United States)

    Kim, Junmo; Fisher, John W; Yezzi, Anthony; Cetin, Müjdat; Willsky, Alan S

    2005-10-01

    In this paper, we present a new information-theoretic approach to image segmentation. We cast the segmentation problem as the maximization of the mutual information between the region labels and the image pixel intensities, subject to a constraint on the total length of the region boundaries. We assume that the probability densities associated with the image pixel intensities within each region are completely unknown a priori, and we formulate the problem based on nonparametric density estimates. Due to the nonparametric structure, our method does not require the image regions to have a particular type of probability distribution and does not require the extraction and use of a particular statistic. We solve the information-theoretic optimization problem by deriving the associated gradient flows and applying curve evolution techniques. We use level-set methods to implement the resulting evolution. The experimental results based on both synthetic and real images demonstrate that the proposed technique can solve a variety of challenging image segmentation problems. Futhermore, our method, which does not require any training, performs as good as methods based on training.

  15. A method to estimate plant density and plant spacing heterogeneity: application to wheat crops.

    Science.gov (United States)

    Liu, Shouyang; Baret, Fred; Allard, Denis; Jin, Xiuliang; Andrieu, Bruno; Burger, Philippe; Hemmerlé, Matthieu; Comar, Alexis

    2017-01-01

    Plant density and its non-uniformity drive the competition among plants as well as with weeds. They need thus to be estimated with small uncertainties accuracy. An optimal sampling method is proposed to estimate the plant density in wheat crops from plant counting and reach a given precision. Three experiments were conducted in 2014 resulting in 14 plots across varied sowing density, cultivars and environmental conditions. The coordinates of the plants along the row were measured over RGB high resolution images taken from the ground level. Results show that the spacing between consecutive plants along the row direction are independent and follow a gamma distribution under the varied conditions experienced. A gamma count model was then derived to define the optimal sample size required to estimate plant density for a given precision. Results suggest that measuring the length of segments containing 90 plants will achieve a precision better than 10%, independently from the plant density. This approach appears more efficient than the usual method based on fixed length segments where the number of plants are counted: the optimal length for a given precision on the density estimation will depend on the actual plant density. The gamma count model parameters may also be used to quantify the heterogeneity of plant spacing along the row by exploiting the variability between replicated samples. Results show that to achieve a 10% precision on the estimates of the 2 parameters of the gamma model, 200 elementary samples corresponding to the spacing between 2 consecutive plants should be measured. This method provides an optimal sampling strategy to estimate the plant density and quantify the plant spacing heterogeneity along the row.

  16. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  17. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    -Douglas function nor the Translog function are consistent with the “true” relationship between the inputs and the output in our data set. We solve this problem by using non-parametric regression. This approach delivers reasonable results, which are on average not too different from the results of the parametric......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  18. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  19. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  20. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  1. Learning Grasp Affordance Densities

    DEFF Research Database (Denmark)

    Detry, Renaud; Kraft, Dirk; Kroemer, Oliver

    2011-01-01

    We address the issue of learning and representing object grasp affordance models. We model grasp affordances with continuous probability density functions (grasp densities) which link object-relative grasp poses to their success probability. The underlying function representation is nonparametric...... and relies on kernel density estimation to provide a continuous model. Grasp densities are learned and refined from exploration, by letting a robot “play” with an object in a sequence of graspand-drop actions: The robot uses visual cues to generate a set of grasp hypotheses; it then executes...... these and records their outcomes. When a satisfactory number of grasp data is available, an importance-sampling algorithm turns these into a grasp density. We evaluate our method in a largely autonomous learning experiment run on three objects of distinct shapes. The experiment shows how learning increases success...

  2. Glacial density and GIA in Alaska estimated from ICESat, GPS and GRACE measurements

    Science.gov (United States)

    Jin, Shuanggen; Zhang, T. Y.; Zou, F.

    2017-01-01

    The density of glacial volume change in Alaska is a key factor in estimating the glacier mass loss from altimetry observations. However, the density of Alaskan glaciers has large uncertainty due to the lack of in situ measurements. In this paper, using the measurements of Ice, Cloud, and land Elevation Satellite (ICESat), Global Positioning System (GPS), and Gravity Recovery and Climate Experiment (GRACE) from 2003 to 2009, an optimal density of glacial volume change with 750 kg/m3 is estimated for the first time to fit the measurements. The glacier mass loss is -57.5 ± 6.5 Gt by converting the volumetric change from ICESat with the estimated density 750 kg/m3. Based on the empirical relation, the depth-density profiles are constructed, which show glacial density variation information with depths in Alaska. By separating the glacier mass loss from glacial isostatic adjustment (GIA) effects in GPS uplift rates and GRACE total water storage trends, the GIA uplift rates are estimated in Alaska. The best fitting model consists of a 60 km elastic lithosphere and 110 km thick asthenosphere with a viscosity of 2.0 × 1019 Pa s over a two-layer mantle.

  3. Estimating population density and connectivity of American mink using spatial capture-recapture

    Science.gov (United States)

    Fuller, Angela K.; Sutherland, Christopher S.; Royle, Andy; Hare, Matthew P.

    2016-01-01

    Estimating the abundance or density of populations is fundamental to the conservation and management of species, and as landscapes become more fragmented, maintaining landscape connectivity has become one of the most important challenges for biodiversity conservation. Yet these two issues have never been formally integrated together in a model that simultaneously models abundance while accounting for connectivity of a landscape. We demonstrate an application of using capture–recapture to develop a model of animal density using a least-cost path model for individual encounter probability that accounts for non-Euclidean connectivity in a highly structured network. We utilized scat detection dogs (Canis lupus familiaris) as a means of collecting non-invasive genetic samples of American mink (Neovison vison) individuals and used spatial capture–recapture models (SCR) to gain inferences about mink population density and connectivity. Density of mink was not constant across the landscape, but rather increased with increasing distance from city, town, or village centers, and mink activity was associated with water. The SCR model allowed us to estimate the density and spatial distribution of individuals across a 388 km2 area. The model was used to investigate patterns of space usage and to evaluate covariate effects on encounter probabilities, including differences between sexes. This study provides an application of capture–recapture models based on ecological distance, allowing us to directly estimate landscape connectivity. This approach should be widely applicable to provide simultaneous direct estimates of density, space usage, and landscape connectivity for many species.

  4. Wavelet Optimal Estimations for Density Functions under Severely Ill-Posed Noises

    Directory of Open Access Journals (Sweden)

    Rui Li

    2013-01-01

    Full Text Available Motivated by Lounici and Nickl's work (2011, this paper considers the problem of estimation of a density f based on an independent and identically distributed sample Y1,…,Yn from g=f*φ. We show a wavelet optimal estimation for a density (function over Besov ball Br,qs(L and Lp risk (1≤p<∞ in the presence of severely ill-posed noises. A wavelet linear estimation is firstly presented. Then, we prove a lower bound, which shows our wavelet estimator optimal. In other words, nonlinear wavelet estimations are not needed in that case. It turns out that our results extend some theorems of Pensky and Vidakovic (1999, as well as Fan and Koo (2002.

  5. Multiscale functional connectivity estimation on low-density neuronal cultures recorded by high-density CMOS Micro Electrode Arrays.

    Science.gov (United States)

    Maccione, Alessandro; Garofalo, Matteo; Nieus, Thierry; Tedesco, Mariateresa; Berdondini, Luca; Martinoia, Sergio

    2012-06-15

    We used electrophysiological signals recorded by CMOS Micro Electrode Arrays (MEAs) at high spatial resolution to estimate the functional-effective connectivity of sparse hippocampal neuronal networks in vitro by applying a cross-correlation (CC) based method and ad hoc developed spatio-temporal filtering. Low-density cultures were recorded by a recently introduced CMOS-MEA device providing simultaneous multi-site acquisition at high-spatial (21 μm inter-electrode separation) as well as high-temporal resolution (8 kHz per channel). The method is applied to estimate functional connections in different cultures and it is refined by applying spatio-temporal filters that allow pruning of those functional connections not compatible with signal propagation. This approach permits to discriminate between possible causal influence and spurious co-activation, and to obtain detailed maps down to cellular resolution. Further, a thorough analysis of the links strength and time delays (i.e., amplitude and peak position of the CC function) allows characterizing the inferred interconnected networks and supports a possible discrimination of fast mono-synaptic propagations, and slow poly-synaptic pathways. By focusing on specific regions of interest we could observe and analyze microcircuits involving connections among a few cells. Finally, the use of the high-density MEA with low density cultures analyzed with the proposed approach enables to compare the inferred effective links with the network structure obtained by staining procedures.

  6. Spatial capture-recapture models for jointly estimating population density and landscape connectivity

    Science.gov (United States)

    Royle, J. Andrew; Chandler, Richard B.; Gazenski, Kimberly D.; Graves, Tabitha A.

    2013-01-01

    Population size and landscape connectivity are key determinants of population viability, yet no methods exist for simultaneously estimating density and connectivity parameters. Recently developed spatial capture–recapture (SCR) models provide a framework for estimating density of animal populations but thus far have not been used to study connectivity. Rather, all applications of SCR models have used encounter probability models based on the Euclidean distance between traps and animal activity centers, which implies that home ranges are stationary, symmetric, and unaffected by landscape structure. In this paper we devise encounter probability models based on “ecological distance,” i.e., the least-cost path between traps and activity centers, which is a function of both Euclidean distance and animal movement behavior in resistant landscapes. We integrate least-cost path models into a likelihood-based estimation scheme for spatial capture–recapture models in order to estimate population density and parameters of the least-cost encounter probability model. Therefore, it is possible to make explicit inferences about animal density, distribution, and landscape connectivity as it relates to animal movement from standard capture–recapture data. Furthermore, a simulation study demonstrated that ignoring landscape connectivity can result in negatively biased density estimators under the naive SCR model.

  7. Effect of compression paddle tilt correction on volumetric breast density estimation.

    Science.gov (United States)

    Kallenberg, Michiel G J; van Gils, Carla H; Lokate, Mariëtte; den Heeten, Gerard J; Karssemeijer, Nico

    2012-08-21

    For the acquisition of a mammogram, a breast is compressed between a compression paddle and a support table. When compression is applied with a flexible compression paddle, the upper plate may be tilted, which results in variation in breast thickness from the chest wall to the breast margin. Paddle tilt has been recognized as a major problem in volumetric breast density estimation methods. In previous work, we developed a fully automatic method to correct the image for the effect of compression paddle tilt. In this study, we investigated in three experiments the effect of paddle tilt and its correction on volumetric breast density estimation. Results showed that paddle tilt considerably affected accuracy of volumetric breast density estimation, but that effect could be reduced by tilt correction. By applying tilt correction, a significant increase in correspondence between mammographic density estimates and measurements on MRI was established. We argue that in volumetric breast density estimation, tilt correction is both feasible and essential when mammographic images are acquired with a flexible compression paddle.

  8. Estimation of tiger densities in India using photographic captures and recaptures

    Science.gov (United States)

    Karanth, U.; Nichols, J.D.

    1998-01-01

    Previously applied methods for estimating tiger (Panthera tigris) abundance using total counts based on tracks have proved unreliable. In this paper we use a field method proposed by Karanth (1995), combining camera-trap photography to identify individual tigers based on stripe patterns, with capture-recapture estimators. We developed a sampling design for camera-trapping and used the approach to estimate tiger population size and density in four representative tiger habitats in different parts of India. The field method worked well and provided data suitable for analysis using closed capture-recapture models. The results suggest the potential for applying this methodology for estimating abundances, survival rates and other population parameters in tigers and other low density, secretive animal species with distinctive coat patterns or other external markings. Estimated probabilities of photo-capturing tigers present in the study sites ranged from 0.75 - 1.00. The estimated mean tiger densities ranged from 4.1 (SE hat= 1.31) to 11.7 (SE hat= 1.93) tigers/100 km2. The results support the previous suggestions of Karanth and Sunquist (1995) that densities of tigers and other large felids may be primarily determined by prey community structure at a given site.

  9. Estimating detection and density of the Andean cat in the high Andes

    Science.gov (United States)

    Reppucci, Juan; Gardner, Beth; Lucherini, Mauro

    2011-01-01

    The Andean cat (Leopardus jacobita) is one of the most endangered, yet least known, felids. Although the Andean cat is considered at risk of extinction, rigorous quantitative population studies are lacking. Because physical observations of the Andean cat are difficult to make in the wild, we used a camera-trapping array to photo-capture individuals. The survey was conducted in northwestern Argentina at an elevation of approximately 4,200 m during October–December 2006 and April–June 2007. In each year we deployed 22 pairs of camera traps, which were strategically placed. To estimate detection probability and density we applied models for spatial capture–recapture using a Bayesian framework. Estimated densities were 0.07 and 0.12 individual/km2 for 2006 and 2007, respectively. Mean baseline detection probability was estimated at 0.07. By comparison, densities of the Pampas cat (Leopardus colocolo), another poorly known felid that shares its habitat with the Andean cat, were estimated at 0.74–0.79 individual/km2 in the same study area for 2006 and 2007, and its detection probability was estimated at 0.02. Despite having greater detectability, the Andean cat is rarer in the study region than the Pampas cat. Properly accounting for the detection probability is important in making reliable estimates of density, a key parameter in conservation and management decisions for any species.

  10. A Bayesian nonparametric method for prediction in EST analysis

    Directory of Open Access Journals (Sweden)

    Prünster Igor

    2007-09-01

    Full Text Available Abstract Background Expressed sequence tags (ESTs analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b the number of new unique genes to be observed in a future sample; c the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample.

  11. Robust Depth-Weighted Wavelet for Nonparametric Regression Models

    Institute of Scientific and Technical Information of China (English)

    Lu LIN

    2005-01-01

    In the nonpaxametric regression models, the original regression estimators including kernel estimator, Fourier series estimator and wavelet estimator are always constructed by the weighted sum of data, and the weights depend only on the distance between the design points and estimation points. As a result these estimators are not robust to the perturbations in data. In order to avoid this problem, a new nonparametric regression model, called the depth-weighted regression model, is introduced and then the depth-weighted wavelet estimation is defined. The new estimation is robust to the perturbations in data, which attains very high breakdown value close to 1/2. On the other hand, some asymptotic behaviours such as asymptotic normality are obtained. Some simulations illustrate that the proposed wavelet estimator is more robust than the original wavelet estimator and, as a price to pay for the robustness, the new method is slightly less efficient than the original method.

  12. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  13. Effects of tissue heterogeneity on the optical estimate of breast density

    Science.gov (United States)

    Taroni, Paola; Pifferi, Antonio; Quarto, Giovanna; Spinelli, Lorenzo; Torricelli, Alessandro; Abbate, Francesca; Balestreri, Nicola; Ganino, Serena; Menna, Simona; Cassano, Enrico; Cubeddu, Rinaldo

    2012-01-01

    Breast density is a recognized strong and independent risk factor for developing breast cancer. At present, breast density is assessed based on the radiological appearance of breast tissue, thus relying on the use of ionizing radiation. We have previously obtained encouraging preliminary results with our portable instrument for time domain optical mammography performed at 7 wavelengths (635–1060 nm). In that case, information was averaged over four images (cranio-caudal and oblique views of both breasts) available for each subject. In the present work, we tested the effectiveness of just one or few point measurements, to investigate if tissue heterogeneity significantly affects the correlation between optically derived parameters and mammographic density. Data show that parameters estimated through a single optical measurement correlate strongly with mammographic density estimated by using BIRADS categories. A central position is optimal for the measurement, but its exact location is not critical. PMID:23082283

  14. Multidimensional kernel estimation

    CERN Document Server

    Milosevic, Vukasin

    2015-01-01

    Kernel estimation is one of the non-parametric methods used for estimation of probability density function. Its first ROOT implementation, as part of RooFit package, has one major issue, its evaluation time is extremely slow making in almost unusable. The goal of this project was to create a new class (TKNDTree) which will follow the original idea of kernel estimation, greatly improve the evaluation time (using the TKTree class for storing the data and creating different user-controlled modes of evaluation) and add the interpolation option, for 2D case, with the help of the new Delaunnay2D class.

  15. [Estimation of Hunan forest carbon density based on spectral mixture analysis of MODIS data].

    Science.gov (United States)

    Yan, En-ping; Lin, Hui; Wang, Guang-xing; Chen, Zhen-xiong

    2015-11-01

    With the fast development of remote sensing technology, combining forest inventory sample plot data and remotely sensed images has become a widely used method to map forest carbon density. However, the existence of mixed pixels often impedes the improvement of forest carbon density mapping, especially when low spatial resolution images such as MODIS are used. In this study, MODIS images and national forest inventory sample plot data were used to conduct the study of estimation for forest carbon density. Linear spectral mixture analysis with and without constraint, and nonlinear spectral mixture analysis were compared to derive the fractions of different land use and land cover (LULC) types. Then sequential Gaussian co-simulation algorithm with and without the fraction images from spectral mixture analyses were employed to estimate forest carbon density of Hunan Province. Results showed that 1) Linear spectral mixture analysis with constraint, leading to a mean RMSE of 0.002, more accurately estimated the fractions of LULC types than linear spectral and nonlinear spectral mixture analyses; 2) Integrating spectral mixture analysis model and sequential Gaussian co-simulation algorithm increased the estimation accuracy of forest carbon density to 81.5% from 74.1%, and decreased the RMSE to 5.18 from 7.26; and 3) The mean value of forest carbon density for the province was 30.06 t · hm(-2), ranging from 0.00 to 67.35 t · hm(-2). This implied that the spectral mixture analysis provided a great potential to increase the estimation accuracy of forest carbon density on regional and global level.

  16. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  17. Estimation Prospects of the Source Number Density of Ultra-high-energy Cosmic Rays

    OpenAIRE

    Takami, Hajime; Sato, Katsuhiko

    2007-01-01

    We discuss the possibility of accurately estimating the source number density of ultra-high-energy cosmic rays (UHECRs) using small-scale anisotropy in their arrival distribution. The arrival distribution has information on their source and source distribution. We calculate the propagation of UHE protons in a structured extragalactic magnetic field (EGMF) and simulate their arrival distribution at the Earth using our previously developed method. The source number density that can best reprodu...

  18. Bulk density estimation using a 3-dimensional image acquisition and analysis system

    Directory of Open Access Journals (Sweden)

    Heyduk Adam

    2016-01-01

    Full Text Available The paper presents a concept of dynamic bulk density estimation of a particulate matter stream using a 3-d image analysis system and a conveyor belt scale. A method of image acquisition should be adjusted to the type of scale. The paper presents some laboratory results of static bulk density measurements using the MS Kinect time-of-flight camera and OpenCV/Matlab software. Measurements were made for several different size classes.

  19. Statistical Analysis of the Spectral Density Estimate Obtained via Coifman Scaling Function

    OpenAIRE

    2007-01-01

    Spectral density built as Fourier transform of covariance sequence of stationary random process is determining the process characteristics and makes for analysis of it’s structure. Thus, one of the main problems in time series analysis is constructing consistent estimates of spectral density via successive, taken after equal periods of time observations of stationary random process. This article is devoted to investigation of problems dealing with application of wavelet anal...

  20. Right-Censored Nonparametric Regression: A Comparative Simulation Study

    Directory of Open Access Journals (Sweden)

    Dursun Aydın

    2016-11-01

    Full Text Available This paper introduces the operating of the selection criteria for right-censored nonparametric regression using smoothing spline. In order to transform the response variable into a variable that contains the right-censorship, we used the KaplanMeier weights proposed by [1], and [2]. The major problem in smoothing spline method is to determine a smoothing parameter to obtain nonparametric estimates of the regression function. In this study, the mentioned parameter is chosen based on censored data by means of the criteria such as improved Akaike information criterion (AICc, Bayesian (or Schwarz information criterion (BIC and generalized crossvalidation (GCV. For this purpose, a Monte-Carlo simulation study is carried out to illustrate which selection criterion gives the best estimation for censored data.

  1. Estimate of the density of Eucalyptus grandis W. Hill ex Maiden using near infrared spectroscopy

    Directory of Open Access Journals (Sweden)

    Silviana Rosso

    2013-12-01

    Full Text Available This study aimed to analyze use of near infrared spectroscopy (NIRS to estimate wood density of Eucalyptus grandis. For that, 66 27-year-old trees were logged and central planks were removed from each log. Test pieces 2.5 x 2.5 x 5.0 cm in size were removed from the base of each plank, in the pith-bark direction, and subjected to determination of bulk and basic density at 12% moisture (dry basis, followed by spectral readings in the radial, tangential and transverse directions using a Bruker Tensor 37 infrared spectrophotometer. The calibration to estimate wood density was developed based on the matrix of spectra obtained from the radial face, containing 216 samples. The partial least squares regression to estimate bulk wood density of Eucalyptus grandis provided a coefficient of determination of validation of 0.74 and a ratio performance deviation of 2.29. Statistics relating to the predictive models had adequate magnitudes for estimating wood density from unknown samples, indicating that the above technique has potential for use in replacement of conventional testing.

  2. Estimating the amount and distribution of radon flux density from the soil surface in China.

    Science.gov (United States)

    Zhuo, Weihai; Guo, Qiuju; Chen, Bo; Cheng, Guan

    2008-07-01

    Based on an idealized model, both the annual and the seasonal radon ((222)Rn) flux densities from the soil surface at 1099 sites in China were estimated by linking a database of soil (226)Ra content and a global ecosystems database. Digital maps of the (222)Rn flux density in China were constructed in a spatial resolution of 25 km x 25 km by interpolation among the estimated data. An area-weighted annual average (222)Rn flux density from the soil surface across China was estimated to be 29.7+/-9.4 mBq m(-2)s(-1). Both regional and seasonal variations in the (222)Rn flux densities are significant in China. Annual average flux densities in the southeastern and northwestern China are generally higher than those in other regions of China, because of high soil (226)Ra content in the southeastern area and high soil aridity in the northwestern one. The seasonal average flux density is generally higher in summer/spring than winter, since relatively higher soil temperature and lower soil water saturation in summer/spring than other seasons are common in China.

  3. PEDO-TRANSFER FUNCTIONS FOR ESTIMATING SOIL BULK DENSITY IN CENTRAL AMAZONIA

    Directory of Open Access Journals (Sweden)

    Henrique Seixas Barros

    2015-04-01

    Full Text Available Under field conditions in the Amazon forest, soil bulk density is difficult to measure. Rigorous methodological criteria must be applied to obtain reliable inventories of C stocks and soil nutrients, making this process expensive and sometimes unfeasible. This study aimed to generate models to estimate soil bulk density based on parameters that can be easily and reliably measured in the field and that are available in many soil-related inventories. Stepwise regression models to predict bulk density were developed using data on soil C content, clay content and pH in water from 140 permanent plots in terra firme (upland forests near Manaus, Amazonas State, Brazil. The model results were interpreted according to the coefficient of determination (R2 and Akaike information criterion (AIC and were validated with a dataset consisting of 125 plots different from those used to generate the models. The model with best performance in estimating soil bulk density under the conditions of this study included clay content and pH in water as independent variables and had R2 = 0.73 and AIC = -250.29. The performance of this model for predicting soil density was compared with that of models from the literature. The results showed that the locally calibrated equation was the most accurate for estimating soil bulk density for upland forests in the Manaus region.

  4. Trapping Elusive Cats: Using Intensive Camera Trapping to Estimate the Density of a Rare African Felid.

    Directory of Open Access Journals (Sweden)

    Eléanor Brassine

    Full Text Available Camera trapping studies have become increasingly popular to produce population estimates of individually recognisable mammals. Yet, monitoring techniques for rare species which occur at extremely low densities are lacking. Additionally, species which have unpredictable movements may make obtaining reliable population estimates challenging due to low detectability. Our study explores the effectiveness of intensive camera trapping for estimating cheetah (Acinonyx jubatus numbers. Using both a more traditional, systematic grid approach and pre-determined, targeted sites for camera placement, the cheetah population of the Northern Tuli Game Reserve, Botswana was sampled between December 2012 and October 2013. Placement of cameras in a regular grid pattern yielded very few (n = 9 cheetah images and these were insufficient to estimate cheetah density. However, pre-selected cheetah scent-marking posts provided 53 images of seven adult cheetahs (0.61 ± 0.18 cheetahs/100 km². While increasing the length of the camera trapping survey from 90 to 130 days increased the total number of cheetah images obtained (from 53 to 200, no new individuals were recorded and the estimated population density remained stable. Thus, our study demonstrates that targeted camera placement (irrespective of survey duration is necessary for reliably assessing cheetah densities where populations are naturally very low or dominated by transient individuals. Significantly our approach can easily be applied to other rare predator species.

  5. Productivity and population density estimates of the dengue vector mosquito Aedes aegypti (Stegomyia aegypti) in Australia.

    Science.gov (United States)

    Williams, C R; Johnson, P H; Ball, T S; Ritchie, S A

    2013-09-01

    New mosquito control strategies centred on the modifying of populations require knowledge of existing population densities at release sites and an understanding of breeding site ecology. Using a quantitative pupal survey method, we investigated production of the dengue vector Aedes aegypti (L.) (Stegomyia aegypti) (Diptera: Culicidae) in Cairns, Queensland, Australia, and found that garden accoutrements represented the most common container type. Deliberately placed 'sentinel' containers were set at seven houses and sampled for pupae over 10 weeks during the wet season. Pupal production was approximately constant; tyres and buckets represented the most productive container types. Sentinel tyres produced the largest female mosquitoes, but were relatively rare in the field survey. We then used field-collected data to make estimates of per premises population density using three different approaches. Estimates of female Ae. aegypti abundance per premises made using the container-inhabiting mosquito simulation (CIMSiM) model [95% confidence interval (CI) 18.5-29.1 females] concorded reasonably well with estimates obtained using a standing crop calculation based on pupal collections (95% CI 8.8-22.5) and using BG-Sentinel traps and a sampling rate correction factor (95% CI 6.2-35.2). By first describing local Ae. aegypti productivity, we were able to compare three separate population density estimates which provided similar results. We anticipate that this will provide researchers and health officials with several tools with which to make estimates of population densities.

  6. Estimating cetacean population density using fixed passive acoustic sensors: an example with Blainville's beaked whales.

    Science.gov (United States)

    Marques, Tiago A; Thomas, Len; Ward, Jessica; DiMarzio, Nancy; Tyack, Peter L

    2009-04-01

    Methods are developed for estimating the size/density of cetacean populations using data from a set of fixed passive acoustic sensors. The methods convert the number of detected acoustic cues into animal density by accounting for (i) the probability of detecting cues, (ii) the rate at which animals produce cues, and (iii) the proportion of false positive detections. Additional information is often required for estimation of these quantities, for example, from an acoustic tag applied to a sample of animals. Methods are illustrated with a case study: estimation of Blainville's beaked whale density over a 6 day period in spring 2005, using an 82 hydrophone wide-baseline array located in the Tongue of the Ocean, Bahamas. To estimate the required quantities, additional data are used from digital acoustic tags, attached to five whales over 21 deep dives, where cues recorded on some of the dives are associated with those received on the fixed hydrophones. Estimated density was 25.3 or 22.5 animals/1000 km(2), depending on assumptions about false positive detections, with 95% confidence intervals 17.3-36.9 and 15.4-32.9. These methods are potentially applicable to a wide variety of marine and terrestrial species that are hard to survey using conventional visual methods.

  7. High-order ionospheric effects on electron density estimation from Fengyun-3C GPS radio occultation

    Science.gov (United States)

    Li, Junhai; Jin, Shuanggen

    2017-03-01

    GPS radio occultation can estimate ionospheric electron density and total electron content (TEC) with high spatial resolution, e.g., China's recent Fengyun-3C GPS radio occultation. However, high-order ionospheric delays are normally ignored. In this paper, the high-order ionospheric effects on electron density estimation from the Fengyun-3C GPS radio occultation data are estimated and investigated using the NeQuick2 ionosphere model and the IGRF12 (International Geomagnetic Reference Field, 12th generation) geomagnetic model. Results show that the high-order ionospheric delays have large effects on electron density estimation with up to 800 el cm-3, which should be corrected in high-precision ionospheric density estimation and applications. The second-order ionospheric effects are more significant, particularly at 250-300 km, while third-order ionospheric effects are much smaller. Furthermore, the high-order ionospheric effects are related to the location, the local time, the radio occultation azimuth and the solar activity. The large high-order ionospheric effects are found in the low-latitude area and in the daytime as well as during strong solar activities. The second-order ionospheric effects have a maximum positive value when the radio occultation azimuth is around 0-20°, and a maximum negative value when the radio occultation azimuth is around -180 to -160°. Moreover, the geomagnetic storm also affects the high-order ionospheric delay, which should be carefully corrected.

  8. Trapping Elusive Cats: Using Intensive Camera Trapping to Estimate the Density of a Rare African Felid.

    Science.gov (United States)

    Brassine, Eléanor; Parker, Daniel

    2015-01-01

    Camera trapping studies have become increasingly popular to produce population estimates of individually recognisable mammals. Yet, monitoring techniques for rare species which occur at extremely low densities are lacking. Additionally, species which have unpredictable movements may make obtaining reliable population estimates challenging due to low detectability. Our study explores the effectiveness of intensive camera trapping for estimating cheetah (Acinonyx jubatus) numbers. Using both a more traditional, systematic grid approach and pre-determined, targeted sites for camera placement, the cheetah population of the Northern Tuli Game Reserve, Botswana was sampled between December 2012 and October 2013. Placement of cameras in a regular grid pattern yielded very few (n = 9) cheetah images and these were insufficient to estimate cheetah density. However, pre-selected cheetah scent-marking posts provided 53 images of seven adult cheetahs (0.61 ± 0.18 cheetahs/100 km²). While increasing the length of the camera trapping survey from 90 to 130 days increased the total number of cheetah images obtained (from 53 to 200), no new individuals were recorded and the estimated population density remained stable. Thus, our study demonstrates that targeted camera placement (irrespective of survey duration) is necessary for reliably assessing cheetah densities where populations are naturally very low or dominated by transient individuals. Significantly our approach can easily be applied to other rare predator species.

  9. A hierarchical model for estimating density in camera-trap studies

    Science.gov (United States)

    Royle, J. Andrew; Nichols, James D.; Karanth, K.Ullas; Gopalaswamy, Arjun M.

    2009-01-01

    Estimating animal density using capture–recapture data from arrays of detection devices such as camera traps has been problematic due to the movement of individuals and heterogeneity in capture probability among them induced by differential exposure to trapping.We develop a spatial capture–recapture model for estimating density from camera-trapping data which contains explicit models for the spatial point process governing the distribution of individuals and their exposure to and detection by traps.We adopt a Bayesian approach to analysis of the hierarchical model using the technique of data augmentation.The model is applied to photographic capture–recapture data on tigers Panthera tigris in Nagarahole reserve, India. Using this model, we estimate the density of tigers to be 14·3 animals per 100 km2 during 2004.Synthesis and applications. Our modelling framework largely overcomes several weaknesses in conventional approaches to the estimation of animal density from trap arrays. It effectively deals with key problems such as individual heterogeneity in capture probabilities, movement of traps, presence of potential ‘holes’ in the array and ad hoc estimation of sample area. The formulation, thus, greatly enhances flexibility in the conduct of field surveys as well as in the analysis of data, from studies that may involve physical, photographic or DNA-based ‘captures’ of individual animals.

  10. Examples of the Application of Nonparametric Information Geometry to Statistical Physics

    Directory of Open Access Journals (Sweden)

    Giovanni Pistone

    2013-09-01

    Full Text Available We review a nonparametric version of Amari’s information geometry in which the set of positive probability densities on a given sample space is endowed with an atlas of charts to form a differentiable manifold modeled on Orlicz Banach spaces. This nonparametric setting is used to discuss the setting of typical problems in machine learning and statistical physics, such as black-box optimization, Kullback-Leibler divergence, Boltzmann-Gibbs entropy and the Boltzmann equation.

  11. Estimation of current density distribution of PAFC by analysis of cell exhaust gas

    Energy Technology Data Exchange (ETDEWEB)

    Kato, S.; Seya, A. [Fuji Electric Co., Ltd., Ichihara-shi (Japan); Asano, A. [Fuji Electric Corporate, Ltd., Yokosuka-shi (Japan)

    1996-12-31

    To estimate distributions of Current densities, voltages, gas concentrations, etc., in phosphoric acid fuel cell (PAFC) stacks, is very important for getting fuel cells with higher quality. In this work, we leave developed a numerical simulation tool to map out the distribution in a PAFC stack. And especially to Study Current density distribution in the reaction area of the cell, we analyzed gas composition in several positions inside a gas outlet manifold of the PAFC stack. Comparing these measured data with calculated data, the current density distribution in a cell plane calculated by the simulation, was certified.

  12. Scent Lure Effect on Camera-Trap Based Leopard Density Estimates.

    Directory of Open Access Journals (Sweden)

    Alexander Richard Braczkowski

    Full Text Available Density estimates for large carnivores derived from camera surveys often have wide confidence intervals due to low detection rates. Such estimates are of limited value to authorities, which require precise population estimates to inform conservation strategies. Using lures can potentially increase detection, improving the precision of estimates. However, by altering the spatio-temporal patterning of individuals across the camera array, lures may violate closure, a fundamental assumption of capture-recapture. Here, we test the effect of scent lures on the precision and veracity of density estimates derived from camera-trap surveys of a protected African leopard population. We undertook two surveys (a 'control' and 'treatment' survey on Phinda Game Reserve, South Africa. Survey design remained consistent except a scent lure was applied at camera-trap stations during the treatment survey. Lures did not affect the maximum movement distances (p = 0.96 or temporal activity of female (p = 0.12 or male leopards (p = 0.79, and the assumption of geographic closure was met for both surveys (p >0.05. The numbers of photographic captures were also similar for control and treatment surveys (p = 0.90. Accordingly, density estimates were comparable between surveys (although estimates derived using non-spatial methods (7.28-9.28 leopards/100km2 were considerably higher than estimates from spatially-explicit methods (3.40-3.65 leopards/100km2. The precision of estimates from the control and treatment surveys, were also comparable and this applied to both non-spatial and spatial methods of estimation. Our findings suggest that at least in the context of leopard research in productive habitats, the use of lures is not warranted.

  13. Retrieval of mesospheric electron densities using an optimal estimation inverse method

    Science.gov (United States)

    Grant, J.; Grainger, R. G.; Lawrence, B. N.; Fraser, G. J.; von Biel, H. A.; Heuff, D. N.; Plank, G. E.

    2004-03-01

    We present a new method to determine mesospheric electron densities from partially reflected medium frequency radar pulses. The technique uses an optimal estimation inverse method and retrieves both an electron density profile and a gradient electron density profile. As well as accounting for the absorption of the two magnetoionic modes formed by ionospheric birefringence of each radar pulse, the forward model of the retrieval parameterises possible Fresnel scatter of each mode by fine electronic structure, phase changes of each mode due to Faraday rotation and the dependence of the amplitudes of the backscattered modes upon pulse width. Validation results indicate that known profiles can be retrieved and that χ2 tests upon retrieval parameters satisfy validity criteria. Application to measurements shows that retrieved electron density profiles are consistent with accepted ideas about seasonal variability of electron densities and their dependence upon nitric oxide production and transport.

  14. Application of Density Estimation Methods to Datasets Collected From a Glider

    Science.gov (United States)

    2015-09-30

    buoyancy. The methodology employed in this study to estimate population density of marine mammals is based on the works of Zimmer et al. (2008), Marques ...estimation modalities (Thomas and Marques , 2012), such as individual or group counting. In this sense, bearings to received sounds on both hydrophones will...the sea trial. Figure 2. Left: Image showing the area of REP14-MED sea-trial (red box) in the context of the Western Mediterranean Sea and

  15. Breast percent density estimation from 3D reconstructed digital breast tomosynthesis images

    Science.gov (United States)

    Bakic, Predrag R.; Kontos, Despina; Carton, Ann-Katherine; Maidment, Andrew D. A.

    2008-03-01

    Breast density is an independent factor of breast cancer risk. In mammograms breast density is quantitatively measured as percent density (PD), the percentage of dense (non-fatty) tissue. To date, clinical estimates of PD have varied significantly, in part due to the projective nature of mammography. Digital breast tomosynthesis (DBT) is a 3D imaging modality in which cross-sectional images are reconstructed from a small number of projections acquired at different x-ray tube angles. Preliminary studies suggest that DBT is superior to mammography in tissue visualization, since superimposed anatomical structures present in mammograms are filtered out. We hypothesize that DBT could also provide a more accurate breast density estimation. In this paper, we propose to estimate PD from reconstructed DBT images using a semi-automated thresholding technique. Preprocessing is performed to exclude the image background and the area of the pectoral muscle. Threshold values are selected manually from a small number of reconstructed slices; a combination of these thresholds is applied to each slice throughout the entire reconstructed DBT volume. The proposed method was validated using images of women with recently detected abnormalities or with biopsy-proven cancers; only contralateral breasts were analyzed. The Pearson correlation and kappa coefficients between the breast density estimates from DBT and the corresponding digital mammogram indicate moderate agreement between the two modalities, comparable with our previous results from 2D DBT projections. Percent density appears to be a robust measure for breast density assessment in both 2D and 3D x-ray breast imaging modalities using thresholding.

  16. Comparison of non-parametric methods for ungrouping coarsely aggregated data

    Directory of Open Access Journals (Sweden)

    Silvia Rizzi

    2016-05-01

    Full Text Available Abstract Background Histograms are a common tool to estimate densities non-parametrically. They are extensively encountered in health sciences to summarize data in a compact format. Examples are age-specific distributions of death or onset of diseases grouped in 5-years age classes with an open-ended age group at the highest ages. When histogram intervals are too coarse, information is lost and comparison between histograms with different boundaries is arduous. In these cases it is useful to estimate detailed distributions from grouped data. Methods From an extensive literature search we identify five methods for ungrouping count data. We compare the performance of two spline interpolation methods, two kernel density estimators and a penalized composite link model first via a simulation study and then with empirical data obtained from the NORDCAN Database. All methods analyzed can be used to estimate differently shaped distributions; can handle unequal interval length; and allow stretches of 0 counts. Results The methods show similar performance when the grouping scheme is relatively narrow, i.e. 5-years age classes. With coarser age intervals, i.e. in the presence of open-ended age groups, the penalized composite link model performs the best. Conclusion We give an overview and test different methods to estimate detailed distributions from grouped count data. Health researchers can benefit from these versatile methods, which are ready for use in the statistical software R. We recommend using the penalized composite link model when data are grouped in wide age classes.

  17. Estimating abundance and density of Amur tigers along the Sino-Russian border.

    Science.gov (United States)

    Xiao, Wenhong; Feng, Limin; Mou, Pu; Miquelle, Dale G; Hebblewhite, Mark; Goldberg, Joshua F; Robinson, Hugh S; Zhao, Xiaodan; Zhou, Bo; Wang, Tianming; Ge, Jianping

    2016-07-01

    As an apex predator the Amur tiger (Panthera tigris altaica) could play a pivotal role in maintaining the integrity of forest ecosystems in Northeast Asia. Due to habitat loss and harvest over the past century, tigers rapidly declined in China and are now restricted to the Russian Far East and bordering habitat in nearby China. To facilitate restoration of the tiger in its historical range, reliable estimates of population size are essential to assess effectiveness of conservation interventions. Here we used camera trap data collected in Hunchun National Nature Reserve from April to June 2013 and 2014 to estimate tiger density and abundance using both maximum likelihood and Bayesian spatially explicit capture-recapture (SECR) methods. A minimum of 8 individuals were detected in both sample periods and the documentation of marking behavior and reproduction suggests the presence of a resident population. Using Bayesian SECR modeling within the 11 400 km(2) state space, density estimates were 0.33 and 0.40 individuals/100 km(2) in 2013 and 2014, respectively, corresponding to an estimated abundance of 38 and 45 animals for this transboundary Sino-Russian population. In a maximum likelihood framework, we estimated densities of 0.30 and 0.24 individuals/100 km(2) corresponding to abundances of 34 and 27, in 2013 and 2014, respectively. These density estimates are comparable to other published estimates for resident Amur tiger populations in the Russian Far East. This study reveals promising signs of tiger recovery in Northeast China, and demonstrates the importance of connectivity between the Russian and Chinese populations for recovering tigers in Northeast China.

  18. Bandwidth selection in kernel density estimation: oracle inequalities and adaptive minimax optimality

    CERN Document Server

    Goldenshluger, Alexander

    2010-01-01

    We address the problem of density estimation with $\\bL_p$--loss by selection of kernel estimators. We develop a selection procedure and derive corresponiding $\\bL_p$--risk oracle inequalities. It is shown that the proposed selection rule leads to the minimax estimator that is adaptive over a scale of the anisotropic Nikol'ski classes. The main technical tools used in our derivations are uniform bounds on the $\\bL_p$--norms of empirical processes developed recently in Goldenshluger and Lepski~(2010).

  19. Multi-objective mixture-based iterated density estimation evolutionary algorithms

    NARCIS (Netherlands)

    Thierens, D.; Bosman, P.A.N.

    2001-01-01

    We propose an algorithm for multi-objective optimization using a mixture-based iterated density estimation evolutionary algorithm (MIDEA). The MIDEA algorithm is a prob- abilistic model building evolutionary algo- rithm that constructs at each generation a mixture of factorized probability

  20. Estimation of electrical conductivity distribution within the human head from magnetic flux density measurement.

    Science.gov (United States)

    Gao, Nuo; Zhu, S A; He, Bin

    2005-06-01

    We have developed a new algorithm for magnetic resonance electrical impedance tomography (MREIT), which uses only one component of the magnetic flux density to reconstruct the electrical conductivity distribution within the body. The radial basis function (RBF) network and simplex method are used in the present approach to estimate the conductivity distribution by minimizing the errors between the 'measured' and model-predicted magnetic flux densities. Computer simulations were conducted in a realistic-geometry head model to test the feasibility of the proposed approach. Single-variable and three-variable simulations were performed to estimate the brain-skull conductivity ratio and the conductivity values of the brain, skull and scalp layers. When SNR = 15 for magnetic flux density measurements with the target skull-to-brain conductivity ratio being 1/15, the relative error (RE) between the target and estimated conductivity was 0.0737 +/- 0.0746 in the single-variable simulations. In the three-variable simulations, the RE was 0.1676 +/- 0.0317. Effects of electrode position uncertainty were also assessed by computer simulations. The present promising results suggest the feasibility of estimating important conductivity values within the head from noninvasive magnetic flux density measurements.

  1. ASYMPTOTIC NORMALITY OF KERNEL ESTIMATES OF A DENSITY FUNCTION UNDER ASSOCIATION DEPENDENCE

    Institute of Scientific and Technical Information of China (English)

    林正炎

    2003-01-01

    Let {Xn,n> _ 1} be a strictly stationary sequence of random variables,which are either associated or negatively associated,f(·)be their common density.In this paper,the author shows a central limit theorem for a kernel estimate of f(·)under certain regular conditions.

  2. Estimation of bone mineral density by digital X-ray radiogrammetry: theoretical background and clinical testing

    DEFF Research Database (Denmark)

    Rosholm, A; Hyldstrup, L; Backsgaard, L

    2002-01-01

    A new automated radiogrammetric method to estimate bone mineral density (BMD) from a single radiograph of the hand and forearm is described. Five regions of interest in radius, ulna and the three middle metacarpal bones are identified and approximately 1800 geometrical measurements from these bon...

  3. Topological Pressure and Coding Sequence Density Estimation in the Human Genome

    CERN Document Server

    Koslicki, David

    2011-01-01

    Inspired by concepts from ergodic theory, we give new insight into coding sequence (CDS) density estimation for the human genome. Our approach is based on the introduction and study of topological pressure: a numerical quantity assigned to any finite sequence based on an appropriate notion of `weighted information content'. For human DNA sequences, each codon is assigned a suitable weight, and using a window size of approximately 60,000bp, we obtain a very strong positive correlation between CDS density and topological pressure. The weights are selected by an optimization procedure, and can be interpreted as quantitative data on the relative importance of different codons for the density estimation of coding sequences. This gives new insight into codon usage bias which is an important subject where long standing questions remain open. Inspired again by ergodic theory, we use the weightings on the codons to define a probability measure on finite sequences. We demonstrate that this measure is effective in disti...

  4. Distributed Density Estimation Based on a Mixture of Factor Analyzers in a Sensor Network

    Directory of Open Access Journals (Sweden)

    Xin Wei

    2015-08-01

    Full Text Available Distributed density estimation in sensor networks has received much attention due to its broad applicability. When encountering high-dimensional observations, a mixture of factor analyzers (MFA is taken to replace mixture of Gaussians for describing the distributions of observations. In this paper, we study distributed density estimation based on a mixture of factor analyzers. Existing estimation algorithms of the MFA are for the centralized case, which are not suitable for distributed processing in sensor networks. We present distributed density estimation algorithms for the MFA and its extension, the mixture of Student’s t-factor analyzers (MtFA. We first define an objective function as the linear combination of local log-likelihoods. Then, we give the derivation process of the distributed estimation algorithms for the MFA and MtFA in details, respectively. In these algorithms, the local sufficient statistics (LSS are calculated at first and diffused. Then, each node performs a linear combination of the received LSS from nodes in its neighborhood to obtain the combined sufficient statistics (CSS. Parameters of the MFA and the MtFA can be obtained by using the CSS. Finally, we evaluate the performance of these algorithms by numerical simulations and application example. Experimental results validate the promising performance of the proposed algorithms.

  5. Importance of tree basic density in biomass estimation and associated uncertainties

    DEFF Research Database (Denmark)

    Njana, Marco Andrew; Meilby, Henrik; Eid, Tron

    2016-01-01

    Key message Aboveground and belowground tree basic densities varied between and within the three mangrove species. If appropriately determined and applied, basic density may be useful in estimation of tree biomass. Predictive accuracy of the common (i.e. multi-species) models including aboveground...... of sustainable forest management, conservation and enhancement of carbon stocks (REDD+) initiatives offer an opportunity for sustainable management of forests including mangroves. In carbon accounting for REDD+, it is required that carbon estimates prepared for monitoring reporting and verification schemes...... and examine uncertainties in estimation of tree biomass using indirect methods. Methods This study focused on three dominant mangrove species (Avicennia marina (Forssk.) Vierh, Sonneratia alba J. Smith and Rhizophora mucronata Lam.) in Tanzania. A total of 120 trees were destructively sampled for aboveground...

  6. Estimation of dislocation density from precession electron diffraction data using the Nye tensor.

    Science.gov (United States)

    Leff, A C; Weinberger, C R; Taheri, M L

    2015-06-01

    The Nye tensor offers a means to estimate the geometrically necessary dislocation density of a crystalline sample based on measurements of the orientation changes within individual crystal grains. In this paper, the Nye tensor theory is applied to precession electron diffraction automated crystallographic orientation mapping (PED-ACOM) data acquired using a transmission electron microscope (TEM). The resulting dislocation density values are mapped in order to visualize the dislocation structures present in a quantitative manner. These density maps are compared with other related methods of approximating local strain dependencies in dislocation-based microstructural transitions from orientation data. The effect of acquisition parameters on density measurements is examined. By decreasing the step size and spot size during data acquisition, an increasing fraction of the dislocation content becomes accessible. Finally, the method described herein is applied to the measurement of dislocation emission during in situ annealing of Cu in TEM in order to demonstrate the utility of the technique for characterizing microstructural dynamics.

  7. Mammographic density and estimation of breast cancer risk in intermediate risk population.

    Science.gov (United States)

    Tesic, Vanja; Kolaric, Branko; Znaor, Ariana; Kuna, Sanja Kusacic; Brkljacic, Boris

    2013-01-01

    It is not clear to what extent mammographic density represents a risk factor for breast cancer among women with moderate risk for disease. We conducted a population-based study to estimate the independent effect of breast density on breast cancer risk and to evaluate the potential of breast density as a marker of risk in an intermediate risk population. From November 2006 to April 2009, data that included American College of Radiology Breast Imaging Reporting and Data System (BI-RADS) breast density categories and risk information were collected on 52,752 women aged 50-69 years without previously diagnosed breast cancer who underwent screening mammography examination. A total of 257 screen-detected breast cancers were identified. Logistic regression was used to assess the effect of breast density on breast carcinoma risk and to control for other risk factors. The risk increased with density and the odds ratio for breast cancer among women with dense breast (heterogeneously and extremely dense breast), was 1.9 (95% confidence interval, 1.3-2.8) compared with women with almost entirely fat breasts, after adjustment for age, body mass index, age at menarche, age at menopause, age at first childbirth, number of live births, use of oral contraceptive, family history of breast cancer, prior breast procedures, and hormone replacement therapy use that were all significantly related to breast density (p density and decreased with number of live births. Our finding that mammographic density is an independent risk factor for breast cancer indicates the importance of breast density measurements for breast cancer risk assessment also in moderate risk populations. © 2012 Wiley Periodicals, Inc.

  8. A novel nonparametric confidence interval for differences of proportions for correlated binary data.

    Science.gov (United States)

    Duan, Chongyang; Cao, Yingshu; Zhou, Lizhi; Tan, Ming T; Chen, Pingyan

    2016-11-16

    Various confidence interval estimators have been developed for differences in proportions resulted from correlated binary data. However, the width of the mostly recommended Tango's score confidence interval tends to be wide, and the computing burden of exact methods recommended for small-sample data is intensive. The recently proposed rank-based nonparametric method by treating proportion as special areas under receiver operating characteristic provided a new way to construct the confidence interval for proportion difference on paired data, while the complex computation limits its application in practice. In this article, we develop a new nonparametric method utilizing the U-statistics approach for comparing two or more correlated areas under receiver operating characteristics. The new confidence interval has a simple analytic form with a new estimate of the degrees of freedom of n - 1. It demonstrates good coverage properties and has shorter confidence interval widths than that of Tango. This new confidence interval with the new estimate of degrees of freedom also leads to coverage probabilities that are an improvement on the rank-based nonparametric confidence interval. Comparing with the approximate exact unconditional method, the nonparametric confidence interval demonstrates good coverage properties even in small samples, and yet they are very easy to implement computationally. This nonparametric procedure is evaluated using simulation studies and illustrated with three real examples. The simplified nonparametric confidence interval is an appealing choice in practice for its ease of use and good performance. © The Author(s) 2016.

  9. Nonparametric instrumental regression with non-convex constraints

    Science.gov (United States)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  10. Combined parametric-nonparametric identification of block-oriented systems

    CERN Document Server

    Mzyk, Grzegorz

    2014-01-01

    This book considers a problem of block-oriented nonlinear dynamic system identification in the presence of random disturbances. This class of systems includes various interconnections of linear dynamic blocks and static nonlinear elements, e.g., Hammerstein system, Wiener system, Wiener-Hammerstein ("sandwich") system and additive NARMAX systems with feedback. Interconnecting signals are not accessible for measurement. The combined parametric-nonparametric algorithms, proposed in the book, can be selected dependently on the prior knowledge of the system and signals. Most of them are based on the decomposition of the complex system identification task into simpler local sub-problems by using non-parametric (kernel or orthogonal) regression estimation. In the parametric stage, the generalized least squares or the instrumental variables technique is commonly applied to cope with correlated excitations. Limit properties of the algorithms have been shown analytically and illustrated in simple experiments.

  11. Estimation of energy density of Li-S batteries with liquid and solid electrolytes

    Science.gov (United States)

    Li, Chunmei; Zhang, Heng; Otaegui, Laida; Singh, Gurpreet; Armand, Michel; Rodriguez-Martinez, Lide M.

    2016-09-01

    With the exponential growth of technology in mobile devices and the rapid expansion of electric vehicles into the market, it appears that the energy density of the state-of-the-art Li-ion batteries (LIBs) cannot satisfy the practical requirements. Sulfur has been one of the best cathode material choices due to its high charge storage (1675 mAh g-1), natural abundance and easy accessibility. In this paper, calculations are performed for different cell design parameters such as the active material loading, the amount/thickness of electrolyte, the sulfur utilization, etc. to predict the energy density of Li-S cells based on liquid, polymeric and ceramic electrolytes. It demonstrates that Li-S battery is most likely to be competitive in gravimetric energy density, but not volumetric energy density, with current technology, when comparing with LIBs. Furthermore, the cells with polymer and thin ceramic electrolytes show promising potential in terms of high gravimetric energy density, especially the cells with the polymer electrolyte. This estimation study of Li-S energy density can be used as a good guidance for controlling the key design parameters in order to get desirable energy density at cell-level.

  12. A method to estimate the neutral atmospheric density near the ionospheric main peak of Mars

    Science.gov (United States)

    Zou, Hong; Ye, Yu Guang; Wang, Jin Song; Nielsen, Erling; Cui, Jun; Wang, Xiao Dong

    2016-04-01

    A method to estimate the neutral atmospheric density near the ionospheric main peak of Mars is introduced in this study. The neutral densities at 130 km can be derived from the ionospheric and atmospheric measurements of the Radio Science experiment on board Mars Global Surveyor (MGS). The derived neutral densities cover a large longitude range in northern high latitudes from summer to late autumn during 3 Martian years, which fills the gap of the previous observations for the upper atmosphere of Mars. The simulations of the Laboratoire de Météorologie Dynamique Mars global circulation model can be corrected with a simple linear equation to fit the neutral densities derived from the first MGS/RS (Radio Science) data sets (EDS1). The corrected simulations with the same correction parameters as for EDS1 match the derived neutral densities from two other MGS/RS data sets (EDS2 and EDS3) very well. The derived neutral density from EDS3 shows a dust storm effect, which is in accord with the Mars Express (MEX) Spectroscopy for Investigation of Characteristics of the Atmosphere of Mars measurement. The neutral density derived from the MGS/RS measurements can be used to validate the Martian atmospheric models. The method presented in this study can be applied to other radio occultation measurements, such as the result of the Radio Science experiment on board MEX.

  13. A novel deep learning-based approach to high accuracy breast density estimation in digital mammography

    Science.gov (United States)

    Ahn, Chul Kyun; Heo, Changyong; Jin, Heongmin; Kim, Jong Hyo

    2017-03-01

    Mammographic breast density is a well-established marker for breast cancer risk. However, accurate measurement of dense tissue is a difficult task due to faint contrast and significant variations in background fatty tissue. This study presents a novel method for automated mammographic density estimation based on Convolutional Neural Network (CNN). A total of 397 full-field digital mammograms were selected from Seoul National University Hospital. Among them, 297 mammograms were randomly selected as a training set and the rest 100 mammograms were used for a test set. We designed a CNN architecture suitable to learn the imaging characteristic from a multitudes of sub-images and classify them into dense and fatty tissues. To train the CNN, not only local statistics but also global statistics extracted from an image set were used. The image set was composed of original mammogram and eigen-image which was able to capture the X-ray characteristics in despite of the fact that CNN is well known to effectively extract features on original image. The 100 test images which was not used in training the CNN was used to validate the performance. The correlation coefficient between the breast estimates by the CNN and those by the expert's manual measurement was 0.96. Our study demonstrated the feasibility of incorporating the deep learning technology into radiology practice, especially for breast density estimation. The proposed method has a potential to be used as an automated and quantitative assessment tool for mammographic breast density in routine practice.

  14. Joint estimation of crown of thorns (Acanthaster planci densities on the Great Barrier Reef

    Directory of Open Access Journals (Sweden)

    M. Aaron MacNeil

    2016-08-01

    Full Text Available Crown-of-thorns starfish (CoTS; Acanthaster spp. are an outbreaking pest among many Indo-Pacific coral reefs that cause substantial ecological and economic damage. Despite ongoing CoTS research, there remain critical gaps in observing CoTS populations and accurately estimating their numbers, greatly limiting understanding of the causes and sources of CoTS outbreaks. Here we address two of these gaps by (1 estimating the detectability of adult CoTS on typical underwater visual count (UVC surveys using covariates and (2 inter-calibrating multiple data sources to estimate CoTS densities within the Cairns sector of the Great Barrier Reef (GBR. We find that, on average, CoTS detectability is high at 0.82 [0.77, 0.87] (median highest posterior density (HPD and [95% uncertainty intervals], with CoTS disc width having the greatest influence on detection. Integrating this information with coincident surveys from alternative sampling programs, we estimate CoTS densities in the Cairns sector of the GBR averaged 44 [41, 48] adults per hectare in 2014.

  15. Density estimation in a wolverine population using spatial capture-recapture models

    Science.gov (United States)

    Royle, J. Andrew; Magoun, Audrey J.; Gardner, Beth; Valkenbury, Patrick; Lowell, Richard E.; McKelvey, Kevin

    2011-01-01

    Classical closed-population capture-recapture models do not accommodate the spatial information inherent in encounter history data obtained from camera-trapping studies. As a result, individual heterogeneity in encounter probability is induced, and it is not possible to estimate density objectively because trap arrays do not have a well-defined sample area. We applied newly-developed, capture-recapture models that accommodate the spatial attribute inherent in capture-recapture data to a population of wolverines (Gulo gulo) in Southeast Alaska in 2008. We used camera-trapping data collected from 37 cameras in a 2,140-km2 area of forested and open habitats largely enclosed by ocean and glacial icefields. We detected 21 unique individuals 115 times. Wolverines exhibited a strong positive trap response, with an increased tendency to revisit previously visited traps. Under the trap-response model, we estimated wolverine density at 9.7 individuals/1,000-km2(95% Bayesian CI: 5.9-15.0). Our model provides a formal statistical framework for estimating density from wolverine camera-trapping studies that accounts for a behavioral response due to baited traps. Further, our model-based estimator does not have strict requirements about the spatial configuration of traps or length of trapping sessions, providing considerable operational flexibility in the development of field studies.

  16. Joint estimation of crown of thorns (Acanthaster planci) densities on the Great Barrier Reef.

    Science.gov (United States)

    MacNeil, M Aaron; Mellin, Camille; Pratchett, Morgan S; Hoey, Jessica; Anthony, Kenneth R N; Cheal, Alistair J; Miller, Ian; Sweatman, Hugh; Cowan, Zara L; Taylor, Sascha; Moon, Steven; Fonnesbeck, Chris J

    2016-01-01

    Crown-of-thorns starfish (CoTS; Acanthaster spp.) are an outbreaking pest among many Indo-Pacific coral reefs that cause substantial ecological and economic damage. Despite ongoing CoTS research, there remain critical gaps in observing CoTS populations and accurately estimating their numbers, greatly limiting understanding of the causes and sources of CoTS outbreaks. Here we address two of these gaps by (1) estimating the detectability of adult CoTS on typical underwater visual count (UVC) surveys using covariates and (2) inter-calibrating multiple data sources to estimate CoTS densities within the Cairns sector of the Great Barrier Reef (GBR). We find that, on average, CoTS detectability is high at 0.82 [0.77, 0.87] (median highest posterior density (HPD) and [95% uncertainty intervals]), with CoTS disc width having the greatest influence on detection. Integrating this information with coincident surveys from alternative sampling programs, we estimate CoTS densities in the Cairns sector of the GBR averaged 44 [41, 48] adults per hectare in 2014.

  17. Non-parametric system identification from non-linear stochastic response

    DEFF Research Database (Denmark)

    Rüdinger, Finn; Krenk, Steen

    2001-01-01

    An estimation method is proposed for identification of non-linear stiffness and damping of single-degree-of-freedom systems under stationary white noise excitation. Non-parametric estimates of the stiffness and damping along with an estimate of the white noise intensity are obtained by suitable p...

  18. Efficient Estimation of Dynamic Density Functions with Applications in Streaming Data

    KAUST Repository

    Qahtan, Abdulhakim

    2016-05-11

    Recent advances in computing technology allow for collecting vast amount of data that arrive continuously in the form of streams. Mining data streams is challenged by the speed and volume of the arriving data. Furthermore, the underlying distribution of the data changes over the time in unpredicted scenarios. To reduce the computational cost, data streams are often studied in forms of condensed representation, e.g., Probability Density Function (PDF). This thesis aims at developing an online density estimator that builds a model called KDE-Track for characterizing the dynamic density of the data streams. KDE-Track estimates the PDF of the stream at a set of resampling points and uses interpolation to estimate the density at any given point. To reduce the interpolation error and computational complexity, we introduce adaptive resampling where more/less resampling points are used in high/low curved regions of the PDF. The PDF values at the resampling points are updated online to provide up-to-date model of the data stream. Comparing with other existing online density estimators, KDE-Track is often more accurate (as reflected by smaller error values) and more computationally efficient (as reflected by shorter running time). The anytime available PDF estimated by KDE-Track can be applied for visualizing the dynamic density of data streams, outlier detection and change detection in data streams. In this thesis work, the first application is to visualize the taxi traffic volume in New York city. Utilizing KDE-Track allows for visualizing and monitoring the traffic flow on real time without extra overhead and provides insight analysis of the pick up demand that can be utilized by service providers to improve service availability. The second application is to detect outliers in data streams from sensor networks based on the estimated PDF. The method detects outliers accurately and outperforms baseline methods designed for detecting and cleaning outliers in sensor data. The

  19. Recursive Density Estimation of NA Samples%样本的递归密度估计

    Institute of Scientific and Technical Information of China (English)

    张冬霞; 梁汉营

    2008-01-01

    Let{Xn,n≥1} be a strictly stationary sequence of negatively associated random variables with the marginal probability density function f (x). In this paper, we discuss the point asymptotic normality for recursive kernel density estimator of f (x).%设{{Xn, n≥1}是一个严平稳的负相协的随机变量序列,其概率密谋函数为f(x).本文讨论了f(x)的递归核估计量的联合渐近正态性.

  20. An Independent Component Analysis Algorithm through Solving Gradient Equation Combined with Kernel Density Estimation

    Institute of Scientific and Technical Information of China (English)

    XUE Yun-feng; WANG Yu-jia; YANG Jie

    2009-01-01

    A new algorithm for linear instantaneous independent component analysis is proposed based on max-imizing the log-likelihood contrast function which can be changed into a gradient equation. An iterative method is introduced to solve this equation efficiently. The unknown probability density functions as well as their first and second derivatives in the gradient equation are estimated by kernel density method. Computer simulations on artificially generated signals and gray scale natural scene images confirm the efficiency and accuracy of the proposed algorithm.

  1. ESTIMATION OF THE NUMBER OF CORRELATED SOURCES WITH COMMON FREQUENCIES BASED ON POWER SPECTRAL DENSITY

    Institute of Scientific and Technical Information of China (English)

    LI Ning; SHI Tielin

    2007-01-01

    Blind source Separation and estimation of the number of sources usually demand that the number of sensors should be greater than or equal to that of the sources, which, however, is very difficult to satisfy for the complex Systems. A new estimating method based on power spectral density (PSD) is presented. When the relation between the number of sensors and that of sources is unknown, the PSD matrix is first obtained by the ratio of PSD of the observation signals, and then the bound of the number of correlated sources with common frequencies can be estimated by comparing every column vector of PSD matrix. The effectiveness of the proposed method is verified by theoretical analysis and experiments, and the influence of noise on the estimation of number of source is simulated.

  2. METAPHOR: A machine learning based method for the probability density estimation of photometric redshifts

    CERN Document Server

    Cavuoti, Stefano; Brescia, Massimo; Vellucci, Civita; Tortora, Crescenzo; Longo, Giuseppe

    2016-01-01

    A variety of fundamental astrophysical science topics require the determination of very accurate photometric redshifts (photo-z's). A wide plethora of methods have been developed, based either on template models fitting or on empirical explorations of the photometric parameter space. Machine learning based techniques are not explicitly dependent on the physical priors and able to produce accurate photo-z estimations within the photometric ranges derived from the spectroscopic training set. These estimates, however, are not easy to characterize in terms of a photo-z Probability Density Function (PDF), due to the fact that the analytical relation mapping the photometric parameters onto the redshift space is virtually unknown. We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method designed to provide a reliable PDF of the error distribution for empirical techniques. The method is implemented as a modular workflow, whose internal engine for photo-z estimation makes use...

  3. Technical factors influencing cone packing density estimates in adaptive optics flood illuminated retinal images.

    Science.gov (United States)

    Lombardo, Marco; Serrao, Sebastiano; Lombardo, Giuseppe

    2014-01-01

    To investigate the influence of various technical factors on the variation of cone packing density estimates in adaptive optics flood illuminated retinal images. Adaptive optics images of the photoreceptor mosaic were obtained in fifteen healthy subjects. The cone density and Voronoi diagrams were assessed in sampling windows of 320×320 µm, 160×160 µm and 64×64 µm at 1.5 degree temporal and superior eccentricity from the preferred locus of fixation (PRL). The technical factors that have been analyzed included the sampling window size, the corrected retinal magnification factor (RMFcorr), the conversion from radial to linear distance from the PRL, the displacement between the PRL and foveal center and the manual checking of cone identification algorithm. Bland-Altman analysis was used to assess the agreement between cone density estimated within the different sampling window conditions. The cone density declined with decreasing sampling area and data between areas of different size showed low agreement. A high agreement was found between sampling areas of the same size when comparing density calculated with or without using individual RMFcorr. The agreement between cone density measured at radial and linear distances from the PRL and between data referred to the PRL or the foveal center was moderate. The percentage of Voronoi tiles with hexagonal packing arrangement was comparable between sampling areas of different size. The boundary effect, presence of any retinal vessels, and the manual selection of cones missed by the automated identification algorithm were identified as the factors influencing variation of cone packing arrangements in Voronoi diagrams. The sampling window size is the main technical factor that influences variation of cone density. Clear identification of each cone in the image and the use of a large buffer zone are necessary to minimize factors influencing variation of Voronoi diagrams of the cone mosaic.

  4. Technical factors influencing cone packing density estimates in adaptive optics flood illuminated retinal images.

    Directory of Open Access Journals (Sweden)

    Marco Lombardo

    Full Text Available PURPOSE: To investigate the influence of various technical factors on the variation of cone packing density estimates in adaptive optics flood illuminated retinal images. METHODS: Adaptive optics images of the photoreceptor mosaic were obtained in fifteen healthy subjects. The cone density and Voronoi diagrams were assessed in sampling windows of 320×320 µm, 160×160 µm and 64×64 µm at 1.5 degree temporal and superior eccentricity from the preferred locus of fixation (PRL. The technical factors that have been analyzed included the sampling window size, the corrected retinal magnification factor (RMFcorr, the conversion from radial to linear distance from the PRL, the displacement between the PRL and foveal center and the manual checking of cone identification algorithm. Bland-Altman analysis was used to assess the agreement between cone density estimated within the different sampling window conditions. RESULTS: The cone density declined with decreasing sampling area and data between areas of different size showed low agreement. A high agreement was found between sampling areas of the same size when comparing density calculated with or without using individual RMFcorr. The agreement between cone density measured at radial and linear distances from the PRL and between data referred to the PRL or the foveal center was moderate. The percentage of Voronoi tiles with hexagonal packing arrangement was comparable between sampling areas of different size. The boundary effect, presence of any retinal vessels, and the manual selection of cones missed by the automated identification algorithm were identified as the factors influencing variation of cone packing arrangements in Voronoi diagrams. CONCLUSIONS: The sampling window size is the main technical factor that influences variation of cone density. Clear identification of each cone in the image and the use of a large buffer zone are necessary to minimize factors influencing variation of Voronoi

  5. An Non-parametrical Approach to Estimate Location Parameters under Simple Order%简单半序约束下估计位置参数的一个非参方法

    Institute of Scientific and Technical Information of China (English)

    孙旭

    2005-01-01

    This paper deals with estimating parameters under simple order whensamples come from location models. Based on the idea of Hodges and Lehmann es-timator (H-L estimator), a new approach to estimate parameters is proposed, whichis difference with the classical L1 isotonic regression and L2 isotonic regression. Analgorithm to compute estimators is given. Simulations by the Monte-Carlo methodis applied to compare the likelihood functions with respect to L1 estimators andweighted isotonic H-L estimators.

  6. Combining Breeding Bird Survey and distance sampling to estimate density of migrant and breeding birds

    Science.gov (United States)

    Somershoe, S.G.; Twedt, D.J.; Reid, B.

    2006-01-01

    We combined Breeding Bird Survey point count protocol and distance sampling to survey spring migrant and breeding birds in Vicksburg National Military Park on 33 days between March and June of 2003 and 2004. For 26 of 106 detected species, we used program DISTANCE to estimate detection probabilities and densities from 660 3-min point counts in which detections were recorded within four distance annuli. For most species, estimates of detection probability, and thereby density estimates, were improved through incorporation of the proportion of forest cover at point count locations as a covariate. Our results suggest Breeding Bird Surveys would benefit from the use of distance sampling and a quantitative characterization of habitat at point count locations. During spring migration, we estimated that the most common migrant species accounted for a population of 5000-9000 birds in Vicksburg National Military Park (636 ha). Species with average populations of 300 individuals during migration were: Blue-gray Gnatcatcher (Polioptila caerulea), Cedar Waxwing (Bombycilla cedrorum), White-eyed Vireo (Vireo griseus), Indigo Bunting (Passerina cyanea), and Ruby-crowned Kinglet (Regulus calendula). Of 56 species that bred in Vicksburg National Military Park, we estimated that the most common 18 species accounted for 8150 individuals. The six most abundant breeding species, Blue-gray Gnatcatcher, White-eyed Vireo, Summer Tanager (Piranga rubra), Northern Cardinal (Cardinalis cardinalis), Carolina Wren (Thryothorus ludovicianus), and Brown-headed Cowbird (Molothrus ater), accounted for 5800 individuals.

  7. Nearest neighbor density ratio estimation for large-scale applications in astronomy

    Science.gov (United States)

    Kremer, J.; Gieseke, F.; Steenstrup Pedersen, K.; Igel, C.

    2015-09-01

    In astronomical applications of machine learning, the distribution of objects used for building a model is often different from the distribution of the objects the model is later applied to. This is known as sample selection bias, which is a major challenge for statistical inference as one can no longer assume that the labeled training data are representative. To address this issue, one can re-weight the labeled training patterns to match the distribution of unlabeled data that are available already in the training phase. There are many examples in practice where this strategy yielded good results, but estimating the weights reliably from a finite sample is challenging. We consider an efficient nearest neighbor density ratio estimator that can exploit large samples to increase the accuracy of the weight estimates. To solve the problem of choosing the right neighborhood size, we propose to use cross-validation on a model selection criterion that is unbiased under covariate shift. The resulting algorithm is our method of choice for density ratio estimation when the feature space dimensionality is small and sample sizes are large. The approach is simple and, because of the model selection, robust. We empirically find that it is on a par with established kernel-based methods on relatively small regression benchmark datasets. However, when applied to large-scale photometric redshift estimation, our approach outperforms the state-of-the-art.

  8. Pedotransfer functions for Irish soils - estimation of bulk density (ρb) per horizon type

    Science.gov (United States)

    Reidy, B.; Simo, I.; Sills, P.; Creamer, R. E.

    2016-01-01

    Soil bulk density is a key property in defining soil characteristics. It describes the packing structure of the soil and is also essential for the measurement of soil carbon stock and nutrient assessment. In many older surveys this property was neglected and in many modern surveys this property is omitted due to cost both in laboratory and labour and in cases where the core method cannot be applied. To overcome these oversights pedotransfer functions are applied using other known soil properties to estimate bulk density. Pedotransfer functions have been derived from large international data sets across many studies, with their own inherent biases, many ignoring horizonation and depth variances. Initially pedotransfer functions from the literature were used to predict different horizon type bulk densities using local known bulk density data sets. Then the best performing of the pedotransfer functions were selected to recalibrate and then were validated again using the known data. The predicted co-efficient of determination was 0.5 or greater in 12 of the 17 horizon types studied. These new equations allowed gap filling where bulk density data were missing in part or whole soil profiles. This then allowed the development of an indicative soil bulk density map for Ireland at 0-30 and 30-50 cm horizon depths. In general the horizons with the largest known data sets had the best predictions, using the recalibrated and validated pedotransfer functions.

  9. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb-Douglas a......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...... to estimate production functions without the specification of a functional form. Therefore, they avoid possible misspecification errors due to the use of an unsuitable functional form. In this paper, we use parametric and non-parametric methods to identify the optimal size of Polish crop farms...

  10. A hybrid model of kernel density estimation and quantile regression for GEFCom2014 probabilistic load forecasting

    CERN Document Server

    Haben, Stephen

    2016-01-01

    We present a model for generating probabilistic forecasts by combining kernel density estimation (KDE) and quantile regression techniques, as part of the probabilistic load forecasting track of the Global Energy Forecasting Competition 2014. The KDE method is initially implemented with a time-decay parameter. We later improve this method by conditioning on the temperature or the period of the week variables to provide more accurate forecasts. Secondly, we develop a simple but effective quantile regression forecast. The novel aspects of our methodology are two-fold. First, we introduce symmetry into the time-decay parameter of the kernel density estimation based forecast. Secondly we combine three probabilistic forecasts with different weights for different periods of the month.

  11. Estimating risk aversion, Risk-Neutral and Real-World Densities using Brazilian Real currency options

    Directory of Open Access Journals (Sweden)

    José Fajardo

    2012-12-01

    Full Text Available This paper uses the Liu et al. (2007 approach to estimate the optionimplied Risk-Neutral Densities (RND, real-world density (RWD, and relative risk aversion from the Brazilian Real/US Dollar exchange rate distribution. Our empirical application uses a sample of exchange-traded Brazilian Real currency options from 1999 to 2011. Our estimated value of the relative risk aversion is around 2.7, which is in line with other articles for the Brazilian Economy. Our out-of-sample results showed that the RND has some ability to forecast the Brazilian Real exchange rate, but when we incorporate the risk aversion, the out-of-sample performance improves substantially.

  12. Uncertainty quantification techniques for population density estimates derived from sparse open source data

    Science.gov (United States)

    Stewart, Robert; White, Devin; Urban, Marie; Morton, April; Webster, Clayton; Stoyanov, Miroslav; Bright, Eddie; Bhaduri, Budhendra L.

    2013-05-01

    The Population Density Tables (PDT) project at Oak Ridge National Laboratory (www.ornl.gov) is developing population density estimates for specific human activities under normal patterns of life based largely on information available in open source. Currently, activity-based density estimates are based on simple summary data statistics such as range and mean. Researchers are interested in improving activity estimation and uncertainty quantification by adopting a Bayesian framework that considers both data and sociocultural knowledge. Under a Bayesian approach, knowledge about population density may be encoded through the process of expert elicitation. Due to the scale of the PDT effort which considers over 250 countries, spans 50 human activity categories, and includes numerous contributors, an elicitation tool is required that can be operationalized within an enterprise data collection and reporting system. Such a method would ideally require that the contributor have minimal statistical knowledge, require minimal input by a statistician or facilitator, consider human difficulties in expressing qualitative knowledge in a quantitative setting, and provide methods by which the contributor can appraise whether their understanding and associated uncertainty was well captured. This paper introduces an algorithm that transforms answers to simple, non-statistical questions into a bivariate Gaussian distribution as the prior for the Beta distribution. Based on geometric properties of the Beta distribution parameter feasibility space and the bivariate Gaussian distribution, an automated method for encoding is developed that responds to these challenging enterprise requirements. Though created within the context of population density, this approach may be applicable to a wide array of problem domains requiring informative priors for the Beta distribution.

  13. Uncertainty Quantification Techniques for Population Density Estimates Derived from Sparse Open Source Data

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Robert N [ORNL; White, Devin A [ORNL; Urban, Marie L [ORNL; Morton, April M [ORNL; Webster, Clayton G [ORNL; Stoyanov, Miroslav K [ORNL; Bright, Eddie A [ORNL; Bhaduri, Budhendra L [ORNL

    2013-01-01

    The Population Density Tables (PDT) project at the Oak Ridge National Laboratory (www.ornl.gov) is developing population density estimates for specific human activities under normal patterns of life based largely on information available in open source. Currently, activity based density estimates are based on simple summary data statistics such as range and mean. Researchers are interested in improving activity estimation and uncertainty quantification by adopting a Bayesian framework that considers both data and sociocultural knowledge. Under a Bayesian approach knowledge about population density may be encoded through the process of expert elicitation. Due to the scale of the PDT effort which considers over 250 countries, spans 40 human activity categories, and includes numerous contributors, an elicitation tool is required that can be operationalized within an enterprise data collection and reporting system. Such a method would ideally require that the contributor have minimal statistical knowledge, require minimal input by a statistician or facilitator, consider human difficulties in expressing qualitative knowledge in a quantitative setting, and provide methods by which the contributor can appraise whether their understanding and associated uncertainty was well captured. This paper introduces an algorithm that transforms answers to simple, non-statistical questions into a bivariate Gaussian distribution as the prior for the Beta distribution. Based on geometric properties of the Beta distribution parameter feasibility space and the bivariate Gaussian distribution, an automated method for encoding is developed that responds to these challenging enterprise requirements. Though created within the context of population density, this approach may be applicable to a wide array of problem domains requiring informative priors for the Beta distribution.

  14. On the rate of convergence of the maximum likelihood estimator of a k-monotone density

    Institute of Scientific and Technical Information of China (English)

    WELLNER; Jon; A

    2009-01-01

    Bounds for the bracketing entropy of the classes of bounded k-monotone functions on [0,A] are obtained under both the Hellinger distance and the Lp(Q) distance,where 1 p < ∞ and Q is a probability measure on [0,A].The result is then applied to obtain the rate of convergence of the maximum likelihood estimator of a k-monotone density.

  15. On the rate of convergence of the maximum likelihood estimator of a K-monotone density

    Institute of Scientific and Technical Information of China (English)

    GAO FuChang; WELLNER Jon A

    2009-01-01

    Bounds for the bracketing entropy of the classes of bounded K-monotone functions on [0, A] are obtained under both the Hellinger distance and the LP(Q) distance, where 1 ≤ p < ∞ and Q is a probability measure on [0, A]. The result is then applied to obtain the rate of convergence of the maximum likelihood estimator of a K-monotone density.

  16. Estimation of bone mineral density by digital X-ray radiogrammetry: theoretical background and clinical testing

    DEFF Research Database (Denmark)

    Rosholm, A; Hyldstrup, L; Backsgaard, L

    2002-01-01

    A new automated radiogrammetric method to estimate bone mineral density (BMD) from a single radiograph of the hand and forearm is described. Five regions of interest in radius, ulna and the three middle metacarpal bones are identified and approximately 1800 geometrical measurements from these bones......-ray absoptiometry (r = 0.86, p Relative to this age-related loss, the reported short...... sites and a precision that potentially allows for relatively short observation intervals. Udgivelsesdato: 2001-null...

  17. Cheap DECAF: Density Estimation for Cetaceans from Acoustic Fixed Sensors Using Separate, Non-Linked Devices

    Science.gov (United States)

    2014-06-29

    Centra de Geofisica Universidade de Lisboa Lisbon, Portugal. Award Number: N00014-11 -1 -0615 This project was a collaborative project between...submitted or in prep) from the University of St Andrews (UStA) and Universidade de Lisboa (UL) research effort. The work has also generated multiple...routines. Task 1.4. Use distance sampling software , Distance (Thomas et al. 2010), to estimate seasonal density, incorporating covariates affecting

  18. Stochastic estimation of level density in nuclear shell-model calculations

    Directory of Open Access Journals (Sweden)

    Shimizu Noritaka

    2016-01-01

    Full Text Available An estimation method of the nuclear level density stochastically based on nuclear shell-model calculations is introduced. In order to count the number of the eigen-values of the shell-model Hamiltonian matrix, we perform the contour integral of the matrix element of a resolvent. The shifted block Krylov subspace method enables us its efficient computation. Utilizing this method, the contamination of center-of-mass motion is clearly removed.

  19. Effect of Broadband Nature of Marine Mammal Echolocation Clicks on Click-Based Population Density Estimates

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Effect of Broadband Nature of Marine Mammal Echolocation...modeled for different marine mammal species and detectors and assess the magnitude of error on the estimated density due to various commonly used...noise limited (von Benda-Beckmann et al. 2010). A three hour segment, previously audited by human operators to ensure no marine mammals were present in

  20. Use of spatial capture-recapture modeling and DNA data to estimate densities of elusive animals

    Science.gov (United States)

    Kery, Marc; Gardner, Beth; Stoeckle, Tabea; Weber, Darius; Royle, J. Andrew

    2011-01-01

    Assessment of abundance, survival, recruitment rates, and density (i.e., population assessment) is especially challenging for elusive species most in need of protection (e.g., rare carnivores). Individual identification methods, such as DNA sampling, provide ways of studying such species efficiently and noninvasively. Additionally, statistical methods that correct for undetected animals and account for locations where animals are captured are available to efficiently estimate density and other demographic parameters. We collected hair samples of European wildcat (Felis silvestris) from cheek-rub lure sticks, extracted DNA from the samples, and identified each animals' genotype. To estimate the density of wildcats, we used Bayesian inference in a spatial capture-recapture model. We used WinBUGS to fit a model that accounted for differences in detection probability among individuals and seasons and between two lure arrays. We detected 21 individual wildcats (including possible hybrids) 47 times. Wildcat density was estimated at 0.29/km2 (SE 0.06), and 95% of the activity of wildcats was estimated to occur within 1.83 km from their home-range center. Lures located systematically were associated with a greater number of detections than lures placed in a cell on the basis of expert opinion. Detection probability of individual cats was greatest in late March. Our model is a generalized linear mixed model; hence, it can be easily extended, for instance, to incorporate trap- and individual-level covariates. We believe that the combined use of noninvasive sampling techniques and spatial capture-recapture models will improve population assessments, especially for rare and elusive animals.

  1. Kernel Density Feature Points Estimator for Content-Based Image Retrieval

    CERN Document Server

    Zuva, Tranos; Ojo, Sunday O; Ngwira, Seleman M

    2012-01-01

    Research is taking place to find effective algorithms for content-based image representation and description. There is a substantial amount of algorithms available that use visual features (color, shape, texture). Shape feature has attracted much attention from researchers that there are many shape representation and description algorithms in literature. These shape image representation and description algorithms are usually not application independent or robust, making them undesirable for generic shape description. This paper presents an object shape representation using Kernel Density Feature Points Estimator (KDFPE). In this method, the density of feature points within defined rings around the centroid of the image is obtained. The KDFPE is then applied to the vector of the image. KDFPE is invariant to translation, scale and rotation. This method of image representation shows improved retrieval rate when compared to Density Histogram Feature Points (DHFP) method. Analytic analysis is done to justify our m...

  2. Estimation of Plasma Density by Surface Plasmons for Surface-Wave Plasmas

    Institute of Scientific and Technical Information of China (English)

    CHEN Zhao-Quan; LIU Ming-Hai; LAN Chao-Hui; CHEN Wei; LUO Zhi-Qing; HU Xi-Wei

    2008-01-01

    @@ An estimation method of plasma density based on surface plasmons theory for surface-wave plasmas is proposed. The number of standing-wave is obtained directly from the discharge image, and the propagation constant is calculated with the trim size of the apparatus in this method, then plasma density can be determined with the value of 9.1 × 1017 m-3. Plasma density is measured using a Langmuir probe, the value is 8.1 × 1017 m-3 which is very close to the predicted value of surface plasmons theory. Numerical simulation is used to check the number of standing-wave by the finite-difference time-domain (FDTD) method also. All results are compatible both of theoretical analysis and experimental measurement.

  3. A new approach on seismic mortality estimations based on average population density

    Science.gov (United States)

    Zhu, Xiaoxin; Sun, Baiqing; Jin, Zhanyong

    2016-12-01

    This study examines a new methodology to predict the final seismic mortality from earthquakes in China. Most studies established the association between mortality estimation and seismic intensity without considering the population density. In China, however, the data are not always available, especially when it comes to the very urgent relief situation in the disaster. And the population density varies greatly from region to region. This motivates the development of empirical models that use historical death data to provide the path to analyze the death tolls for earthquakes. The present paper employs the average population density to predict the final death tolls in earthquakes using a case-based reasoning model from realistic perspective. To validate the forecasting results, historical data from 18 large-scale earthquakes occurred in China are used to estimate the seismic morality of each case. And a typical earthquake case occurred in the northwest of Sichuan Province is employed to demonstrate the estimation of final death toll. The strength of this paper is that it provides scientific methods with overall forecast errors lower than 20 %, and opens the door for conducting final death forecasts with a qualitative and quantitative approach. Limitations and future research are also analyzed and discussed in the conclusion.

  4. Wavelet-based density estimation for noise reduction in plasma simulations using particles

    Science.gov (United States)

    van yen, Romain Nguyen; del-Castillo-Negrete, Diego; Schneider, Kai; Farge, Marie; Chen, Guangye

    2010-04-01

    For given computational resources, the accuracy of plasma simulations using particles is mainly limited by the noise due to limited statistical sampling in the reconstruction of the particle distribution function. A method based on wavelet analysis is proposed and tested to reduce this noise. The method, known as wavelet-based density estimation (WBDE), was previously introduced in the statistical literature to estimate probability densities given a finite number of independent measurements. Its novel application to plasma simulations can be viewed as a natural extension of the finite size particles (FSP) approach, with the advantage of estimating more accurately distribution functions that have localized sharp features. The proposed method preserves the moments of the particle distribution function to a good level of accuracy, has no constraints on the dimensionality of the system, does not require an a priori selection of a global smoothing scale, and its able to adapt locally to the smoothness of the density based on the given discrete particle data. Moreover, the computational cost of the denoising stage is of the same order as one time step of a FSP simulation. The method is compared with a recently proposed proper orthogonal decomposition based method, and it is tested with three particle data sets involving different levels of collisionality and interaction with external and self-consistent fields.

  5. Use of prediction methods to estimate true density of active pharmaceutical ingredients.

    Science.gov (United States)

    Cao, Xiaoping; Leyva, Norma; Anderson, Stephen R; Hancock, Bruno C

    2008-05-01

    True density is a fundamental and important property of active pharmaceutical ingredients (APIs). Using prediction methods to estimate the API true density can be very beneficial in pharmaceutical research and development, especially when experimental measurements cannot be made due to lack of material or sample handling restrictions. In this paper, two empirical prediction methods developed by Girolami and Immirzi and Perini were used to estimate the true density of APIs, and the estimation results were compared with experimentally measured values by helium pycnometry. The Girolami method is simple and can be used for both liquids and solids. For the tested APIs, the Girolami method had a maximum error of -12.7% and an average percent error of -3.0% with a 95% CI of (-3.8, -2.3%). The Immirzi and Perini method is more involved and is mainly used for solid crystals. In general, it gives better predictions than the Girolami method. For the tested APIs, the Immirzi and Perini method had a maximum error of 9.6% and an average percent error of 0.9% with a 95% CI of (0.3, 1.6%).

  6. New Density Estimation Methods for Charged Particle Beams With Applications to Microbunching Instability

    Energy Technology Data Exchange (ETDEWEB)

    Balsa Terzic, Gabriele Bassi

    2011-07-01

    In this paper we discuss representations of charge particle densities in particle-in-cell (PIC) simulations, analyze the sources and profiles of the intrinsic numerical noise, and present efficient methods for their removal. We devise two alternative estimation methods for charged particle distribution which represent significant improvement over the Monte Carlo cosine expansion used in the 2d code of Bassi, designed to simulate coherent synchrotron radiation (CSR) in charged particle beams. The improvement is achieved by employing an alternative beam density estimation to the Monte Carlo cosine expansion. The representation is first binned onto a finite grid, after which two grid-based methods are employed to approximate particle distributions: (i) truncated fast cosine transform (TFCT); and (ii) thresholded wavelet transform (TWT). We demonstrate that these alternative methods represent a staggering upgrade over the original Monte Carlo cosine expansion in terms of efficiency, while the TWT approximation also provides an appreciable improvement in accuracy. The improvement in accuracy comes from a judicious removal of the numerical noise enabled by the wavelet formulation. The TWT method is then integrated into Bassi's CSR code, and benchmarked against the original version. We show that the new density estimation method provides a superior performance in terms of efficiency and spatial resolution, thus enabling high-fidelity simulations of CSR effects, including microbunching instability.

  7. Examining the impact of the precision of address geocoding on estimated density of crime locations

    Science.gov (United States)

    Harada, Yutaka; Shimada, Takahito

    2006-10-01

    This study examines the impact of the precision of address geocoding on the estimated density of crime locations in a large urban area of Japan. The data consist of two separate sets of the same Penal Code offenses known to the police that occurred during a nine-month period of April 1, 2001 through December 31, 2001 in the central 23 wards of Tokyo. These two data sets are derived from older and newer recording system of the Tokyo Metropolitan Police Department (TMPD), which revised its crime reporting system in that year so that more precise location information than the previous years could be recorded. Each of these data sets was address-geocoded onto a large-scale digital map, using our hierarchical address-geocoding schema, and was examined how such differences in the precision of address information and the resulting differences in address-geocoded incidence locations affect the patterns in kernel density maps. An analysis using 11,096 pairs of incidences of residential burglary (each pair consists of the same incidents geocoded using older and newer address information, respectively) indicates that the kernel density estimation with a cell size of 25×25 m and a bandwidth of 500 m may work quite well in absorbing the poorer precision of geocoded locations based on data from older recording system, whereas in several areas where older recording system resulted in very poor precision level, the inaccuracy of incident locations may produce artifactitious and potentially misleading patterns in kernel density maps.

  8. Review on Vehicular Speed, Density Estimation and Classification Using Acoustic Signal

    Directory of Open Access Journals (Sweden)

    Prashant Borkar

    2013-09-01

    Full Text Available Traffic monitoring and parameters estimation from urban to non urban (battlefield environment traffic is fast-emerging field based on acoustic signals. We present here a comprehensive review of the state-of-the-art acoustic signal for vehicular speed estimation, density estimation and classification, critical analysis and an outlook to future research directions. This field is of increasing relevance for intelligent transport systems (ITSs. In recent years video monitoring and surveillance systems has been widely used in traffic management and hence traffic parameters can be achieved using such systems, but installation, operational and maintenance cost associated with these approaches are relatively high compared to the use of acoustic signal which is having very low installation and maintenance cost. The classification process includes sensing unit, class definition, feature extraction, classifier application and system evaluation. The acoustic classification system is part of a multi sensor real time environment for traffic surveillance and monitoring. Classification accuracy achieved by various studied algorithms shows very good performance for the ‘Heavy Weight’ class of vehicles as compared to the other category “Light Weight”. Also a slight performance degrades as vehicle speed increases. Vehicular speed estimation corresponds to average speed and traffic density measurement, and can be substantially used for traffic signal timings optimization.

  9. Integrated Bayesian Estimation of Zeff in the TEXTOR Tokamak from Bremsstrahlung and CX Impurity Density Measurements

    Science.gov (United States)

    Verdoolaege, G.; Von Hellermann, M. G.; Jaspers, R.; Ichir, M. M.; Van Oost, G.

    2006-11-01

    The validation of diagnostic date from a nuclear fusion experiment is an important issue. The concept of an Integrated Data Analysis (IDA) allows the consistent estimation of plasma parameters from heterogeneous data sets. Here, the determination of the ion effective charge (Zeff) is considered. Several diagnostic methods exist for the determination of Zeff, but the results are in general not in agreement. In this work, the problem of Zeff estimation on the TEXTOR tokamak is approached from the perspective of IDA, in the framework of Bayesian probability theory. The ultimate goal is the estimation of a full Zeff profile that is consistent both with measured bremsstrahlung emissivities, as well as individual impurity spectral line intensities obtained from Charge Exchange Recombination Spectroscopy (CXRS). We present an overview of the various uncertainties that enter the calculation of a Zeff profile from bremsstrahlung date on the one hand, and line intensity data on the other hand. We discuss a simple linear and nonlinear Bayesian model permitting the estimation of a central value for Zeff and the electron density ne on TEXTOR from bremsstrahlung emissivity measurements in the visible, and carbon densities derived from CXRS. Both the central Zeff and ne are sampled using an MCMC algorithm. An outlook is given towards possible model improvements.

  10. Estimation of dislocation density from precession electron diffraction data using the Nye tensor

    Energy Technology Data Exchange (ETDEWEB)

    Leff, A.C. [Department of Materials Science & Engineering, Drexel University, Philadelphia, PA (United States); Weinberger, C.R. [Department of Mechanical Engineering and Mechanics, Drexel University, Philadelphia, PA (United States); Taheri, M.L., E-mail: mtaheri@coe.drexel.edu [Department of Materials Science & Engineering, Drexel University, Philadelphia, PA (United States)

    2015-06-15

    The Nye tensor offers a means to estimate the geometrically necessary dislocation density of a crystalline sample based on measurements of the orientation changes within individual crystal grains. In this paper, the Nye tensor theory is applied to precession electron diffraction automated crystallographic orientation mapping (PED-ACOM) data acquired using a transmission electron microscope (TEM). The resulting dislocation density values are mapped in order to visualize the dislocation structures present in a quantitative manner. These density maps are compared with other related methods of approximating local strain dependencies in dislocation-based microstructural transitions from orientation data. The effect of acquisition parameters on density measurements is examined. By decreasing the step size and spot size during data acquisition, an increasing fraction of the dislocation content becomes accessible. Finally, the method described herein is applied to the measurement of dislocation emission during in situ annealing of Cu in TEM in order to demonstrate the utility of the technique for characterizing microstructural dynamics. - Highlights: • Developed a method of mapping GND density using orientation mapping data from TEM. • As acquisition length-scale is decreased, all dislocations are considered GNDs. • Dislocation emission and corresponding grain rotation quantified.

  11. Extension of the statistical modal energy distribution analysis for estimating energy density in coupled subsystems

    Science.gov (United States)

    Totaro, N.; Guyader, J. L.

    2012-06-01

    The present article deals with an extension of the Statistical modal Energy distribution Analysis (SmEdA) method to estimate kinetic and potential energy density in coupled subsystems. The SmEdA method uses the modal bases of uncoupled subsystems and focuses on the modal energies rather than the global energies of subsystems such as SEA (Statistical Energy Analysis). This method permits extending SEA to subsystems with low modal overlap or to localized excitations as it does not assume the existence of modal energy equipartition. We demonstrate that by using the modal energies of subsystems computed by SmEdA, it is possible to estimate energy distribution in subsystems. This approach has the same advantages of standard SEA, as it uses very short calculations to analyze damping effects. The estimation of energy distribution from SmEdA is applied to an academic case and an industrial example.

  12. ANNz2 - Photometric redshift and probability density function estimation using machine learning methods

    CERN Document Server

    Sadeh, Iftach; Lahav, Ofer

    2015-01-01

    We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister and Lahav (2004). Large photometric galaxy surveys are important for cosmological studies, and in particular for characterizing the nature of dark energy. The success of such surveys greatly depends on the ability to measure photo-zs, based on limited spectral data. ANNz2 utilizes multiple machine learning methods, such as artificial neural networks, boosted decision/regression trees and k-nearest neighbours. The objective of the algorithm is to dynamically optimize the performance of the photo-z estimation, and to properly derive the associated uncertainties. In addition to single-value solutions, the new code also generates full probability density functions (PDFs) in two different ways. In addition, estimators are incorporated to mitigate possible problems of spectroscopic training samples which are not representative or are incomplete. ANNz2 is also adapted to provide optimized solution...

  13. Optimal diffusion MRI acquisition for fiber orientation density estimation: an analytic approach.

    Science.gov (United States)

    White, Nathan S; Dale, Anders M

    2009-11-01

    An important challenge in the design of diffusion MRI experiments is how to optimize statistical efficiency, i.e., the accuracy with which parameters can be estimated from the diffusion data in a given amount of imaging time. In model-based spherical deconvolution analysis, the quantity of interest is the fiber orientation density (FOD). Here, we demonstrate how the spherical harmonics (SH) can be used to form an explicit analytic expression for the efficiency of the minimum variance (maximally efficient) linear unbiased estimator of the FOD. Using this expression, we calculate optimal b-values for maximum FOD estimation efficiency with SH expansion orders of L = 2, 4, 6, and 8 to be approximately b = 1,500, 3,000, 4,600, and 6,200 s/mm(2), respectively. However, the arrangement of diffusion directions and scanner-specific hardware limitations also play a role in determining the realizable efficiency of the FOD estimator that can be achieved in practice. We show how some commonly used methods for selecting diffusion directions are sometimes inefficient, and propose a new method for selecting diffusion directions in MRI based on maximizing the statistical efficiency. We further demonstrate how scanner-specific hardware limitations generally lead to optimal b-values that are slightly lower than the ideal b-values. In summary, the analytic expression for the statistical efficiency of the unbiased FOD estimator provides important insight into the fundamental tradeoff between angular resolution, b-value, and FOD estimation accuracy.

  14. THE USE OF MATHEMATICAL MODELS FOR ESTIMATING WOOD BASIC DENSITY OF Eucalyptus sp CLONES

    Directory of Open Access Journals (Sweden)

    Cláudio Roberto Thiersch

    2006-09-01

    Full Text Available This study aimed at identifying at what point in the stem, in the longitudinal and cardinal direction, the pylodin penetration depth should be measured, for determining wood basic density, envisaging forestry inventory Data base used in compassed 36 parcels of 400 m2. Around the parcels 216 trees were sealed. Two clones (hybrid of E. grandis and E. urophylla, at the ages of 3; 4, 5 and 6 years, belonging to three different sites in East Brazil, encompassing East and Northeast of Espirito Santo state and south of Bahia state. In each measuring height of diameters it was also measured the penetration depth of the pylodin (in mm. The average basic density of scaled trees, was determined, departing from the cheaps, using the immersion method. The main conclusions were: The density equation, as function of the pylodin measures, age, site, diameters at 1.3m of ground and total height, was more precise, exact and stable than the density equation as function of pylodin, age, site and diameter, which in turn was more exact and stable than the density equation, as function of pylodin measures, age, site, diameter at a 1.3m of the ground and of total height, was precise and exact for all ages and sites, in dependent on if the pylodin measurements were taken in the South or in North fares, or in the average position between them. The height for measurement with pylodin can also be taken in the more ergonomic position of 1.3m. The density estimation, as a function of the measures with the pylodin, or as a function of the use of the pylodin, age, average dominant tree height an diameter at 1.3m of the ground, for both clones, was more precise when the measure with the pylodin was taken at the North face. The average tree basic density must always be taken by a specific equation for each clone, given that these equations differ statistically.

  15. mBEEF-vdW: Robust fitting of error estimation density functionals

    Science.gov (United States)

    Lundgaard, Keld T.; Wellendorff, Jess; Voss, Johannes; Jacobsen, Karsten W.; Bligaard, Thomas

    2016-06-01

    We propose a general-purpose semilocal/nonlocal exchange-correlation functional approximation, named mBEEF-vdW. The exchange is a meta generalized gradient approximation, and the correlation is a semilocal and nonlocal mixture, with the Rutgers-Chalmers approximation for van der Waals (vdW) forces. The functional is fitted within the Bayesian error estimation functional (BEEF) framework [J. Wellendorff et al., Phys. Rev. B 85, 235149 (2012), 10.1103/PhysRevB.85.235149; J. Wellendorff et al., J. Chem. Phys. 140, 144107 (2014), 10.1063/1.4870397]. We improve the previously used fitting procedures by introducing a robust MM-estimator based loss function, reducing the sensitivity to outliers in the datasets. To more reliably determine the optimal model complexity, we furthermore introduce a generalization of the bootstrap 0.632 estimator with hierarchical bootstrap sampling and geometric mean estimator over the training datasets. Using this estimator, we show that the robust loss function leads to a 10 % improvement in the estimated prediction error over the previously used least-squares loss function. The mBEEF-vdW functional is benchmarked against popular density functional approximations over a wide range of datasets relevant for heterogeneous catalysis, including datasets that were not used for its training. Overall, we find that mBEEF-vdW has a higher general accuracy than competing popular functionals, and it is one of the best performing functionals on chemisorption systems, surface energies, lattice constants, and dispersion. We also show the potential-energy curve of graphene on the nickel(111) surface, where mBEEF-vdW matches the experimental binding length. mBEEF-vdW is currently available in gpaw and other density functional theory codes through Libxc, version 3.0.0.

  16. msBP: An R Package to Perform Bayesian Nonparametric Inference Using Multiscale Bernstein Polynomials Mixtures

    Directory of Open Access Journals (Sweden)

    Antonio Canale

    2017-06-01

    Full Text Available msBP is an R package that implements a new method to perform Bayesian multiscale nonparametric inference introduced by Canale and Dunson (2016. The method, based on mixtures of multiscale beta dictionary densities, overcomes the drawbacks of Pólya trees and inherits many of the advantages of Dirichlet process mixture models. The key idea is that an infinitely-deep binary tree is introduced, with a beta dictionary density assigned to each node of the tree. Using a multiscale stick-breaking characterization, stochastically decreasing weights are assigned to each node. The result is an infinite mixture model. The package msBP implements a series of basic functions to deal with this family of priors such as random densities and numbers generation, creation and manipulation of binary tree objects, and generic functions to plot and print the results. In addition, it implements the Gibbs samplers for posterior computation to perform multiscale density estimation and multiscale testing of group differences described in Canale and Dunson (2016.

  17. Heterogeneous occupancy and density estimates of the pathogenic fungus Batrachochytrium dendrobatidis in waters of North America

    Science.gov (United States)

    Chestnut, Tara E.; Anderson, Chauncey; Popa, Radu; Blaustein, Andrew R.; Voytek, Mary; Olson, Deanna H.; Kirshtein, Julie

    2014-01-01

    Biodiversity losses are occurring worldwide due to a combination of stressors. For example, by one estimate, 40% of amphibian species are vulnerable to extinction, and disease is one threat to amphibian populations. The emerging infectious disease chytridiomycosis, caused by the aquatic fungus Batrachochytrium dendrobatidis (Bd), is a contributor to amphibian declines worldwide. Bd research has focused on the dynamics of the pathogen in its amphibian hosts, with little emphasis on investigating the dynamics of free-living Bd. Therefore, we investigated patterns of Bd occupancy and density in amphibian habitats using occupancy models, powerful tools for estimating site occupancy and detection probability. Occupancy models have been used to investigate diseases where the focus was on pathogen occurrence in the host. We applied occupancy models to investigate free-living Bd in North American surface waters to determine Bd seasonality, relationships between Bd site occupancy and habitat attributes, and probability of detection from water samples as a function of the number of samples, sample volume, and water quality. We also report on the temporal patterns of Bd density from a 4-year case study of a Bd-positive wetland. We provide evidence that Bd occurs in the environment year-round. Bd exhibited temporal and spatial heterogeneity in density, but did not exhibit seasonality in occupancy. Bd was detected in all months, typically at less than 100 zoospores L−1. The highest density observed was ∼3 million zoospores L−1. We detected Bd in 47% of sites sampled, but estimated that Bd occupied 61% of sites, highlighting the importance of accounting for imperfect detection. When Bd was present, there was a 95% chance of detecting it with four samples of 600 ml of water or five samples of 60 mL. Our findings provide important baseline information to advance the study of Bd disease ecology, and advance our understanding of amphibian exposure to free-living Bd in aquatic

  18. Heterogeneous occupancy and density estimates of the pathogenic fungus Batrachochytrium dendrobatidis in waters of North America.

    Directory of Open Access Journals (Sweden)

    Tara Chestnut

    Full Text Available Biodiversity losses are occurring worldwide due to a combination of stressors. For example, by one estimate, 40% of amphibian species are vulnerable to extinction, and disease is one threat to amphibian populations. The emerging infectious disease chytridiomycosis, caused by the aquatic fungus Batrachochytrium dendrobatidis (Bd, is a contributor to amphibian declines worldwide. Bd research has focused on the dynamics of the pathogen in its amphibian hosts, with little emphasis on investigating the dynamics of free-living Bd. Therefore, we investigated patterns of Bd occupancy and density in amphibian habitats using occupancy models, powerful tools for estimating site occupancy and detection probability. Occupancy models have been used to investigate diseases where the focus was on pathogen occurrence in the host. We applied occupancy models to investigate free-living Bd in North American surface waters to determine Bd seasonality, relationships between Bd site occupancy and habitat attributes, and probability of detection from water samples as a function of the number of samples, sample volume, and water quality. We also report on the temporal patterns of Bd density from a 4-year case study of a Bd-positive wetland. We provide evidence that Bd occurs in the environment year-round. Bd exhibited temporal and spatial heterogeneity in density, but did not exhibit seasonality in occupancy. Bd was detected in all months, typically at less than 100 zoospores L(-1. The highest density observed was ∼3 million zoospores L(-1. We detected Bd in 47% of sites sampled, but estimated that Bd occupied 61% of sites, highlighting the importance of accounting for imperfect detection. When Bd was present, there was a 95% chance of detecting it with four samples of 600 ml of water or five samples of 60 mL. Our findings provide important baseline information to advance the study of Bd disease ecology, and advance our understanding of amphibian exposure to free

  19. An Analytical Planning Model to Estimate the Optimal Density of Charging Stations for Electric Vehicles.

    Directory of Open Access Journals (Sweden)

    Yongjun Ahn

    Full Text Available The charging infrastructure location problem is becoming more significant due to the extensive adoption of electric vehicles. Efficient charging station planning can solve deeply rooted problems, such as driving-range anxiety and the stagnation of new electric vehicle consumers. In the initial stage of introducing electric vehicles, the allocation of charging stations is difficult to determine due to the uncertainty of candidate sites and unidentified charging demands, which are determined by diverse variables. This paper introduces the Estimating the Required Density of EV Charging (ERDEC stations model, which is an analytical approach to estimating the optimal density of charging stations for certain urban areas, which are subsequently aggregated to city level planning. The optimal charging station's density is derived to minimize the total cost. A numerical study is conducted to obtain the correlations among the various parameters in the proposed model, such as regional parameters, technological parameters and coefficient factors. To investigate the effect of technological advances, the corresponding changes in the optimal density and total cost are also examined by various combinations of technological parameters. Daejeon city in South Korea is selected for the case study to examine the applicability of the model to real-world problems. With real taxi trajectory data, the optimal density map of charging stations is generated. These results can provide the optimal number of chargers for driving without driving-range anxiety. In the initial planning phase of installing charging infrastructure, the proposed model can be applied to a relatively extensive area to encourage the usage of electric vehicles, especially areas that lack information, such as exact candidate sites for charging stations and other data related with electric vehicles. The methods and results of this paper can serve as a planning guideline to facilitate the extensive

  20. A long-term evaluation of biopsy darts and DNA to estimate cougar density

    Science.gov (United States)

    Beausoleil, Richard A.; Clark, Joseph D.; Maletzke, Benjamin T.

    2016-01-01

    Accurately estimating cougar (Puma concolor) density is usually based on long-term research consisting of intensive capture and Global Positioning System collaring efforts and may cost hundreds of thousands of dollars annually. Because wildlife agency budgets rarely accommodate this approach, most infer cougar density from published literature, rely on short-term studies, or use hunter harvest data as a surrogate in their jurisdictions; all of which may limit accuracy and increase risk of management actions. In an effort to develop a more cost-effective long-term strategy, we evaluated a research approach using citizen scientists with trained hounds to tree cougars and collect tissue samples with biopsy darts. We then used the DNA to individually identify cougars and employed spatially explicit capture–recapture models to estimate cougar densities. Overall, 240 tissue samples were collected in northeastern Washington, USA, producing 166 genotypes (including recaptures and excluding dependent kittens) of 133 different cougars (8-25/yr) from 2003 to 2011. Mark–recapture analyses revealed a mean density of 2.2 cougars/100 km2 (95% CI=1.1-4.3) and stable to decreasing population trends (β=-0.048, 95% CI=-0.106–0.011) over the 9 years of study, with an average annual harvest rate of 14% (range=7-21%). The average annual cost per year for field sampling and genotyping was US$11,265 ($422.24/sample or $610.73/successfully genotyped sample). Our results demonstrated that long-term biopsy sampling using citizen scientists can increase capture success and provide reliable cougar-density information at a reasonable cost.

  1. An Analytical Planning Model to Estimate the Optimal Density of Charging Stations for Electric Vehicles.

    Science.gov (United States)

    Ahn, Yongjun; Yeo, Hwasoo

    2015-01-01

    The charging infrastructure location problem is becoming more significant due to the extensive adoption of electric vehicles. Efficient charging station planning can solve deeply rooted problems, such as driving-range anxiety and the stagnation of new electric vehicle consumers. In the initial stage of introducing electric vehicles, the allocation of charging stations is difficult to determine due to the uncertainty of candidate sites and unidentified charging demands, which are determined by diverse variables. This paper introduces the Estimating the Required Density of EV Charging (ERDEC) stations model, which is an analytical approach to estimating the optimal density of charging stations for certain urban areas, which are subsequently aggregated to city level planning. The optimal charging station's density is derived to minimize the total cost. A numerical study is conducted to obtain the correlations among the various parameters in the proposed model, such as regional parameters, technological parameters and coefficient factors. To investigate the effect of technological advances, the corresponding changes in the optimal density and total cost are also examined by various combinations of technological parameters. Daejeon city in South Korea is selected for the case study to examine the applicability of the model to real-world problems. With real taxi trajectory data, the optimal density map of charging stations is generated. These results can provide the optimal number of chargers for driving without driving-range anxiety. In the initial planning phase of installing charging infrastructure, the proposed model can be applied to a relatively extensive area to encourage the usage of electric vehicles, especially areas that lack information, such as exact candidate sites for charging stations and other data related with electric vehicles. The methods and results of this paper can serve as a planning guideline to facilitate the extensive adoption of electric

  2. Near infrared spectroscopy for estimating wood basic density in Eucalyptus urophylla and Eucalyptus grandis

    Directory of Open Access Journals (Sweden)

    Paulo Ricardo Gherardi Hein

    2009-06-01

    Full Text Available Wood basic density is indicative of several other wood properties and is considered as a key feature for many industrialapplications. Near infrared spectroscopy (NIRS is a fast, efficient technique that is capable of estimating that property. However,it should be improved in order to complement the often time-consuming and costly conventional method. Research on woodtechnological properties using near infrared spectroscopy has shown promising results. Thus the aim of this study was to evaluatethe efficiency of near infrared spectroscopy for estimating wood basic density in both Eucalyptus urophylla and Eucalyptus grandis.The coefficients of determination of the predictive models for cross validation ranged between 0.74 and 0.86 and the ratio performanceof deviation (RPD ranged between 1.9 and 2.7. The application of spectral filter, detection and removal of outlier samples, andselection of variables (wavelength improved the adjustment of calibrations, thereby reducing the standard error of calibration (SECand cross validation (SECV as well as increasing the coefficient of determination (R² and the RPD value. The technique of nearinfrared spectroscopy can therefore, be used for predicting wood basic density in Eucalyptus urophylla and Eucalyptus grandis.

  3. 含有持久生存数据的复发事件的生存函数的一个非参数方法%Nonparametric Estimation of a Recurrent Survival Function with Long-Term Survivors

    Institute of Scientific and Technical Information of China (English)

    肖翠柳; 赵晓兵; 王静龙

    2010-01-01

    在生物学、社会科学、保险理赔、可靠性和人口统计学等的研究中,我们经常会遇到复发事件数据的处理.最近一段时间以来,两个相邻复发事件的时间间隔的一个纵向数据模型已经引起统计工作者的广泛兴趣.本文中,我们提议另一个复发事件时间间隔模型,它可以用来模拟生存数据中带有所谓的持久生存者.非参数方法将用于我们所提议模型的统计推断,模拟和现实数据的例子将用来评价模型和提议估计方法的小样本性质.%Recurrent event data are often encountered in longitudinal follow-up studies related to biomed ical science,econometrics,insurance claims,reliability and demography.The longitudinal pattern of gaps between successive recurrent events is often of great research interest.In this paper,we propose an alternative model which can accommodate "cure fraction" in gaps,and a nonparametric method is employed to the statistical inference.The proposed model and methodology are demon strated by a small simulation result and the tumor recurrent data.

  4. On the method of logarithmic cumulants for parametric probability density function estimation.

    Science.gov (United States)

    Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane

    2013-10-01

    Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.

  5. Estimating black bear population density and genetic diversity at Tensas River, Louisiana using microsatellite DNA markers

    Science.gov (United States)

    Boersen, Mark R.; Clark, Joseph D.; King, Tim L.

    2003-01-01

    The Recovery Plan for the federally threatened Louisiana black bear (Ursus americanus luteolus) mandates that remnant populations be estimated and monitored. In 1999 we obtained genetic material with barbed-wire hair traps to estimate bear population size and genetic diversity at the 329-km2 Tensas River Tract, Louisiana. We constructed and monitored 122 hair traps, which produced 1,939 hair samples. Of those, we randomly selected 116 subsamples for genetic analysis and used up to 12 microsatellite DNA markers to obtain multilocus genotypes for 58 individuals. We used Program CAPTURE to compute estimates of population size using multiple mark-recapture models. The area of study was almost entirely circumscribed by agricultural land, thus the population was geographically closed. Also, study-area boundaries were biologically discreet, enabling us to accurately estimate population density. Using model Chao Mh to account for possible effects of individual heterogeneity in capture probabilities, we estimated the population size to be 119 (SE=29.4) bears, or 0.36 bears/km2. We were forced to examine a substantial number of loci to differentiate between some individuals because of low genetic variation. Despite the probable introduction of genes from Minnesota bears in the 1960s, the isolated population at Tensas exhibited characteristics consistent with inbreeding and genetic drift. Consequently, the effective population size at Tensas may be as few as 32, which warrants continued monitoring or possibly genetic augmentation.

  6. Soil Organic Carbon Density in Hebei Province, China:Estimates and Uncertainty

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yong-Cun; SHI Xue-Zheng; YU Dong-Sheng; T. F. PAGELLA; SUN Wei-Xia; XU Xiang-Hua

    2005-01-01

    In order to improve the precision of soil organic carbon (SOC) estimates, the sources of uncertainty in soil organic carbon density (SOCD) estimates and SOC stocks were examined using 363 soil profiles in Hebei Province, China, with three methods: the soil profile statistics (SPS), GIS-based soil type (GST), and kriging interpolation (KI). The GST method, utilizing both pedological professional knowledge and GIS technology, was considered the most accurate method of the three estimations, with SOCD estimates for SPS 10% lower and KI 10% higher. The SOCD range for GST was 84% wider than KI as KI smoothing effect narrowed the SOCD range. Nevertheless, the coefficient of variation for SOCD with KI (41.7%) was less than GST and SPS. Comparing SOCD's lower estimates for SPS versus GST, the major sources of uncertainty were the conflicting area of proportional relations. Meanwhile, the fewer number of soil profiles and the necessity of using the smoothing effect with KI were its sources of uncertainty. Moreover, for local detailed variations of SOCD, GST was more advantageous in reflecting the distribution pattern than KI.

  7. The Effects of Surfactants on the Estimation of Bacterial Density in Petroleum Samples

    Science.gov (United States)

    Luna, Aderval Severino; da Costa, Antonio Carlos Augusto; Gonçalves, Márcia Monteiro Machado; de Almeida, Kelly Yaeko Miyashiro

    The effect of the surfactants polyoxyethylene monostearate (Tween 60), polyoxyethylene monooleate (Tween 80), cetyl trimethyl ammonium bromide (CTAB), and sodium dodecyl sulfate (SDS) on the estimation of bacterial density (sulfate-reducing bacteria [SRB] and general anaerobic bacteria [GAnB]) was examined in petroleum samples. Three different compositions of oil and water were selected to be representative of the real samples. The first one contained a high content of oil, the second one contained a medium content of oil, and the last one contained a low content of oil. The most probable number (MPN) was used to estimate the bacterial density. The results showed that the addition of surfactants did not improve the SRB quantification for the high or medium oil content in the petroleum samples. On other hand, Tween 60 and Tween 80 promoted a significant increase on the GAnB quantification at 0.01% or 0.03% m/v concentrations, respectively. CTAB increased SRB and GAnB estimation for the sample with a low oil content at 0.00005% and 0.0001% m/v, respectively.

  8. Large-sample study of the kernel density estimators under multiplicative censoring

    CERN Document Server

    Asgharian, Masoud; Fakoor, Vahid; 10.1214/11-AOS954

    2012-01-01

    The multiplicative censoring model introduced in Vardi [Biometrika 76 (1989) 751--761] is an incomplete data problem whereby two independent samples from the lifetime distribution $G$, $\\mathcal{X}_m=(X_1,...,X_m)$ and $\\mathcal{Z}_n=(Z_1,...,Z_n)$, are observed subject to a form of coarsening. Specifically, sample $\\mathcal{X}_m$ is fully observed while $\\mathcal{Y}_n=(Y_1,...,Y_n)$ is observed instead of $\\mathcal{Z}_n$, where $Y_i=U_iZ_i$ and $(U_1,...,U_n)$ is an independent sample from the standard uniform distribution. Vardi [Biometrika 76 (1989) 751--761] showed that this model unifies several important statistical problems, such as the deconvolution of an exponential random variable, estimation under a decreasing density constraint and an estimation problem in renewal processes. In this paper, we establish the large-sample properties of kernel density estimators under the multiplicative censoring model. We first construct a strong approximation for the process $\\sqrt{k}(\\hat{G}-G)$, where $\\hat{G}$ is...

  9. A pdf-Free Change Detection Test Based on Density Difference Estimation.

    Science.gov (United States)

    Bu, Li; Alippi, Cesare; Zhao, Dongbin

    2016-11-16

    The ability to detect online changes in stationarity or time variance in a data stream is a hot research topic with striking implications. In this paper, we propose a novel probability density function-free change detection test, which is based on the least squares density-difference estimation method and operates online on multidimensional inputs. The test does not require any assumption about the underlying data distribution, and is able to operate immediately after having been configured by adopting a reservoir sampling mechanism. Thresholds requested to detect a change are automatically derived once a false positive rate is set by the application designer. Comprehensive experiments validate the effectiveness in detection of the proposed method both in terms of detection promptness and accuracy.

  10. Probabilistic density function estimation of geotechnical shear strength parameters using the second Chebyshev orthogonal polynomial

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A method to estimate the probabilistic density function (PDF) of shear strength parameters was proposed. The second Chebyshev orthogonal polynomial(SCOP) combined with sample moments (the originmoments)was used to approximate the PDF of parameters. χ2 test was adopted to verify the availability of the method. It is distribution-free because no classical theoretical distributions were assumed in advance and the inference result provides a universal form of probability density curves. Six most commonly-used theoretical distributions named normal, lognormal, extreme value Ⅰ , gama, beta and Weibull distributions were used to verify SCOP method. An example from the observed data of cohesion c of a kind of silt clay was presented for illustrative purpose. The results show that the acceptance levels in SCOP are all smaller than those in the classical finite comparative method and the SCOP function is more accurate and effective in the reliability analysis of geotechnical engineering.

  11. Detection of Bistability in Phase Space of a Real Galaxy, using a New Non-parametric Bayesian Test of Hypothesis

    CERN Document Server

    Chakrabarty, Dalia

    2013-01-01

    In lieu of direct detection of dark matter, estimation of the distribution of the gravitational mass in distant galaxies is of crucial importance in Astrophysics. Typically, such estimation is performed using small samples of noisy, partially missing measurements - only some of the three components of the velocity and location vectors of individual particles that live in the galaxy are measurable. Such limitations of the available data in turn demands that simplifying model assumptions be undertaken. Thus, assuming that the phase space of a galaxy manifests simple symmetries - such as isotropy - allows for the learning of the density of the gravitational mass in galaxies. This is equivalent to assuming that the phase space $pdf$ from which the velocity and location vectors of galactic particles are sampled from, is an isotropic function of these vectors. We present a new non-parametric test of hypothesis that tests for relative support in two or more measured data sets of disparate sizes, for the undertaken m...

  12. Population density estimation of the European wildcat (Felis silvestris silvestris in Sicily using camera trapping

    Directory of Open Access Journals (Sweden)

    Stefano Anile

    2012-07-01

    Full Text Available The wildcat is an elusive species that is threatened with extinction in many areas of its European distribution. In Sicily the wildcat lives in a wide range of habitats; this study was done on Mount Etna. A previous camera trap monitoring was conducted in 2006 (pilot study and 2007 (first estimation of wildcat population size using camera trapping with capture-recapture analyses in the same study area. In 2009 digital camera traps in pair were used at each station with the aim of obtaining photographs of the wildcat. Experience and data collected from previous studies were used to develop a protocol to estimate the density of the wildcat’s population using capture–recapture analyses and the coat-colour and markings system to recognize individuals. Two trap-lines adjacent to each other were run in two consecutive data collection periods. Camera traps worked together for 1080 trap-days and we obtained 42 pictures of wildcats from 32 events of photographic capture, from which 10 individuals ( excluding four kittens were determined. The history capture of each individual was constructed and the software CAPTURE was used to generate an estimation of the population density (0.22 to 0.44 wildcat/100 ha for our study area using two different approaches for the calculation of the effective area sampled. The wildcat’s population density on Mount Etna is higher than those found throughout Europe, and is favoured by the habitat structure, prey availability, Mediterranean climate and the protection status provided by the park.

  13. Axonal and dendritic density field estimation from incomplete single-slice neuronal reconstructions

    Directory of Open Access Journals (Sweden)

    Jaap evan Pelt

    2014-06-01

    Full Text Available Neuronal information processing in cortical networks critically depends on the organization of synaptic connectivity. Synaptic connections can form when axons and dendrites come in close proximity of each other. The spatial innervation of neuronal arborizations can be described by their axonal and dendritic density fields. Recently we showed that potential locations of synapses between neurons can be estimated from their overlapping axonal and dendritic density fields. However, deriving density fields from single-slice neuronal reconstructions is hampered by incompleteness because of cut branches.Here, we describe a method for recovering the lost axonal and dendritic mass. This so-called completion method is based on an estimation of the mass inside the slice and an extrapolation to the space outside the slice, assuming axial symmetry in the mass distribution. We validated the method using a set of neurons generated with our NETMORPH simulator. The model-generated neurons were artificially sliced and subsequently recovered by the completion method. Depending on slice thickness and arbor extent, branches that have lost their outside parents (orphan branches may occur inside the slice. Not connected anymore to the contiguous structure of the sliced neuron, orphan branches result in an underestimation of neurite mass. For 300 m thick slices, however, the validation showed a full recovery of dendritic and an almost full recovery of axonal mass.The completion method was applied to three experimental data sets of reconstructed rat cortical L2/3 pyramidal neurons. The results showed that in 300 m thick slices intracortical axons lost about 50% and dendrites about 16% of their mass. The completion method can be applied to single-slice reconstructions as long as axial symmetry can be assumed in the mass distribution. This opens up the possibility of using incomplete neuronal reconstructions from open-access data bases to determine population mean

  14. "Prospecting Asteroids: Indirect technique to estimate overall density and inner composition"

    Science.gov (United States)

    Such, Pamela

    2016-07-01

    Spectroscopic studies of asteroids make possible to obtain some information on their composition from the surface but say little about the innermost material, porosity and density of the object. In addition, spectroscopic observations are affected by the effects of "space weathering" produced by the bombardment of charged particles for certain materials that change their chemical structure, albedo and other physical properties, partly altering their chances of identification. Data such as the mass, size and density of the asteroids are essential at the time to propose space missions in order to determine the best candidates for space exploration and is of great importance to determine a priori any of them remotely from Earth. From many years ago its determined masses of largest asteroids studying the gravitational effects they have on smaller asteroids when they approach them (see Davis and Bender, 1977; Schubart and Matson, 1979; School et al 1987; Hoffman, 1989b, among others), but estimates of the masses of the smallest objects is limited to the effects that occur in extreme close encounters to other asteroids of similar size. This paper presents the results of a search for approaches of pair of asteroids that approximate distances less than 0.0004 UA (50,000 km) of each other in order to study their masses through the astrometric method and to estimate in a future their densities and internal composition. References Davis, D. R., and D. F. Bender. 1977. Asteroid mass determinations: search for futher encounter opportunities. Bull. Am. Astron. Soc. 9, 502-503. Hoffman, M. 1989b. Asteroid mass determination: Present situation and perspectives. In asteroids II (R. P. Binzel, T. Gehreis, and M. S. Matthews, Eds.), pp 228-239. Univ. Arizona Press, Tucson. School, H. L. D. Schmadel and S. Roser 1987. The mass of the asteroid (10) Hygiea derived from observations of (829) Academia. Astron. Astrophys. 179, 311-316. Schubart, J. And D. L. Matson 1979. Masses and

  15. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  16. Testing for a constant coefficient of variation in nonparametric regression

    OpenAIRE

    Dette, Holger; Marchlewski, Mareen; Wagener, Jens

    2010-01-01

    In the common nonparametric regression model Y_i=m(X_i)+sigma(X_i)epsilon_i we consider the problem of testing the hypothesis that the coefficient of the scale and location function is constant. The test is based on a comparison of the observations Y_i=\\hat{sigma}(X_i) with their mean by a smoothed empirical process, where \\hat{sigma} denotes the local linear estimate of the scale function. We show weak convergence of a centered version of this process to a Gaussian process under the null ...

  17. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb......-Douglas or the Translog production function is used. However, the specification of a functional form for the production function involves the risk of specifying a functional form that is not similar to the “true” relationship between the inputs and the output. This misspecification might result in biased estimation...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  18. Efficient 3D movement-based kernel density estimator and application to wildlife ecology

    Science.gov (United States)

    Tracey-PR, Jeff; Sheppard, James K.; Lockwood, Glenn K.; Chourasia, Amit; Tatineni, Mahidhar; Fisher, Robert N.; Sinkovits, Robert S.

    2014-01-01

    We describe an efficient implementation of a 3D movement-based kernel density estimator for determining animal space use from discrete GPS measurements. This new method provides more accurate results, particularly for species that make large excursions in the vertical dimension. The downside of this approach is that it is much more computationally expensive than simpler, lower-dimensional models. Through a combination of code restructuring, parallelization and performance optimization, we were able to reduce the time to solution by up to a factor of 1000x, thereby greatly improving the applicability of the method.

  19. Effects of Percent Tree Canopy Density and DEM Misregistration on SRTM/NED Vegetation Height Estimates

    Directory of Open Access Journals (Sweden)

    George Miliaresis

    2009-04-01

    Full Text Available The U.S National Elevation Dataset and the NLCD 2001 landcover data were used to test the correlation between SRTM elevation values and the height of evergreen forest vegetation in the Klamath Mountains of California.Vegetation height estimates (SRTM-NED are valid only for the two out of eight (N, NE, E, SE, S, SW, W, NW geographic directions, due to NED and SRTM grid data misregistration. Penetration depths of SRTM radar were found to linearly correlate to tree percent canopy density.

  20. Cetacean Density Estimation from Novel Acoustic Datasets by Acoustic Propagation Modeling

    Science.gov (United States)

    2013-09-30

    whales off Kona, Hawai’i, is based on the works of Zimmer et al. (2008), Marques et al. (2009), and Küsel et al. (2011). The density estimator formula...given by Marques et al. (2009) is applied here for the case of one (k=1) sensor, yielding the following formulation: � = (−�) ...2124 manually labeled false killer whale clicks, calculated in 1 kHz band intervals from 0 to 90 kHz. From the above image it can be observed the

  1. Recent Advances and Trends in Nonparametric Statistics

    CERN Document Server

    Akritas, MG

    2003-01-01

    The advent of high-speed, affordable computers in the last two decades has given a new boost to the nonparametric way of thinking. Classical nonparametric procedures, such as function smoothing, suddenly lost their abstract flavour as they became practically implementable. In addition, many previously unthinkable possibilities became mainstream; prime examples include the bootstrap and resampling methods, wavelets and nonlinear smoothers, graphical methods, data mining, bioinformatics, as well as the more recent algorithmic approaches such as bagging and boosting. This volume is a collection o

  2. Correlated Non-Parametric Latent Feature Models

    CERN Document Server

    Doshi-Velez, Finale

    2012-01-01

    We are often interested in explaining data through a set of hidden factors or features. When the number of hidden features is unknown, the Indian Buffet Process (IBP) is a nonparametric latent feature model that does not bound the number of active features in dataset. However, the IBP assumes that all latent features are uncorrelated, making it inadequate for many realworld problems. We introduce a framework for correlated nonparametric feature models, generalising the IBP. We use this framework to generate several specific models and demonstrate applications on realworld datasets.

  3. Estimation of population firing rates and current source densities from laminar electrode recordings.

    Science.gov (United States)

    Pettersen, Klas H; Hagen, Espen; Einevoll, Gaute T

    2008-06-01

    This model study investigates the validity of methods used to interpret linear (laminar) multielectrode recordings. In computer experiments extracellular potentials from a synaptically activated population of about 1,000 pyramidal neurons are calculated using biologically realistic compartmental neuron models combined with electrostatic forward modeling. The somas of the pyramidal neurons are located in a 0.4 mm high and wide columnar cylinder, mimicking a stimulus-evoked layer-5 population in a neocortical column. Current-source density (CSD) analysis of the low-frequency part (estimates of the true underlying CSD. The high-frequency part (>750 Hz) of the potentials (multi-unit activity, MUA) is found to scale approximately as the population firing rate to the power 3/4 and to give excellent estimates of the underlying population firing rate for trial-averaged data. The MUA signal is found to decay much more sharply outside the columnar populations than the LFP.

  4. A more appropriate white blood cell count for estimating malaria parasite density in Plasmodium vivax patients in northeastern Myanmar.

    Science.gov (United States)

    Liu, Huaie; Feng, Guohua; Zeng, Weilin; Li, Xiaomei; Bai, Yao; Deng, Shuang; Ruan, Yonghua; Morris, James; Li, Siman; Yang, Zhaoqing; Cui, Liwang

    2016-04-01

    The conventional method of estimating parasite densities employ an assumption of 8000 white blood cells (WBCs)/μl. However, due to leucopenia in malaria patients, this number appears to overestimate parasite densities. In this study, we assessed the accuracy of parasite density estimated using this assumed WBC count in eastern Myanmar, where Plasmodium vivax has become increasingly prevalent. From 256 patients with uncomplicated P. vivax malaria, we estimated parasite density and counted WBCs by using an automated blood cell counter. It was found that WBC counts were not significantly different between patients of different gender, axillary temperature, and body mass index levels, whereas they were significantly different between age groups of patients and the time points of measurement. The median parasite densities calculated with the actual WBC counts (1903/μl) and the assumed WBC count of 8000/μl (2570/μl) were significantly different. We demonstrated that using the assumed WBC count of 8000 cells/μl to estimate parasite densities of P. vivax malaria patients in this area would lead to an overestimation. For P. vivax patients aged five years and older, an assumed WBC count of 5500/μl best estimated parasite densities. This study provides more realistic assumed WBC counts for estimating parasite densities in P. vivax patients from low-endemicity areas of Southeast Asia.

  5. Management of deep brain stimulator battery failure: battery estimators, charge density, and importance of clinical symptoms.

    Directory of Open Access Journals (Sweden)

    Kaihan Fakhar

    Full Text Available OBJECTIVE: We aimed in this investigation to study deep brain stimulation (DBS battery drain with special attention directed toward patient symptoms prior to and following battery replacement. BACKGROUND: Previously our group developed web-based calculators and smart phone applications to estimate DBS battery life (http://mdc.mbi.ufl.edu/surgery/dbs-battery-estimator. METHODS: A cohort of 320 patients undergoing DBS battery replacement from 2002-2012 were included in an IRB approved study. Statistical analysis was performed using SPSS 20.0 (IBM, Armonk, NY. RESULTS: The mean charge density for treatment of Parkinson's disease was 7.2 µC/cm(2/phase (SD = 3.82, for dystonia was 17.5 µC/cm(2/phase (SD = 8.53, for essential tremor was 8.3 µC/cm(2/phase (SD = 4.85, and for OCD was 18.0 µC/cm(2/phase (SD = 4.35. There was a significant relationship between charge density and battery life (r = -.59, p<.001, as well as total power and battery life (r = -.64, p<.001. The UF estimator (r = .67, p<.001 and the Medtronic helpline (r = .74, p<.001 predictions of battery life were significantly positively associated with actual battery life. Battery status indicators on Soletra and Kinetra were poor predictors of battery life. In 38 cases, the symptoms improved following a battery change, suggesting that the neurostimulator was likely responsible for symptom worsening. For these cases, both the UF estimator and the Medtronic helpline were significantly correlated with battery life (r = .65 and r = .70, respectively, both p<.001. CONCLUSIONS: Battery estimations, charge density, total power and clinical symptoms were important factors. The observation of clinical worsening that was rescued following neurostimulator replacement reinforces the notion that changes in clinical symptoms can be associated with battery drain.

  6. Constrained Kalman Filtering Via Density Function Truncation for Turbofan Engine Health Estimation

    Science.gov (United States)

    Simon, Dan; Simon, Donald L.

    2006-01-01

    Kalman filters are often used to estimate the state variables of a dynamic system. However, in the application of Kalman filters some known signal information is often either ignored or dealt with heuristically. For instance, state variable constraints (which may be based on physical considerations) are often neglected because they do not fit easily into the structure of the Kalman filter. This paper develops an analytic method of incorporating state variable inequality constraints in the Kalman filter. The resultant filter truncates the PDF (probability density function) of the Kalman filter estimate at the known constraints and then computes the constrained filter estimate as the mean of the truncated PDF. The incorporation of state variable constraints increases the computational effort of the filter but significantly improves its estimation accuracy. The improvement is demonstrated via simulation results obtained from a turbofan engine model. The turbofan engine model contains 3 state variables, 11 measurements, and 10 component health parameters. It is also shown that the truncated Kalman filter may be a more accurate way of incorporating inequality constraints than other constrained filters (e.g., the projection approach to constrained filtering).

  7. Density-preserving sampling: robust and efficient alternative to cross-validation for error estimation.

    Science.gov (United States)

    Budka, Marcin; Gabrys, Bogdan

    2013-01-01

    Estimation of the generalization ability of a classification or regression model is an important issue, as it indicates the expected performance on previously unseen data and is also used for model selection. Currently used generalization error estimation procedures, such as cross-validation (CV) or bootstrap, are stochastic and, thus, require multiple repetitions in order to produce reliable results, which can be computationally expensive, if not prohibitive. The correntropy-inspired density-preserving sampling (DPS) procedure proposed in this paper eliminates the need for repeating the error estimation procedure by dividing the available data into subsets that are guaranteed to be representative of the input dataset. This allows the production of low-variance error estimates with an accuracy comparable to 10 times repeated CV at a fraction of the computations required by CV. This method can also be used for model ranking and selection. This paper derives the DPS procedure and investigates its usability and performance using a set of public benchmark datasets and standard classifiers.

  8. Structuring feature space: a non-parametric method for volumetric transfer function generation.

    Science.gov (United States)

    Maciejewski, Ross; Woo, Insoo; Chen, Wei; Ebert, David S

    2009-01-01

    The use of multi-dimensional transfer functions for direct volume rendering has been shown to be an effective means of extracting materials and their boundaries for both scalar and multivariate data. The most common multi-dimensional transfer function consists of a two-dimensional (2D) histogram with axes representing a subset of the feature space (e.g., value vs. value gradient magnitude), with each entry in the 2D histogram being the number of voxels at a given feature space pair. Users then assign color and opacity to the voxel distributions within the given feature space through the use of interactive widgets (e.g., box, circular, triangular selection). Unfortunately, such tools lead users through a trial-and-error approach as they assess which data values within the feature space map to a given area of interest within the volumetric space. In this work, we propose the addition of non-parametric clustering within the transfer function feature space in order to extract patterns and guide transfer function generation. We apply a non-parametric kernel density estimation to group voxels of similar features within the 2D histogram. These groups are then binned and colored based on their estimated density, and the user may interactively grow and shrink the binned regions to explore feature boundaries and extract regions of interest. We also extend this scheme to temporal volumetric data in which time steps of 2D histograms are composited into a histogram volume. A three-dimensional (3D) density estimation is then applied, and users can explore regions within the feature space across time without adjusting the transfer function at each time step. Our work enables users to effectively explore the structures found within a feature space of the volume and provide a context in which the user can understand how these structures relate to their volumetric data. We provide tools for enhanced exploration and manipulation of the transfer function, and we show that the initial

  9. The Wegner Estimate and the Integrated Density of States for some Random Operators

    Indian Academy of Sciences (India)

    J M Combes; P D Hislop; Frédéric Klopp; Shu Nakamura

    2002-02-01

    The integrated density of states (IDS) for random operators is an important function describing many physical characteristics of a random system. Properties of the IDS are derived from the Wegner estimate that describes the influence of finite-volume perturbations on a background system. In this paper, we present a simple proof of the Wegner estimate applicable to a wide variety of random perturbations of deterministic background operators. The proof yields the correct volume dependence of the upper bound. This implies the local Hölder continuity of the integrated density of states at energies in the unperturbed spectral gap. The proof depends on the -theory of the spectral shift function (SSF), for ≥ 1, applicable to pairs of self-adjoint operators whose difference is in the trace ideal $\\mathcal{I}_p$, for 0 < ≤ 1. We present this and other results on the SSF due to other authors. Under an additional condition of the single-site potential, local Hölder continuity is proved at all energies. Finally, we present extensions of this work to random potentials with nonsign definite single-site potentials.

  10. Power spectral density of velocity fluctuations estimated from phase Doppler data

    Directory of Open Access Journals (Sweden)

    Jicha Miroslav

    2012-04-01

    Full Text Available Laser Doppler Anemometry (LDA and its modifications such as PhaseDoppler Particle Anemometry (P/DPA is point-wise method for optical nonintrusive measurement of particle velocity with high data rate. Conversion of the LDA velocity data from temporal to frequency domain – calculation of power spectral density (PSD of velocity fluctuations, is a non trivial task due to nonequidistant data sampling in time. We briefly discuss possibilities for the PSD estimation and specify limitations caused by seeding density and other factors of the flow and LDA setup. Arbitrary results of LDA measurements are compared with corresponding Hot Wire Anemometry (HWA data in the frequency domain. Slot correlation (SC method implemented in software program Kern by Nobach (2006 is used for the PSD estimation. Influence of several input parameters on resulting PSDs is described. Optimum setup of the software for our data of particle-laden air flow in realistic human airway model is documented. Typical character of the flow is described using PSD plots of velocity fluctuations with comments on specific properties of the flow. Some recommendations for improvements of future experiments to acquire better PSD results are given.

  11. Error estimates for density-functional theory predictions of surface energy and work function

    Science.gov (United States)

    De Waele, Sam; Lejaeghere, Kurt; Sluydts, Michael; Cottenier, Stefaan

    2016-12-01

    Density-functional theory (DFT) predictions of materials properties are becoming ever more widespread. With increased use comes the demand for estimates of the accuracy of DFT results. In view of the importance of reliable surface properties, this work calculates surface energies and work functions for a large and diverse test set of crystalline solids. They are compared to experimental values by performing a linear regression, which results in a measure of the predictable and material-specific error of the theoretical result. Two of the most prevalent functionals, the local density approximation (LDA) and the Perdew-Burke-Ernzerhof parametrization of the generalized gradient approximation (PBE-GGA), are evaluated and compared. Both LDA and GGA-PBE are found to yield accurate work functions with error bars below 0.3 eV, rivaling the experimental precision. LDA also provides satisfactory estimates for the surface energy with error bars smaller than 10%, but GGA-PBE significantly underestimates the surface energy for materials with a large correlation energy.

  12. Density and Biomass Estimates by Removal for an Amazonian Crocodilian, Paleosuchus palpebrosus.

    Directory of Open Access Journals (Sweden)

    Zilca Campos

    Full Text Available Direct counts of crocodilians are rarely feasible and it is difficult to meet the assumptions of mark-recapture methods for most species in most habitats. Catch-out experiments are also usually not logistically or morally justifiable because it would be necessary to destroy the habitat in order to be confident that most individuals had been captured. We took advantage of the draining and filling of a large area of flooded forest during the building of the Santo Antônio dam on the Madeira River to obtain accurate estimates of the density and biomass of Paleosuchus palpebrosus. The density, 28.4 non-hatchling individuals per km2, is one of the highest reported for any crocodilian, except for species that are temporarily concentrated in small areas during dry-season drought. The biomass estimate of 63.15 kg*km-2 is higher than that for most or even all mammalian carnivores in tropical forest. P. palpebrosus may be one of the World´s most abundant crocodilians.

  13. A Semianalytical Model Using MODIS Data to Estimate Cell Density of Red Tide Algae (Aureococcus anophagefferens

    Directory of Open Access Journals (Sweden)

    Lingling Jiang

    2016-01-01

    Full Text Available A multiband and a single-band semianalytical model were developed to predict algae cell density distribution. The models were based on cell density (N dependent parameterizations of the spectral backscattering coefficients, bb(λ, obtained from in situ measurements. There was a strong relationship between bb(λ and N, with a minimum regression coefficient of 0.97 at 488 nm and a maximum value of 0.98 at other bands. The cell density calculated by the multiband inversion model was similar to the field measurements of the coastal waters (the average relative error was only 8.9%, but it could not accurately discern the red tide from mixed pixels, and this led to overestimation of the area affected by the red tide. While the single-band inversion model is less precise than the former model in the high chlorophyll water, it could eliminate the impact of the suspended sediments and make more accurate estimates of the red tide area. We concluded that the two models both have advantages and disadvantages; these methods lay the foundation for developing a remote sensing forecasting system for red tides.

  14. Diurnal and seasonal variability in radial distribution of sap flux density: Implications for estimating stand transpiration.

    Science.gov (United States)

    Fiora, Alessandro; Cescatti, Alessandro

    2006-09-01

    Daily and seasonal patterns in radial distribution of sap flux density were monitored in six trees differing in social position in a mixed coniferous stand dominated by silver fir (Abies alba Miller) and Norway spruce (Picea abies (L.) Karst) in the Alps of northeastern Italy. Radial distribution of sap flux was measured with arrays of 1-cm-long Granier probes. The radial profiles were either Gaussian or decreased monotonically toward the tree center, and seemed to be related to social position and crown distribution of the trees. The ratio between sap flux estimated with the most external sensor and the mean flux, weighted with the corresponding annulus areas, was used as a correction factor (CF) to express diurnal and seasonal radial variation in sap flow. During sunny days, the diurnal radial profile of sap flux changed with time and accumulated photosynthetic active radiation (PAR), with an increasing contribution of sap flux in the inner sapwood during the day. Seasonally, the contribution of sap flux in the inner xylem increased with daily cumulative PAR and the variation of CF was proportional to the tree diameter, ranging from 29% for suppressed trees up to 300% for dominant trees. Two models were developed, relating CF with PAR and tree diameter at breast height (DBH), to correct daily and seasonal estimates of whole-tree and stand sap flow obtained by assuming uniform sap flux density over the sapwood. If the variability in the radial profile of sap flux density was not accounted for, total stand transpiration would be overestimated by 32% during sunny days and 40% for the entire season.

  15. Effects of LiDAR point density, sampling size and height threshold on estimation accuracy of crop biophysical parameters.

    Science.gov (United States)

    Luo, Shezhou; Chen, Jing M; Wang, Cheng; Xi, Xiaohuan; Zeng, Hongcheng; Peng, Dailiang; Li, Dong

    2016-05-30

    Vegetation leaf area index (LAI), height, and aboveground biomass are key biophysical parameters. Corn is an important and globally distributed crop, and reliable estimations of these parameters are essential for corn yield forecasting, health monitoring and ecosystem modeling. Light Detection and Ranging (LiDAR) is considered an effective technology for estimating vegetation biophysical parameters. However, the estimation accuracies of these parameters are affected by multiple factors. In this study, we first estimated corn LAI, height and biomass (R2 = 0.80, 0.874 and 0.838, respectively) using the original LiDAR data (7.32 points/m2), and the results showed that LiDAR data could accurately estimate these biophysical parameters. Second, comprehensive research was conducted on the effects of LiDAR point density, sampling size and height threshold on the estimation accuracy of LAI, height and biomass. Our findings indicated that LiDAR point density had an important effect on the estimation accuracy for vegetation biophysical parameters, however, high point density did not always produce highly accurate estimates, and reduced point density could deliver reasonable estimation results. Furthermore, the results showed that sampling size and height threshold were additional key factors that affect the estimation accuracy of biophysical parameters. Therefore, the optimal sampling size and the height threshold should be determined to improve the estimation accuracy of biophysical parameters. Our results also implied that a higher LiDAR point density, larger sampling size and height threshold were required to obtain accurate corn LAI estimation when compared with height and biomass estimations. In general, our results provide valuable guidance for LiDAR data acquisition and estimation of vegetation biophysical parameters using LiDAR data.

  16. Line Transect and Triangulation Surveys Provide Reliable Estimates of the Density of Kloss' Gibbons (Hylobates klossii) on Siberut Island, Indonesia.

    Science.gov (United States)

    Höing, Andrea; Quinten, Marcel C; Indrawati, Yohana Maria; Cheyne, Susan M; Waltert, Matthias

    2013-02-01

    Estimating population densities of key species is crucial for many conservation programs. Density estimates provide baseline data and enable monitoring of population size. Several different survey methods are available, and the choice of method depends on the species and study aims. Few studies have compared the accuracy and efficiency of different survey methods for large mammals, particularly for primates. Here we compare estimates of density and abundance of Kloss' gibbons (Hylobates klossii) using two of the most common survey methods: line transect distance sampling and triangulation. Line transect surveys (survey effort: 155.5 km) produced a total of 101 auditory and visual encounters and a density estimate of 5.5 gibbon clusters (groups or subgroups of primate social units)/km(2). Triangulation conducted from 12 listening posts during the same period revealed a similar density estimate of 5.0 clusters/km(2). Coefficients of variation of cluster density estimates were slightly higher from triangulation (0.24) than from line transects (0.17), resulting in a lack of precision in detecting changes in cluster densities of triangulation and triangulation method also may be appropriate.

  17. Comparing adaptive and fixed bandwidth-based kernel density estimates in spatial cancer epidemiology.

    Science.gov (United States)

    Lemke, Dorothea; Mattauch, Volkmar; Heidinger, Oliver; Pebesma, Edzer; Hense, Hans-Werner

    2015-03-31

    Monitoring spatial disease risk (e.g. identifying risk areas) is of great relevance in public health research, especially in cancer epidemiology. A common strategy uses case-control studies and estimates a spatial relative risk function (sRRF) via kernel density estimation (KDE). This study was set up to evaluate the sRRF estimation methods, comparing fixed with adaptive bandwidth-based KDE, and how they were able to detect 'risk areas' with case data from a population-based cancer registry. The sRRF were estimated within a defined area, using locational information on incident cancer cases and on a spatial sample of controls, drawn from a high-resolution population grid recognized as underestimating the resident population in urban centers. The spatial extensions of these areas with underestimated resident population were quantified with population reference data and used in this study as 'true risk areas'. Sensitivity and specificity analyses were conducted by spatial overlay of the 'true risk areas' and the significant (α=.05) p-contour lines obtained from the sRRF. We observed that the fixed bandwidth-based sRRF was distinguished by a conservative behavior in identifying these urban 'risk areas', that is, a reduced sensitivity but increased specificity due to oversmoothing as compared to the adaptive risk estimator. In contrast, the latter appeared more competitive through variance stabilization, resulting in a higher sensitivity, while the specificity was equal as compared to the fixed risk estimator. Halving the originally determined bandwidths led to a simultaneous improvement of sensitivity and specificity of the adaptive sRRF, while the specificity was reduced for the fixed estimator. The fixed risk estimator contrasts with an oversmoothing tendency in urban areas, while overestimating the risk in rural areas. The use of an adaptive bandwidth regime attenuated this pattern, but led in general to a higher false positive rate, because, in our study design

  18. Thirty years of nonparametric item response theory

    NARCIS (Netherlands)

    Molenaar, W.

    2001-01-01

    Relationships between a mathematical measurement model and its real-world applications are discussed. A distinction is made between large data matrices commonly found in educational measurement and smaller matrices found in attitude and personality measurement. Nonparametric methods are evaluated fo

  19. A Bayesian Nonparametric Approach to Test Equating

    Science.gov (United States)

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  20. How Are Teachers Teaching? A Nonparametric Approach

    Science.gov (United States)

    De Witte, Kristof; Van Klaveren, Chris

    2014-01-01

    This paper examines which configuration of teaching activities maximizes student performance. For this purpose a nonparametric efficiency model is formulated that accounts for (1) self-selection of students and teachers in better schools and (2) complementary teaching activities. The analysis distinguishes both individual teaching (i.e., a…