Orthogonal projections and bootstrap resampling procedures in the study of infraspecific variation
Directory of Open Access Journals (Sweden)
Luiza Carla Duarte
1998-12-01
Full Text Available The effect of an increase in quantitative continuous characters resulting from indeterminate growth upon the analysis of population differentiation was investigated using, as an example, a set of continuous characters measured as distance variables in 10 populations of a rodent species. The data before and after correction for allometric size effects using orthogonal projections were analyzed with a parametric bootstrap resampling procedure applied to canonical variate analysis. The variance component of the distance measures attributable to indeterminate growth within the populations was found to be substantial, although the ordination of the populations was not affected, as evidenced by the relative and absolute positions of the centroids. The covariance pattern of the distance variables used to infer the nature of the morphological differences was strongly influenced by indeterminate growth. The uncorrected data produced a misleading picture of morphological differentiation by indicating that groups of populations differed in size. However, the data corrected for allometric effects clearly demonstrated that populations differed morphologically both in size and shape. These results are discussed in terms of the analysis of morphological differentiation among populations and the definition of infraspecific geographic units.A influência do aumento em caracteres quantitativos contínuos devido ao crescimento indeterminado sobre a análise de diferenciação entre populações foi investigado utilizando como exemplo um conjunto de dados de variáveis craniométricas em 10 populações de uma espécie de roedor. Dois conjuntos de dados, um não corrigido para o efeito alométrico do tamanho e um outro corrigido para o efeito alométrico do tamanho utilizando um método de projeção ortogonal, foram analisados por um procedimento "bootstrap" de reamostragem aplicado à análise de variáveis canônicas. O componente de variância devido ao
Directory of Open Access Journals (Sweden)
Heussen Nicole
2010-03-01
Full Text Available Abstract Background Various perinatal factors influencing neuromotor development are known from cross sectional studies. Factors influencing the age at which distinct abilities are acquired are uncertain. We hypothesized that the Cox regression model might identify these factors. Methods Neonates treated at Aachen University Hospital in 2000/2001 were identified retrospectively (n = 796. Outcome data, based on a structured interview, were available from 466 children, as were perinatal data. Factors possibly related to outcome were identified by bootstrap selection and then included into a multivariate Cox regression model. To evaluate if the parental assessment might change with the time elapsed since birth we studied five age cohorts of 163 normally developed children. Results Birth weight, gestational age, congenital cardiac disease and periventricular leukomalacia were related to outcome in the multivariate analysis (p Conclusions Combined application of the bootstrap resampling procedure and multivariate Cox regression analysis effectively identifies perinatal factors influencing the age at which distinct abilities are acquired. These were similar as known from previous cross sectional studies. Retrospective data acquistion may lead to a bias because the parental memories change with time. This recommends applying this statistical approach in larger prospective trials.
Assessing Uncertainty in LULC Classification Accuracy by Using Bootstrap Resampling
Directory of Open Access Journals (Sweden)
Lin-Hsuan Hsiao
2016-08-01
Full Text Available Supervised land-use/land-cover (LULC classifications are typically conducted using class assignment rules derived from a set of multiclass training samples. Consequently, classification accuracy varies with the training data set and is thus associated with uncertainty. In this study, we propose a bootstrap resampling and reclassification approach that can be applied for assessing not only the uncertainty in classification results of the bootstrap-training data sets, but also the classification uncertainty of individual pixels in the study area. Two measures of pixel-specific classification uncertainty, namely the maximum class probability and Shannon entropy, were derived from the class probability vector of individual pixels and used for the identification of unclassified pixels. Unclassified pixels that are identified using the traditional chi-square threshold technique represent outliers of individual LULC classes, but they are not necessarily associated with higher classification uncertainty. By contrast, unclassified pixels identified using the equal-likelihood technique are associated with higher classification uncertainty and they mostly occur on or near the borders of different land-cover.
A nonparametric hypothesis test via the Bootstrap resampling
Temel, Tugrul
2011-01-01
This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.
Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap
Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao
2016-01-01
Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…
Genetic divergence among cupuaçu accessions by multiscale bootstrap resampling
Directory of Open Access Journals (Sweden)
Vinicius Silva dos Santos
2015-06-01
Full Text Available This study aimed at investigating the genetic divergence of eighteen accessions of cupuaçu trees based on fruit morphometric traits and comparing usual methods of cluster analysis with the proposed multiscale bootstrap resampling methodology. The data were obtained from an experiment conducted in Tomé-Açu city (PA, Brazil, arranged in a completely randomized design with eighteen cupuaçu accessions and 10 repetitions, from 2004 to 2011. Genetic parameters were estimated by restricted maximum likelihood/best linear unbiased prediction (REML/BLUP methodology. The predicted breeding values were used in the study on genetic divergence through Unweighted Pair Cluster Method with Arithmetic Mean (UPGMA hierarchical clustering and Tocher’s optimization method based on standardized Euclidean distance. Clustering consistency and optimal number of clusters in the UPGMA method were verified by the cophenetic correlation coefficient (CCC and Mojena’s criterion, respectively, besides the multiscale bootstrap resampling technique. The use of the clustering UPGMA method in situations with and without multiscale bootstrap resulted in four and five clusters, respectively, while the Tocher’s method resulted in seven clusters. The multiscale bootstrap resampling technique proves to be efficient to assess the consistency of clustering in hierarchical methods and, consequently, the optimal number of clusters.
Bootstrap re-sampling and cross-validation for neural network learning
Dupret, Georges; Koda, Masato
2000-01-01
A technical framework to assess the impact of re-sampling on the ability of a neural network is presented to correctly learn a classification problem.We use the bootstrap expression of the prediction error to identify the optimal re-sampling proportions in a numerical experiment with binary classes and propose a new,simple method to estimate this optimal proportion.An upper and a lower bounds for the optimal proportion are derived based on Bayes decision rule.The analytical considerations to ...
A new resampling method for sampling designs without replacement: the doubled half bootstrap
Antal, Erika; Tillé, Yves
2016-01-01
A new and very fast method of bootstrap for sampling without replacement from a finite population is proposed. This method can be used to estimate the variance in sampling with unequal inclusion probabilities and does not require artificial populations or utilization of bootstrap weights. The bootstrap samples are directly selected from the original sample. The bootstrap procedure contains two steps: in the first step, units are selected once with Poisson sampling using the same inclusion pro...
International Nuclear Information System (INIS)
We report on a broader evaluation of statistical bootstrap resampling methods as a tool for pixel-level calibration and imaging fidelity assessment in radio interferometry. Pixel-level imaging fidelity assessment is a challenging problem, important for the value it holds in robust scientific interpretation of interferometric images, enhancement of automated pipeline reduction systems needed to broaden the user community for these instruments, and understanding leading-edge direction-dependent calibration and imaging challenges for future telescopes such as the Square Kilometre Array. This new computational approach is now possible because of advances in statistical resampling for data with long-range dependence and the available performance of contemporary high-performance computing resources. We expand our earlier numerical evaluation to span a broader domain subset in simulated image fidelity and source brightness distribution morphologies. As before, we evaluate the statistical performance of the bootstrap resampling methods against direct Monte Carlo simulation. We find that both model-based and subsample bootstrap methods continue to show significant promise for the challenging problem of interferometric imaging fidelity assessment when evaluated over the broader domain subset. We report on their measured statistical performance and guidelines for their use and application in practice. We also examine the performance of the underlying polarization self-calibration algorithm used in this study over a range of parallactic angle coverage.
Fingerprint resampling: A generic method for efficient resampling
Merijn Mestdagh; Stijn Verdonck; Kevin Duisters; Francis Tuerlinckx
2015-01-01
In resampling methods, such as bootstrapping or cross validation, a very similar computational problem (usually an optimization procedure) is solved over and over again for a set of very similar data sets. If it is computationally burdensome to solve this computational problem once, the whole resampling method can become unfeasible. However, because the computational problems and data sets are so similar, the speed of the resampling method may be increased by taking advantage of these similar...
Pareto, Deborah; Aguiar, Pablo; Pavía, Javier; Gispert, Juan Domingo; Cot, Albert; Falcón, Carles; Benabarre, Antoni; Lomeña, Francisco; Vieta, Eduard; Ros, Domènec
2008-07-01
Statistical parametric mapping (SPM) has become the technique of choice to statistically evaluate positron emission tomography (PET), functional magnetic resonance imaging (fMRI), and single photon emission computed tomography (SPECT) functional brain studies. Nevertheless, only a few methodological studies have been carried out to assess the performance of SPM in SPECT. The aim of this paper was to study the performance of SPM in detecting changes in regional cerebral blood flow (rCBF) in hypo- and hyperperfused areas in brain SPECT studies. The paper seeks to determine the relationship between the group size and the rCBF changes, and the influence of the correction for degradations. The assessment was carried out using simulated brain SPECT studies. Projections were obtained with Monte Carlo techniques, and a fan-beam collimator was considered in the simulation process. Reconstruction was performed by using the ordered subsets expectation maximization (OSEM) algorithm with and without compensation for attenuation, scattering, and spatial variant collimator response. Significance probability maps were obtained with SPM2 by using a one-tailed two-sample t-test. A bootstrap resampling approach was used to determine the sample size for SPM to detect the between-group differences. Our findings show that the correction for degradations results in a diminution of the sample size, which is more significant for small regions and low-activation factors. Differences in sample size were found between hypo- and hyperperfusion. These differences were larger for small regions and low-activation factors, and when no corrections were included in the reconstruction algorithm.
Hilmer, Christiana E.; Holt, Matthew T.
2000-01-01
This paper compares the finite sample performance of subsample bootstrap and subsample jackknife techniques to the traditional bootstrap method when parameters are constrained to be on some boundary. To assess how these three methods perform in an empirical application, a negative semi-definite translog cost function is estimated using U.S. manufacturing data.
Unbiased Estimates of Variance Components with Bootstrap Procedures
Brennan, Robert L.
2007-01-01
This article provides general procedures for obtaining unbiased estimates of variance components for any random-model balanced design under any bootstrap sampling plan, with the focus on designs of the type typically used in generalizability theory. The results reported here are particularly helpful when the bootstrap is used to estimate standard…
A Bootstrap Procedure of Propensity Score Estimation
Bai, Haiyan
2013-01-01
Propensity score estimation plays a fundamental role in propensity score matching for reducing group selection bias in observational data. To increase the accuracy of propensity score estimation, the author developed a bootstrap propensity score. The commonly used propensity score matching methods: nearest neighbor matching, caliper matching, and…
Fingerprint resampling: A generic method for efficient resampling
Mestdagh, Merijn; Verdonck, Stijn; Duisters, Kevin; Tuerlinckx, Francis
2015-01-01
In resampling methods, such as bootstrapping or cross validation, a very similar computational problem (usually an optimization procedure) is solved over and over again for a set of very similar data sets. If it is computationally burdensome to solve this computational problem once, the whole resampling method can become unfeasible. However, because the computational problems and data sets are so similar, the speed of the resampling method may be increased by taking advantage of these similarities in method and data. As a generic solution, we propose to learn the relation between the resampled data sets and their corresponding optima. Using this learned knowledge, we are then able to predict the optima associated with new resampled data sets. First, these predicted optima are used as starting values for the optimization process. Once the predictions become accurate enough, the optimization process may even be omitted completely, thereby greatly decreasing the computational burden. The suggested method is validated using two simple problems (where the results can be verified analytically) and two real-life problems (i.e., the bootstrap of a mixed model and a generalized extreme value distribution). The proposed method led on average to a tenfold increase in speed of the resampling method. PMID:26597870
Institute of Scientific and Technical Information of China (English)
Fang-Ling Tao; Shi-Fan Min; Wei-Jian Wu; Guang-Wen Liang; Ling Zeng
2008-01-01
Taking a published natural population life table office leaf roller, Cnaphalocrocis medinalis (Lepidoptera: Pyralidae), as an example, we estimated the population trend index,I, via re-sampling methods (jackknife and bootstrap), determined its statistical properties and illustrated the application of these methods in determining the control effectiveness of bio-agents and chemical insecticides. Depending on the simulation outputs, the smoothed distribution pattern of the estimates of I by delete-1 jackknife is visually distinguishable from the normal density, but the smoothed pattern produced by delete-d jackknife, and logarithm-transformed smoothed patterns produced by both empirical and parametric bootstraps,matched well the corresponding normal density. Thus, the estimates of I produced by delete-1 jackknife were not used to determine the suppressive effect of wasps and insecticides. The 95% percent confidence intervals or the narrowest 95 percentiles and Z-test criterion were employed to compare the effectiveness of Trichogrammajaponicum Ashmead and insecti-cides (powder, 1.5% mevinphos + 3% alpha-hexachloro cyclohexane) against the rice leaf roller based on the estimates of I produced by delete-d jackknife and bootstrap techniques.At α= 0.05 level, there were statistical differences between wasp treatment and control, and between wasp and insecticide treatments, if the normality is ensured, or by the narrowest 95 percentiles. However, there is still no difference between insecticide treatment and control.By Z-test criterion, wasp treatment is better than control and insecticide treatment with P-value＜0.01. Insecticide treatment is similar to control with P-value ＞ 0.2 indicating that 95% confidence intervals procedure is more conservative. Although similar conclusions may be drawn by re-sampling techniques, such as the delta method, about the suppressive effect of trichogramma and insecticides, the normality of the estimates can be checked and guaranteed
The wild tapered block bootstrap
DEFF Research Database (Denmark)
Hounyo, Ulrich
In this paper, a new resampling procedure, called the wild tapered block bootstrap, is introduced as a means of calculating standard errors of estimators and constructing confidence regions for parameters based on dependent heterogeneous data. The method consists in tapering each overlapping block...... of the series first, the applying the standard wild bootstrap for independent and heteroscedastic distrbuted observations to overlapping tapered blocks in an appropriate way. Its perserves the favorable bias and mean squared error properties of the tapered block bootstrap, which is the state-of-the-art block......-order asymptotic validity of the tapered block bootstrap as well as the wild tapered block bootstrap approximation to the actual distribution of the sample mean is also established when data are assumed to satisfy a near epoch dependent condition. The consistency of the bootstrap variance estimator for the sample...
Bootstrap determination of the cointegration rank in heteroskedastic VAR models
DEFF Research Database (Denmark)
Cavaliere, Guiseppe; Rahbæk, Anders; Taylor, A.M. Robert
2014-01-01
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio (PLR) co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates of...... the underlying vector autoregressive (VAR) model which obtain under the reduced rank null hypothesis. They propose methods based on an independent and individual distributed (i.i.d.) bootstrap resampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co......-integrated VAR model with i.i.d. innovations. In this paper we investigate the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap resampling scheme, when time-varying behavior is present in either the conditional or unconditional variance of the innovations. We...
Directory of Open Access Journals (Sweden)
Osmir José Lavoranti
2010-06-01
Full Text Available Reliable evaluation of the stability of genotypes and environment is of prime concern to plant breeders, but the lack of a comprehensive analysis of the structure of the GE interaction has been a stumbling block to the recommendation of varieties. The Additive Main Effects and Multiplicative Interaction (AMMI Model currently offers the good approach to interpretation and understanding of the GE interaction but lacks a way of assessing the stability of its estimates. The present contribution proposes the use of bootstrap resampling
in the AMMI Model, and applies it to obtain both a graphical and a numerical analysis of the phenotypic
stability of 20 Eucalyptus grandis progenies from Australia that were planted in seven environments in the Southern and Southeastern regions of Brazil. The results showed distinct behaviors of genotypes and
environments and the genotype x environment interaction was significant (p value < 0.01. The bootstrap coefficient of stability based on the squared Mahalanobis distance of the scores showed that genotypes and environments can be differentiated in terms of their stabilities. Graphical analysis of the AMMI biplot provided a better understanding of the interpretation of phenotypic stability. The proposed AMMI bootstrap eliminated the uncertainties regarding the identification of low scores in traditional analyses.As posições críticas dos estatísticos, que atuam em programas de melhoramento genético, referem-se à falta de uma análise criteriosa da estrutura da interação do genótipo com o ambiente (GE como um dos principais problemas para a recomendação de cultivares. A metodologia AMMI (additive main effects and multiplicative interaction analysis propõe ser mais eficiente que as análises usuais na interpretação e compreensão da interação GE, entretanto, à dificuldade de se interpretar a interação quando há baixa explicação do primeiro componente principal; à dificuldade de
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates of...... the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap re-sampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we...... investigate the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap re-sampling scheme, when time-varying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically...
Bootstrap Determination of the Co-integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A.M.Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates of...... the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap re-sampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we...... investigate the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap re-sampling scheme, when time-varying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically...
Mitterpach, Róbert
2012-01-01
Aim of this thesis is to introduce the reader to the basic bootstrap techniques used in econometrics, to present their variations and importance. Results of the ordinary least squares model, residual bootstrap and case resampling bootstrap will be presented and compared on cross-sectional data and time series from small numbered random subsample from the available data. Bootstrap was shown to improve numerical performance of ordinary least squares model.
New resampling method for evaluating stability of clusters
Directory of Open Access Journals (Sweden)
Neuhaeuser Markus
2008-01-01
Full Text Available Abstract Background Hierarchical clustering is a widely applied tool in the analysis of microarray gene expression data. The assessment of cluster stability is a major challenge in clustering procedures. Statistical methods are required to distinguish between real and random clusters. Several methods for assessing cluster stability have been published, including resampling methods such as the bootstrap. We propose a new resampling method based on continuous weights to assess the stability of clusters in hierarchical clustering. While in bootstrapping approximately one third of the original items is lost, continuous weights avoid zero elements and instead allow non integer diagonal elements, which leads to retention of the full dimensionality of space, i.e. each variable of the original data set is represented in the resampling sample. Results Comparison of continuous weights and bootstrapping using real datasets and simulation studies reveals the advantage of continuous weights especially when the dataset has only few observations, few differentially expressed genes and the fold change of differentially expressed genes is low. Conclusion We recommend the use of continuous weights in small as well as in large datasets, because according to our results they produce at least the same results as conventional bootstrapping and in some cases they surpass it.
Wild Bootstrap Versus Moment-Oriented Bootstrap
Sommerfeld, Volker
1997-01-01
We investigate the relative merits of a “moment-oriented” bootstrap method of Bunke (1997) in comparison with the classical wild bootstrap of Wu (1986) in nonparametric heteroscedastic regression situations. The “moment-oriented” bootstrap is a wild bootstrap based on local estimators of higher order error moments that are smoothed by kernel smoothers. In this paper we perform an asymptotic comparison of these two dierent bootstrap procedures. We show that the moment-oriented bootstrap is in ...
A bootstrap procedure to select hyperspectral wavebands related to tannin content
Ferwerda, J.G.; Skidmore, A.K.; Stein, A.
2006-01-01
Detection of hydrocarbons in plants with hyperspectral remote sensing is hampered by overlapping absorption pits, while the `optimal' wavebands for detecting some surface characteristics (e.g. chlorophyll, lignin, tannin) may shift. We combined a phased regression with a bootstrap procedure to find
Introductory statistics and analytics a resampling perspective
Bruce, Peter C
2014-01-01
Concise, thoroughly class-tested primer that features basic statistical concepts in the concepts in the context of analytics, resampling, and the bootstrapA uniquely developed presentation of key statistical topics, Introductory Statistics and Analytics: A Resampling Perspective provides an accessible approach to statistical analytics, resampling, and the bootstrap for readers with various levels of exposure to basic probability and statistics. Originally class-tested at one of the first online learning companies in the discipline, www.statistics.com, the book primarily focuses on application
Efficient p-value evaluation for resampling-based tests
Yu, K.
2011-01-05
The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated calculations of the test statistic on a large number of simulated data sets for its significance level assessment, and thus it could become very computationally intensive. Here, we propose an efficient p-value evaluation procedure by adapting the stochastic approximation Markov chain Monte Carlo algorithm. The new procedure can be used easily for estimating the p-value for any resampling-based test. We show through numeric simulations that the proposed procedure can be 100-500 000 times as efficient (in term of computing time) as the standard resampling-based procedure when evaluating a test statistic with a small p-value (e.g. less than 10( - 6)). With its computational burden reduced by this proposed procedure, the versatile resampling-based test would become computationally feasible for a much wider range of applications. We demonstrate the application of the new method by applying it to a large-scale genetic association study of prostate cancer.
Generalized bootstrap for estimating equations
Chatterjee, Snigdhansu; Bose, Arup
2005-01-01
We introduce a generalized bootstrap technique for estimators obtained by solving estimating equations. Some special cases of this generalized bootstrap are the classical bootstrap of Efron, the delete-d jackknife and variations of the Bayesian bootstrap. The use of the proposed technique is discussed in some examples. Distributional consistency of the method is established and an asymptotic representation of the resampling variance estimator is obtained.
Del Barrio, Eustasio; Lescornel, Hélène; Loubes, Jean-Michel
2016-01-01
Wasserstein barycenters and variance-like criterion using Wasserstein distance are used in many problems to analyze the homo-geneity of collections of distributions and structural relationships between the observations. We propose the estimation of the quantiles of the empirical process of the Wasserstein's variation using a bootstrap procedure. Then we use these results for statistical inference on a distribution registration model for general deformation functions. The tests are based on th...
MacKinnon, James G.
2007-01-01
This paper surveys bootstrap and Monte Carlo methods for testing hypotheses in econometrics. Several different ways of computing bootstrap P values are discussed, including the double bootstrap and the fast double bootstrap. It is emphasized that there are many different procedures for generating bootstrap samples for regression models and other types of model. As an illustration, a simulation experiment examines the performance of several methods of bootstrapping the supF test for structural...
A Neurocomputational Theory of how Explicit Learning Bootstraps Early Procedural Learning
Directory of Open Access Journals (Sweden)
Erick Joseph Paul
2013-12-01
Full Text Available It is widely accepted that human learning and memory is mediated by multiple memory systems that are each best suited to different requirements and demands. Within the domain of categorization, at least two systems are thought to facilitate learning: an explicit (declarative system depending largely on the prefrontal cortex, and a procedural (non-declarative system depending on the basal ganglia. Substantial evidence suggests that each system is optimally suited to learn particular categorization tasks. However, it remains unknown precisely how these systems interact to produce optimal learning and behavior. In order to investigate this issue, the present research evaluated the progression of learning through simulation of categorization tasks using COVIS, a well-known model of human category learning that includes both explicit and procedural learning systems. Specifically, the model's parameter space was thoroughly explored in procedurally learned categorization tasks across a variety of conditions and architectures to identify plausible interaction architectures. The simulation results support the hypothesis that one-way interaction between the systems occurs such that the explicit system "bootstraps" learning early on in the procedural system. Thus, the procedural system initially learns a suboptimal strategy employed by the explicit system and later refines its strategy. This bootstrapping could be from cortical-striatal projections that originate in premotor or motor regions of cortex, or possibly by the explicit system’s control of motor responses through basal ganglia-mediated loops.
A neurocomputational theory of how explicit learning bootstraps early procedural learning.
Paul, Erick J; Ashby, F Gregory
2013-01-01
It is widely accepted that human learning and memory is mediated by multiple memory systems that are each best suited to different requirements and demands. Within the domain of categorization, at least two systems are thought to facilitate learning: an explicit (declarative) system depending largely on the prefrontal cortex, and a procedural (non-declarative) system depending on the basal ganglia. Substantial evidence suggests that each system is optimally suited to learn particular categorization tasks. However, it remains unknown precisely how these systems interact to produce optimal learning and behavior. In order to investigate this issue, the present research evaluated the progression of learning through simulation of categorization tasks using COVIS, a well-known model of human category learning that includes both explicit and procedural learning systems. Specifically, the model's parameter space was thoroughly explored in procedurally learned categorization tasks across a variety of conditions and architectures to identify plausible interaction architectures. The simulation results support the hypothesis that one-way interaction between the systems occurs such that the explicit system "bootstraps" learning early on in the procedural system. Thus, the procedural system initially learns a suboptimal strategy employed by the explicit system and later refines its strategy. This bootstrapping could be from cortical-striatal projections that originate in premotor or motor regions of cortex, or possibly by the explicit system's control of motor responses through basal ganglia-mediated loops. PMID:24385962
Survey bootstrap and bootstrap weights
Stas Kolenikov
2008-01-01
In this presentation, I will review the bootstrap for complex surveys with designs featuring stratification, clustering, and unequal probability weights. I will present the Stata module bsweights, which creates the bootstrap weights for designs specified through and supported by svy. I will also provide simple demonstrations highlighting the use of the procedure and its syntax. I will discuss various tuning parameters and their impact on the performance of the procedure, and I will give argum...
DEFF Research Database (Denmark)
Hounyo, Ulrich; Varneskov, Rasmus T.
We provide a new resampling procedure - the local stable bootstrap - that is able to mimic the dependence properties of realized power variations for pure-jump semimartingales observed at different frequencies. This allows us to propose a bootstrap estimator and inference procedure for the activity...... index of the underlying process, β, as well as a bootstrap test for whether it obeys a jump-diffusion or a pure-jump process, that is, of the null hypothesis H₀: β=2 against the alternative H₁: β
A Bayesian approach to efficient differential allocation for resampling-based significance testing
Directory of Open Access Journals (Sweden)
Soi Sameer
2009-06-01
Full Text Available Abstract Background Large-scale statistical analyses have become hallmarks of post-genomic era biological research due to advances in high-throughput assays and the integration of large biological databases. One accompanying issue is the simultaneous estimation of p-values for a large number of hypothesis tests. In many applications, a parametric assumption in the null distribution such as normality may be unreasonable, and resampling-based p-values are the preferred procedure for establishing statistical significance. Using resampling-based procedures for multiple testing is computationally intensive and typically requires large numbers of resamples. Results We present a new approach to more efficiently assign resamples (such as bootstrap samples or permutations within a nonparametric multiple testing framework. We formulated a Bayesian-inspired approach to this problem, and devised an algorithm that adapts the assignment of resamples iteratively with negligible space and running time overhead. In two experimental studies, a breast cancer microarray dataset and a genome wide association study dataset for Parkinson's disease, we demonstrated that our differential allocation procedure is substantially more accurate compared to the traditional uniform resample allocation. Conclusion Our experiments demonstrate that using a more sophisticated allocation strategy can improve our inference for hypothesis testing without a drastic increase in the amount of computation on randomized data. Moreover, we gain more improvement in efficiency when the number of tests is large. R code for our algorithm and the shortcut method are available at http://people.pcbi.upenn.edu/~lswang/pub/bmc2009/.
Variance estimation in neutron coincidence counting using the bootstrap method
Energy Technology Data Exchange (ETDEWEB)
Dubi, C., E-mail: chendb331@gmail.com [Physics Department, Nuclear Research Center of the Negev, P.O.B. 9001 Beer Sheva (Israel); Ocherashvilli, A.; Ettegui, H. [Physics Department, Nuclear Research Center of the Negev, P.O.B. 9001 Beer Sheva (Israel); Pedersen, B. [Nuclear Security Unit, Institute for Transuranium Elements, Via E. Fermi, 2749 JRC, Ispra (Italy)
2015-09-11
In the study, we demonstrate the implementation of the “bootstrap” method for a reliable estimation of the statistical error in Neutron Multiplicity Counting (NMC) on plutonium samples. The “bootstrap” method estimates the variance of a measurement through a re-sampling process, in which a large number of pseudo-samples are generated, from which the so-called bootstrap distribution is generated. The outline of the present study is to give a full description of the bootstrapping procedure, and to validate, through experimental results, the reliability of the estimated variance. Results indicate both a very good agreement between the measured variance and the variance obtained through the bootstrap method, and a robustness of the method with respect to the duration of the measurement and the bootstrap parameters.
Del Barrio, Eustasio; Lescornel, Hélène; Loubes, Jean-Michel
2016-01-01
Wasserstein barycenters and variance-like criterion using Wasser-stein distance are used in many problems to analyze the homogeneity of collections of distributions and structural relationships between the observations. We propose the estimation of the quantiles of the empirical process of the Wasserstein's variation using a bootstrap procedure. Then we use these results for statistical inference on a distribution registration model for general deformation functions. The tests are based on th...
Bootstrap, Wild Bootstrap and Generalized Bootstrap
Mammen, Enno
1995-01-01
Some modifications and generalizations of the bootstrap procedurehave been proposed. In this note we will consider the wild bootstrap and the generalized bootstrap and we will give two arguments why it makes sense touse these modifications instead of the original bootstrap. The firstargument is that there exist examples where generalized and wild bootstrapwork, but where the original bootstrap fails and breaks down. The secondargument will be based on higher order considerations. We will show...
Directory of Open Access Journals (Sweden)
Ettore Marubini
2014-01-01
Full Text Available This paper presents a robust two-stage procedure for identification of outlying observations in regression analysis. The exploratory stage identifies leverage points and vertical outliers through a robust distance estimator based on Minimum Covariance Determinant (MCD. After deletion of these points, the confirmatory stage carries out an Ordinary Least Squares (OLS analysis on the remaining subset of data and investigates the effect of adding back in the previously deleted observations. Cut-off points pertinent to different diagnostics are generated by bootstrapping and the cases are definitely labelled as good-leverage, bad-leverage, vertical outliers and typical cases. The procedure is applied to four examples.
Inferences of Coordinates in Multidimensional Scaling by a Bootstrapping Procedure in R
Kim, Donghoh; Kim, Se-Kang; Park, Soyeon
2015-01-01
Recently, MDS has been utilized to identify and evaluate cognitive ability latent profiles in a population. However, dimension coordinates do not carry any statistical properties. To cope with statistical incompetence of MDS, we investigated the common aspects of various studies utilizing bootstrapping, and provided an R function for its…
On constructing accurate conﬁdence bands for ROC curves through smooth resampling
Bertail, Patrice; Clémençon, Stéphan; Vayatis, Nicolas
2008-01-01
This paper is devoted to thoroughly inves- tigating how to bootstrap the ROC curve, a widely used visual tool for evaluating the accuracy of test/scoring statistics s(X) in the bipartite setup. The issue of conﬁdence bands for the ROC curve is considered and a resampling procedure based on a smooth ver- sion of the empirical distribution called the ”smoothed bootstrap” is introduced. Theo- retical arguments and simulation results are presented to show that the ”smoothed boot- strap” is prefer...
Resampling methods in Microsoft Excel® for estimating reference intervals.
Theodorsson, Elvar
2015-01-01
Computer-intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular. Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples.
Applications of the Fast Double Bootstrap
MacKinnon, James G.
2006-01-01
The fast double bootstrap, or FDB, is a procedure for calculating bootstrap P values that is much more computationally efficient than the double bootstrap itself. In many cases, it can provide more accurate results than ordinary bootstrap tests. For the fast double bootstrap to be valid, the test statistic must be asymptotically independent of the random parts of the bootstrap data generating process. This paper presents simulation evidence on the performance of FDB tests in three cases of in...
Directory of Open Access Journals (Sweden)
Larissa Ribeiro de Andrade
2014-01-01
Full Text Available The bootstrap method is generally performed by presupposing that each sample unit would show the same probability of being re-sampled. However, when a sample with outliers is taken into account, the empirical distribution generated by this method may be influenced, or rather, it may not accurately represent the original sample. Current study proposes a bootstrap algorithm that allows the use of measures of influence in the calculation of re-sampling probabilities. The method was reproduced in simulation scenarios taking into account the logistic growth curve model and the CovRatio measurement to evaluate the impact of an influential observation in the determinacy of the matrix of the co-variance of parameter estimates. In most cases, bias estimates were reduced. Consequently, the method is suitable to be used in non-linear models and allows the researcher to apply other measures for better bias reductions.
Niska, Christoffer
2014-01-01
Practical and instruction-based, this concise book will take you from understanding what Bootstrap is, to creating your own Bootstrap theme in no time! If you are an intermediate front-end developer or designer who wants to learn the secrets of Bootstrap, this book is perfect for you.
Bootstrap and Wild Bootstrap for High Dimensional Linear Models
Mammen, Enno
1993-01-01
In this paper two bootstrap procedures are considered for the estimation of the distribution of linear contrasts and of F-test statistics in high dimensional linear models. An asymptotic approach will be chosen where the dimension p of the model may increase for sample size $n\\rightarrow\\infty$. The range of validity will be compared for the normal approximation and for the bootstrap procedures. Furthermore, it will be argued that the rates of convergence are different for the bootstrap proce...
Asymptotic properties of robust three-stage procedure based on bootstrap for m-estimator
Hlávka, Zdenéek
2000-01-01
The paper concerns the fixed-width confidence intervals for location based on M- estimators in the location model. A robust three-stage procedure is proposed and its asymptotic properties are studied. The performance of the procedure depends on some tuning parameters. Their effect on the proposed confidence interval is checked together with the overall behaviour of the procedure in a simulation study.
Improving the Reliability of Bootstrap Tests
Russell Davidson; MacKinnon, James G.
2000-01-01
We first propose procedures for estimating the rejection probabilities for bootstrap tests in Monte Carlo experiments without actually computing a bootstrap test for each replication. These procedures are only about twice as expensive as estimating rejection probabilities for asymptotic tersts. We then propose procedures for computing modified bootstrap P values that will often be more accurate than ordinary ones. These procedures are closely related to the double bootstrap, but they are far ...
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
Bootstrapping heteroskedastic regression models: wild bootstrap vs. pairs bootstrap
Flachaire, Emmanuel
2005-01-01
International audience In regression models, appropriate bootstrap methods for inference robust to heteroskedasticity of unknown form are the wild bootstrap and the pairs bootstrap. The finite sample performance of a heteroskedastic-robust test is investigated with Monte Carlo experiments. The simulation results suggest that one specific version of the wild bootstrap outperforms the other versions of the wild bootstrap and of the pairs bootstrap. It is the only one for which the bootstrap ...
A Neurocomputational Theory of how Explicit Learning Bootstraps Early Procedural Learning
Erick Joseph Paul; F. Gregory Ashby
2013-01-01
It is widely accepted that human learning and memory is mediated by multiple memory systems that are each best suited to different requirements and demands. Within the domain of categorization, at least two systems are thought to facilitate learning: an explicit (declarative) system depending largely on the prefrontal cortex, and a procedural (non-declarative) system depending on the basal ganglia. Substantial evidence suggests that each system is optimally suited to learn particular categori...
A neurocomputational theory of how explicit learning bootstraps early procedural learning
Paul, Erick J.; Ashby, F. Gregory
2013-01-01
It is widely accepted that human learning and memory is mediated by multiple memory systems that are each best suited to different requirements and demands. Within the domain of categorization, at least two systems are thought to facilitate learning: an explicit (declarative) system depending largely on the prefrontal cortex, and a procedural (non-declarative) system depending on the basal ganglia. Substantial evidence suggests that each system is optimally suited to learn particular categori...
On Bootstrap Tests of Symmetry About an Unknown Median.
Zheng, Tian; Gastwirth, Joseph L
2010-07-01
It is important to examine the symmetry of an underlying distribution before applying some statistical procedures to a data set. For example, in the Zuni School District case, a formula originally developed by the Department of Education trimmed 5% of the data symmetrically from each end. The validity of this procedure was questioned at the hearing by Chief Justice Roberts. Most tests of symmetry (even nonparametric ones) are not distribution free in finite sample sizes. Hence, using asymptotic distribution may not yield an accurate type I error rate or/and loss of power in small samples. Bootstrap resampling from a symmetric empirical distribution function fitted to the data is proposed to improve the accuracy of the calculated p-value of several tests of symmetry. The results show that the bootstrap method is superior to previously used approaches relying on the asymptotic distribution of the tests that assumed the data come from a normal distribution. Incorporating the bootstrap estimate in a recently proposed test due to Miao, Gel and Gastwirth (2006) preserved its level and shows it has reasonable power properties on the family of distribution evaluated.
Confidence Intervals for Effect Sizes: Applying Bootstrap Resampling
Banjanovic, Erin S.; Osborne, Jason W.
2016-01-01
Confidence intervals for effect sizes (CIES) provide readers with an estimate of the strength of a reported statistic as well as the relative precision of the point estimate. These statistics offer more information and context than null hypothesis statistic testing. Although confidence intervals have been recommended by scholars for many years,…
A comparison of four different block bootstrap methods
Boris Radovanov; Aleksandra Marcikić
2014-01-01
The paper contains a description of four different block bootstrap methods, i.e., non-overlapping block bootstrap, overlapping block bootstrap (moving block bootstrap), stationary block bootstrap and subsampling. Furthermore, the basic goal of this paper is to quantify relative efficiency of each mentioned block bootstrap procedure and then to compare those methods. To achieve the goal, we measure mean square errors of estimation variance returns. The returns are calculated from 1250 daily ob...
RIO: Analyzing proteomes by automated phylogenomics using resampled inference of orthologs
Directory of Open Access Journals (Sweden)
Eddy Sean R
2002-05-01
Full Text Available Abstract Background When analyzing protein sequences using sequence similarity searches, orthologous sequences (that diverged by speciation are more reliable predictors of a new protein's function than paralogous sequences (that diverged by gene duplication. The utility of phylogenetic information in high-throughput genome annotation ("phylogenomics" is widely recognized, but existing approaches are either manual or not explicitly based on phylogenetic trees. Results Here we present RIO (Resampled Inference of Orthologs, a procedure for automated phylogenomics using explicit phylogenetic inference. RIO analyses are performed over bootstrap resampled phylogenetic trees to estimate the reliability of orthology assignments. We also introduce supplementary concepts that are helpful for functional inference. RIO has been implemented as Perl pipeline connecting several C and Java programs. It is available at http://www.genetics.wustl.edu/eddy/forester/. A web server is at http://www.rio.wustl.edu/. RIO was tested on the Arabidopsis thaliana and Caenorhabditis elegans proteomes. Conclusion The RIO procedure is particularly useful for the automated detection of first representatives of novel protein subfamilies. We also describe how some orthologies can be misleading for functional inference.
Temperature Corrected Bootstrap Algorithm
Comiso, Joey C.; Zwally, H. Jay
1997-01-01
A temperature corrected Bootstrap Algorithm has been developed using Nimbus-7 Scanning Multichannel Microwave Radiometer data in preparation to the upcoming AMSR instrument aboard ADEOS and EOS-PM. The procedure first calculates the effective surface emissivity using emissivities of ice and water at 6 GHz and a mixing formulation that utilizes ice concentrations derived using the current Bootstrap algorithm but using brightness temperatures from 6 GHz and 37 GHz channels. These effective emissivities are then used to calculate surface ice which in turn are used to convert the 18 GHz and 37 GHz brightness temperatures to emissivities. Ice concentrations are then derived using the same technique as with the Bootstrap algorithm but using emissivities instead of brightness temperatures. The results show significant improvement in the area where ice temperature is expected to vary considerably such as near the continental areas in the Antarctic, where the ice temperature is colder than average, and in marginal ice zones.
Bhaumik, Snig
2015-01-01
If you are a web developer who designs and develops websites and pages using HTML, CSS, and JavaScript, but have very little familiarity with Bootstrap, this is the book for you. Previous experience with HTML, CSS, and JavaScript will be helpful, while knowledge of jQuery would be an extra advantage.
Pfiffner, H. J.
1969-01-01
Circuit can sample a number of transducers in sequence without drawing from them. This bootstrap unloader uses a differential amplifier with one input connected to a circuit which is the equivalent of the circuit to be unloaded, and the other input delivering the proper unloading currents.
The Local Fractional Bootstrap
DEFF Research Database (Denmark)
Bennedsen, Mikkel; Hounyo, Ulrich; Lunde, Asger;
We introduce a bootstrap procedure for high-frequency statistics of Brownian semistationary processes. More specifically, we focus on a hypothesis test on the roughness of sample paths of Brownian semistationary processes, which uses an estimator based on a ratio of realized power variations. Our...... to two empirical data sets: we assess the roughness of a time series of high-frequency asset prices and we test the validity of Kolmogorov's scaling law in atmospheric turbulence data.......We introduce a bootstrap procedure for high-frequency statistics of Brownian semistationary processes. More specifically, we focus on a hypothesis test on the roughness of sample paths of Brownian semistationary processes, which uses an estimator based on a ratio of realized power variations. Our...... and in simulations we observe that the bootstrap-based hypothesis test provides considerable finite-sample improvements over an existing test that is based on a central limit theorem. This is important when studying the roughness properties of time series data; we illustrate this by applying the bootstrap method...
Ding, Cody S
2005-02-01
Although multidimensional scaling (MDS) profile analysis is widely used to study individual differences, there is no objective way to evaluate the statistical significance of the estimated scale values. In the present study, a resampling technique (bootstrapping) was used to construct confidence limits for scale values estimated from MDS profile analysis. These bootstrap confidence limits were used, in turn, to evaluate the significance of marker variables of the profiles. The results from analyses of both simulation data and real data suggest that the bootstrap method may be valid and may be used to evaluate hypotheses about the statistical significance of marker variables of MDS profiles. PMID:16097342
Resampling Methods Revisited: Advancing the Understanding and Applications in Educational Research
Bai, Haiyan; Pan, Wei
2008-01-01
Resampling methods including randomization test, cross-validation, the jackknife and the bootstrap are widely employed in the research areas of natural science, engineering and medicine, but they lack appreciation in educational research. The purpose of the present review is to revisit and highlight the key principles and developments of…
The Chopthin Algorithm for Resampling
Gandy, Axel; Lau, F. Din-Houn
2016-08-01
Resampling is a standard step in particle filters and more generally sequential Monte Carlo methods. We present an algorithm, called chopthin, for resampling weighted particles. In contrast to standard resampling methods the algorithm does not produce a set of equally weighted particles; instead it merely enforces an upper bound on the ratio between the weights. Simulation studies show that the chopthin algorithm consistently outperforms standard resampling methods. The algorithms chops up particles with large weight and thins out particles with low weight, hence its name. It implicitly guarantees a lower bound on the effective sample size. The algorithm can be implemented efficiently, making it practically useful. We show that the expected computational effort is linear in the number of particles. Implementations for C++, R (on CRAN), Python and Matlab are available.
Schick, Simon; Rössler, Ole; Weingartner, Rolf
2016-10-01
Based on a hindcast experiment for the period 1982-2013 in 66 sub-catchments of the Swiss Rhine, the present study compares two approaches of building a regression model for seasonal streamflow forecasting. The first approach selects a single "best guess" model, which is tested by leave-one-out cross-validation. The second approach implements the idea of bootstrap aggregating, where bootstrap replicates are employed to select several models, and out-of-bag predictions provide model testing. The target value is mean streamflow for durations of 30, 60 and 90 days, starting with the 1st and 16th day of every month. Compared to the best guess model, bootstrap aggregating reduces the mean squared error of the streamflow forecast by seven percent on average. Thus, if resampling is anyway part of the model building procedure, bootstrap aggregating seems to be a useful strategy in statistical seasonal streamflow forecasting. Since the improved accuracy comes at the cost of a less interpretable model, the approach might be best suited for pure prediction tasks, e.g. as in operational applications.
Beran, Rudolf
1994-01-01
This essay is organized around the theoretical and computationalproblem of constructing bootstrap confidence sets, with forays into relatedtopics. The seven section headings are: Introduction; The Bootstrap World;Bootstrap Confidence Sets; Computing Bootstrap Confidence Sets; Quality ofBootstrap Confidence Sets; Iterated and Two-step Boostrap; Further Resources.
On the M fewer than N bootstrap approximation to the trimmed mean
Gribkova, N.; Helmers, R.
2008-01-01
We show that the M fewer than N (N is the real data sample size, M denotes the size of the bootstrap resample; M=N ! 0, as M ! 1) bootstrap approximation to the distribution of the trimmed mean is consistent without any conditions on the population distribution F, whereas Efron's naive (i.e. M = N)
A Score Based Approach to Wild Bootstrap Inference
Patrick M. Kline; Andres Santos
2010-01-01
We propose a generalization of the wild bootstrap of Wu (1986) and Liu (1988) based upon perturbing the scores of M-estimators. This "score bootstrap" procedure avoids recomputing the estimator in each bootstrap iteration, making it substantially less costly to compute than the conventional nonparametric bootstrap, particularly in complex nonlinear models. Despite this computational advantage, in the linear model, the score bootstrap studentized test statistic is equivalent to that of the con...
Improving the Reliability of Bootstrap Tests with the Fast Double Bootstrap
Davidson, Russell; MacKinnon, James
2006-01-01
Two procedures are proposed for estimating the rejection probabilities of bootstrap tests in Monte Carlo experiments without actually computing a bootstrap test for each replication. These procedures are only about twice as expensive (per replication) as estimating rejection probabilities for asymptotic tests. Then a new procedure is proposed for computing bootstrap P values that will often be more accurate than ordinary ones. This “fast double bootstrap” is closely related to the double boot...
The wild bootstrap for multilevel models
Modugno, Lucia; Giannerini, Simone
2015-01-01
In this paper we study the performance of the most popular bootstrap schemes for multilevel data. Also, we propose a modified version of the wild bootstrap procedure for hierarchical data structures. The wild bootstrap does not require homoscedasticity or assumptions on the distribution of the error processes. Hence, it is a valuable tool for robust inference in a multilevel framework. We assess the finite size performances of the schemes through a Monte Carlo study. The results show that for...
Magno, Alexandre
2013-01-01
A practical, step-by-step tutorial on developing websites for mobile using Bootstrap.This book is for anyone who wants to get acquainted with the new features available in Bootstrap 3 and who wants to develop websites with the mobile-first feature of Bootstrap. The reader should have a basic knowledge of Bootstrap as a frontend framework.
Investigations of dipole localization accuracy in MEG using the bootstrap.
Darvas, F; Rautiainen, M; Pantazis, D; Baillet, S; Benali, H; Mosher, J C; Garnero, L; Leahy, R M
2005-04-01
We describe the use of the nonparametric bootstrap to investigate the accuracy of current dipole localization from magnetoencephalography (MEG) studies of event-related neural activity. The bootstrap is well suited to the analysis of event-related MEG data since the experiments are repeated tens or even hundreds of times and averaged to achieve acceptable signal-to-noise ratios (SNRs). The set of repetitions or epochs can be viewed as a set of independent realizations of the brain's response to the experiment. Bootstrap resamples can be generated by sampling with replacement from these epochs and averaging. In this study, we applied the bootstrap resampling technique to MEG data from somatotopic experimental and simulated data. Four fingers of the right and left hand of a healthy subject were electrically stimulated, and about 400 trials per stimulation were recorded and averaged in order to measure the somatotopic mapping of the fingers in the S1 area of the brain. Based on single-trial recordings for each finger we performed 5000 bootstrap resamples. We reconstructed dipoles from these resampled averages using the Recursively Applied and Projected (RAP)-MUSIC source localization algorithm. We also performed a simulation for two dipolar sources with overlapping time courses embedded in realistic background brain activity generated using the prestimulus segments of the somatotopic data. To find correspondences between multiple sources in each bootstrap, sample dipoles with similar time series and forward fields were assumed to represent the same source. These dipoles were then clustered by a Gaussian Mixture Model (GMM) clustering algorithm using their combined normalized time series and topographies as feature vectors. The mean and standard deviation of the dipole position and the dipole time series in each cluster were computed to provide estimates of the accuracy of the reconstructed source locations and time series. PMID:15784414
Kim, Jae-In; Kim, Taejung
2016-01-01
Epipolar resampling is the procedure of eliminating vertical disparity between stereo images. Due to its importance, many methods have been developed in the computer vision and photogrammetry field. However, we argue that epipolar resampling of image sequences, instead of a single pair, has not been studied thoroughly. In this paper, we compare epipolar resampling methods developed in both fields for handling image sequences. Firstly we briefly review the uncalibrated and calibrated epipolar resampling methods developed in computer vision and photogrammetric epipolar resampling methods. While it is well known that epipolar resampling methods developed in computer vision and in photogrammetry are mathematically identical, we also point out differences in parameter estimation between them. Secondly, we tested representative resampling methods in both fields and performed an analysis. We showed that for epipolar resampling of a single image pair all uncalibrated and photogrammetric methods tested could be used. More importantly, we also showed that, for image sequences, all methods tested, except the photogrammetric Bayesian method, showed significant variations in epipolar resampling performance. Our results indicate that the Bayesian method is favorable for epipolar resampling of image sequences. PMID:27011186
Kim, Jae-In; Kim, Taejung
2016-03-22
Epipolar resampling is the procedure of eliminating vertical disparity between stereo images. Due to its importance, many methods have been developed in the computer vision and photogrammetry field. However, we argue that epipolar resampling of image sequences, instead of a single pair, has not been studied thoroughly. In this paper, we compare epipolar resampling methods developed in both fields for handling image sequences. Firstly we briefly review the uncalibrated and calibrated epipolar resampling methods developed in computer vision and photogrammetric epipolar resampling methods. While it is well known that epipolar resampling methods developed in computer vision and in photogrammetry are mathematically identical, we also point out differences in parameter estimation between them. Secondly, we tested representative resampling methods in both fields and performed an analysis. We showed that for epipolar resampling of a single image pair all uncalibrated and photogrammetric methods tested could be used. More importantly, we also showed that, for image sequences, all methods tested, except the photogrammetric Bayesian method, showed significant variations in epipolar resampling performance. Our results indicate that the Bayesian method is favorable for epipolar resampling of image sequences.
Efficient bootstrap with weakly dependent processes
Bravo, Francesco; Crudu, Federico
2012-01-01
The efficient bootstrap methodology is developed for overidentified moment conditions models with weakly dependent observation. The resulting bootstrap procedure is shown to be asymptotically valid and can be used to approximate the distributions of t-statistics, the J-statistic for overidentifying
Polyphase antialiasing in resampling of images.
Seidner, Daniel
2005-11-01
Changing resolution of images is a common operation. It is also common to use simple, i.e., small, interpolation kernels satisfying some "smoothness" qualities that are determined in the spatial domain. Typical applications use linear interpolation or piecewise cubic interpolation. These are popular since the interpolation kernels are small and the results are acceptable. However, since the interpolation kernel, i.e., impulse response, has a finite and small length, the frequency domain characteristics are not good. Therefore, when we enlarge the image by a rational factor of (L/M), two effects usually appear and cause a noticeable degradation in the quality of the image. The first is jagged edges and the second is low-frequency modulation of high-frequency components, such as sampling noise. Both effects result from aliasing. Enlarging an image by a factor of (L/M) is represented by first interpolating the image on a grid L times finer than the original sampling grid, and then resampling it every M grid points. While the usual treatment of the aliasing created by the resampling operation is aimed toward improving the interpolation filter in the frequency domain, this paper suggests reducing the aliasing effects using a polyphase representation of the interpolation process and treating the polyphase filters separately. The suggested procedure is simple. A considerable reduction in the aliasing effects is obtained for a small interpolation kernel size. We discuss separable interpolation and so the analysis is conducted for the one-dimensional case. PMID:16279186
Shaar, R.; Ron, H.; Tauxe, L.; Kessel, R.; Agnon, A.
2011-12-01
constraints for the 'true' value. We introduce a new bootstrap procedure to calculate a 95% confidence interval of the result. We substantiate the new procedure by conducting two independent tests. The first uses synthetic re-melted slag produced under known field intensities - 3 SD samples and 4 non-SD samples. The second compares paleointensity determinations from archaeological slag samples of the same age - 34 SD samples and 10 non-SD samples. The two tests suggest that the bootstrap technique is an optimal approach for non-ideal dataset.
Evaluating Neural Network Predictors by Bootstrapping
Blake LeBaron; Andreas S. Weigend
1994-01-01
We present a new method, inspired by the bootstrap, whose goal it is to determine the quality and reliability of a neural network predictor. Our method leads to more robust forecasting along with a large amount of statistical information on forecast performance that we exploit. We exhibit the method in the context of multi-variate time series prediction on financial data from the New York Stock Exchange. It turns out that the variation due to different resamplings (i.e., splits between traini...
Non-Parametric Data Dependent Bootstrap for Conditional Moment Model
Bruce E. Hansen
2000-01-01
A new non-parametric bootstrap is introduced for dependent data. The bootstrap is based on a weighted empirical-likelihood estimate of the one-step-ahead conditional distribution, imposing the conditional moment restrictions implied by the model. This is the first dependent-data bootstrap procedure which imposes conditional moment restrictions on a bootstrap distribution. The method can be applied to form confidence intervals and p-values from hypothesis tests in Generalized Method of Moments...
Chain ladder method: Bayesian bootstrap versus classical bootstrap
Peters, Gareth W.; Mario V. W\\"uthrich; Shevchenko, Pavel V.
2010-01-01
The intention of this paper is to estimate a Bayesian distribution-free chain ladder (DFCL) model using approximate Bayesian computation (ABC) methodology. We demonstrate how to estimate quantities of interest in claims reserving and compare the estimates to those obtained from classical and credibility approaches. In this context, a novel numerical procedure utilising Markov chain Monte Carlo (MCMC), ABC and a Bayesian bootstrap procedure was developed in a truly distribution-free setting. T...
Bootstrap Methods in Econometrics
MacKinnon, James G.
2006-01-01
There are many bootstrap methods that can be used for econometric analysis. In certain circumstances, such as regression models with independent and identically distributed error terms, appropriately chosen bootstrap methods generally work very well. However, there are many other cases, such as regression models with dependent errors, in which bootstrap methods do not always work well. This paper discusses a large number of bootstrap methods that can be useful in econometrics. Applications to...
A Direct Bootstrap Method for Complex Sampling Designs From a Finite Population
Antal, Erika; Tillé, Yves
2016-01-01
In complex designs, classical bootstrap methods result in a biased variance estimator when the sampling design is not taken into account. Resampled units are usually rescaled or weighted in order to achieve unbiasedness in the linear case. In the present article, we propose novel resampling methods that may be directly applied to variance estimation. These methods consist of selecting subsamples under a completely different sampling scheme from that which generated the original sample, whic...
Bootstrapping structured page segmentation
Ma, Huanfeng; Doermann, David S.
2003-01-01
In this paper, we present an approach to the bootstrap learning of a page segmentation model. The idea evolves from attempts to segment dictionaries that often have a consistent page structure, and is extended to the segmentation of more general structured documents. In cases of highly regular structure, the layout can be learned from examples of only a few pages. The system is first trained using a small number of samples, and a larger test set is processed based on the training result. After making corrections to a selected subset of the test set, these corrected samples are combined with the original training samples to generate bootstrap samples. The newly created samples are used to retrain the system, refine the learned features and resegment the test samples. This procedure is applied iteratively until the learned parameters are stable. Using this approach, we do not need to initially provide a large set of training samples. We have applied this segmentation to many structured documents such as dictionaries, phone books, spoken language transcripts, and obtained satisfying segmentation performance.
Approximate regenerative-block bootstrap for Markov chains: some simulation studies
Bertail, Patrice; Clémençon, Stéphan
2007-01-01
Abstract : In Bertail & Clémençon (2005a) a novel methodology for bootstrappinggeneral Harris Markov chains has been proposed, which crucially exploits their renewalproperties (when eventually extended via the Nummelin splitting technique) and has theoreticalproperties that surpass other existing methods within the Markovian framework(bmoving block bootstrap, sieve bootstrap etc...). This paper is devoted to discuss practicalissues related to the implementation of this specific resampling met...
Fixed-b Subsampling and Block Bootstrap: Improved Confidence Sets Based on P-value Calibration
Shao, Xiaofeng; Politis, Dimitris N.
2012-01-01
Subsampling and block-based bootstrap methods have been used in a wide range of inference problems for time series. To accommodate the dependence, these resampling methods involve a bandwidth parameter, such as subsampling window width and block size in the block-based bootstrap. In empirical work, using different bandwidth parameters could lead to different inference results, but the traditional first order asymptotic theory does not capture the choice of the bandwidth. In this article, we p...
Iterated smoothed bootstrap confidence intervals for population quantiles
Lee, SMS; Ho, YHS
2005-01-01
This paper investigates the effects of smoothed bootstrap iterations on coverage probabilities of smoothed bootstrap and bootstrap-t confidence intervals for population quantiles, and establishes the optimal kernel bandwidths at various stages of the smoothing procedures. The conventional smoothed bootstrap and bootstrap-t methods have been known to yield one-sided coverage errors of orders O(n−1/2) and o(n−2/3), respectively, for intervals based on the sample quantile of a random sample of s...
Change-point in stochastic design regression and the bootstrap
Seijo, Emilio; Sen, Bodhisattva
2011-01-01
In this paper we study the consistency of different bootstrap procedures for constructing confidence intervals (CIs) for the unique jump discontinuity (change-point) in an otherwise smooth regression function in a stochastic design setting. This problem exhibits nonstandard asymptotics and we argue that the standard bootstrap procedures in regression fail to provide valid confidence intervals for the change-point. We propose a version of smoothed bootstrap, illustrate its remarkable finite sa...
Nonparametric bootstrap prediction
Fushiki, Tadayoshi; Komaki, Fumiyasu; Aihara, Kazuyuki
2005-01-01
Ensemble learning has recently been intensively studied in the field of machine learning. `Bagging' is a method of ensemble learning and uses bootstrap data to construct various predictors. The required prediction is then obtained by averaging the predictors. Harris proposed using this technique with the parametric bootstrap predictive distribution to construct predictive distributions, and showed that the parametric bootstrap predictive distribution gives asymptotically better prediction tha...
A Resampling Based Clustering Algorithm for Replicated Gene Expression Data.
Li, Han; Li, Chun; Hu, Jie; Fan, Xiaodan
2015-01-01
In gene expression data analysis, clustering is a fruitful exploratory technique to reveal the underlying molecular mechanism by identifying groups of co-expressed genes. To reduce the noise, usually multiple experimental replicates are performed. An integrative analysis of the full replicate data, instead of reducing the data to the mean profile, carries the promise of yielding more precise and robust clusters. In this paper, we propose a novel resampling based clustering algorithm for genes with replicated expression measurements. Assuming those replicates are exchangeable, we formulate the problem in the bootstrap framework, and aim to infer the consensus clustering based on the bootstrap samples of replicates. In our approach, we adopt the mixed effect model to accommodate the heterogeneous variances and implement a quasi-MCMC algorithm to conduct statistical inference. Experiments demonstrate that by taking advantage of the full replicate data, our algorithm produces more reliable clusters and has robust performance in diverse scenarios, especially when the data is subject to multiple sources of variance. PMID:26671802
A comparison of four different block bootstrap methods
Directory of Open Access Journals (Sweden)
Boris Radovanov
2014-12-01
Full Text Available The paper contains a description of four different block bootstrap methods, i.e., non-overlapping block bootstrap, overlapping block bootstrap (moving block bootstrap, stationary block bootstrap and subsampling. Furthermore, the basic goal of this paper is to quantify relative efficiency of each mentioned block bootstrap procedure and then to compare those methods. To achieve the goal, we measure mean square errors of estimation variance returns. The returns are calculated from 1250 daily observations of Serbian stock market index values BELEX15 from April 2009 to April 2014. Thereby, considering the effects of potential changes in decisions according to variations in the sample length and purposes of the use, this paper introduces stability analysis which contains robustness testing of the different sample size and the different block length. Testing results indicate some changes in bootstrap method efficiencies when altering the sample size or the block length.
Jongjoo, Kim; Davis, Scott K; Taylor, Jeremy F
2002-06-01
Empirical confidence intervals (CIs) for the estimated quantitative trait locus (QTL) location from selective and non-selective non-parametric bootstrap resampling methods were compared for a genome scan involving an Angus x Brahman reciprocal fullsib backcross population. Genetic maps, based on 357 microsatellite markers, were constructed for 29 chromosomes using CRI-MAP V2.4. Twelve growth, carcass composition and beef quality traits (n = 527-602) were analysed to detect QTLs utilizing (composite) interval mapping approaches. CIs were investigated for 28 likelihood ratio test statistic (LRT) profiles for the one QTL per chromosome model. The CIs from the non-selective bootstrap method were largest (87 7 cM average or 79-2% coverage of test chromosomes). The Selective II procedure produced the smallest CI size (42.3 cM average). However, CI sizes from the Selective II procedure were more variable than those produced by the two LOD drop method. CI ranges from the Selective II procedure were also asymmetrical (relative to the most likely QTL position) due to the bias caused by the tendency for the estimated QTL position to be at a marker position in the bootstrap samples and due to monotonicity and asymmetry of the LRT curve in the original sample.
Jongjoo, Kim; Davis, Scott K; Taylor, Jeremy F
2002-06-01
Empirical confidence intervals (CIs) for the estimated quantitative trait locus (QTL) location from selective and non-selective non-parametric bootstrap resampling methods were compared for a genome scan involving an Angus x Brahman reciprocal fullsib backcross population. Genetic maps, based on 357 microsatellite markers, were constructed for 29 chromosomes using CRI-MAP V2.4. Twelve growth, carcass composition and beef quality traits (n = 527-602) were analysed to detect QTLs utilizing (composite) interval mapping approaches. CIs were investigated for 28 likelihood ratio test statistic (LRT) profiles for the one QTL per chromosome model. The CIs from the non-selective bootstrap method were largest (87 7 cM average or 79-2% coverage of test chromosomes). The Selective II procedure produced the smallest CI size (42.3 cM average). However, CI sizes from the Selective II procedure were more variable than those produced by the two LOD drop method. CI ranges from the Selective II procedure were also asymmetrical (relative to the most likely QTL position) due to the bias caused by the tendency for the estimated QTL position to be at a marker position in the bootstrap samples and due to monotonicity and asymmetry of the LRT curve in the original sample. PMID:12220133
Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach
Rodrigues, D. B. B.
2015-12-01
Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.
Tiwari, Mukesh K.; Adamowski, Jan
2013-10-01
A new hybrid wavelet-bootstrap-neural network (WBNN) model is proposed in this study for short term (1, 3, and 5 day; 1 and 2 week; and 1 and 2 month) urban water demand forecasting. The new method was tested using data from the city of Montreal in Canada. The performance of the WBNN method was compared with the autoregressive integrated moving average (ARIMA) and autoregressive integrated moving average model with exogenous input variables (ARIMAX), traditional NNs, wavelet analysis-based NNs (WNN), bootstrap-based NNs (BNN), and a simple naïve persistence index model. The WBNN model was developed as an ensemble of several NNs built using bootstrap resamples of wavelet subtime series instead of raw data sets. The results demonstrated that the hybrid WBNN and WNN models produced significantly more accurate forecasting results than the traditional NN, BNN, ARIMA, and ARIMAX models. It was also found that the WBNN model reduces the uncertainty associated with the forecasts, and the performance of WBNN forecasted confidence bands was found to be more accurate and reliable than BNN forecasted confidence bands. It was found in this study that maximum temperature and total precipitation improved the accuracy of water demand forecasts using wavelet analysis. The performance of WBNN models was also compared for different numbers of bootstrap resamples (i.e., 25, 50, 100, 200, and 500) and it was found that WBNN models produced optimum results with different numbers of bootstrap resamples for different lead time forecasts with considerable variability.
Model Based Bootstrap Methods for Interval Censored Data
Sen, Bodhisattva; Xu, Gongjun
2013-01-01
We investigate the performance of model based bootstrap methods for constructing point-wise confidence intervals around the survival function with interval censored data. We show that bootstrapping from the nonparametric maximum likelihood estimator of the survival function is inconsistent for both the current status and case 2 interval censoring models. A model based smoothed bootstrap procedure is proposed and shown to be consistent. In addition, simulation studies are conducted to illustra...
Echeverri, Alejandro Castedo; Serone, Marco
2016-01-01
We study the numerical bounds obtained using a conformal-bootstrap method - advocated in ref. [1] but never implemented so far - where different points in the plane of conformal cross ratios $z$ and $\\bar z$ are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point $z=\\bar z =1/2$, we can consistently "integrate out" higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this "effective" bootstrap by studying the 3D Ising and $O(n)$ vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the $O(n)$ vector models, with $n=2,3,4$, which have not yet been computed using bootstrap techniques.
Echeverri, Alejandro Castedo; von Harling, Benedict; Serone, Marco
2016-09-01
We study the numerical bounds obtained using a conformal-bootstrap method — advocated in ref. [1] but never implemented so far — where different points in the plane of conformal cross ratios z and overline{z} are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point z=overline{z}=1/2 , we can consistently "integrate out" higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this "effective" bootstrap by studying the 3D Ising and O( n) vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the O( n) vector models, with n = 2, 3, 4, which have not yet been computed using bootstrap techniques.
Bootstrapping phylogenies inferred from rearrangement data
Directory of Open Access Journals (Sweden)
Lin Yu
2012-08-01
Full Text Available Abstract Background Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. Results We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Conclusions Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its
Bootstrap percolation with inhibition
Einarsson, Hafsteinn; Lengler, Johannes; Panagiotou, Konstantinos; Mousset, Frank; Steger, Angelika
2014-01-01
Bootstrap percolation is a prominent framework for studying the spreading of activity on a graph. We begin with an initial set of active vertices. The process then proceeds in rounds, and further vertices become active as soon as they have a certain number of active neighbors. A recurring feature in bootstrap percolation theory is an `all-or-nothing' phenomenon: either the size of the starting set is so small that the process stops very soon, or it percolates (almost) completely. Motivated by...
Dynamics of bootstrap percolation
Indian Academy of Sciences (India)
Prabodh Shukla
2008-08-01
Bootstrap percolation transition may be first order or second order, or it may have a mixed character where a first-order drop in the order parameter is preceded by critical fluctuations. Recent studies have indicated that the mixed transition is characterized by power-law avalanches, while the continuous transition is characterized by truncated avalanches in a related sequential bootstrap process. We explain this behaviour on the basis of an analytical and numerical study of the avalanche distributions on a Bethe lattice.
Wild cluster bootstrap confidence intervals
MacKinnon, James G.
2014-01-01
Confidence intervals based on cluster-robust covariance matrices can be constructed in many ways. In addition to conventional intervals obtained by inverting Wald (t) tests, the paper studies intervals obtained by inverting LM tests, studentized bootstrap intervals based on the wild cluster bootstrap, and restricted bootstrap intervals obtained by inverting bootstrap Wald and LM tests. It also studies the choice of an auxiliary distribution for the wild bootstrap, a modified covariance matrix...
Breakdown theory for bootstrap quantiles
Singh, Kesar
1998-01-01
A general formula for computing the breakdown point in robustness for the $t$th bootstrap quantile of a statistic $T_n$ is obtained. The answer depends on $t$ and the breakdown point of $T_n$. Since the bootstrap quantiles are vital ingredients of bootstrap confidence intervals, the theory has implications pertaining to robustness of bootstrap confidence intervals. For certain $L$ and $M$ estimators, a robustification of bootstrap is suggested via the notion of Winsorization.
Simulation-Optimization via Kriging and Bootstrapping: A Survey (Revision of CentER DP 2011-064)
Kleijnen, Jack P.C.
2013-01-01
Abstract: This article surveys optimization of simulated systems. The simulation may be either deterministic or random. The survey reflects the author’s extensive experience with simulation-optimization through Kriging (or Gaussian process) metamodels. The analysis of these metamodels may use parametric bootstrapping for deterministic simulation or distribution-free bootstrapping (or resampling) for random simulation. The survey covers: (1) Simulation-optimization through "efficient global op...
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
Quantitative evaluation of PET image using event information bootstrap
Song, Hankyeol; Kwak, Shin Hye; Kim, Kyeong Min; Kang, Joo Hyun; Chung, Yong Hyun; Woo, Sang-Keun
2016-04-01
The purpose of this study was to enhance the effect in the PET image quality according to event bootstrap of small animal PET data. In order to investigate the time difference condition, realigned sinograms were generated from randomly sampled data set using bootstrap. List-mode data was obtained from small animal PET scanner for Ge-68 30 sec, Y-90 20 min and Y-90 60 min. PET image was reconstructed by Ordered Subset Expectation Maximization(OSEM) 2D with the list-mode format. Image analysis was investigated by Signal to Noise Ratio(SNR) of Ge-68 and Y-90 image. Non-parametric resampled PET image SNR percent change for the Ge-68 30 sec, Y-90 60 min, and Y-90 20 min was 1.69 %, 7.03 %, and 4.78 %, respectively. SNR percent change of non-parametric resampled PET image with time difference condition was 1.08 % for the Ge-68 30 sec, 6.74 % for the Y-90 60 min and 10.94 % for the Y-90 29 min. The result indicated that the bootstrap with time difference condition had a potential to improve a noisy Y-90 PET image quality. This method should be expected to reduce Y-90 PET measurement time and to enhance its accuracy.
Using Commonly Available Software for Conducting Bootstrap Analyses.
Fan, Xitao
Bootstrap analysis, both for nonparametric statistical inference and for describing sample results stability and replicability, has been gaining prominence among quantitative researchers in educational and psychological research. Procedurally, however, it is often quite a challenge for quantitative researchers to implement bootstrap analysis in…
Quilty, John; Adamowski, Jan; Khalil, Bahaa; Rathinasamy, Maheswaran
2016-03-01
The input variable selection problem has recently garnered much interest in the time series modeling community, especially within water resources applications, demonstrating that information theoretic (nonlinear)-based input variable selection algorithms such as partial mutual information (PMI) selection (PMIS) provide an improved representation of the modeled process when compared to linear alternatives such as partial correlation input selection (PCIS). PMIS is a popular algorithm for water resources modeling problems considering nonlinear input variable selection; however, this method requires the specification of two nonlinear regression models, each with parametric settings that greatly influence the selected input variables. Other attempts to develop input variable selection methods using conditional mutual information (CMI) (an analog to PMI) have been formulated under different parametric pretenses such as k nearest-neighbor (KNN) statistics or kernel density estimates (KDE). In this paper, we introduce a new input variable selection method based on CMI that uses a nonparametric multivariate continuous probability estimator based on Edgeworth approximations (EA). We improve the EA method by considering the uncertainty in the input variable selection procedure by introducing a bootstrap resampling procedure that uses rank statistics to order the selected input sets; we name our proposed method bootstrap rank-ordered CMI (broCMI). We demonstrate the superior performance of broCMI when compared to CMI-based alternatives (EA, KDE, and KNN), PMIS, and PCIS input variable selection algorithms on a set of seven synthetic test problems and a real-world urban water demand (UWD) forecasting experiment in Ottawa, Canada.
Fourier transform resampling: Theory and application
International Nuclear Information System (INIS)
One of the most challenging problems in medical imaging is the development of reconstruction algorithms for nonstandard geometries. This work focuses on the application of Fourier analysis to the problem of resampling or rebinning. Conventional resampling methods utilizing some form of interpolation almost always result in a loss of resolution in the tomographic image. Fourier Transform Resampling (FTRS) offers potential improvement because the Modulation Transfer Function (MTF) of the process behaves like an ideal low pass filter. The MTF, however, is nonstationary if the coordinate transformation is nonlinear. FTRS may be viewed as a generalization of the linear coordinate transformations of standard Fourier analysis. Simulated MTF's were obtained by projecting point sources at different transverse positions in the flat fan beam detector geometry. These MTF's were compared to the closed form expression for FIRS. Excellent agreement was obtained for frequencies at or below the estimated cutoff frequency. The resulting FTRS algorithm is applied to simulations with symmetric fan beam geometry, an elliptical orbit and uniform attenuation, with a normalized root mean square error (NRME) of 0.036. Also, a Tc-99m point source study (1 cm dia., placed in air 10 cm from the COR) for a circular fan beam acquisition was reconstructed with a hybrid resampling method. The FWHM of the hybrid resampling method was 11.28 mm and compares favorably with a direct reconstruction (FWHM: 11.03 mm)
BoCluSt: Bootstrap Clustering Stability Algorithm for Community Detection.
Garcia, Carlos
2016-01-01
The identification of modules or communities in sets of related variables is a key step in the analysis and modeling of biological systems. Procedures for this identification are usually designed to allow fast analyses of very large datasets and may produce suboptimal results when these sets are of a small to moderate size. This article introduces BoCluSt, a new, somewhat more computationally intensive, community detection procedure that is based on combining a clustering algorithm with a measure of stability under bootstrap resampling. Both computer simulation and analyses of experimental data showed that BoCluSt can outperform current procedures in the identification of multiple modules in data sets with a moderate number of variables. In addition, the procedure provides users with a null distribution of results to evaluate the support for the existence of community structure in the data. BoCluSt takes individual measures for a set of variables as input, and may be a valuable and robust exploratory tool of network analysis, as it provides 1) an estimation of the best partition of variables into modules, 2) a measure of the support for the existence of modular structures, and 3) an overall description of the whole structure, which may reveal hierarchical modular situations, in which modules are composed of smaller sub-modules.
Bootstrap position analysis for forecasting low flow frequency
Tasker, Gary D.; Dunne, P.
1997-01-01
A method of random resampling of residuals from stochastic models is used to generate a large number of 12-month-long traces of natural monthly runoff to be used in a position analysis model for a water-supply storage and delivery system. Position analysis uses the traces to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows conditioned on the current reservoir levels and streamflows. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality, fewer parameters need to be estimated directly from the data, and accounting for parameter uncertainty is easily done. For a given set of operating rules and water-use requirements for a system, water managers can use such a model as a decision-making tool to evaluate different operating rules. ?? ASCE,.
Bootstrapping Density-Weighted Average Derivatives
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael
Employing the "small bandwidth" asymptotic framework of Cattaneo, Crump, and Jansson (2009), this paper studies the properties of a variety of bootstrap-based inference procedures associated with the kernel-based density-weighted averaged derivative estimator proposed by Powell, Stock, and Stoker...
Maximum a posteriori resampling of noisy, spatially correlated data
Goff, John A.; Jenkins, Chris; Calder, Brian
2006-08-01
is typical for such data, affected by speckle noise. Compared to filtering, maximum a posteriori resampling provides an objective and optimal method for reducing noise, and better preservation of the statistical properties of the sampled field. The primary disadvantage is that maximum a posteriori resampling is a computationally expensive procedure.
Bootstrapped models for intrinsic random functions
Energy Technology Data Exchange (ETDEWEB)
Campbell, K.
1988-08-01
Use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process. The fact that this function has to be estimated from data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the bootstrap in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as their kriging variance, provide a reasonable picture of variability introduced by imperfect estimation of the generalized covariance function.
Rejon-Barrera, Fernando
2015-01-01
We work out all of the details required for implementation of the conformal bootstrap program applied to the four-point function of two scalars and two vectors in an abstract conformal field theory in arbitrary dimension. This includes a review of which tensor structures make appearances, a construction of the projectors onto the required mixed symmetry representations, and a computation of the conformal blocks for all possible operators which can be exchanged. These blocks are presented as differential operators acting upon the previously known scalar conformal blocks. Finally, we set up the bootstrap equations which implement crossing symmetry. Special attention is given to the case of conserved vectors, where several simplifications occur.
Introduction to the Bootstrap World
Boos, Dennis D.
2003-01-01
The bootstrap has made a fundamental impact on how we carry out statistical inference in problems without analytic solutions. This fact is illustrated with examples and comments that emphasize the parametric bootstrap and hypothesis testing.
An approximate analytical approach to resampling averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, M.
2004-01-01
Using a novel reformulation, we develop a framework to compute approximate resampling data averages analytically. The method avoids multiple retraining of statistical models on the samples. Our approach uses a combination of the replica "trick" of statistical physics and the TAP approach for appr...
Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference
Olea, R.A.; Pardo-Iguzquiza, E.
2011-01-01
The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.
Testing for heteroscedasticity in jumpy and noisy high-frequency data: A resampling approach
DEFF Research Database (Denmark)
Christensen, Kim; Hounyo, Ulrich; Podolskij, Mark
in the presence of a heteroscedastic volatility term (and has a standard normal distribution otherwise). The test is inspected in a general Monte Carlo simulation setting, where we note that in finite samples the asymptotic theory is severely distorted by infinite-activity price jumps. To improve inference, we...... suggest a bootstrap approach to test the null of homoscedasticity. We prove the first-order validity of this procedure, while in simulations the bootstrap leads to almost correctly sized tests. As an illustration, we apply the bootstrapped version of our t-statistic to a large cross-section of equity high...
Poland, David; Simmons-Duffin, David
2016-06-01
The conformal bootstrap was proposed in the 1970s as a strategy for calculating the properties of second-order phase transitions. After spectacular success elucidating two-dimensional systems, little progress was made on systems in higher dimensions until a recent renaissance beginning in 2008. We report on some of the main results and ideas from this renaissance, focusing on new determinations of critical exponents and correlation functions in the three-dimensional Ising and O(N) models.
Assessing uncertainties in superficial water provision by different bootstrap-based techniques
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario
2014-05-01
An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be
Bo E. Honoré; Hu, Luojia
2015-01-01
The bootstrap is a convenient tool for calculating standard errors of the parameters of complicated econometric models. Unfortunately, the fact that these models are complicated often makes the bootstrap extremely slow or even practically infeasible. This paper proposes an alternative to the bootstrap that relies only on the estimation of one-dimensional parameters. The paper contains no new difficult math. But we believe that it can be useful.
Detrending bootstrap unit root tests
Smeekes, S.
2009-01-01
The role of detrending in bootstrap unit root tests is investigated. When bootstrapping, detrending must not only be done for the construction of the test statistic, but also in the first step of the bootstrap algorithm. It is argued that the two points should be treated separately. Asymptotic validity of sieve bootstrap ADF unit root tests is shown for test statistics based on full sample and recursive OLS and GLS detrending. It is also shown that the detrending method in the first step of t...
Bootstrapping and Bartlett corrections in the cointegrated VAR model
Omtzigt, P.H.; Fachin, S.
2002-01-01
The small sample properties of tests on long-run coefficients in cointegrated systems are still a matter of concern to applied econometricians. We compare the performance of the Bartlett correction, the bootstrap and the fast double bootstrap for tests on ccointegration parameters in the maximum likelihood framework. We show by means of a theoretical result and simulations that all three procedures should be based on the unrestricted estimate of the cointegration vectors. The fast double boot...
Chester, Shai M
2016-01-01
We initiate the conformal bootstrap study of Quantum Electrodynamics in $2+1$ space-time dimensions (QED$_{3}$) with $N$ flavors of charged fermions by focusing on the 4-point function of four monopole operators with the lowest unit of topological charge. We obtain upper bounds on the scaling dimension of the doubly-charged monopole operator, with and without assuming other gaps in the operator spectrum. Intriguingly, we find a (gap-dependent) kink in these bounds that comes reasonably close to the large $N$ extrapolation of the scaling dimensions of the singly-charged and doubly-charged monopole operators down to $N=4$ and $N=6$.
Iliesiu, Luca; Kos, Filip; Poland, David; Pufu, Silviu S.; Simmons-Duffin, David; Yacoby, Ran
2016-03-01
We study the conformal bootstrap for a 4-point function of fermions in 3D. We first introduce an embedding formalism for 3D spinors and compute the conformal blocks appearing in fermion 4-point functions. Using these results, we find general bounds on the dimensions of operators appearing in the ψ × ψ OPE, and also on the central charge C T . We observe features in our bounds that coincide with scaling dimensions in the GrossNeveu models at large N . We also speculate that other features could coincide with a fermionic CFT containing no relevant scalar operators.
Fixed-b Subsampling and Block Bootstrap: Improved Confidence Sets Based on P-value Calibration
Shao, Xiaofeng
2012-01-01
Subsampling and block-based bootstrap methods have been used in a wide range of inference problems for time series. To accommodate the dependence, these resampling methods involve a bandwidth parameter, such as subsampling window width and block size in the block-based bootstrap. In empirical work, using different bandwidth parameters could lead to different inference results, but the traditional first order asymptotic theory does not capture the choice of the bandwidth. In this article, we propose to adopt the fixed-b approach, as advocated by Kiefer and Vogelsang (2005) in the heteroscedasticity-autocorrelation robust testing context, to account for the influence of the bandwidth on the inference. Under the fixed-b asymptotic framework, we derive the asymptotic null distribution of the p-values for subsampling and the moving block bootstrap, and further propose a calibration of the traditional small-b based confidence intervals (regions, bands) and tests. Our treatment is fairly general as it includes both ...
On sieve bootstrap prediction intervals.
Andrés M. Alonso; Peña, Daniel; Romo Urroz, Juan
2003-01-01
In this paper we consider a sieve bootstrap method for constructing nonparametric prediction intervals for a general class of linear processes. We show that the sieve bootstrap provides consistent estimators of the conditional distribution of future values given the observed data.
Ultrafast Approximation for Phylogenetic Bootstrap
Bui Quang Minh, [No Value; Nguyen, Thi; von Haeseler, Arndt
2013-01-01
Nonparametric bootstrap has been a widely used tool in phylogenetic analysis to assess the clade support of phylogenetic trees. However, with the rapidly growing amount of data, this task remains a computational bottleneck. Recently, approximation methods such as the RAxML rapid bootstrap (RBS) and
Explorations in Statistics: the Bootstrap
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…
Second Thoughts on the Bootstrap
Efron, Bradley
2003-01-01
This brief review article is appearing in the issue of Statistical Science that marks the 25th anniversary of the bootstrap. It concerns some of the theoretical and methodological aspects of the bootstrap and how they might influence future work in statistics.
Breakdown Point Theory for Implied Probability Bootstrap
Lorenzo Camponovo; Taisuke Otsu
2011-01-01
This paper studies robustness of bootstrap inference methods under moment conditions. In particular, we compare the uniform weight and implied probability bootstraps by analyzing behaviors of the bootstrap quantiles when outliers take arbitrarily large values, and derive the breakdown points for those bootstrap quantiles. The breakdown point properties characterize the situation where the implied probability bootstrap is more robust than the uniform weight bootstrap against outliers. Simulati...
Bootstrapped models for intrinsic random functions
Energy Technology Data Exchange (ETDEWEB)
Campbell, K.
1987-01-01
The use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process, and the fact that this function has to be estimated from the data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the ''bootstrap'' in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as of their ''kriging variance,'' provide a reasonable picture of the variability introduced by imperfect estimation of the generalized covariance function.
Bootstrapping pre-averaged realized volatility under market microstructure noise
DEFF Research Database (Denmark)
Hounyo, Ulrich; Goncalves, Sílvia; Meddahi, Nour
The main contribution of this paper is to propose a bootstrap method for inference on integrated volatility based on the pre-averaging approach of Jacod et al. (2009), where the pre-averaging is done over all possible overlapping blocks of consecutive observations. The overlapping nature of the pre......-averaged returns implies that these are kn-dependent with kn growing slowly with the sample size n. This motivates the application of a blockwise bootstrap method. We show that the "blocks of blocks" bootstrap method suggested by Politis and Romano (1992) (and further studied by Bühlmann and Künsch (1995)) is...... valid only when volatility is constant. The failure of the blocks of blocks bootstrap is due to the heterogeneity of the squared pre-averaged returns when volatility is stochastic. To preserve both the dependence and the heterogeneity of squared pre-averaged returns, we propose a novel procedure that...
Collier, Scott; Yin, Xi
2016-01-01
We constrain the spectrum of two-dimensional unitary, compact conformal field theories with central charge c > 1 using modular bootstrap. Upper bounds on the gap in the dimension of primary operators of any spin, as well as in the dimension of scalar primaries, are computed numerically as functions of the central charge using semi-definite programming. Our bounds refine those of Hellerman and Friedan-Keller, and are in some cases saturated by known CFTs. In particular, we show that unitary CFTs with c < 8 must admit relevant deformations, and that a nontrivial bound on the gap of scalar primaries exists for c < 25. We also study bounds on the dimension gap in the presence of twist gaps, bounds on the degeneracy of operators, and demonstrate how "extremal spectra" which maximize the degeneracy at the gap can be determined numerically.
Building Confidence Intervals with Block Bootstraps for the Variance Ratio Test of Predictability
Eduardo José Araújo Lima; Benjamin Miranda Tabak
2007-01-01
This paper compares different versions of the multiple variance ratio test based on bootstrap techniques for the construction of empirical distributions. It also analyzes the crucial issue of selecting optimal block sizes when block bootstrap procedures are used, by applying the methods developed by Hall et al. (1995) and by Politis and White (2004). By comparing the results of the different methods using Monte Carlo simulations, we conclude that methodologies using block bootstrap methods pr...
Bootstrapping Realized Multivariate Volatility Measures.
Donovon, Prosper; Goncalves, Silvia; Meddahi, Nour
2013-01-01
We study bootstrap methods for statistics that are a function of multivariate high frequency returns such as realized regression coefficients and realized covariances and correlations. For these measures of covariation, the Monte Carlo simulation results of Barndorff-Nielsen and Shephard (2004) show that finite sample distortions associated with their feasible asymptotic theory approach may arise if sampling is not too frequent. This motivates our use of the bootstrap as an altern...
Bootstrap Dynamical Symmetry Breaking
Directory of Open Access Journals (Sweden)
Wei-Shu Hou
2013-01-01
Full Text Available Despite the emergence of a 125 GeV Higgs-like particle at the LHC, we explore the possibility of dynamical electroweak symmetry breaking by strong Yukawa coupling of very heavy new chiral quarks Q . Taking the 125 GeV object to be a dilaton with suppressed couplings, we note that the Goldstone bosons G exist as longitudinal modes V L of the weak bosons and would couple to Q with Yukawa coupling λ Q . With m Q ≳ 700 GeV from LHC, the strong λ Q ≳ 4 could lead to deeply bound Q Q ¯ states. We postulate that the leading “collapsed state,” the color-singlet (heavy isotriplet, pseudoscalar Q Q ¯ meson π 1 , is G itself, and a gap equation without Higgs is constructed. Dynamical symmetry breaking is affected via strong λ Q , generating m Q while self-consistently justifying treating G as massless in the loop, hence, “bootstrap,” Solving such a gap equation, we find that m Q should be several TeV, or λ Q ≳ 4 π , and would become much heavier if there is a light Higgs boson. For such heavy chiral quarks, we find analogy with the π − N system, by which we conjecture the possible annihilation phenomena of Q Q ¯ → n V L with high multiplicity, the search of which might be aided by Yukawa-bound Q Q ¯ resonances.
Bootstrapping Time Dilation Decoherence
Gooding, Cisco; Unruh, William G.
2015-10-01
We present a general relativistic model of a spherical shell of matter with a perfect fluid on its surface coupled to an internal oscillator, which generalizes a model recently introduced by the authors to construct a self-gravitating interferometer (Gooding and Unruh in Phys Rev D 90:044071, 2014). The internal oscillator evolution is defined with respect to the local proper time of the shell, allowing the oscillator to serve as a local clock that ticks differently depending on the shell's position and momentum. A Hamiltonian reduction is performed on the system, and an approximate quantum description is given to the reduced phase space. If we focus only on the external dynamics, we must trace out the clock degree of freedom, and this results in a form of intrinsic decoherence that shares some features with a proposed "universal" decoherence mechanism attributed to gravitational time dilation (Pikovski et al in Nat Phys, 2015). We note that the proposed decoherence remains present in the (gravity-free) limit of flat spacetime, emphasizing that the effect can be attributed entirely to proper time differences, and thus is not necessarily related to gravity. Whereas the effect described in (Pikovski et al in Nat Phys, 2015) vanishes in the absence of an external gravitational field, our approach bootstraps the gravitational contribution to the time dilation decoherence by including self-interaction, yielding a fundamentally gravitational intrinsic decoherence effect.
Colin, Yannick; Goñi-Urriza, Marisol; Caumette, Pierre; Guyoneaud, Rémy
2015-03-01
The development of new high-throughput cultivation methods aims to increase the isolation efficiency as compared to standard techniques that often require enrichment procedures to compensate the low microbial recovery. In the current study, estuarine sulfate-reducing bacteria were isolated using an anaerobic isolation procedure in 384-well microplates. Ninety-nine strains were recovered from initial sediments. Isolates were identified according to their partial 16S rRNA sequences and clustered into 13 phylotypes. Besides, the increase in species richness obtained through enrichments or resampling was investigated. Forty-four enrichment procedures were conducted and shifts in sulfate-reducing bacterial communities were investigated through dsrAB gene fingerprinting. Despite efforts in conducting numerous enrichment conditions only few of them were statistically different from initial sample. The cultural diversity obtained from 3 of the most divergent enrichments, as well as from resampled sediments equally contributed to raise the sulfate-reducing diversity up to 22 phylotypes. Enrichments (selection of metabolism) or resampling (transient populations and micro-heterogeneity) may still be helpful to assess new microbial phylotypes. Nevertheless, all the newly cultivated strains were all representatives of minor Operational Taxonomic Units and could eventually be recovered by maintaining high-throughput isolation effort from the initial sediments.
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
Double-bootstrap methods that use a single double-bootstrap simulation
Chang, Jinyuan; Hall, Peter
2014-01-01
We show that, when the double bootstrap is used to improve performance of bootstrap methods for bias correction, techniques based on using a single double-bootstrap sample for each single-bootstrap sample can be particularly effective. In particular, they produce third-order accuracy for much less computational expense than is required by conventional double-bootstrap methods. However, this improved level of performance is not available for the single double-bootstrap methods that have been s...
A bootstrap evaluation of the effect of data splitting on financial time series.
LeBaron, B; Weigend, A S
1998-01-01
Exposes problems of the commonly used technique of splitting the available data into training, validation, and test sets that are held fixed, warns about drawing too strong conclusions from such static splits, and shows potential pitfalls of ignoring variability across splits. Using a bootstrap or resampling method, we compare the uncertainty in the solution stemming from the data splitting with neural-network specific uncertainties (parameter initialization, choice of number of hidden units, etc.). We present two results on data from the New York Stock Exchange. First, the variation due to different resamplings is significantly larger than the variation due to different network conditions. This result implies that it is important to not over-interpret a model (or an ensemble of models) estimated on one specific split of the data. Second, on each split, the neural-network solution with early stopping is very close to a linear model; no significant nonlinearities are extracted. PMID:18252443
Forecasting drought risks for a water supply storage system using bootstrap position analysis
Tasker, Gary; Dunne, Paul
1997-01-01
Forecasting the likelihood of drought conditions is an integral part of managing a water supply storage and delivery system. Position analysis uses a large number of possible flow sequences as inputs to a simulation of a water supply storage and delivery system. For a given set of operating rules and water use requirements, water managers can use such a model to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows a few months ahead conditioned on the current reservoir levels and streamflows. The large number of possible flow sequences are generated using a stochastic streamflow model with a random resampling of innovations. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality and it allows incorporation of long-range weather forecasts into the analysis.
Mir, Tasika; Bernstein, Mark
2016-06-01
Background This is a qualitative study designed to examine neurosurgeons' and neuro-oncologists' perceptions of resampling surgery for glioblastoma multiforme electively, post-therapy or at asymptomatic relapse. Methods Twenty-six neurosurgeons, three radiation oncologists and one neuro-oncologist were selected using convenience sampling and interviewed. Participants were presented with hypothetical scenarios in which resampling surgery was offered within a clinical trial and another in which the surgery was offered on a routine basis. Results Over half of the participants were interested in doing this within a clinical trial. About a quarter of the participants would be willing to consider routine resampling surgery if: (1) a resection were done rather than a simple biopsy; (2) they could wait until the patient becomes symptomatic and (3) there was a preliminary in vitro study with existing tumour samples to be able to offer patients some trial drugs. The remaining quarter of participants was entirely against the trial. Participants also expressed concerns about resource allocation, financial barriers, possibilities of patient coercion and the fear of patients' inability to offer true informed consent. Conclusion Overall, if surgeons are convinced of the benefits of the trial from their information from scientists, and they feel that patients are providing truly informed consent, then the majority would be willing to consider performing the surgery. Many surgeons would still feel uncomfortable with the procedure unless they are able to offer the patient some benefit from the procedure such that the risk to benefit ratio is balanced. PMID:26760112
Medical Image Retrieval Based on Multi-Layer Resampling Template
Institute of Scientific and Technical Information of China (English)
WANG Xin-rui; YANG Yun-feng
2014-01-01
Medical image application in clinical diagnosis and treatment is becoming more and more widely, How to use a large number of images in the image management system and it is a very important issue how to assist doctors to analyze and diagnose. This paper studies the medical image retrieval based on multi-layer resampling template under the thought of the wavelet decomposition, the image retrieval method consists of two retrieval process which is coarse and fine retrieval. Coarse retrieval process is the medical image retrieval process based on the image contour features. Fine retrieval process is the medical image retrieval process based on multi-layer resampling template, a multi-layer sampling operator is employed to extract image resampling images each layer, then these resampling images are retrieved step by step to finish the process from coarse to fine retrieval.
Resampling Algorithms for Particle Filters: A Computational Complexity Perspective
Directory of Open Access Journals (Sweden)
Miodrag Bolić
2004-11-01
Full Text Available Newly developed resampling algorithms for particle filters suitable for real-time implementation are described and their analysis is presented. The new algorithms reduce the complexity of both hardware and DSP realization through addressing common issues such as decreasing the number of operations and memory access. Moreover, the algorithms allow for use of higher sampling frequencies by overlapping in time the resampling step with the other particle filtering steps. Since resampling is not dependent on any particular application, the analysis is appropriate for all types of particle filters that use resampling. The performance of the algorithms is evaluated on particle filters applied to bearings-only tracking and joint detection and estimation in wireless communications. We have demonstrated that the proposed algorithms reduce the complexity without performance degradation.
A Bootstrap Cointegration Rank Test for Panels of VAR Models
DEFF Research Database (Denmark)
Callot, Laurent
functions of the individual Cointegrated VARs (CVAR) models. A bootstrap based procedure is used to compute empirical distributions of the trace test statistics for these individual models. From these empirical distributions two panel trace test statistics are constructed. The satisfying small sample...
Bootstrap confidence intervals for three-way methods
Kiers, Henk A.L.
2004-01-01
Results from exploratory three-way analysis techniques such as CANDECOMP/PARAFAC and Tucker3 analysis are usually presented without giving insight into uncertainties due to sampling. Here a bootstrap procedure is proposed that produces percentile intervals for all output parameters. Special adjustme
A Note on the Particle Filter with Posterior Gaussian Resampling
Xiong, X; Navon, I.M.; Uzunoglu, B.
2011-01-01
Particle filter (PF) is a fully non-linear filter with Bayesian conditional probability estimation, compared here with the well-known ensemble Kalman filter (EnKF). A Gaussian resampling (GR) method is proposed to generate the posterior analysis ensemble in an effective and efficient way. The Lorenz model is used to test the proposed method. The PF with Gaussian resampling (PFGR) can approximate more accurately the Bayesian analysis. The present work demonstrates that the proposed PFGR posses...
Bootstrap planning: Theory and application
Energy Technology Data Exchange (ETDEWEB)
Chen, P.C.
1994-02-01
We identify a general framework for weak planning called bootstrap planning, which is defined as global planning using only a local planner along with some memory for learning intermediate subgoals. We present a family of algorithms for bootstrap planning, and provide some initial theory on their performance. In our theoretical analysis, we develop a random digraph problem model and use it to make some performance predictions and comparisons of these algorithms. We also use it to provide some techniques for approximating the optimal resource bound on the local planner to achieve the best global planner. We validate our theoretical results with empirical demonstration on the 15-puzzle. We show how to reduce the planning cost of a global planner by 2 orders of magnitude using bootstrap planning. We also demonstrate a natural but not widely recognized connection between search costs and the lognormal distribution.
Einecke, Sabrina; Bissantz, Nicolai; Clevermann, Fabian; Rhode, Wolfgang
2016-01-01
Astroparticle experiments such as IceCube or MAGIC require a deconvolution of their measured data with respect to the response function of the detector to provide the distributions of interest, e.g. energy spectra. In this paper, appropriate uncertainty limits that also allow to draw conclusions on the geometric shape of the underlying distribution are determined using bootstrap methods, which are frequently applied in statistical applications. Bootstrap is a collective term for resampling methods that can be employed to approximate unknown probability distributions or features thereof. A clear advantage of bootstrap methods is their wide range of applicability. For instance, they yield reliable results, even if the usual normality assumption is violated. The use, meaning and construction of uncertainty limits to any user-specific confidence level in the form of confidence intervals and levels are discussed. The precise algorithms for the implementation of these methods, applicable for any deconvolution algor...
Bootstrap bias-adjusted GMM estimators
Ramalho, Joaquim J.S.
2005-01-01
The ability of six alternative bootstrap methods to reduce the bias of GMM parameter estimates is examined in an instrumental variable framework using Monte Carlo analysis. Promising results were found for the two bootstrap estimators suggested in the paper.
Analytical bootstrap methods for censored data
Alan D. Hutson
2002-01-01
Analytic bootstrap estimators for the moments of survival quantities are derived. By using these expressions recommendations can be made as to the appropriateness of bootstrap estimation under censored data conditions.
Unsupervised model compression for multilayer bootstrap networks
ZHANG, XIAO-LEI
2015-01-01
Recently, multilayer bootstrap network (MBN) has demonstrated promising performance in unsupervised dimensionality reduction. It can learn compact representations in standard data sets, i.e. MNIST and RCV1. However, as a bootstrap method, the prediction complexity of MBN is high. In this paper, we propose an unsupervised model compression framework for this general problem of unsupervised bootstrap methods. The framework compresses a large unsupervised bootstrap model into a small model by ta...
The bootstrap and edgeworth expansion
Hall, Peter
1992-01-01
This monograph addresses two quite different topics, in the belief that each can shed light on the other. Firstly, it lays the foundation for a particular view of the bootstrap. Secondly, it gives an account of Edgeworth expansion. Chapter 1 is about the bootstrap, witih almost no mention of Edgeworth expansion; Chapter 2 is about Edgeworth expansion, with scarcely a word about the bootstrap; and Chapters 3 and 4 bring these two themes together, using Edgeworth expansion to explore and develop the properites of the bootstrap. The book is aimed a a graduate level audience who has some exposure to the methods of theoretical statistics. However, technical details are delayed until the last chapter (entitled "Details of Mathematical Rogour"), and so a mathematically able reader without knowledge of the rigorous theory of probability will have no trouble understanding the first four-fifths of the book. The book simultaneously fills two gaps in the literature; it provides a very readable graduate level account of t...
On the Asymptotic Accuracy of Efron's Bootstrap
Singh, Kesar
1981-01-01
In the non-lattice case it is shown that the bootstrap approximation of the distribution of the standardized sample mean is asymptotically more accurate than approximation by the limiting normal distribution. The exact convergence rate of the bootstrap approximation of the distributions of sample quantiles is obtained. A few other convergence rates regarding the bootstrap method are also studied.
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Guiseppe, Cavaliere; Rahbæk, Anders; Taylor, A.M. Robert
empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... we fill this gap in the literature by proposing a bootstrap sequential algorithm which we demonstrate delivers consistent cointegration rank estimation for general I(1) processes. Finite sample Monte Carlo simulations show the proposed procedure performs well in practice....
Bootstrap-Based Regularization for Low-Rank Matrix Estimation
Josse, Julie; Wager, Stefan
2014-01-01
We develop a flexible framework for low-rank matrix estimation that allows us to transform noise models into regularization schemes via a simple bootstrap algorithm. Effectively, our procedure seeks an autoencoding basis for the observed matrix that is stable with respect to the specified noise model; we call the resulting procedure a stable autoencoder. In the simplest case, with an isotropic noise model, our method is equivalent to a classical singular value shrinkage estimator. For non-iso...
A Bootstrap Cointegration Rank Test for Panels of VAR Models
Callot, Laurent
2010-01-01
This paper proposes a sequential procedure to determine the common cointegration rank of panels of cointegrated VARs. It shows how a panel of cointegrated VARs can be transformed in a set of independent individual models. The likelihood function of the transformed panel is the sum of the likelihood functions of the individual Cointegrated VARs (CVAR) models. A bootstrap based procedure is used to compute empirical distributions of the trace test statistics for these individual models. From th...
Feti, Andreea; Dudele, Aiga
2012-01-01
Bootstrapping plays a vital role in the life of small and medium-sized enter-prises. By providing a large variety of financing alternatives bootstrapping ensures the existence of entrepreneurship, even though, too less attention is paid to bootstrapping in the specific literature. Therefore, the master thesis strives to eliminate the gaps in the theory by bringing new insights in the field of bootstrapping.The purpose of the master thesis is to investigate the usage of boot-strapping methods ...
A two-stage productivity analysis using bootstrapped Malmquist index and quantile regression
Kaditi, Eleni A.; Nitsi, Elisavet I.
2009-01-01
This paper examines the effects of farm characteristics and government policies in enhancing productivity growth for a sample of Greek farms, using a two-stage procedure. In the 1st-stage, non-parametric estimates of Malmquist index and its decompositions are computed, while a bootstrapping procedure is applied to provide their statistical precision. In the 2nd-stage, the productivity growth estimates are regressed on various covariates using a bootstrapped quantile regression approach. The e...
Bootstrapping N=2 chiral correlators
Energy Technology Data Exchange (ETDEWEB)
Lemos, Madalena [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Liendo, Pedro [Humboldt-Univ. Berlin (Germany). IMIP
2015-12-15
We apply the numerical bootstrap program to chiral operators in four-dimensional N=2 SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of N=2 SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
Horn, D.
2015-03-01
The quark model emerged from the Gell-Mann-Ne'eman flavor SU(3) symmetry. Its development, in the context of strong interactions, took place in a heuristic theoretical framework, referred to as the Bootstrap Era. Setting the background for the dominant ideas in strong interaction of the early 1960s, we outline some aspects of the constituent quark model. An independent theoretical development was the emergence of hadron duality in 1967, leading to a realization of the Bootstrap idea by relating hadron resonances (in the s-channel) with Regge pole trajectories (in t- and u-channels). The synthesis of duality with the quark-model has been achieved by duality diagrams, serving as a conceptual framework for discussing many aspects of hadron dynamics toward the end of the 1960s.
On a generalized bootstrap principle
International Nuclear Information System (INIS)
The S-matrices for non-simply-laced affine Toda field theories are considered in the context of a generalized bootstrap principle. The S-matrices, and in particular their poles, depend on a parameter whose range lies between the Coxeter numbers of dual pairs of the corresponding non-simply-laced algebras. It is proposed that only odd order poles in the physical strip with positive coefficients throughout this range should participate in the bootstrap. All other singularities have an explanation in principle in terms of a generalized Coleman-Thun mechanism. Besides the S-matrices introduced by Delius, Grisaru and Zanon, the missing case (F4(1), e6(2)), is also considered and provides many interesting examples of pole generation. (author)
Bootstrap Current in Spherical Tokamaks
Institute of Scientific and Technical Information of China (English)
王中天; 王龙
2003-01-01
Variational principle for the neoclassical theory has been developed by including amomentum restoring term in the electron-electron collisional operator, which gives an additionalfree parameter maximizing the heat production rate. All transport coefficients are obtained in-cluding the bootstrap current. The essential feature of the study is that the aspect ratio affects thefunction of the electron-electron collision operator through a geometrical factor. When the aspectratio approaches to unity, the fraction of circulating particles goes to zero and the contribution toparticle flux from the electron-electron collision vanishes. The resulting diffusion coefficient is inrough agreement with Hazeltine. When the aspect ratio approaches to infinity, the results are inagreement with Rosenbluth. The formalism gives the two extreme cases a connection. The theoryis particularly important for the calculation of bootstrap current in spherical tokamaks and thepresent tokamaks, in which the square root of the inverse aspect ratio, in general, is not small.
Conformal Bootstrap in Mellin Space
Gopakumar, Rajesh; Sen, Kallol; Sinha, Aninda
2016-01-01
We propose a new approach towards analytically solving for the dynamical content of Conformal Field Theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the epsilon expansion of the Wilson-Fisher fixed point by computing operator dimensions and, strikingly, OPE coefficients to higher orders in epsilon than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement of certain observables in the 3d Ising model, with the precise numerical values that...
On adaptive resampling strategies for sequential Monte Carlo methods
Del Moral, Pierre; Jasra, Ajay; 10.3150/10-BEJ335
2012-01-01
Sequential Monte Carlo (SMC) methods are a class of techniques to sample approximately from any sequence of probability distributions using a combination of importance sampling and resampling steps. This paper is concerned with the convergence analysis of a class of SMC methods where the times at which resampling occurs are computed online using criteria such as the effective sample size. This is a popular approach amongst practitioners but there are very few convergence results available for these methods. By combining semigroup techniques with an original coupling argument, we obtain functional central limit theorems and uniform exponential concentration estimates for these algorithms.
Conformal Bootstrap in Embedding Space
Fortin, Jean-François
2016-01-01
It is shown how to obtain conformal blocks from embedding space with the help of the operator product expansion. The minimal conformal block originates from scalar exchange in a four-point correlation functions of four scalars. All remaining conformal blocks are simple derivatives of the minimal conformal block. With the help of the orthogonality properties of the conformal blocks, the analytic conformal bootstrap can be implemented directly in embedding space, leading to a Jacobi-like definition of conformal field theories.
Conformal bootstrap in embedding space
Fortin, Jean-François; Skiba, Witold
2016-05-01
It is shown how to obtain conformal blocks from embedding space with the help of the operator product expansion. The minimal conformal block originates from scalar exchange in a four-point correlation function of four scalars. All remaining conformal blocks are simple derivatives of the minimal conformal block. With the help of the orthogonality properties of the conformal blocks, the analytic conformal bootstrap can be implemented directly in embedding space, leading to a Jacobi-like definition of conformal field theories.
Bootstrapping High Dimensional Time Series
Zhang, Xianyang; Cheng, Guang
2014-01-01
This article studies bootstrap inference for high dimensional weakly dependent time series in a general framework of approximately linear statistics. The following high dimensional applications are covered: (1) uniform confidence band for mean vector; (2) specification testing on the second order property of time series such as white noise testing and bandedness testing of covariance matrix; (3) specification testing on the spectral property of time series. In theory, we first derive a Gaussi...
Modified Bootstrap Sensitometry In Radiography
Bednarek, Daniel R.; Rudin, Stephen
1981-04-01
A new modified bootstrap approach to sensitometry is presented which provides H and D curves that show almost exact agreement with those obtained using conventional methods. Two bootstrap techniques are described; both involve a combination of inverse-square and stepped-wedge modulation of the radiation field and provide intensity-scale sensitometric curves as appropriate for medical radiography. H and D curves obtained with these modified techniques are compared with those obtained for screen-film combinations using inverse-square sensitometry as well as with those obtained for direct x-ray film using time-scale sensitometry. The stepped wedge of the Wisconsin X-Ray Test Cassette was used in the bootstrap approach since it provides sufficient exposure latitude to encompass the useful density range of medical x-ray film. This approach makes radiographic sensitometry quick and convenient, allowing accurate characteristic curves to be obtained for any screen-film cassette using standard diagnostic x-ray equipment.
PARTICLE FILTER BASED VEHICLE TRACKING APPROACH WITH IMPROVED RESAMPLING STAGE
Directory of Open Access Journals (Sweden)
Wei Leong Khong
2014-02-01
Full Text Available Optical sensors based vehicle tracking can be widely implemented in traffic surveillance and flow control. The vast development of video surveillance infrastructure in recent years has drawn the current research focus towards vehicle tracking using high-end and low cost optical sensors. However, tracking vehicles via such sensors could be challenging due to the high probability of changing vehicle appearance and illumination, besides the occlusion and overlapping incidents. Particle filter has been proven as an approach which can overcome nonlinear and non-Gaussian situations caused by cluttered background and occlusion incidents. Unfortunately, conventional particle filter approach encounters particle degeneracy especially during and after the occlusion. Particle filter with sampling important resampling (SIR is an important step to overcome the drawback of particle filter, but SIR faced the problem of sample impoverishment when heavy particles are statistically selected many times. In this work, genetic algorithm has been proposed to be implemented in the particle filter resampling stage, where the estimated position can converge faster to hit the real position of target vehicle under various occlusion incidents. The experimental results show that the improved particle filter with genetic algorithm resampling method manages to increase the tracking accuracy and meanwhile reduce the particle sample size in the resampling stage.
Introduction to Permutation and Resampling-Based Hypothesis Tests
LaFleur, Bonnie J.; Greevy, Robert A.
2009-01-01
A resampling-based method of inference--permutation tests--is often used when distributional assumptions are questionable or unmet. Not only are these methods useful for obvious departures from parametric assumptions (e.g., normality) and small sample sizes, but they are also more robust than their parametric counterparts in the presences of…
Theoretical comparisons of block bootstrap methods
Lahiri, S. N.
1999-01-01
In this paper, we compare the asymptotic behavior of some common block bootstrap methods based on nonrandom as well as random block lengths. It is shown that, asymptotically, bootstrap estimators derived using any of the methods considered in the paper have the same amount of bias to the first order. However, the variances of these bootstrap estimators may be different even in the first order. Expansions for the bias, the variance and the mean-squared error of different bloc...
Energy Technology Data Exchange (ETDEWEB)
Sohn, S.Y
1999-12-01
We consider a robust parameter design of the process for forming contact windows in complementary metal-oxide semiconductor circuits. Robust design is often used to find the optimal levels of process conditions which would provide the output of consistent quality as close to a target value. In this paper, we analyze the results of the fractional factorial design of nine factors: mask dimension, viscosity, bake temperature, spin speed, bake time, aperture, exposure time, developing time, etch time, where the outcome of the experiment is measured in terms of a categorized window size with five categories. Random effect analysis is employed to model both the mean and variance of categorized window size as functions of some controllable factors as well as random errors. Empirical Bayes' procedures are then utilized to fit both the models, and to eventually find the robust design of CMOS circuit process by means of a Bootstrap resampling approach.
Statistical bootstrap model and annihilations
Möhring, H J
1974-01-01
The statistical bootstrap model (SBM) describes the decay of single, high mass, hadronic states (fireballs, clusters) into stable particles. Coupling constants B, one for each isospin multiplet of stable particles, are the only free parameter of the model. They are related to the maximum temperature parameter T/sub 0/. The various versions of the SMB can be classified into two groups: full statistical bootstrap models and linear ones. The main results of the model are the following: i) All momentum spectra are isotropic; especially the exclusive ones are described by invariant phase space. The inclusive and semi-inclusive single-particle distributions are asymptotically of pure exponential shape; the slope is governed by T /sub 0/ only. ii) The model parameter B for pions has been obtained by fitting the multiplicity distribution in pp and pn at rest, and corresponds to T/sub 0/=0.167 GeV in the full SBM with exotics. The average pi /sup -/ multiplicity for the linear and the full SBM (both with exotics) is c...
Bootstrap percolation on spatial networks
Gao, Jian; Zhou, Tao; Hu, Yanqing
2015-10-01
Bootstrap percolation is a general representation of some networked activation process, which has found applications in explaining many important social phenomena, such as the propagation of information. Inspired by some recent findings on spatial structure of online social networks, here we study bootstrap percolation on undirected spatial networks, with the probability density function of long-range links’ lengths being a power law with tunable exponent. Setting the size of the giant active component as the order parameter, we find a parameter-dependent critical value for the power-law exponent, above which there is a double phase transition, mixed of a second-order phase transition and a hybrid phase transition with two varying critical points, otherwise there is only a second-order phase transition. We further find a parameter-independent critical value around -1, about which the two critical points for the double phase transition are almost constant. To our surprise, this critical value -1 is just equal or very close to the values of many real online social networks, including LiveJournal, HP Labs email network, Belgian mobile phone network, etc. This work helps us in better understanding the self-organization of spatial structure of online social networks, in terms of the effective function for information spreading.
Comparison of resampling method applied to censored data
Directory of Open Access Journals (Sweden)
Claude Arrabal
2014-06-01
Full Text Available This paper is about a comparison study among the performances of variance estimators of certain parameters, usingresampling techniques such as bootstrap and jackknife. The comparison will be made among several situations ofsimulated censored data, relating the observed values of estimates to real values. For real data, it will be consideredthe dataset Stanford heart transplant, analyzed by Cho et al. (2009 using the model of Cox regression (Cox, 1972for adjustment. It is noted that the Jackknife residual is ecient to analyze inuential data points in the responsevariable.Keywords: bootstrap, Jackknife, simulation, Cox Regression Model, censored data.
A Bayesian Bootstrap for a Finite Population
Lo, Albert Y.
1988-01-01
A Bayesian bootstrap for a finite population is introduced; its small-sample distributional properties are discussed and compared with those of the frequentist bootstrap for a finite population. It is also shown that the two are first-order asymptotically equivalent.
Bootstrapping Phylogenetic Trees: Theory and Methods
Holmes, Susan
2003-01-01
This is a survey of the use of the bootstrap in the area of systematic and evolutionary biology. I present the current usage by biologists of the bootstrap as a tool both for making inferences and for evaluating robustness, and propose a framework for thinking about these problems in terms of mathematical statistics.
Coefficient Alpha Bootstrap Confidence Interval under Nonnormality
Padilla, Miguel A.; Divers, Jasmin; Newton, Matthew
2012-01-01
Three different bootstrap methods for estimating confidence intervals (CIs) for coefficient alpha were investigated. In addition, the bootstrap methods were compared with the most promising coefficient alpha CI estimation methods reported in the literature. The CI methods were assessed through a Monte Carlo simulation utilizing conditions…
Using re-sampling methods in mortality studies.
Directory of Open Access Journals (Sweden)
Igor Itskovich
Full Text Available Traditional methods of computing standardized mortality ratios (SMR in mortality studies rely upon a number of conventional statistical propositions to estimate confidence intervals for obtained values. Those propositions include a common but arbitrary choice of the confidence level and the assumption that observed number of deaths in the test sample is a purely random quantity. The latter assumption may not be fully justified for a series of periodic "overlapping" studies. We propose a new approach to evaluating the SMR, along with its confidence interval, based on a simple re-sampling technique. The proposed method is most straightforward and requires neither the use of above assumptions nor any rigorous technique, employed by modern re-sampling theory, for selection of a sample set. Instead, we include all possible samples that correspond to the specified time window of the study in the re-sampling analysis. As a result, directly obtained confidence intervals for repeated overlapping studies may be tighter than those yielded by conventional methods. The proposed method is illustrated by evaluating mortality due to a hypothetical risk factor in a life insurance cohort. With this method used, the SMR values can be forecast more precisely than when using the traditional approach. As a result, the appropriate risk assessment would have smaller uncertainties.
Generic Hardware Architectures for Sampling and Resampling in Particle Filters
Directory of Open Access Journals (Sweden)
Petar M. Djurić
2005-10-01
Full Text Available Particle filtering is a statistical signal processing methodology that has recently gained popularity in solving several problems in signal processing and communications. Particle filters (PFs have been shown to outperform traditional filters in important practical scenarios. However their computational complexity and lack of dedicated hardware for real-time processing have adversely affected their use in real-time applications. In this paper, we present generic architectures for the implementation of the most commonly used PF, namely, the sampling importance resampling filter (SIRF. These provide a generic framework for the hardware realization of the SIRF applied to any model. The proposed architectures significantly reduce the memory requirement of the filter in hardware as compared to a straightforward implementation based on the traditional algorithm. We propose two architectures each based on a different resampling mechanism. Further, modifications of these architectures for acceleration of resampling process are presented. We evaluate these schemes based on resource usage and latency. The platform used for the evaluations is the Xilinx Virtex II pro FPGA. The architectures presented here have led to the development of the first hardware (FPGA prototype for the particle filter applied to the bearings-only tracking problem.
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
The N=2 superconformal bootstrap
Beem, Christopher; Lemos, Madalena; Liendo, Pedro; Rastelli, Leonardo; van Rees, Balt C.
2016-03-01
In this work we initiate the conformal bootstrap program for N=2 super-conformal field theories in four dimensions. We promote an abstract operator-algebraic viewpoint in order to unify the description of Lagrangian and non-Lagrangian theories, and formulate various conjectures concerning the landscape of theories. We analyze in detail the four-point functions of flavor symmetry current multiplets and of N=2 chiral operators. For both correlation functions we review the solution of the superconformal Ward identities and describe their superconformal block decompositions. This provides the foundation for an extensive numerical analysis discussed in the second half of the paper. We find a large number of constraints for operator dimensions, OPE coefficients, and central charges that must hold for any N=2 superconformal field theory.
Particle filter based on iterated importance density function and parallel resampling
Institute of Scientific and Technical Information of China (English)
武勇; 王俊; 曹运合
2015-01-01
The design, analysis and parallel implementation of particle filter (PF) were investigated. Firstly, to tackle the particle degeneracy problem in the PF, an iterated importance density function (IIDF) was proposed, where a new term associating with the current measurement information (CMI) was introduced into the expression of the sampled particles. Through the repeated use of the least squares estimate, the CMI can be integrated into the sampling stage in an iterative manner, conducing to the greatly improved sampling quality. By running the IIDF, an iterated PF (IPF) can be obtained. Subsequently, a parallel resampling (PR) was proposed for the purpose of parallel implementation of IPF, whose main idea was the same as systematic resampling (SR) but performed differently. The PR directly used the integral part of the product of the particle weight and particle number as the number of times that a particle was replicated, and it simultaneously eliminated the particles with the smallest weights, which are the two key differences from the SR. The detailed implementation procedures on the graphics processing unit of IPF based on the PR were presented at last. The performance of the IPF, PR and their parallel implementations are illustrated via one-dimensional numerical simulation and practical application of passive radar target tracking.
Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... that the probability of selecting a rank smaller than (equal to) the true co-integrating rank will converge to zero (one minus the marginal significance level), as the sample size diverges, for general I(1) processes. No such likelihood-based procedure is currently known to be available. In this paper...... we fill this gap in the literature by proposing a bootstrap sequential algorithm which we demonstrate delivers consistent cointegration rank estimation for general I(1) processes. Finite sample Monte Carlo simulations show the proposed procedure performs well in practice....
A model study in hadron statistical bootstrap
Hagedorn, Rolf
1973-01-01
In the framework of the statistical bootstrap the decay of a fireball is considered as an exact inverse of its statistical composition. This assumption leads to a bootstrap formulated in terms of integral equations for all kinds of distributions of the fireball's decay products. Solutions of the equations are obtained in terms of power series and of K-transforms and determine in the general case their asymptotic behaviour for large fireball mass. Relations to a thermodynamical description are established and illustrated by effective temperatures. The approach to the asymptotic limits is easy to investigate in a simplified linear bootstrap where the K-transforms can be more explicitly calculated. (30 refs).
iDESWEB: Frameworks CSS: Bootstrap
Yuste Torregrosa, Álvaro; Luján Mora, Sergio
2012-01-01
Framework CSS (herramientas y pautas), frameworks más famosos (BluePrint, 960 Grid System, YUI), Twitter Bootstrap, ejemplo de botones, ejemplo de uso de la rejilla. Sitio web del curso: http://idesweb.es/
Bootstrapping under constraint for the assessment of group behavior in human contact networks
Tremblay, Nicolas; Forest, Cary; Nornberg, Mark; Pinton, Jean-François; Borgnat, Pierre
2012-01-01
The increasing availability of time --and space-- resolved data describing human activities and interactions gives insights into both static and dynamic properties of human behavior. In practice, nevertheless, real-world datasets can often be considered as only one realisation of a particular event. This highlights a key issue in social network analysis: the statistical significance of estimated properties. In this context, we focus here on the assessment of quantitative features of specific subset of nodes in empirical networks. We present a resampling method based on bootstrapping groups of nodes under constraints within the empirical network. The method enables us to define confidence intervals for various Null Hypotheses concerning relevant properties of the subset of nodes under consideration, in order to characterize its behavior as "normal" or not. We apply this method to a high resolution dataset describing the face-to-face proximity of individuals during two co-located scientific conferences. As a ca...
Bootstrapping Reflective Systems: The Case of Pharo
Polito, Guillermo; Ducasse, Stéphane; Fabresse, Luc; Bouraqadi, Noury; Van Ryseghem, Benjamin
2014-01-01
Bootstrapping is a technique commonly known by its usage in language definition by the introduction of a compiler written in the same language it compiles. This process is important to understand and modify the definition of a given language using the same language, taking benefit of the abstractions and expression power it provides. A bootstrap, then, supports the evolution of a language. However, the infrastructure of reflective systems like Smalltalk includes, in addition to a compiler, an...
Investigating Mortality Uncertainty Using the Block Bootstrap
Directory of Open Access Journals (Sweden)
Xiaoming Liu
2010-01-01
Full Text Available This paper proposes a block bootstrap method for measuring mortality risk under the Lee-Carter model framework. In order to take account of all sources of risk (the process risk, the parameter risk, and the model risk properly, a block bootstrap is needed to cope with the spatial dependence found in the residuals. As a result, the prediction intervals we obtain for life expectancy are more accurate than the ones obtained from other similar methods.
Theoretical Comparison of Bootstrap Confidence Intervals
Hall, Peter
1988-01-01
We develop a unified framework within which many commonly used bootstrap critical points and confidence intervals may be discussed and compared. In all, seven different bootstrap methods are examined, each being usable in both parametric and nonparametric contexts. Emphasis is on the way in which the methods cope with first- and second-order departures from normality. Percentile-$t$ and accelerated bias-correction emerge as the most promising of existing techniques. Certain other methods are ...
Investigating Mortality Uncertainty Using the Block Bootstrap
Xiaoming Liu; W. John Braun
2010-01-01
This paper proposes a block bootstrap method for measuring mortality risk under the Lee-Carter model framework. In order to take account of all sources of risk (the process risk, the parameter risk, and the model risk) properly, a block bootstrap is needed to cope with the spatial dependence found in the residuals. As a result, the prediction intervals we obtain for life expectancy are more accurate than the ones obtained from other similar methods.
On the estimation of the extremal index based on scaling and resampling
Hamidieh, Kamal; Michailidis, George
2010-01-01
The extremal index parameter theta characterizes the degree of local dependence in the extremes of a stationary time series and has important applications in a number of areas, such as hydrology, telecommunications, finance and environmental studies. In this study, a novel estimator for theta based on the asymptotic scaling of block-maxima and resampling is introduced. It is shown to be consistent and asymptotically normal for a large class of m-dependent time series. Further, a procedure for the automatic selection of its tuning parameter is developed and different types of confidence intervals that prove useful in practice proposed. The performance of the estimator is examined through simulations, which show its highly competitive behavior. Finally, the estimator is applied to three real data sets of daily crude oil prices, daily returns of the S&P 500 stock index, and high-frequency, intra-day traded volumes of a stock. These applications demonstrate additional diagnostic features of statistical plots ...
A new approach to bootstrap inference in functional coefficient models
Herwartz, Helmut; Xu, Fang
2007-01-01
We introduce a new, factor based bootstrap approach which is robust under heteroskedastic error terms for inference in functional coefficient models. Modeling the functional coefficient parametrically, the bootstrap approximation of an F statistic is shown to hold asymptotically. In simulation studies with both parametric and nonparametric functional coefficients, factor based bootstrap inference outperforms the wild bootstrap and pairs bootstrap approach according to its size features. Apply...
The Bootstrap of Mean for Dependent Heterogeneous Arrays.
GONÇALVES, Silvia; White, Halbert
2001-01-01
Presently, conditions ensuring the validity of bootstrap methods for the sample mean of (possibly heterogeneous) near epoch dependent (NED) functions of mixing processes are unknown. Here we establish the validity of the bootstrap in this context, extending the applicability of bootstrap methods to a class of processes broadly relevant for applications in economics and finance. Our results apply to two block bootstrap methods: the moving blocks bootstrap of Künsch ( 989) and Liu and Singh ( 9...
Maximum Likelihood and the Bootstrap for Nonlinear Dynamic Models
Goncalves, Silvia; White, Halbert
2002-01-01
The bootstrap is an increasingly popular method for performing statistical inference. This paper provides the theoretical foundation for using the bootstrap as a valid tool of inference for quasi-maximum likelihood estimators (QMLE). We provide a unified framework for analyzing bootstrapped extremum estimators of nonlinear dynamic models for heterogeneous dependent stochastic processes. We apply our results to two block bootstrap methods, the moving blocks bootstrap of Künsch (1989) and Liu a...
Loop calculus and bootstrap-belief propagation for perfect matchings on arbitrary graphs
Chertkov, M.; Gelfand, A.; Shin, J.
2013-12-01
This manuscript discusses computation of the Partition Function (PF) and the Minimum Weight Perfect Matching (MWPM) on arbitrary, non-bipartite graphs. We present two novel problem formulations - one for computing the PF of a Perfect Matching (PM) and one for finding MWPMs - that build upon the inter-related Bethe Free Energy (BFE), Belief Propagation (BP), Loop Calculus (LC), Integer Linear Programming and Linear Programming frameworks. First, we describe an extension of the LC framework to the PM problem. The resulting formulas, coined (fractional) Bootstrap-BP, express the PF of the original model via the BFE of an alternative PM problem. We then study the zero-temperature version of this Bootstrap-BP formula for approximately solving the MWPM problem. We do so by leveraging the Bootstrap-BP formula to construct a sequence of MWPM problems, where each new problem in the sequence is formed by contracting odd-sized cycles (or blossoms) from the previous problem. This Bootstrap-and-Contract procedure converges reliably and generates an empirically tight upper bound for the MWPM. We conclude by discussing the relationship between our iterative procedure and the famous Blossom Algorithm of Edmonds '65 and demonstrate the performance of the Bootstrap-and-Contract approach on a variety of weighted PM problems.
Loop calculus and bootstrap-belief propagation for perfect matchings on arbitrary graphs
International Nuclear Information System (INIS)
This manuscript discusses computation of the Partition Function (PF) and the Minimum Weight Perfect Matching (MWPM) on arbitrary, non-bipartite graphs. We present two novel problem formulations – one for computing the PF of a Perfect Matching (PM) and one for finding MWPMs – that build upon the inter-related Bethe Free Energy (BFE), Belief Propagation (BP), Loop Calculus (LC), Integer Linear Programming and Linear Programming frameworks. First, we describe an extension of the LC framework to the PM problem. The resulting formulas, coined (fractional) Bootstrap-BP, express the PF of the original model via the BFE of an alternative PM problem. We then study the zero-temperature version of this Bootstrap-BP formula for approximately solving the MWPM problem. We do so by leveraging the Bootstrap-BP formula to construct a sequence of MWPM problems, where each new problem in the sequence is formed by contracting odd-sized cycles (or blossoms) from the previous problem. This Bootstrap-and-Contract procedure converges reliably and generates an empirically tight upper bound for the MWPM. We conclude by discussing the relationship between our iterative procedure and the famous Blossom Algorithm of Edmonds '65 and demonstrate the performance of the Bootstrap-and-Contract approach on a variety of weighted PM problems
Maximum likelihood resampling of noisy, spatially correlated data
Goff, J.; Jenkins, C.
2005-12-01
In any geologic application, noisy data are sources of consternation for researchers, inhibiting interpretability and marring images with unsightly and unrealistic artifacts. Filtering is the typical solution to dealing with noisy data. However, filtering commonly suffers from ad hoc (i.e., uncalibrated, ungoverned) application, which runs the risk of erasing high variability components of the field in addition to the noise components. We present here an alternative to filtering: a newly developed methodology for correcting noise in data by finding the "best" value given the data value, its uncertainty, and the data values and uncertainties at proximal locations. The motivating rationale is that data points that are close to each other in space cannot differ by "too much", where how much is "too much" is governed by the field correlation properties. Data with large uncertainties will frequently violate this condition, and in such cases need to be corrected, or "resampled." The best solution for resampling is determined by the maximum of the likelihood function defined by the intersection of two probability density functions (pdf): (1) the data pdf, with mean and variance determined by the data value and square uncertainty, respectively, and (2) the geostatistical pdf, whose mean and variance are determined by the kriging algorithm applied to proximal data values. A Monte Carlo sampling of the data probability space eliminates non-uniqueness, and weights the solution toward data values with lower uncertainties. A test with a synthetic data set sampled from a known field demonstrates quantitatively and qualitatively the improvement provided by the maximum likelihood resampling algorithm. The method is also applied to three marine geology/geophysics data examples: (1) three generations of bathymetric data on the New Jersey shelf with disparate data uncertainties; (2) mean grain size data from the Adriatic Sea, which is combination of both analytic (low uncertainty
Bootstrap inversion for Pn wave velocity in North-Western Italy
Directory of Open Access Journals (Sweden)
C. Eva
1997-06-01
Full Text Available An inversion of Pn arrival times from regional distance earthquakes (180-800 km, recorded by 94 seismic stations operating in North-Western Italy and surrounding areas, was carried out to image lateral variations of P-wave velocity at the crust-mantle boundary, and to estimate the static delay time at each station. The reliability of the obtained results was assessed using both synthetic tests and the bootstrap Monte Carlo resampling technique. Numerical simulations demonstrated the existence of a trade-off between cell velocities and estimated station delay times along the edge of the model. Bootstrap inversions were carried out to determine the standard deviation of velocities and time terms. Low Pn velocity anomalies are detected beneath the outer side of the Alps (-6% and the Western Po plain (-4% in correspondence with two regions of strong crustal thickening and negative Bouguer anomaly. In contrast, high Pn velocities are imaged beneath the inner side of the Alps (+4% indicating the presence of high velocity and density lower crust-upper mantle. The Ligurian sea shows high Pn velocities close to the Ligurian coastlines (+3% and low Pn velocities (-1.5% in the middle of the basin in agreement with the upper mantle velocity structure revealed by seismic refraction profiles.
Directory of Open Access Journals (Sweden)
Ana Severiano
Full Text Available Several research fields frequently deal with the analysis of diverse classification results of the same entities. This should imply an objective detection of overlaps and divergences between the formed clusters. The congruence between classifications can be quantified by clustering agreement measures, including pairwise agreement measures. Several measures have been proposed and the importance of obtaining confidence intervals for the point estimate in the comparison of these measures has been highlighted. A broad range of methods can be used for the estimation of confidence intervals. However, evidence is lacking about what are the appropriate methods for the calculation of confidence intervals for most clustering agreement measures. Here we evaluate the resampling techniques of bootstrap and jackknife for the calculation of the confidence intervals for clustering agreement measures. Contrary to what has been shown for some statistics, simulations showed that the jackknife performs better than the bootstrap at accurately estimating confidence intervals for pairwise agreement measures, especially when the agreement between partitions is low. The coverage of the jackknife confidence interval is robust to changes in cluster number and cluster size distribution.
Aptamer Affinity Maturation by Resampling and Microarray Selection.
Kinghorn, Andrew B; Dirkzwager, Roderick M; Liang, Shaolin; Cheung, Yee-Wai; Fraser, Lewis A; Shiu, Simon Chi-Chin; Tang, Marco S L; Tanner, Julian A
2016-07-19
Aptamers have significant potential as affinity reagents, but better approaches are critically needed to discover higher affinity nucleic acids to widen the scope for their diagnostic, therapeutic, and proteomic application. Here, we report aptamer affinity maturation, a novel aptamer enhancement technique, which combines bioinformatic resampling of aptamer sequence data and microarray selection to navigate the combinatorial chemistry binding landscape. Aptamer affinity maturation is shown to improve aptamer affinity by an order of magnitude in a single round. The novel aptamers exhibited significant adaptation, the complexity of which precludes discovery by other microarray based methods. Honing aptamer sequences using aptamer affinity maturation could help optimize a next generation of nucleic acid affinity reagents. PMID:27346322
A Class of Population Covariance Matrices in the Bootstrap Approach to Covariance Structure Analysis
Yuan, Ke-Hai; Hayashi, Kentaro; Yanagihara, Hirokazu
2007-01-01
Model evaluation in covariance structure analysis is critical before the results can be trusted. Due to finite sample sizes and unknown distributions of real data, existing conclusions regarding a particular statistic may not be applicable in practice. The bootstrap procedure automatically takes care of the unknown distribution and, for a given…
Bootstrap Percolation on Random Geometric Graphs
Bradonjić, Milan
2012-01-01
Bootstrap percolation has been used effectively to model phenomena as diverse as emergence of magnetism in materials, spread of infection, diffusion of software viruses in computer networks, adoption of new technologies, and emergence of collective action and cultural fads in human societies. It is defined on an (arbitrary) network of interacting agents whose state is determined by the state of their neighbors according to a threshold rule. In a typical setting, bootstrap percolation starts by random and independent "activation" of nodes with a fixed probability $p$, followed by a deterministic process for additional activations based on the density of active nodes in each neighborhood ($\\th$ activated nodes). Here, we study bootstrap percolation on random geometric graphs in the regime when the latter are (almost surely) connected. Random geometric graphs provide an appropriate model in settings where the neighborhood structure of each node is determined by geographical distance, as in wireless {\\it ad hoc} ...
Early Stop Criterion from the Bootstrap Ensemble
DEFF Research Database (Denmark)
Hansen, Lars Kai; Larsen, Jan; Fog, Torben L.
1997-01-01
This paper addresses the problem of generalization error estimation in neural networks. A new early stop criterion based on a Bootstrap estimate of the generalization error is suggested. The estimate does not require the network to be trained to the minimum of the cost function, as required...... by other methods based on asymptotic theory. Moreover, in contrast to methods based on cross-validation which require data left out for testing, and thus biasing the estimate, the Bootstrap technique does not have this disadvantage. The potential of the suggested technique is demonstrated on various time...
Bootstrap percolation: a renormalisation group approach
International Nuclear Information System (INIS)
In bootstrap percolation, sites are occupied at random with probability p, but each site is considered active only if at least m of its neighbours are also active. Within an approximate position-space renormalization group framework on a square lattice we obtain the behaviour of the critical concentration p (sub)c and of the critical exponents ν and β for m = 0 (ordinary percolation), 1,2 and 3. We find that the bootstrap percolation problem can be cast into different universality classes, characterized by the values of m. (author)
BOOTSTRAPPING FOR EXTRACTING RELATIONS FROM LARGE CORPORA
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
A new approach of relation extraction is described in this paper. It adopts a bootstrapping model with a novel iteration strategy, which generates more precise examples of specific relation. Compared with previous methods, the proposed method has three main advantages: first, it needs less manual intervention; second, more abundant and reasonable information are introduced to represent a relation pattern; third, it reduces the risk of circular dependency occurrence in bootstrapping. Scalable evaluation methodology and metrics are developed for our task with comparable techniques over TianWang 100G corpus. The experimental results show that it can get 90% precision and have excellent expansibility.
Conference on Bootstrapping and Related Techniques
Rothe, Günter; Sendler, Wolfgang
1992-01-01
This book contains 30 selected, refereed papers from an in- ternational conference on bootstrapping and related techni- ques held in Trier 1990. Thepurpose of the book is to in- form about recent research in the area of bootstrap, jack- knife and Monte Carlo Tests. Addressing the novice and the expert it covers as well theoretical as practical aspects of these statistical techniques. Potential users in different disciplines as biometry, epidemiology, computer science, economics and sociology but also theoretical researchers s- hould consult the book to be informed on the state of the art in this area.
Bootstrap Method for Dependent Data Structure and Measure of Statistical Precision
Directory of Open Access Journals (Sweden)
T. O. Olatayo
2010-01-01
Full Text Available Problem statement: This article emphasized on the construction of valid inferential procedures for an estimator θ^ as a measure of its statistical precision for dependent data structure. Approach: The truncated geometric bootstrap estimates of standard error and other measures of statistical precision such as bias, coefficient of variation, ratio and root mean square error are considered. Results: We extend it to other measures of statistical precision such as bootstrap confidence interval for an estimator θ^ and illustrate with real geological data. Conclusion/Recommendations: The bootstrap estimates of standard error and other measures of statistical accuracy such as bias, ratio, coefficient of variation and root mean square error reveals the suitability of the method for dependent data structure.
A Large Sample Study of the Bayesian Bootstrap
Lo, Albert Y.
1987-01-01
An asymptotic justification of the Bayesian bootstrap is given. Large-sample Bayesian bootstrap probability intervals for the mean, the variance and bands for the distribution, the smoothed density and smoothed rate function are also provided.
The use of the bootstrap in the analysis of case-control studies with missing data
DEFF Research Database (Denmark)
Siersma, Volkert Dirk; Johansen, Christoffer
2004-01-01
nonparametric bootstrap, bootstrap confidence intervals, missing values, multiple imputation, matched case-control study......nonparametric bootstrap, bootstrap confidence intervals, missing values, multiple imputation, matched case-control study...
Wild bootstrap of the mean in the infinite variance case
Giuseppe Cavaliere; Iliyan Georgiev; Robert Taylor, A. M.
2011-01-01
It is well known that the standard i.i.d. bootstrap of the mean is inconsistent in a location model with infinite variance (alfa-stable) innovations. This occurs because the bootstrap distribution of a normalised sum of infinite variance random variables tends to a random distribution. Consistent bootstrap algorithms based on subsampling methods have been proposed but have the drawback that they deliver much wider confidence sets than those generated by the i.i.d. bootstrap owing to the fact ...
uniform bootstrap confidence bands for bounded influence curve estimators
Härdle, Wolfgang Karl; Ritov, Ya‘acov; Wang, Weining
2013-01-01
We consider theoretical bootstrap "coupling" techniques for nonparametric robust smoothers and quantile regression, and verify the bootstrap improvement. To cope with curse of dimensionality, a variant of "coupling" bootstrap techniques are developed for additive models with both symmetric error distributions and further extension to the quantile regression framework. Our bootstrap method can be used in many situations like constructing con dence intervals and bands. We demonstrate the bootst...
How to Bootstrap a Human Communication System
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
Pulling Econometrics Students up by Their Bootstraps
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
Automatic bootstrapping and tracking of object contours.
Chiverton, John; Xie, Xianghua; Mirmehdi, Majid
2012-03-01
A new fully automatic object tracking and segmentation framework is proposed. The framework consists of a motion-based bootstrapping algorithm concurrent to a shape-based active contour. The shape-based active contour uses finite shape memory that is automatically and continuously built from both the bootstrap process and the active-contour object tracker. A scheme is proposed to ensure that the finite shape memory is continuously updated but forgets unnecessary information. Two new ways of automatically extracting shape information from image data given a region of interest are also proposed. Results demonstrate that the bootstrapping stage provides important motion and shape information to the object tracker. This information is found to be essential for good (fully automatic) initialization of the active contour. Further results also demonstrate convergence properties of the content of the finite shape memory and similar object tracking performance in comparison with an object tracker with unlimited shape memory. Tests with an active contour using a fixed-shape prior also demonstrate superior performance for the proposed bootstrapped finite-shape-memory framework and similar performance when compared with a recently proposed active contour that uses an alternative online learning model. PMID:21908256
Weak Convergence of Smoothed and Nonsmoothed Bootstrap Quantile Estimates
Falk, M; Reiss, R.-D.
1989-01-01
Under fairly general assumptions on the underlying distribution function, the bootstrap process, pertaining to the sample $q$-quantile, converges weakly in $D_\\mathbb{R}$ to the standard Brownian motion. Furthermore, weak convergence of a smoothed bootstrap quantile estimate is proved which entails that in this particular case the smoothed bootstrap estimate outperforms the nonsmoothed one.
Testing for asymmetry in economic time series using bootstrap methods
Claudio Lupi; Patrizia Ordine
2001-01-01
In this paper we show that phase-scrambling bootstrap offers a natural framework for asymmetry testing in economic time series. A comparison with other bootstrap schemes is also sketched. A Monte Carlo analysis is carried out to evaluate the size and power properties of the phase-scrambling bootstrap-based test.
Bootstrapping Q Methodology to Improve the Understanding of Human Perspectives.
Zabala, Aiora; Pascual, Unai
2016-01-01
Q is a semi-qualitative methodology to identify typologies of perspectives. It is appropriate to address questions concerning diverse viewpoints, plurality of discourses, or participation processes across disciplines. Perspectives are interpreted based on rankings of a set of statements. These rankings are analysed using multivariate data reduction techniques in order to find similarities between respondents. Discussing the analytical process and looking for progress in Q methodology is becoming increasingly relevant. While its use is growing in social, health and environmental studies, the analytical process has received little attention in the last decades and it has not benefited from recent statistical and computational advances. Specifically, the standard procedure provides overall and arguably simplistic variability measures for perspectives and none of these measures are associated to individual statements, on which the interpretation is based. This paper presents an innovative approach of bootstrapping Q to obtain additional and more detailed measures of variability, which helps researchers understand better their data and the perspectives therein. This approach provides measures of variability that are specific to each statement and perspective, and additional measures that indicate the degree of certainty with which each respondent relates to each perspective. This supplementary information may add or subtract strength to particular arguments used to describe the perspectives. We illustrate and show the usefulness of this approach with an empirical example. The paper provides full details for other researchers to implement the bootstrap in Q studies with any data collection design.
Bandpass-resampling effects for the retrieval of surface emissivity.
Richter, Rudolf; Coll, Cesar
2002-06-20
The retrieval of surface emissivity in the 8-14-microm region from remotely sensed thermal imagery requires channel-averaged values of atmospheric transmittance, path radiance, and downwelling sky flux. Band-pass resampling introduces inherent retrieval errors that depend on atmospheric conditions, spectral region, bandwidth, flight altitude, and surface temperature. This simulation study is performed for clear sky conditions and moderate atmospheric water vapor contents. It shows that relative emissivity retrieval errors can reach as much as 3% for broadband sensors (1-2-microm bandwidth) and 0.8% for narrowband instruments (0.15 microm), even for constant surface emissivity. For spectrally varying surface emissivities the relative retrieval error increases for the broadband instrument by approximately 2% in channels with strong emissivity changes of 0.05-0.1. The corresponding retrieval errors for narrowband sensors increase by approximately 3-4%. The channels in the atmospheric window regions with lower transmittance, i.e., 8-8.5 and 12.5-14 microm, are most sensitive to retrieval errors.
The Block-block Bootstrap: Improved Asymptotic Refinements
Donald W.K. Andrews
2002-01-01
The asymptotic refinements attributable to the block bootstrap for time series are not as large as those of the nonparametric iid bootstrap or the parametric bootstrap. One reason is that the independence between the blocks in the block bootstrap sample does not mimic the dependence structure of the original sample. This is the join-point problem. In this paper, we propose a method of solving this problem. The idea is not to alter the block bootstrap. Instead, we alter the original sample sta...
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Directory of Open Access Journals (Sweden)
Dan Garcia-Carrillo
2016-03-01
Full Text Available The Internet of Things (IoT is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP. Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP and Authentication Authorization and Accounting (AAA. We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-01-01
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length. PMID:26978362
Lahiri, S. N.
2005-01-01
Efron [J. Roy. Statist. Soc. Ser. B 54 (1992) 83--111] proposed a computationally efficient method, called the jackknife-after-bootstrap, for estimating the variance of a bootstrap estimator for independent data. For dependent data, a version of the jackknife-after-bootstrap method has been recently proposed by Lahiri [Econometric Theory 18 (2002) 79--98]. In this paper it is shown that the jackknife-after-bootstrap estimators of the variance of a bootstrap quantile are consistent for both de...
Usery, E.L.; Finn, M.P.; Scheidt, D.J.; Ruhl, S.; Beard, T.; Bearden, M.
2004-01-01
Researchers have been coupling geographic information systems (GIS) data handling and processing capability to watershed and waterquality models for many years. This capability is suited for the development of databases appropriate for water modeling. However, it is rare for GIS to provide direct inputs to the models. To demonstrate the logical procedure of coupling GIS for model parameter extraction, we selected the Agricultural Non-Point Source (AGNPS) pollution model. Investigators can generate data layers at various resolutions and resample to pixel sizes to support models at particular scales. We developed databases of elevation, land cover, and soils at various resolutions in four watersheds. The ability to use multiresolution databases for the generation of model parameters is problematic for grid-based models. We used database development procedures and observed the effects of resolution and resampling on GIS input datasets and parameters generated from those inputs for AGNPS. Results indicate that elevation values at specific points compare favorably between 3- and 30-m raster datasets. Categorical data analysis indicates that land cover classes vary significantly. Derived parameters parallel the results of the base GIS datasets. Analysis of data resampled from 30-m to 60-, 120-, 210-, 240-, 480-, 960-, and 1920-m pixels indicates a general degradation of both elevation and land cover correlations as resolution decreases. Initial evaluation of model output values for soluble nitrogen and phosphorous indicates similar degradation with resolution. ?? Springer-Verlag 2004.
Extremal bootstrapping: go with the flow
El-Showk, Sheer
2016-01-01
The extremal functional method determines approximate solutions to the constraints of crossing symmetry, which saturate bounds on the space of unitary CFTs. We show that such solutions are characterized by extremality conditions, which may be used to flow continuously along the boundaries of parameter space. Along the flow there is generically no further need for optimization, which dramatically reduces computational requirements, bringing calculations from the realm of computing clusters to laptops. Conceptually, extremality sheds light on possible ways to bootstrap without positivity, extending the method to non-unitary theories, and implies that theories saturating bounds, and especially those sitting at kinks, have unusually sparse spectra. We discuss several applications, including the first high-precision bootstrap of a non-unitary CFT.
The $(2,0)$ superconformal bootstrap
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C
2016-01-01
We develop the conformal bootstrap program for six-dimensional conformal field theories with $(2,0)$ supersymmetry, focusing on the universal four-point function of stress tensor multiplets. We review the solution of the superconformal Ward identities and describe the superconformal block decomposition of this correlator. We apply numerical bootstrap techniques to derive bounds on OPE coefficients and scaling dimensions from the constraints of crossing symmetry and unitarity. We also derive analytic results for the large spin spectrum using the lightcone expansion of the crossing equation. Our principal result is strong evidence that the $A_1$ theory realizes the minimal allowed central charge $(c=25)$ for any interacting $(2,0)$ theory. This implies that the full stress tensor four-point function of the $A_1$ theory is the unique unitary solution to the crossing symmetry equation at $c=25$. For this theory, we estimate the scaling dimensions of the lightest unprotected operators appearing in the stress tenso...
Bootstrapping Deep Lexical Resources: Resources for Courses
Baldwin, Timothy
2007-01-01
We propose a range of deep lexical acquisition methods which make use of morphological, syntactic and ontological language resources to model word similarity and bootstrap from a seed lexicon. The different methods are deployed in learning lexical items for a precision grammar, and shown to each have strengths and weaknesses over different word classes. A particular focus of this paper is the relative accessibility of different language resource types, and predicted ``bang for the buck'' associated with each in deep lexical acquisition applications.
TASI Lectures on the Conformal Bootstrap
Simmons-Duffin, David
2016-01-01
These notes are from courses given at TASI and the Advanced Strings School in summer 2015. Starting from principles of quantum field theory and the assumption of a traceless stress tensor, we develop the basics of conformal field theory, including conformal Ward identities, radial quantization, reflection positivity, the operator product expansion, and conformal blocks. We end with an introduction to numerical bootstrap methods, focusing on the 2d and 3d Ising models.
Resampling of an Image by Block-Based Interpolation or Decimation with Compensation
Kapinos, M.; J. Mihalik; J. Zavacky
2000-01-01
Due to multiple standards on digital coding of image it is expected that conversion from one picture format to another will be quite necessary for display or recording of different format sources. Conventional approach of 2-D sampling rate conversion by polyphase filters requires relatively large memory and computational power. Therefore, a new efficient method for image resampling has been presented. The proposed approach performs resampling block by block with overlap. To minimize the overl...
DEFF Research Database (Denmark)
Huang, Shaojun; Mathe, Laszlo; Teodorescu, Remus
2013-01-01
the proper switching instances needed for the resampling modulation technique. The software implementation of the proposed phase shifted PWM (PS-PWM) method, and its application in a distributed control system for MMC, are fully discussed in this paper. Simulation and experiment results show...... that the proposed solution can realize the resampled uniform PWM and provide high effective sampling frequency and low time delay, which is critical for the distributed control of MMC....
Seol, Hyunsoo
2016-06-01
The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size.
Moreno, Claudia E.; Guevara, Roger; Sánchez-Rojas, Gerardo; Téllez, Dianeis; Verdú, José R.
2008-01-01
Environmental assessment at the community level in highly diverse ecosystems is limited by taxonomic constraints and statistical methods requiring true replicates. Our objective was to show how diverse systems can be studied at the community level using higher taxa as biodiversity surrogates, and re-sampling methods to allow comparisons. To illustrate this we compared the abundance, richness, evenness and diversity of the litter fauna in a pine-oak forest in central Mexico among seasons, sites and collecting methods. We also assessed changes in the abundance of trophic guilds and evaluated the relationships between community parameters and litter attributes. With the direct search method we observed differences in the rate of taxa accumulation between sites. Bootstrap analysis showed that abundance varied significantly between seasons and sampling methods, but not between sites. In contrast, diversity and evenness were significantly higher at the managed than at the non-managed site. Tree regression models show that abundance varied mainly between seasons, whereas taxa richness was affected by litter attributes (composition and moisture content). The abundance of trophic guilds varied among methods and seasons, but overall we found that parasitoids, predators and detrivores decreased under management. Therefore, although our results suggest that management has positive effects on the richness and diversity of litter fauna, the analysis of trophic guilds revealed a contrasting story. Our results indicate that functional groups and re-sampling methods may be used as tools for describing community patterns in highly diverse systems. Also, the higher taxa surrogacy could be seen as a preliminary approach when it is not possible to identify the specimens at a low taxonomic level in a reasonable period of time and in a context of limited financial resources, but further studies are needed to test whether the results are specific to a system or whether they are general
Is Bootstrap Really Helpful in Point Process Statistics?
Snethlage, Martin
2000-01-01
There are some papers which describe the use of bootstrap techniques in point process statistics. The aim of the present paper is to show that the form in which bootstrap is used there is dubious. In case of variance estimation of pair correlation function estimators the used bootstrap techniques lead to results which can be obtained simpler without simulation; furthermore, they differ from the desired results. The problem to obtain confidence regions for the intensity function of inhomogeneo...
A PARAMETRIC BOOTSTRAP USING THE FIRST FOURMOMENTS OF THE RESIDUALS
Pierre-Eric Treyens
2007-01-01
We consider linear regression models and we suppose that disturbances are either Gaussian or non Gaussian. Until now, within the framework of the bootstrap, we thought that the error in rejection probability (ERP) had the same rate of convergence with the parametric bootstrap or the nonparametric bootstrap. For linear data generating processes (DGP) we show in this paper that this assertion is false if skewness and/or kurtosis coefficients of the distribution of the disturbances are nonnull. ...
Stationary bootstrapping realized volatility under market microstructure noise
Hwang, Eunju; Shin, Dong Wan
2013-01-01
Large-sample validity is proved for stationary bootstrapping of a bias-corrected realized volatility under market microstructure noise, which enables us to construct a bootstrap confidence interval of integrated volatility. A finite-sample simulation shows that the stationary bootstrapping confidence interval outperforms existing ones which are constructed ignoring market microstructure noise or using asymptotic normality for the bias-corrected realized volatility.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Dan Garcia-Carrillo; Rafael Marin-Lopez
2016-01-01
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to ...
Bootstrap Results From the State Space From Representation of the Heath-Jarrow-Morton Model
Ram Bhar; Carl Chiarella
1996-01-01
This paper builds upon the authors' previous work on transformation of the Heath-Jarrow-Morton (HJM) model of the term structure of interest rates to state space form for a fairly general class of volatility specification including stochastic variables. Estimation of this volatility function is at the heart of the identification of the HJM model. The paper develops a bootstrap procedure for the HJM model cast into the non-linear filtering framework to analyse the statistical significance of t...
Testing for time-varying fractional cointegration using the bootstrap approach
Simwaka, Kisu
2012-01-01
Fractional cointegration has attracted interest in time series econometrics in recent years (see among others, Dittmann 2004). According to Engle and Granger (1987), the concept of fractional cointegration was introduced to generalize the traditional cointegration to the long memory framework. Although cointegration tests have been developed for the traditional cointegration framework, these tests do not take into account fractional cointegration. This paper proposes a bootstrap procedure to ...
Bootstrap confidence intervals for the process capability index under half-logistic distribution
Wararit Panichkitkosolkul
2012-01-01
This study concerns the construction of bootstrap confidence intervals for theprocess capability index in the case of half-logistic distribution. The bootstrap confidence intervals applied consist of standard bootstrap confidence interval, percentile bootstrap confidence interval and bias-corrected percentile bootstrap confidence interval. Using Monte Carlo simulations, the estimated coverage probabilities and average widths ofbootstrap confidence intervals are compared, with results showing ...
Accidental Symmetries and the Conformal Bootstrap
Chester, Shai M; Iliesiu, Luca V; Klebanov, Igor R; Pufu, Silviu S; Yacoby, Ran
2015-01-01
We study an ${\\cal N} = 2$ supersymmetric generalization of the three-dimensional critical $O(N)$ vector model that is described by $N+1$ chiral superfields with superpotential $W = g_1 X \\sum_i Z_i^2 + g_2 X^3$. By combining the tools of the conformal bootstrap with results obtained through supersymmetric localization, we argue that this model exhibits a symmetry enhancement at the infrared superconformal fixed point due to $g_2$ flowing to zero. This example is special in that the existence of an infrared fixed point with $g_1,g_2\
A bootstrap approach to bump hunting
Silverman, B. W.
1982-01-01
An important question in cluster analysis and pattern recognition is the determination of the number of clusters into which a given population should be divided. Frequently, particularly when certain specific clustering methods are being used, the number of clusters is taken to be equal to the number of modes, or local maxima, in the probability density function underlying the given data set. The use of kernal density estimates in mode estimation is discussed. The test statistic to be used is defined and a bootstrap technique for assessing significance is given. An illustrative application is followed by an examination of the asymptotic behavior of the test statistic.
Confidence Intervals for the Mean: To Bootstrap or Not to Bootstrap
Calzada, Maria E.; Gardner, Holly
2011-01-01
The results of a simulation conducted by a research team involving undergraduate and high school students indicate that when data is symmetric the student's "t" confidence interval for a mean is superior to the studied non-parametric bootstrap confidence intervals. When data is skewed and for sample sizes n greater than or equal to 10, the results…
Enders, Craig K.
2002-01-01
Proposed a method for extending the Bollen-Stine bootstrap model (K. Bollen and R. Stine, 1992) fit to structural equation models with missing data. Developed a Statistical Analysis System macro program to implement this procedure, and assessed its usefulness in a simulation. The new method yielded model rejection rates close to the nominal 5%…
van de Schoot, Rens; Strohmeier, Dagmar
2011-01-01
In the present paper, the application of a parametric bootstrap procedure, as described by van de Schoot, Hoijtink, and Dekovic (2010), will be applied to demonstrate that a direct test of an informative hypothesis offers more informative results compared to testing traditional null hypotheses against catch-all rivals. Also, more power can be…
Learning web development with Bootstrap and AngularJS
Radford, Stephen
2015-01-01
Whether you know a little about Bootstrap or AngularJS, or you're a complete beginner, this book will enhance your capabilities in both frameworks and you'll build a fully functional web app. A working knowledge of HTML, CSS, and JavaScript is required to fully get to grips with Bootstrap and AngularJS.
On bootstrap sample size in extreme value theory
J.L. Geluk (Jaap); L.F.M. de Haan (Laurens)
2002-01-01
textabstractIt has been known for a long time that for bootstrapping the probability distribution of the maximum of a sample consistently, the bootstrap sample size needs to be of smaller order than the original sample size. See Jun Shao and Dongsheng Tu (1995), Ex. 3.9,p. 123. We show that the same
Bootstrap Estimates of Standard Errors in Generalizability Theory
Tong, Ye; Brennan, Robert L.
2007-01-01
Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…
Bootstrapping and Bartlett corrections in the cointegrated VAR model
P.H. Omtzigt; S. Fachin
2003-01-01
The small sample properties of tests on long-run coefficients in cointegrated systems are still a matter of concern to applied econometricians. We compare the performance of the Bartlett correction, the bootstrap and the fast double bootstrap for tests on ccointegration parameters in the maximum lik
Resampling of an Image by Block-Based Interpolation or Decimation with Compensation
Directory of Open Access Journals (Sweden)
M. Kapinos
2000-06-01
Full Text Available Due to multiple standards on digital coding of image it is expectedthat conversion from one picture format to another will be quitenecessary for display or recording of different format sources.Conventional approach of 2-D sampling rate conversion by polyphasefilters requires relatively large memory and computational power.Therefore, a new efficient method for image resampling has beenpresented. The proposed approach performs resampling block by blockwith overlap. To minimize the overlap special block interpolationkernels are used, with one pixel overlap gives satisfactory result formost of practical images. Proposed method can be efficiently applied toimage communication systems where block transforms are used for datacompression.
A Robust Kalman Framework with Resampling and Optimal Smoothing
Directory of Open Access Journals (Sweden)
Thomas Kautz
2015-02-01
Full Text Available The Kalman filter (KF is an extremely powerful and versatile tool for signal processing that has been applied extensively in various fields. We introduce a novel Kalman-based analysis procedure that encompasses robustness towards outliers, Kalman smoothing and real-time conversion from non-uniformly sampled inputs to a constant output rate. These features have been mostly treated independently, so that not all of their benefits could be exploited at the same time. Here, we present a coherent analysis procedure that combines the aforementioned features and their benefits. To facilitate utilization of the proposed methodology and to ensure optimal performance, we also introduce a procedure to calculate all necessary parameters. Thereby, we substantially expand the versatility of one of the most widely-used filtering approaches, taking full advantage of its most prevalent extensions. The applicability and superior performance of the proposed methods are demonstrated using simulated and real data. The possible areas of applications for the presented analysis procedure range from movement analysis over medical imaging, brain-computer interfaces to robot navigation or meteorological studies.
A robust Kalman framework with resampling and optimal smoothing.
Kautz, Thomas; Eskofier, Bjoern M
2015-02-27
The Kalman filter (KF) is an extremely powerful and versatile tool for signal processing that has been applied extensively in various fields. We introduce a novel Kalman-based analysis procedure that encompasses robustness towards outliers, Kalman smoothing and real-time conversion from non-uniformly sampled inputs to a constant output rate. These features have been mostly treated independently, so that not all of their benefits could be exploited at the same time. Here, we present a coherent analysis procedure that combines the aforementioned features and their benefits. To facilitate utilization of the proposed methodology and to ensure optimal performance, we also introduce a procedure to calculate all necessary parameters. Thereby, we substantially expand the versatility of one of the most widely-used filtering approaches, taking full advantage of its most prevalent extensions. The applicability and superior performance of the proposed methods are demonstrated using simulated and real data. The possible areas of applications for the presented analysis procedure range from movement analysis over medical imaging, brain-computer interfaces to robot navigation or meteorological studies.
Building Intuitions about Statistical Inference Based on Resampling
Watson, Jane; Chance, Beth
2012-01-01
Formal inference, which makes theoretical assumptions about distributions and applies hypothesis testing procedures with null and alternative hypotheses, is notoriously difficult for tertiary students to master. The debate about whether this content should appear in Years 11 and 12 of the "Australian Curriculum: Mathematics" has gone on for…
Conformal bootstrap, universality and gravitational scattering
Directory of Open Access Journals (Sweden)
Steven Jackson
2015-12-01
Full Text Available We use the conformal bootstrap equations to study the non-perturbative gravitational scattering between infalling and outgoing particles in the vicinity of a black hole horizon in AdS. We focus on irrational 2D CFTs with large c and only Virasoro symmetry. The scattering process is described by the matrix element of two light operators (particles between two heavy states (BTZ black holes. We find that the operator algebra in this regime is (i universal and identical to that of Liouville CFT, and (ii takes the form of an exchange algebra, specified by an R-matrix that exactly matches the scattering amplitude of 2+1 gravity. The R-matrix is given by a quantum 6j-symbol and the scattering phase by the volume of a hyperbolic tetrahedron. We comment on the relevance of our results to scrambling and the holographic reconstruction of the bulk physics near black hole horizons.
Conformal Bootstrap, Universality and Gravitational Scattering
Jackson, Steven; Verlinde, Herman
2014-01-01
We use the conformal bootstrap equations to study the non-perturbative gravitational scattering between infalling and outgoing particles in the vicinity of a black hole horizon in AdS. We focus on irrational 2D CFTs with large $c$, a sparse light spectrum and only Virasoro symmetry. The scattering process is described by the matrix element of two light operators (particles) between two heavy states (BTZ black holes). We find that the operator algebra in this regime is (i) universal and identical to that of Liouville CFT, and (ii) takes the form of an exchange algebra, specified by an R-matrix that exactly matches with the scattering amplitude of 2+1 gravity. The R-matrix is given by a quantum 6j-symbol and the scattering phase by the volume of a hyperbolic tetrahedron. We comment on the relevance of our results to scrambling and the holographic reconstruction of the bulk physics near black hole horizons.
Resampling-based approaches to study variation in morphological modularity.
Directory of Open Access Journals (Sweden)
Carmelo Fruciano
Full Text Available Modularity has been suggested to be connected to evolvability because a higher degree of independence among parts allows them to evolve as separate units. Recently, the Escoufier RV coefficient has been proposed as a measure of the degree of integration between modules in multivariate morphometric datasets. However, it has been shown, using randomly simulated datasets, that the value of the RV coefficient depends on sample size. Also, so far there is no statistical test for the difference in the RV coefficient between a priori defined groups of observations. Here, we (1, using a rarefaction analysis, show that the value of the RV coefficient depends on sample size also in real geometric morphometric datasets; (2 propose a permutation procedure to test for the difference in the RV coefficient between a priori defined groups of observations; (3 show, through simulations, that such a permutation procedure has an appropriate Type I error; (4 suggest that a rarefaction procedure could be used to obtain sample-size-corrected values of the RV coefficient; and (5 propose a nearest-neighbor procedure that could be used when studying the variation of modularity in geographic space. The approaches outlined here, readily extendable to non-morphometric datasets, allow study of the variation in the degree of integration between a priori defined modules. A Java application--that will allow performance of the proposed test using a software with graphical user interface--has also been developed and is available at the Morphometrics at Stony Brook Web page (http://life.bio.sunysb.edu/morph/.
Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'
DEFF Research Database (Denmark)
de Nijs, Robin
2015-01-01
In order to be able to calculate half-count images from already acquired data, White and Lawson published their method based on Poisson resampling. They verified their method experimentally by measurements with a Co-57 flood source. In this comment their results are reproduced and confirmed...
Using Resampling To Estimate the Precision of an Empirical Standard-Setting Method.
Muijtjens, Arno M. M.; Kramer, Anneke W. M.; Kaufman, David M.; Van der Vleuten, Cees P. M.
2003-01-01
Developed a method to estimate the cutscore precisions for empirical standard-setting methods by using resampling. Illustrated the method with two actual datasets consisting of 86 Dutch medical residents and 155 Canadian medical students taking objective structured clinical examinations. Results show the applicability of the method. (SLD)
A steady-State Genetic Algorithm with Resampling for Noisy Inventory Control
Prestwich, S.; Tarim, S.A.; Rossi, R.; Hnich, B.
2008-01-01
Noisy fitness functions occur in many practical applications of evolutionary computation. A standard technique for solving these problems is fitness resampling but this may be inefficient or need a large population, and combined with elitism it may overvalue chromosomes or reduce genetic diversity.
Bootstrap consistency for general semiparametric M-estimation
Cheng, Guang
2010-10-01
Consider M-estimation in a semiparametric model that is characterized by a Euclidean parameter of interest and an infinite-dimensional nuisance parameter. As a general purpose approach to statistical inferences, the bootstrap has found wide applications in semiparametric M-estimation and, because of its simplicity, provides an attractive alternative to the inference approach based on the asymptotic distribution theory. The purpose of this paper is to provide theoretical justifications for the use of bootstrap as a semiparametric inferential tool. We show that, under general conditions, the bootstrap is asymptotically consistent in estimating the distribution of the M-estimate of Euclidean parameter; that is, the bootstrap distribution asymptotically imitates the distribution of the M-estimate. We also show that the bootstrap confidence set has the asymptotically correct coverage probability. These general onclusions hold, in particular, when the nuisance parameter is not estimable at root-n rate, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.
Properties of bootstrap tests for N-of-1 studies
Lin, Sharon Xiaowen; Morrison, Leanne; Smith, Peter; Hargood, Charlie; Weal, Mark; Yardley, Lucy
2016-01-01
N-of-1 study designs involve the collection and analysis of repeated measures data from an individual not using an intervention and using an intervention. This study explores the use of semi-parametric and parametric bootstrap tests in the analysis of N-of-1 studies under a single time series framework in the presence of autocorrelation. When the Type I error rates of bootstrap tests are compared to Wald tests, our results show that the bootstrap tests have more desirable properties. We compa...
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Filter and modified stepwedge bootstrap sensitometry in medical radiography
Energy Technology Data Exchange (ETDEWEB)
Yoshida, Akira
1988-05-01
Two new bootstrap methods for determining characteristic curves of radiographic screen/film systems are presented : filter bootstrap sensitometry and modified stepwedge bootstrap sensitometry. Both are intensity-scale sensitometries since the radiation intensity can be varied through use of a combination of inverse square and metal filters. Characteristic curves obtained by these methods are compared with those from screen/film systems using inverse square sensitometry as a reference standard of accuracy. The precision of all three methods is better than +- 2 % with agreement among generally being within 3 % over the useful density range. By these bootstrap methods, it is possible to obtain characteristic curves agreeing well with those using the inverse square method for a relatively short distance, and make radiographic sensitometry practical and convenient at most medical institutions.
'Bootstrap' Configuration for Multistage Pulse-Tube Coolers
Nguyen, Bich; Nguyen, Lauren
2008-01-01
A bootstrap configuration has been proposed for multistage pulse-tube coolers that, for instance, provide final-stage cooling to temperatures as low as 20 K. The bootstrap configuration supplants the conventional configuration, in which customarily the warm heat exchangers of all stages reject heat at ambient temperature. In the bootstrap configuration, the warm heat exchanger, the inertance tube, and the reservoir of each stage would be thermally anchored to the cold heat exchanger of the next warmer stage. The bootstrapped configuration is superior to the conventional setup, in some cases increasing the 20 K cooler's coefficient of performance two-fold over that of an otherwise equivalent conventional layout. The increased efficiency could translate into less power consumption, less cooler mass, and/or lower cost for a given amount of cooling.
On the range of validity of the autoregressive sieve bootstrap
Kreiss, Jens-Peter; Politis, Dimitris N; 10.1214/11-AOS900
2012-01-01
We explore the limits of the autoregressive (AR) sieve bootstrap, and show that its applicability extends well beyond the realm of linear time series as has been previously thought. In particular, for appropriate statistics, the AR-sieve bootstrap is valid for stationary processes possessing a general Wold-type autoregressive representation with respect to a white noise; in essence, this includes all stationary, purely nondeterministic processes, whose spectral density is everywhere positive. Our main theorem provides a simple and effective tool in assessing whether the AR-sieve bootstrap is asymptotically valid in any given situation. In effect, the large-sample distribution of the statistic in question must only depend on the first and second order moments of the process; prominent examples include the sample mean and the spectral density. As a counterexample, we show how the AR-sieve bootstrap is not always valid for the sample autocovariance even when the underlying process is linear.
Robust parametric bootstrap test with MOM estimator: An alternative to independent sample t-test
Harun, Nurul Hanis; Yusof, Zahayu Md
2014-12-01
Normality and homogeneity are two major assumptions that need to be fulfilled when using independent sample t-test. However, not all data encompassed with these assumptions. Consequently, the result produced by independent sample t-test becomes invalid. Therefore, the alternative is to use robust statistical procedure in handling the problems of nonnormality and variances heterogeneity. This study proposed to use Parametric Bootstrap test with popular robust estimators, MADn and Tn which empirically determines the amount of trimming. The Type I error rates produced by each procedure were examined and compared with classical parametric test and nonparametric test namely independent sample t-test and Mann Whitney test, respectively. 5000 simulated data sets are used in this study in order to generate the Type I error for each procedure. The findings of this study indicate that the Parametric Bootstrap test with MADn and Tn produces the best Type I error control compared to the independent sample t-test and the Mann Whitney test under nonnormal distribution, heterogeneous variances and unbalanced design. Then, the performance of each procedure was demonstrated using real data.
Bootstrap tests in linear models with many regressors
Patrick Richard
2014-01-01
This paper is concerned with bootstrap hypothesis testing in high dimensional linear regression models. Using a theoretical framework recently introduced by Anatolyev (2012), we show that bootstrap F, LR and LM tests are asymptotically valid even when the numbers of estimated parameters and tested restrictions are not asymptotically negligible fractions of the sample size. These results are derived for models with iid error terms, but Monte Carlo evidence suggests that they extend to the wild...
Spectrum of local boundary operators from boundary form factor bootstrap
Szots, M
2007-01-01
Using the recently introduced boundary form factor bootstrap equations, we map the complete space of their solutions for the boundary version of the scaling Lee-Yang model and sinh-Gordon theory. We show that the complete space of solutions, graded by the ultraviolet behaviour of the form factors can be brought into correspondence with the spectrum of local boundary operators expected from boundary conformal field theory, which is a major evidence for the correctness of the boundary form factor bootstrap framework.
On the range of validity of the autoregressive sieve bootstrap
Kreiss, Jens-Peter; Paparoditis, Efstathios; Politis, Dimitris N.
2012-01-01
We explore the limits of the autoregressive (AR) sieve bootstrap, and show that its applicability extends well beyond the realm of linear time series as has been previously thought. In particular, for appropriate statistics, the AR-sieve bootstrap is valid for stationary processes possessing a general Wold-type autoregressive representation with respect to a white noise; in essence, this includes all stationary, purely nondeterministic processes, whose spectral density is everywhere positive....
Bootstrap transition to high beta equilibrium in helical system
International Nuclear Information System (INIS)
It is shown theoretically and computationally that helical magnetic field, produced by continuous winding helical coils and without the toroidal coil, can sustain MHD stable high beta plasma. Pressure driven toroidal current (bootstrap current) cancels the external magnetic field and reduces the MHD potential energy, depending on the plasma beta values. Ramp-up of heating power input induces bootstrap transition to higher beta plasmas with flat-top pressure profiles. Helical pitch parameter dependence of MHD stability is analyzed. (author)
Bootstrapping the statistical uncertainties of NN scattering data
Perez, R Navarro; Arriola, E Ruiz
2014-01-01
We use the Monte Carlo bootstrap as a method to simulate pp and np scattering data below pion production threshold from an initial set of over 6700 experimental mutually $3\\sigma$ consistent data. We compare the results of the bootstrap, with 1020 statistically generated samples of the full database, with the standard covariance matrix method of error propagation. No significant differences in scattering observables and phase shifts are found. This suggests alternative strategies for propagating errors of nuclear forces in nuclear structure calculations.
Bootstrapped Multinomial Logistic Regression on Apnea Detection Using ECG Data
Sanabila, Hadaiq R.; Fanany, Mohamad Ivan; Jatmiko, Wisnu; Arymurthy, Aniati Murni
2010-01-01
In designing a classification system, one of the most important considerations is how optimal the classifier will adapt and give best generalization when it is given data from unknown model distribution. Unlike linear regression, logistic regression has no simple formula to assess its generalization ability. In such cases, bootstrapping offers an advantage over analytical methods thanks to its simplicity. This paper presents an analysis of bootstrapped multinomial logistic regression appli...
BOOTSTRAP-BASED STATISTICAL THRESHOLDING FOR MEG SOURCE RECONSTRUCTION IMAGES
Sekihara, Kensuke; Sahani, Maneesh; Nagarajan, Srikantan S.
2004-01-01
This paper proposes a bootstrap-based statistical method for extracting target source activities from MEG/EEG source reconstruction results. The method requires measurements in a control condition, which contains only non-target source activities. The method derives, at each pixel location, an empirical probability distribution of the non-target source activity using bootstrapped reconstruction obtained from the control period. The statistical threshold that can extract the target source acti...
Quantum bootstrapping via compressed quantum Hamiltonian learning
International Nuclear Information System (INIS)
A major problem facing the development of quantum computers or large scale quantum simulators is that general methods for characterizing and controlling are intractable. We provide a new approach to this problem that uses small quantum simulators to efficiently characterize and learn control models for larger devices. Our protocol achieves this by using Bayesian inference in concert with Lieb–Robinson bounds and interactive quantum learning methods to achieve compressed simulations for characterization. We also show that the Lieb–Robinson velocity is epistemic for our protocol, meaning that information propagates at a rate that depends on the uncertainty in the system Hamiltonian. We illustrate the efficiency of our bootstrapping protocol by showing numerically that an 8 qubit Ising model simulator can be used to calibrate and control a 50 qubit Ising simulator while using only about 750 kilobits of experimental data. Finally, we provide upper bounds for the Fisher information that show that the number of experiments needed to characterize a system rapidly diverges as the duration of the experiments used in the characterization shrinks, which motivates the use of methods such as ours that do not require short evolution times. (fast track communication)
Bootstrapping Object Coreferencing on the Semantic Web
Institute of Scientific and Technical Information of China (English)
Wei Hu; Yu-Zhong Qu; Xing-Zhi Sun
2011-01-01
An object on the Semantic Web is likely to be denoted with several URIs by different parties.Object coreferencing is a process to identify "equivalent" URIs of objects for achieving a better Data Web.In this paper,we propose a bootstrapping approach for object coreferencing on the Semantic Web.For an object URI,we firstly establish a kernel that consists of semantically equivalent URIs from the same-as,(inverse) functional properties and (max-)cardinalities,and then extend the kernel with respect to the textual descriptions (e.g.,labels and local names) of URIs.We also propose a trustworthiness-based method to rank the coreferent URIs in the kernel as well as a similarity-based method for ranking the URIs in the extension of the kernel.We implement the proposed approach,called ObjectCoref,on a large-scale dataset that contains 76 million URIs collected by the Falcons search engine until 2008.The evaluation on precision,relative recall and response time demonstrates the feasibility of our approach.Additionally,we apply the proposed approach to investigate the popularity of the URI alias phenomenon on the current Semantic Web.
Conformal Collider Physics from the Lightcone Bootstrap
Li, Daliang; Poland, David
2015-01-01
We analytically study the lightcone limit of the conformal bootstrap equations for 4-point functions containing global symmetry currents and the stress tensor in 3d CFTs. We show that the contribution of the stress tensor to the anomalous dimensions of large spin double-twist states is negative if and only if the conformal collider physics bounds are satisfied. In the context of AdS/CFT these results indicate a relation between the attractiveness of AdS gravity and positivity of the CFT energy flux. We also study the contribution of non-Abelian conserved currents to the anomalous dimensions of double-twist operators, corresponding to the gauge binding energy of 2-particle states in AdS. We show that the representation of the double-twist state determines the sign of the gauge binding energy if and only if the coefficient appearing in the current 3-point function satisfies a similar bound, which is equivalent to an upper bound on the charge flux asymmetry of the CFT.
Bootstrap inference longitudinal semiparametric regression model
Pane, Rahmawati; Otok, Bambang Widjanarko; Zain, Ismaini; Budiantara, I. Nyoman
2016-02-01
Semiparametric regression contains two components, i.e. parametric and nonparametric component. Semiparametric regression model is represented by yt i=μ (x˜'ti,zt i)+εt i where μ (x˜'ti,zt i)=x˜'tiβ ˜+g (zt i) and yti is response variable. It is assumed to have a linear relationship with the predictor variables x˜'ti=(x1 i 1,x2 i 2,…,xT i r) . Random error εti, i = 1, …, n, t = 1, …, T is normally distributed with zero mean and variance σ2 and g(zti) is a nonparametric component. The results of this study showed that the PLS approach on longitudinal semiparametric regression models obtain estimators β˜^t=[X'H(λ)X]-1X'H(λ )y ˜ and g˜^λ(z )=M (λ )y ˜ . The result also show that bootstrap was valid on longitudinal semiparametric regression model with g^λ(b )(z ) as nonparametric component estimator.
Control of bootstrap current in the pedestal region of tokamaks
Energy Technology Data Exchange (ETDEWEB)
Shaing, K. C. [Institute for Space and Plasma Sciences, National Cheng Kung University, Tainan City 70101, Taiwan (China); Department of Engineering Physics, University of Wisconsin, Madison, Wisconsin 53796 (United States); Lai, A. L. [Institute for Space and Plasma Sciences, National Cheng Kung University, Tainan City 70101, Taiwan (China)
2013-12-15
The high confinement mode (H-mode) plasmas in the pedestal region of tokamaks are characterized by steep gradient of the radial electric field, and sonic poloidal U{sub p,m} flow that consists of poloidal components of the E×B flow and the plasma flow velocity that is parallel to the magnetic field B. Here, E is the electric field. The bootstrap current that is important for the equilibrium, and stability of the pedestal of H-mode plasmas is shown to have an expression different from that in the conventional theory. In the limit where ‖U{sub p,m}‖≫ 1, the bootstrap current is driven by the electron temperature gradient and inductive electric field fundamentally different from that in the conventional theory. The bootstrap current in the pedestal region can be controlled through manipulating U{sub p,m} and the gradient of the radial electric. This, in turn, can control plasma stability such as edge-localized modes. Quantitative evaluations of various coefficients are shown to illustrate that the bootstrap current remains finite when ‖U{sub p,m}‖ approaches infinite and to provide indications how to control the bootstrap current. Approximate analytic expressions for viscous coefficients that join results in the banana and plateau-Pfirsch-Schluter regimes are presented to facilitate bootstrap and neoclassical transport simulations in the pedestal region.
Tie the straps: Uniform bootstrap con fidence bands for bounded influence curve estimators
Härdle, Wolfgang Karl; Ritov, Ya'Acov; Wang, Weining
2013-01-01
We consider theoretical bootstrap coupling techniques for nonparametric robust smoothers and quantile regression, and verify the bootstrap improvement. To cope with curse of dimensionality, a variant of coupling bootstrap techniques are developed for additive models with both symmetric error distributions and further extension to the quantile regression framework. Our bootstrap method can be used in many situations like constructing con dence intervals and bands. We demonstrate the bootstrap ...
Classified-edge guided depth resampling for multi-view coding
Lu, Yu; Zhou, Yang; Chen, Hua-hua
2016-01-01
A new depth resampling for multi-view coding is proposed in this paper. At first, the depth video is downsampled by median filtering before encoding. After decoding, the classified edges, including credible edge and probable edge from the aligned texture image and the depth image, are interpolated by the selected diagonal pair, whose intensity difference is the minimum among four diagonal pairs around edge pixel. According to different category of edge, the intensity difference is measured by either real depth or percentage depth without any parameter setting. Finally, the resampled depth video and the decoded full-resolution texture video are synthesized into virtual views for the performance evaluation. Experiments on the platform of multi-view high efficiency video coding (HEVC) demonstrate that the proposed method is superior to the contrastive methods in terms of visual quality and rate distortion (RD) performance.
Quasi-Epipolar Resampling of High Resolution Satellite Stereo Imagery for Semi Global Matching
Tatar, N.; Saadatseresht, M.; Arefi, H.; Hadavand, A.
2015-12-01
Semi-global matching is a well-known stereo matching algorithm in photogrammetric and computer vision society. Epipolar images are supposed as input of this algorithm. Epipolar geometry of linear array scanners is not a straight line as in case of frame camera. Traditional epipolar resampling algorithms demands for rational polynomial coefficients (RPCs), physical sensor model or ground control points. In this paper we propose a new solution for epipolar resampling method which works without the need for these information. In proposed method, automatic feature extraction algorithms are employed to generate corresponding features for registering stereo pairs. Also original images are divided into small tiles. In this way by omitting the need for extra information, the speed of matching algorithm increased and the need for high temporal memory decreased. Our experiments on GeoEye-1 stereo pair captured over Qom city in Iran demonstrates that the epipolar images are generated with sub-pixel accuracy.
Compression, restoration, resampling, ‘compressive sensing’: fast transforms in digital imaging
International Nuclear Information System (INIS)
Transform image processing methods are methods that work in domains of image transforms, such as discrete fourier, discrete cosine, wavelet and alike. They are the basic tools in image compression, image restoration, image resampling and geometrical transformations and can be traced back to the early 1970s. The paper presents a review of these methods with emphasis on their comparison and relationships, from the very first steps of transform image compression methods to adaptive and local adaptive transform domain filters for image restoration, to methods of precise image resampling and image reconstruction from sparse samples and up to the ‘compressive sensing’ approach that has gained popularity in the last few years. The review has a tutorial character and purpose. (topical review)
Automotive FMCW Radar-enhanced Range Estimation via a Local Resampling Fourier Transform
Cailing Wang; Huajun Liu; Guang Han; Xiaoyuan Jing
2016-01-01
In complex traffic scenarios, more accurate measurement and discrimination for an automotive frequency-modulated continuous-wave (FMCW) radar is required for intelligent robots, driverless cars and driver-assistant systems. A more accurate range estimation method based on a local resampling Fourier transform (LRFT) for a FMCW radar is developed in this paper. Radar signal correlation in the phase space sees a higher signal-noise-ratio (SNR) to achieve more accurate ranging, and the LRFT - whi...
A Bootstrap Approach to Martian Manufacturing
Dorais, Gregory A.
2004-01-01
In-Situ Resource Utilization (ISRU) is an essential element of any affordable strategy for a sustained human presence on Mars. Ideally, Martian habitats would be extremely massive to allow plenty of room to comfortably live and work, as well as to protect the occupants from the environment. Moreover, transportation and power generation systems would also require significant mass if affordable. For our approach to ISRU, we use the industrialization of the U.S. as a metaphor. The 19th century started with small blacksmith shops and ended with massive steel mills primarily accomplished by blacksmiths increasing their production capacity and product size to create larger shops, which produced small mills, which produced the large steel mills that industrialized the country. Most of the mass of a steel mill is comprised of steel in simple shapes, which are produced and repaired with few pieces of equipment also mostly made of steel in basic shapes. Due to this simplicity, we expect that the 19th century manufacturing growth can be repeated on Mars in the 21st century using robots as the primary labor force. We suggest a "bootstrap" approach to manufacturing on Mars that uses a "seed" manufacturing system that uses regolith to create major structural components and spare parts. The regolith would be melted, foamed, and sintered as needed to fabricate parts using casting and solid freeform fabrication techniques. Complex components, such as electronics, would be brought from Earth and integrated as needed. These parts would be assembled to create additional manufacturing systems, which can be both more capable and higher capacity. These subsequent manufacturing systems could refine vast amounts of raw materials to create large components, as well as assemble equipment, habitats, pressure vessels, cranes, pipelines, railways, trains, power generation stations, and other facilities needed to economically maintain a sustained human presence on Mars.
Stability of response characteristics of a Delphi panel: application of bootstrap data expansion
Directory of Open Access Journals (Sweden)
Cole Bryan R
2005-12-01
Full Text Available Abstract Background Delphi surveys with panels of experts in a particular area of interest have been widely utilized in the fields of clinical medicine, nursing practice, medical education and healthcare services. Despite this wide applicability of the Delphi methodology, there is no clear identification of what constitutes a sufficient number of Delphi survey participants to ensure stability of results. Methods The study analyzed the response characteristics from the first round of a Delphi survey conducted with 23 experts in healthcare quality and patient safety. The panel members had similar training and subject matter understanding of the Malcolm Baldrige Criteria for Performance Excellence in Healthcare. The raw data from the first round sampling, which usually contains the largest diversity of responses, were augmented via bootstrap sampling to obtain computer-generated results for two larger samples obtained by sampling with replacement. Response characteristics (mean, trimmed mean, standard deviation and 95% confidence intervals for 54 survey items were compared for the responses of the 23 actual study participants and two computer-generated samples of 1000 and 2000 resampling iterations. Results The results from this study indicate that the response characteristics of a small expert panel in a well-defined knowledge area are stable in light of augmented sampling. Conclusion Panels of similarly trained experts (who possess a general understanding in the field of interest provide effective and reliable utilization of a small sample from a limited number of experts in a field of study to develop reliable criteria that inform judgment and support effective decision-making.
Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping
Directory of Open Access Journals (Sweden)
Irene Borra-Serrano
2015-08-01
Full Text Available Unmanned aerial vehicles (UAVs combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights.
Energy Technology Data Exchange (ETDEWEB)
Klan, M.S.; Shankle, S.A.; Kellogg, M.A.
1990-06-01
In this study a set of multicomponent case weights applicable to residential survey information were prepared for the Bonneville Power Administration (BPA) by the Pacific Northwest Laboratory (PNL). These case weights were prepared for the 1985 resample of respondents of an earlier BPA residential energy survey -- the original 1983 survey and subsequent surveys administered to the 1985 PNWRES resample were designed to gather information from households concerning their use of energy and related data. The PNWRES samples were drawn based on stratified random sampling techniques, that allow the survey results to represent the characteristics of the overall Pacific Northwest population of residential utility accounts. In order to determine the characteristics of the population, however, the survey results must be appropriately weighted. Case weights were developed for 1983 PNWRES by Lou Harris and Associates, Inc. This report briefly documents PNL's extension of the weighting methodology to the subsequent 1985 PNWRES resample, and describes the resulting caseweights generated by PNL. 9 refs., 5 tabs.
He, Qingbo; Wang, Jun; Hu, Fei; Kong, Fanrang
2013-10-01
The diagnosis of train bearing defects plays a significant role to maintain the safety of railway transport. Among various defect detection techniques, acoustic diagnosis is capable of detecting incipient defects of a train bearing as well as being suitable for wayside monitoring. However, the wayside acoustic signal will be corrupted by the Doppler effect and surrounding heavy noise. This paper proposes a solution to overcome these two difficulties in wayside acoustic diagnosis. In the solution, a dynamically resampling method is firstly presented to reduce the Doppler effect, and then an adaptive stochastic resonance (ASR) method is proposed to enhance the defective characteristic frequency automatically by the aid of noise. The resampling method is based on a frequency variation curve extracted from the time-frequency distribution (TFD) of an acoustic signal by dynamically minimizing the local cost functions. For the ASR method, the genetic algorithm is introduced to adaptively select the optimal parameter of the multiscale noise tuning (MST)-based stochastic resonance (SR) method. The proposed wayside acoustic diagnostic scheme combines signal resampling and information enhancement, and thus is expected to be effective in wayside defective bearing detection. The experimental study verifies the effectiveness of the proposed solution.
Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping.
Borra-Serrano, Irene; Peña, José Manuel; Torres-Sánchez, Jorge; Mesas-Carrascosa, Francisco Javier; López-Granados, Francisca
2015-01-01
Unmanned aerial vehicles (UAVs) combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images) created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra) captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA) implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights. PMID:26274960
Wendt, Herwig; Abry, Patrice
2007-01-01
Scaling analysis is nowadays becoming a standard tool in statistical signal processing. It mostly consists of estimating scaling attributes which in turns are involved in standard tasks such as detection, identification or classification. Recently, we proposed that confidence interval or hypothesis test design for scaling analysis could be based on non parametric bootstrap approaches. We showed that such procedures are efficient to decide whether data are better modeled with Gaussian fraction...
Plaza, D. A; Keyser, R.; G. J. M. De Lannoy; L. Giustarini; Matgen, P.; Pauwels, V. R. N.
2012-01-01
The Sequential Importance Sampling with Resampling (SISR) particle filter and the SISR with parameter resampling particle filter (SISR-PR) are evaluated for their performance in soil moisture assimilation and the consequent effect on baseflow generation. With respect to the resulting soil moisture time series, both filters perform appropriately. However, the SISR filter has a negative effect on the baseflow due to inconsistency between the parameter values and the states after the assimilatio...
Locality, bulk equations of motion and the conformal bootstrap
Kabat, Daniel
2016-01-01
We develop an approach to construct local bulk operators in a CFT to order 1/N^2. Since 4-point functions are not fixed by conformal invariance we use the OPE to categorize possible forms for a bulk operator. Using previous results on 3-point functions we construct a local bulk operator in each OPE channel. We then impose the condition that the bulk operators constructed in different channels agree, and hence give rise to a well-defined bulk operator. We refer to this condition as the "bulk bootstrap." We argue and explicitly show in some examples that the bulk bootstrap leads to some of the same results as the regular conformal bootstrap. In fact the bulk bootstrap provides an easier way to determine some CFT data, since it does not require knowing the form of the conformal blocks. This analysis clarifies previous results on the relation between bulk locality and the bootstrap for theories with a 1/N expansion, and it identifies a simple and direct way in which OPE coefficients and anomalous dimensions deter...
Adaptive wavelet detection of transients using the bootstrap
Hewer, Gary A.; Kuo, Wei; Peterson, Lawrence A.
1996-03-01
A Daubechies wavelet-based bootstrap detection strategy based on the research of Carmona was applied to a set of test signals. The detector was a function of the d-scales. The adaptive detection statistics were derived using Efron's bootstrap methodology, which relieved us from having to make parametric assumptions about the underlying noise and offered a method of overcoming the constraints of modeling the detector statistics. The test set of signals used to evaluate the Daubechies/bootstrap pulse detector were generated with a Hewlett-Packard Fast Agile Signal Simulator (FASS). These video pulses, with varying signal-to-noise ratios (SNRs), included unmodulated, linear chirp, and Barker phase-code modulations baseband (IF) video pulses mixed with additive white Gaussian noise. Simulated examples illustrating the bootstrap methodology are presented, along with a complete set of constant false alarm rate (CFAR) detection statistics for the test signals. The CFAR curves clearly show that the wavelet bootstrap can adaptively detect transient pulses at low SNRs.
Point Set Denoising Using Bootstrap-Based Radial Basis Function
Ramli, Ahmad; Abd. Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study. PMID:27315105
Design and Implementation of a Bootstrap Trust Chain
Institute of Scientific and Technical Information of China (English)
YU Fajiang; ZHANG Huanguo
2006-01-01
The chain of trust in bootstrap process is the basis of whole system trust in the trusted computing group (TCG) definition. This paper presents a design and implementation of a bootstrap trust chain in PC based on the Windows and today' commodity hardware, merely depends on availability of an embedded security module (ESM). ESM and security enhanced BIOS is the root of trust, PMBR (Pre-MBR) checks the integrity of boot data and Windows kernel, which is a checking agent stored in ESM. In the end, the paper analyzed the mathematic expression of the chain of trust and the runtime performance compared with the common booting process. The trust chain bootstrap greatly strengthens the security of personal computer system, and affects the runtime performance with only adding about 12% booting time.
A Bootstrap Algebraic Multilevel method for Markov Chains
Bolten, M; Brannick, J; Frommer, A; Kahl, K; Livshits, I
2010-01-01
This work concerns the development of an Algebraic Multilevel method for computing stationary vectors of Markov chains. We present an efficient Bootstrap Algebraic Multilevel method for this task. In our proposed approach, we employ a multilevel eigensolver, with interpolation built using ideas based on compatible relaxation, algebraic distances, and least squares fitting of test vectors. Our adaptive variational strategy for computation of the state vector of a given Markov chain is then a combination of this multilevel eigensolver and associated multilevel preconditioned GMRES iterations. We show that the Bootstrap AMG eigensolver by itself can efficiently compute accurate approximations to the state vector. An additional benefit of the Bootstrap approach is that it yields an accurate interpolation operator for many other eigenmodes. This in turn allows for the use of the resulting AMG hierarchy to accelerate the MLE steps using standard multigrid correction steps. The proposed approach is applied to a rang...
Addressing the P2P Bootstrap Problem for Small Networks
Wolinsky, David Isaac; Boykin, P Oscar; Figueiredo, Renato
2010-01-01
P2P overlays provide a framework for building distributed applications consisting of few to many resources with features including self-configuration, scalability, and resilience to node failures. Such systems have been successfully adopted in large-scale services for content delivery networks, file sharing, and data storage. In small-scale systems, they can be useful to address privacy concerns and for network applications that lack dedicated servers. The bootstrap problem, finding an existing peer in the overlay, remains a challenge to enabling these services for small-scale P2P systems. In large networks, the solution to the bootstrap problem has been the use of dedicated services, though creating and maintaining these systems requires expertise and resources, which constrain their usefulness and make them unappealing for small-scale systems. This paper surveys and summarizes requirements that allow peers potentially constrained by network connectivity to bootstrap small-scale overlays through the use of e...
No unitary bootstrap for the fractal Ising model
Golden, John
2015-01-01
We consider the conformal bootstrap for spacetime dimension $1
Zhu, Feng; Feng, Weiyue; Wang, Huajian; Huang, Shaosen; Lv, Yisong; Chen, Yong
2013-01-01
X-ray spectral imaging provides quantitative imaging of trace elements in biological sample with high sensitivity. We propose a novel algorithm to promote the signal-to-noise ratio (SNR) of X-ray spectral images that have low photon counts. Firstly, we estimate the image data area that belongs to the homogeneous parts through confidence interval testing. Then, we apply the Poisson regression through its maximum likelihood estimation on this area to estimate the true photon counts from the Poisson noise corrupted data. Unlike other denoising methods based on regression analysis, we use the bootstrap resampling methods to ensure the accuracy of regression estimation. Finally, we use a robust local nonparametric regression method to estimate the baseline and subsequently subtract it from the X-ray spectral data to further improve the SNR of the data. Experiments on several real samples show that the proposed method performs better than some state-of-the-art approaches to ensure accuracy and precision for quantit...
van de Water, S; Kraan, A C; Breedveld, S; Schillemans, W; Teguh, D N; Kooy, H M; Madden, T M; Heijmen, B J M; Hoogeman, M S
2013-10-01
This study investigates whether 'pencil beam resampling', i.e. iterative selection and weight optimization of randomly placed pencil beams (PBs), reduces optimization time and improves plan quality for multi-criteria optimization in intensity-modulated proton therapy, compared with traditional modes in which PBs are distributed over a regular grid. Resampling consisted of repeatedly performing: (1) random selection of candidate PBs from a very fine grid, (2) inverse multi-criteria optimization, and (3) exclusion of low-weight PBs. The newly selected candidate PBs were added to the PBs in the existing solution, causing the solution to improve with each iteration. Resampling and traditional regular grid planning were implemented into our in-house developed multi-criteria treatment planning system 'Erasmus iCycle'. The system optimizes objectives successively according to their priorities as defined in the so-called 'wish-list'. For five head-and-neck cancer patients and two PB widths (3 and 6 mm sigma at 230 MeV), treatment plans were generated using: (1) resampling, (2) anisotropic regular grids and (3) isotropic regular grids, while using varying sample sizes (resampling) or grid spacings (regular grid). We assessed differences in optimization time (for comparable plan quality) and in plan quality parameters (for comparable optimization time). Resampling reduced optimization time by a factor of 2.8 and 5.6 on average (7.8 and 17.0 at maximum) compared with the use of anisotropic and isotropic grids, respectively. Doses to organs-at-risk were generally reduced when using resampling, with median dose reductions ranging from 0.0 to 3.0 Gy (maximum: 14.3 Gy, relative: 0%-42%) compared with anisotropic grids and from -0.3 to 2.6 Gy (maximum: 11.4 Gy, relative: -4%-19%) compared with isotropic grids. Resampling was especially effective when using thin PBs (3 mm sigma). Resampling plans contained on average fewer PBs, energy layers and protons than anisotropic grid
Bootstrapped efficiency measures of oil blocks in Angola
International Nuclear Information System (INIS)
This paper investigates the technical efficiency of Angola oil blocks over the period 2002-2007. A double bootstrap data envelopment analysis (DEA) model is adopted composed in the first stage of a DEA-variable returns to scale (VRS) model and then followed in the second stage by a bootstrapped truncated regression. Results showed that on average, the technical efficiency has fluctuated over the period of study, but deep and ultradeep oil blocks have generally maintained a consistent efficiency level. Policy implications are derived.
Bootstrapping Critical Ising Model on Three Dimensional Real Projective Space
Nakayama, Yu
2016-04-01
Given conformal data on a flat Euclidean space, we use crosscap conformal bootstrap equations to numerically solve the Lee-Yang model as well as the critical Ising model on a three dimensional real projective space. We check the rapid convergence of our bootstrap program in two dimensions from the exact solutions available. Based on the comparison, we estimate that our systematic error on the numerically solved one-point functions of the critical Ising model on a three dimensional real projective space is less than 1%. Our method opens up a novel way to solve conformal field theories on nontrivial geometries.
PyCFTBoot: A flexible interface for the conformal bootstrap
Behan, Connor
2016-01-01
We introduce PyCFTBoot, a wrapper designed to reduce the barrier to entry in conformal bootstrap calculations that require semidefinite programming. Symengine and SDPB are used for the most intensive symbolic and numerical steps respectively. After reviewing the built-in algorithms for conformal blocks, we explain how to use the code through a number of examples that verify past results. As an application, we show that the multi-correlator bootstrap still appears to single out the Wilson-Fisher fixed points as special theories in dimensions between 3 and 4 despite the recent proof that they violate unitarity.
Bootstrap Tests and Confidence Regions for Functions of a Covariance Matrix
Beran, Rudolf; Srivastava, Muni S.
1985-01-01
Bootstrap tests and confidence regions for functions of the population covariance matrix have the desired asymptotic levels, provided model restrictions, such as multiple eigenvalues in the covariance matrix, are taken into account in designing the bootstrap algorithm.
Il bootstrap. Un'applicazione informatica per un problema di ricampionamento
Morana, Maria Teresa; Porcu, Mariano
2002-01-01
The aim of this paper is to give a simple introduction to the bootstrap techniques showing a basic computer algorithm. The algorithm displays, step by step, how to determinate a bootstrap confidence interval.
Jackknife resampling technique on mocks: an alternative method for covariance matrix estimation
Escoffier, S; Tilquin, A; Pisani, A; Aguichine, A; de la Torre, S; Ealet, A; Gillard, W; Jullo, E
2016-01-01
We present a fast and robust alternative method to compute covariance matrix in case of cosmology studies. Our method is based on the jackknife resampling applied on simulation mock catalogues. Using a set of 600 BOSS DR11 mock catalogues as a reference, we find that the jackknife technique gives a similar galaxy clustering covariance matrix estimate by requiring a smaller number of mocks. A comparison of convergence rates show that $\\sim$7 times fewer simulations are needed to get a similar accuracy on variance. We expect this technique to be applied in any analysis where the number of available N-body simulations is low.
Siana Halim; Herman Mallian
2006-01-01
The Bootstrap is a lively research area. A lot Of ideas are around and have let to quiet different proposals. In this paper we sketch briefly some Bootstrap methods for independent and dependent data. Finally we give an Bootstrap example for constructing confidence interval in the forecasting for stationer data. Abstract in Bahasa Indonesia : Bootstrap merupakan area penelitian yang terus berkembang. Ada banyak ide dan proposal-proposal yang berbeda telah diberikan oleh para peneliti. Namun d...
First and second order analysis for periodic random arrays using block bootstrap methods
Dudek, Anna E.
2016-01-01
In the paper row-wise periodically correlated triangular arrays are considered. The period length is assumed to grow in time. The Fourier decomposition of the mean and autocovariance functions for each row of the matrix is presented. To construct bootstrap estimators of the Fourier coefficients two block bootstrap techniques are used. These are the circular version of the Generalized Seasonal Block Bootstrap and the Circular Block Bootstrap. Consistency results for both methods are presented....
Bootstrapping Rapidity Anomalous Dimension for Transverse-Momentum Resummation
Energy Technology Data Exchange (ETDEWEB)
Li, Ye [Fermilab; Zhu, Hua Xing [MIT, Cambridge, CTP
2016-04-05
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. An intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
A neural network based reputation bootstrapping approach for service selection
Wu, Quanwang; Zhu, Qingsheng; Li, Peng
2015-10-01
With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.
Automatic bootstrapping of a morphable face model using multiple components
Haar, F.B. ter; Veltkamp, R.C.
2009-01-01
We present a new bootstrapping algorithm to automatically enhance a 3D morphable face model with new face data. Our algorithm is based on a morphable model fitting method that uses a set of predefined face components. This fitting method produces accurate model fits to 3D face data with noise and ho
More on analytic bootstrap for O(N) models
Dey, Parijat; Sen, Kallol
2016-01-01
This note is an extension of a recent work on the analytical bootstrapping of $O(N)$ models. An additonal feature of the $O(N)$ model is that the OPE contains trace and antisymmetric operators apart from the symmetric-traceless objects appearing in the OPE of the singlet sector. This in addition to the stress tensor $(T_{\\mu\
Metastability Thresholds for Anisotropic Bootstrap Percolation in Three Dimensions
Van Enter, A.C.D.; Fey, A.
2012-01-01
In this paper we analyze several anisotropic bootstrap percolation models in three dimensions. We present the order of magnitude for the metastability thresholds for a fairly general class of models. In our proofs, we use an adaptation of the technique of dimensional reduction. We find that the orde
Metastability thresholds for anisotropic bootstrap percolation in three dimensions
Van Enter, A.C.D.; Fey, A.
2012-01-01
In this paper we analyze several anisotropic bootstrap percolation models in three dimensions. We present the order of magnitude for the metastability thresholds for a fairly general class of models. In our proofs, we use an adaptation of the technique of dimensional reduction. We find that the orde
Bootstrapping rapidity anomalous dimension for transverse-momentum resummation
Li, Ye
2016-01-01
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. An intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
Bootstrapping the energy flow in the beginning of life
Hengeveld, R.; Fedonkin, M.A.
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in t
Sidecoin: a snapshot mechanism for bootstrapping a blockchain
Krug, Joseph; Peterson, Jack
2015-01-01
Sidecoin is a mechanism that allows a snapshot to be taken of Bitcoin's blockchain. We compile a list of Bitcoin's unspent transaction outputs, then use these outputs and their corresponding balances to bootstrap a new blockchain. This allows the preservation of Bitcoin's economic state in the context of a new blockchain, which may provide new features and technical innovations.
A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages ob...
Normal Limits, Nonnormal Limits, and the Bootstrap for Quantiles of Dependent Data
Sharipov, O. Sh.; Wendler, M.
2012-01-01
We will show under very weak conditions on differentiability and dependence that the central limit theorem for quantiles holds and that the block bootstrap is weakly consistent. Under slightly stronger conditions, the bootstrap is strongly consistent. Without the differentiability condition, quantiles might have a non-normal asymptotic distribution and the bootstrap might fail.
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
Noncentral limit theorem and the bootstrap for quantiles of dependent data
Sharipov, Olimjon S.; Wendler, Martin
2012-01-01
We will show under minimal conditions on differentiability and dependence that the central limit theorem for quantiles holds and that the block bootstrap is weakly consistent. Under slightly stronger conditions, the bootstrap is strongly consistent. Without the differentiability condition, quantiles might have a non-normal asymptotic distribution and the bootstrap might fail.
RANDOM QUADRATIC-FORMS AND THE BOOTSTRAP FOR U-STATISTICS
DEHLING, H; MIKOSCH, T
1994-01-01
We study the bootstrap distribution for U-statistics with special emphasis on the degenerate case. For the Efron bootstrap we give a short proof of the consistency using Mallows' metrics. We also study the i.i.d. weighted bootstrap [GRAPHICS] where (X(i)) and (xi(i)) are two i.i.d. sequences, indepe
International Nuclear Information System (INIS)
This study investigates whether ‘pencil beam resampling’, i.e. iterative selection and weight optimization of randomly placed pencil beams (PBs), reduces optimization time and improves plan quality for multi-criteria optimization in intensity-modulated proton therapy, compared with traditional modes in which PBs are distributed over a regular grid. Resampling consisted of repeatedly performing: (1) random selection of candidate PBs from a very fine grid, (2) inverse multi-criteria optimization, and (3) exclusion of low-weight PBs. The newly selected candidate PBs were added to the PBs in the existing solution, causing the solution to improve with each iteration. Resampling and traditional regular grid planning were implemented into our in-house developed multi-criteria treatment planning system ‘Erasmus iCycle’. The system optimizes objectives successively according to their priorities as defined in the so-called ‘wish-list’. For five head-and-neck cancer patients and two PB widths (3 and 6 mm sigma at 230 MeV), treatment plans were generated using: (1) resampling, (2) anisotropic regular grids and (3) isotropic regular grids, while using varying sample sizes (resampling) or grid spacings (regular grid). We assessed differences in optimization time (for comparable plan quality) and in plan quality parameters (for comparable optimization time). Resampling reduced optimization time by a factor of 2.8 and 5.6 on average (7.8 and 17.0 at maximum) compared with the use of anisotropic and isotropic grids, respectively. Doses to organs-at-risk were generally reduced when using resampling, with median dose reductions ranging from 0.0 to 3.0 Gy (maximum: 14.3 Gy, relative: 0%–42%) compared with anisotropic grids and from −0.3 to 2.6 Gy (maximum: 11.4 Gy, relative: −4%–19%) compared with isotropic grids. Resampling was especially effective when using thin PBs (3 mm sigma). Resampling plans contained on average fewer PBs, energy layers and protons than
Resampling nucleotide sequences with closest-neighbor trimming and its comparison to other methods.
Directory of Open Access Journals (Sweden)
Kouki Yonezawa
Full Text Available A large number of nucleotide sequences of various pathogens are available in public databases. The growth of the datasets has resulted in an enormous increase in computational costs. Moreover, due to differences in surveillance activities, the number of sequences found in databases varies from one country to another and from year to year. Therefore, it is important to study resampling methods to reduce the sampling bias. A novel algorithm-called the closest-neighbor trimming method-that resamples a given number of sequences from a large nucleotide sequence dataset was proposed. The performance of the proposed algorithm was compared with other algorithms by using the nucleotide sequences of human H3N2 influenza viruses. We compared the closest-neighbor trimming method with the naive hierarchical clustering algorithm and [Formula: see text]-medoids clustering algorithm. Genetic information accumulated in public databases contains sampling bias. The closest-neighbor trimming method can thin out densely sampled sequences from a given dataset. Since nucleotide sequences are among the most widely used materials for life sciences, we anticipate that our algorithm to various datasets will result in reducing sampling bias.
A genetic resampling particle filter for freeway traffic-state estimation
Institute of Scientific and Technical Information of China (English)
Bi Jun; Guan Wei; Qi Long-Tao
2012-01-01
On-line estimation of the state of traffic based on data sampled by electronic detectors is important for intelligent traffic management and control.Because a nonlinear feature exists in the traffic state,and because particle filters have good characteristics when it comes to solving the nonlinear problem,a genetic resampling particle filter is proposed to estimate the state of freeway traffic.In this paper,a freeway section of the northern third ring road in the city of Beijing in China is considered as the experimental object.By analysing the traffic-state characteristics of the freeway,the traffic is modeled based on the second-order validated macroscopic traffic flow model.In order to solve the particle degeneration issue in the performance of the particle filter,a genetic mechanism is introduced into the resampling process.The realization of a genetic particle filter for freeway traffic-state estimation is discussed in detail,and the filter estimation performance is validated and evaluated by the achieved experimental data.
ENSO-conditioned weather resampling method for seasonal ensemble streamflow prediction
Beckers, Joost V. L.; Weerts, Albrecht H.; Tijdeman, Erik; Welles, Edwin
2016-08-01
Oceanic-atmospheric climate modes, such as El Niño-Southern Oscillation (ENSO), are known to affect the local streamflow regime in many rivers around the world. A new method is proposed to incorporate climate mode information into the well-known ensemble streamflow prediction (ESP) method for seasonal forecasting. The ESP is conditioned on an ENSO index in two steps. First, a number of original historical ESP traces are selected based on similarity between the index value in the historical year and the index value at the time of forecast. In the second step, additional ensemble traces are generated by a stochastic ENSO-conditioned weather resampler. These resampled traces compensate for the reduction of ensemble size in the first step and prevent degradation of skill at forecasting stations that are less affected by ENSO. The skill of the ENSO-conditioned ESP is evaluated over 50 years of seasonal hindcasts of streamflows at three test stations in the Columbia River basin in the US Pacific Northwest. An improvement in forecast skill of 5 to 10 % is found for two test stations. The streamflows at the third station are less affected by ENSO and no change in forecast skill is found here.
A bootstrapping soft shrinkage approach for variable selection in chemical modeling.
Deng, Bai-Chuan; Yun, Yong-Huan; Cao, Dong-Sheng; Yin, Yu-Long; Wang, Wei-Ting; Lu, Hong-Mei; Luo, Qian-Yi; Liang, Yi-Zeng
2016-02-18
In this study, a new variable selection method called bootstrapping soft shrinkage (BOSS) method is developed. It is derived from the idea of weighted bootstrap sampling (WBS) and model population analysis (MPA). The weights of variables are determined based on the absolute values of regression coefficients. WBS is applied according to the weights to generate sub-models and MPA is used to analyze the sub-models to update weights for variables. The optimization procedure follows the rule of soft shrinkage, in which less important variables are not eliminated directly but are assigned smaller weights. The algorithm runs iteratively and terminates until the number of variables reaches one. The optimal variable set with the lowest root mean squared error of cross-validation (RMSECV) is selected. The method was tested on three groups of near infrared (NIR) spectroscopic datasets, i.e. corn datasets, diesel fuels datasets and soy datasets. Three high performing variable selection methods, i.e. Monte Carlo uninformative variable elimination (MCUVE), competitive adaptive reweighted sampling (CARS) and genetic algorithm partial least squares (GA-PLS) are used for comparison. The results show that BOSS is promising with improved prediction performance. The Matlab codes for implementing BOSS are freely available on the website: http://www.mathworks.com/matlabcentral/fileexchange/52770-boss. PMID:26826688
BOOTSTRAP WAVELET IN THE NONPARAMETRIC REGRESSION MODEL WITH WEAKLY DEPENDENT PROCESSES
Institute of Scientific and Technical Information of China (English)
林路; 张润楚
2004-01-01
This paper introduces a method of bootstrap wavelet estimation in a nonparametric regression model with weakly dependent processes for both fixed and random designs. The asymptotic bounds for the bias and variance of the bootstrap wavelet estimators are given in the fixed design model. The conditional normality for a modified version of the bootstrap wavelet estimators is obtained in the fixed model. The consistency for the bootstrap wavelet estimator is also proved in the random design model. These results show that the bootstrap wavelet method is valid for the model with weakly dependent processes.
Directory of Open Access Journals (Sweden)
Rohin Anhal
2013-10-01
Full Text Available The aim of this paper is to examine the direction of causality between real GDP on the one hand and final energy and coal consumption on the other in India, for the period from 1970 to 2011. The methodology adopted is the non-parametric bootstrap procedure, which is used to construct the critical values for the hypothesis of causality. The results of the bootstrap tests show that for total energy consumption, there exists no causal relationship in either direction with GDP of India. However, if coal consumption is considered, we find evidence in support of unidirectional causality running from coal consumption to GDP. This clearly has important implications for the Indian economy. The most important implication is that curbing coal consumption in order to reduce carbon emissions would in turn have a limiting effect on economic growth. Our analysis contributes to the literature in three distinct ways. First, this is the first paper to use the bootstrap method to examine the growth-energy connection for the Indian economy. Second, we analyze data for the time period 1970 to 2011, thereby utilizing recently available data that has not been used by others. Finally, in contrast to the recently done studies, we adopt a disaggregated approach for the analysis of the growth-energy nexus by considering not only aggregate energy consumption, but coal consumption as well.
Conformal bootstrap: non-perturbative QFT's under siege
CERN. Geneva
2016-01-01
[Exceptionally in Council Chamber] Originally formulated in the 70's, the conformal bootstrap is the ambitious idea that one can use internal consistency conditions to carve out, and eventually solve, the space of conformal field theories. In this talk I will review recent developments in the field which have boosted this program to a new level. I will present a method to extract quantitative informations in strongly-interacting theories, such as 3D Ising, O(N) vector model and even systems without a Lagrangian formulation. I will explain how these techniques have led to the world record determination of several critical exponents. Finally, I will review exact analytical results obtained using bootstrap techniques.
Bootstrap bound for conformal multi-flavor QCD on lattice
Nakayama, Yu
2016-01-01
The recent work by Iha et al shows an upper bound on mass anomalous dimension $\\gamma_m$ of multi-flavor massless QCD at the renormalization group fixed point from the conformal bootstrap in $SU(N_F)_V$ symmetric conformal field theories under the assumption that the fixed point is realizable with the lattice regularization based on staggered fermions. We show that the almost identical but slightly stronger bound applies to the regularization based on Wilson fermions (or domain wall fermions) by studying the conformal bootstrap in $SU(N_f)_L \\times SU(N_f)_R$ symmetric conformal field theories. For $N_f=8$, our bound implies $\\gamma_m < 1.31$ to avoid dangerously irrelevant operators that are not compatible with the lattice symmetry.
A bootstrap lunar base: Preliminary design review 2
1987-01-01
A bootstrap lunar base is the gateway to manned solar system exploration and requires new ideas and new designs on the cutting edge of technology. A preliminary design for a Bootstrap Lunar Base, the second provided by this contractor, is presented. An overview of the work completed is discussed as well as the technical, management, and cost strategies to complete the program requirements. The lunar base design stresses the transforming capabilities of its lander vehicles to aid in base construction. The design also emphasizes modularity and expandability in the base configuration to support the long-term goals of scientific research and profitable lunar resource exploitation. To successfully construct, develop, and inhabit a permanent lunar base, however, several technological advancements must first be realized. Some of these technological advancements are also discussed.
Bootstrap bound for conformal multi-flavor QCD on lattice
Nakayama, Yu
2016-07-01
The recent work by Iha et al. shows an upper bound on mass anomalous dimension γ m of multi-flavor massless QCD at the renormalization group fixed point from the conformal bootstrap in SU( N F ) V symmetric conformal field theories under the assumption that the fixed point is realizable with the lattice regularization based on staggered fermions. We show that the almost identical but slightly stronger bound applies to the regularization based on Wilson fermions (or domain wall fermions) by studying the conformal bootstrap in SU( N f ) L × SU( N f ) R symmetric conformal field theories. For N f = 8, our bound implies γ m < 1 .31 to avoid dangerously irrelevant operators that are not compatible with the lattice symmetry.
Voice Conversion Using Pitch Shifting Algorithm by Time Stretching with PSOLA and Re-Sampling
Mousa, Allam
2010-01-01
Voice changing has many applications in the industry and commercial filed. This paper emphasizes voice conversion using a pitch shifting method which depends on detecting the pitch of the signal (fundamental frequency) using Simplified Inverse Filter Tracking (SIFT) and changing it according to the target pitch period using time stretching with Pitch Synchronous Over Lap Add Algorithm (PSOLA), then resampling the signal in order to have the same play rate. The same study was performed to see the effect of voice conversion when some Arabic speech signal is considered. Treatment of certain Arabic voiced vowels and the conversion between male and female speech has shown some expansion or compression in the resulting speech. Comparison in terms of pitch shifting is presented here. Analysis was performed for a single frame and a full segmentation of speech.
A Novel Approach To Detection and Evaluation of Resampled Tampered Images
Directory of Open Access Journals (Sweden)
Amrit Hanuman
2015-08-01
Full Text Available Most digital forgeries use an interpolation function, affecting the underlying statistical distribution of the image pixel values, that when detected, can be used as evidence of tampering. This paper provides a comparison of interpolation techniques, similar to Lehmann [1], using analyses of the Fourier transform of the image signal, and a quantitative assessment of the interpolation quality after applying selected interpolation functions, alongside an appraisal of computational performance using runtime measurements. A novel algorithm is proposed for detecting locally tampered regions, taking the averaged discrete Fourier transform of the zero-crossing of the second difference of the resampled signal (ADZ. The algorithm was contrasted using precision, recall and specificity metrics against those found in the literature, with comparable results. The interpolation comparison results were similar to that of [1]. The results of the detection algorithm showed that it performed well for determining authentic images, and better than previously proposed algorithms for determining tampered regions.
International Nuclear Information System (INIS)
A method to prepare a set of four climate scenarios for the Netherlands is presented. These scenarios for climate change in 2050 and 2085 (compared to present-day) are intended for general use in climate change adaptation in the Netherlands. An ensemble of eight simulations with the global model EC-Earth and the regional climate model RACMO2 (run at 12 km resolution) is used. For each scenario time horizon, two target values of the global mean temperature rise are chosen based on the spread in the CMIP5 simulations. Next, the corresponding time periods in the EC-Earth/RACMO2 simulations are selected in which these target values of the global temperature rise are reached. The model output for these periods is then resampled using blocks of 5 yr periods. The rationale of resampling is that natural variations in the EC-Earth/RACMO2 ensemble are used to represent (part of the) uncertainty in the CMIP5 projections. Samples are then chosen with the aim of reconstructing the spread in seasonal temperature and precipitation changes in CMIP5 for the Netherlands. These selected samples form the basis of the scenarios. The resulting four scenarios represent 50–80% of the CMIP5 spread for summer and winter changes in seasonal means as well as a limited number of monthly statistics (warm, cold, wet and dry months). The strong point of the method—also in relation to the previous set of the climate scenarios for the Netherlands issued in 2006—is that it preserves nearly all physical inter-variable consistencies as they exist in the original model output in both space and time. (paper)
Farmer, W. H.; Over, T. M.; Vogel, R. M.; Archfield, S. A.; Kiang, J. E.
2014-12-01
In ungaged basins, predictions of daily streamflow are essential to responsible and effective management and design of water resources systems. Transfer-based methods are widely used for prediction in ungaged basins (PUB) within a gaged network. Such methods rely on the transfer of information from an index gage to an ungaged site. In what is known as the nearest-neighbor algorithm, the index gage is selected based on geospatial proximity. The predictions offered by any PUB method can be highly uncertain, and it is often difficult to characterize this uncertainty. In the development of predicted streamflow records, understanding the uncertainty of estimates would greatly improve water resources management in ungaged basins. It is proposed that by resampling the sites of the gaged network, with replacement, a set of equally-probable streamflow predictions can be produced for any ungaged site. For a particular day in the record, the percentiles of the distribution of the resampled, predicted streamflows can be used to estimate confidence intervals of the original daily streamflow predictions. This approach is explored in the Southeast United States with a nearest-neighbor application of non-linear spatial interpolation using flow duration curves (QPPQ), a common PUB method. Though some interval re-centering is required to ensure that the best-case prediction falls within the confidence intervals, it is shown that this technique provides a reasonable first-order approximation of prediction uncertainty. Still, the best estimated confidence intervals are shown to consistently under-estimate the nominal confidence. It is hypothesized that this interval contraction is a result of temporal and spatial correlation within the gaged network. Additionally, implications of prediction uncertainty are explored and alternative estimators are considered.
Non-critical string, Liouville theory and geometric bootstrap hypothesis
Hadasz, L; Hadasz, Leszek; Jaskolski, Zbigniew
2003-01-01
Basing on the standard construction of critical string amplitudes we analyze properties of the longitudinal sector of the non-critical Nambu-Goto string. We demonstrate that it cannot be described by standard (in the sense of BPZ) conformal field theory. As an alternative we propose a new version of the geometric approach to Liouville theory and formulate its basic consistency condition - the geometric bootstrap equation.
'Bootstrap' charging of surfaces composed of multiple materials
Stannard, P. R.; Katz, I.; Parks, D. E.
1981-01-01
The paper examines the charging of a checkerboard array of two materials, only one of which tends to acquire a negative potential alone, using the NASA Charging Analyzer Program (NASCAP). The influence of the charging material's field causes the otherwise 'non-charging' material to acquire a negative potential due to the suppression of its secondary emission ('bootstrap' charging). The NASCAP predictions for the equilibrium potential difference between the two materials are compared to results based on an analytical model.
Higgs Critical Exponents and Conformal Bootstrap in Four Dimensions
DEFF Research Database (Denmark)
Antipin, Oleg; Mølgaard, Esben; Sannino, Francesco
2015-01-01
We investigate relevant properties of composite operators emerging in nonsupersymmetric, four-dimensional gauge-Yukawa theories with interacting conformal fixed points within a precise framework. The theories investigated in this work are structurally similar to the standard model of particle int...... bootstrap results are then compared to precise four dimensional conformal field theoretical results. To accomplish this, it was necessary to calculate explicitly the crossing symmetry relations for the global symmetry group SU($N$)$\\times$SU($N$)....
Spectrum of local boundary operators from boundary form factor bootstrap
Szots, M.; Takacs, G.
2007-01-01
Using the recently introduced boundary form factor bootstrap equations, we map the complete space of their solutions for the boundary version of the scaling Lee-Yang model and sinh-Gordon theory. We show that the complete space of solutions, graded by the ultraviolet behaviour of the form factors can be brought into correspondence with the spectrum of local boundary operators expected from boundary conformal field theory, which is a major evidence for the correctness of the boundary form fact...
Bootstrap and the physical values of $\\pi N$ resonance parameters
Semenov-Tian-Shansky, Kirill M.; Vereshagin, Alexander V.; Vereshagin, Vladimir V.
2007-01-01
This is the 6th paper in the series developing the formalism to manage the effective scattering theory of strong interactions. Relying on the theoretical scheme suggested in our previous publications we concentrate here on the practical aspect and apply our technique to the elastic pion-nucleon scattering amplitude. We test numerically the pion-nucleon spectrum sum rules that follow from the tree level bootstrap constraints. We show how these constraints can be used to estimate the tensor and...
Study and Integrate Bootstrap 3 for OpixManager
Tapani, Zhejia
2010-01-01
ABSTRACT This bachelor thesis is about how to study and integrate Bootstrap 3 into OpixManager. The purpose is to improve user interface of OpixManager application. OpixManager is constructed by using CodeIgniter and Model-View-Controller (MVC) framework. OpixManager application is for project management. It includes staff augmentation, customer management, report management and so on. It is to support both scrum and traditional software development process. There are two major parts...
Addressing the P2P Bootstrap Problem for Small Networks
Wolinsky, David Isaac; Juste, Pierre St.; Boykin, P. Oscar; Figueiredo, Renato
2010-01-01
P2P overlays provide a framework for building distributed applications consisting of few to many resources with features including self-configuration, scalability, and resilience to node failures. Such systems have been successfully adopted in large-scale services for content delivery networks, file sharing, and data storage. In small-scale systems, they can be useful to address privacy concerns and for network applications that lack dedicated servers. The bootstrap problem, finding an existi...
Bootstrapping a Five-Loop Amplitude from Steinmann Relations
Caron-Huot, Simon; McLeod, Andrew; von Hippel, Matt
2016-01-01
The analytic structure of scattering amplitudes is restricted by Steinmann relations, which enforce the vanishing of certain discontinuities of discontinuities. We show that these relations dramatically simplify the function space for the hexagon function bootstrap in planar maximally supersymmetric Yang-Mills theory. Armed with this simplification, along with the constraints of dual conformal symmetry and Regge exponentiation, we obtain the complete five-loop six-particle amplitude.
A conformal bootstrap approach to critical percolation in two dimensions
Picco, Marco; Santachiara, Raoul
2016-01-01
We study four-point functions of critical percolation in two dimensions, and more generally of the Potts model. We propose an exact ansatz for the spectrum: an infinite, discrete and non-diagonal combination of representations of the Virasoro algebra. Based on this ansatz, we compute four-point functions using a numerical conformal bootstrap approach. The results agree with Monte-Carlo computations of connectivities of random clusters.
Comparison Of Modified Bootstrap And Conventional Sensitometry In Medical Radiography
Bednarek, Daniel R.; Rudin, Stephen
1980-08-01
A new modified bootstrap approach to sensitometry is presented which provides H and D curves that show almost exact agreement with those obtained using conventional methods. Two bootstrap techniques are described; both involve a combination of inverse-square and step-ped wedge modulation of the radiation field and provide intensity-scale sensitometric curves as appropriate for medical radiography. H and D curves obtained with these modified techniques are compared with those obtained for screen-film combinations using inverse-square sensitometry as well as with those obtained for direct x-ray film using time-scale sensitometry. The stepped-wedge of the Wisconsin X-Ray Test Cassette was used in the boot-strap approach since it provides sufficient exposure latitude to encompass the useful den-sity range of medical x-ray film. This approach makes radiographic sensitometry quick and convenient, allowing accurate characteristic curves to be obtained for any screen-film cassette using standard diagnostic equipment.
Necessary Condition for Emergent Symmetry from the Conformal Bootstrap
Nakayama, Yu; Ohtsuki, Tomoki
2016-09-01
We use the conformal bootstrap program to derive the necessary conditions for emergent symmetry enhancement from discrete symmetry (e.g., Zn ) to continuous symmetry [e.g., U (1 )] under the renormalization group flow. In three dimensions, in order for Z2 symmetry to be enhanced to U (1 ) symmetry, the conformal bootstrap program predicts that the scaling dimension of the order parameter field at the infrared conformal fixed point must satisfy Δ1>1.08 . We also obtain the similar necessary conditions for Z3 symmetry with Δ1>0.580 and Z4 symmetry with Δ1>0.504 from the simultaneous conformal bootstrap analysis of multiple four-point functions. As applications, we show that our necessary conditions impose severe constraints on the nature of the chiral phase transition in QCD, the deconfinement criticality in Néel valence bond solid transitions, and anisotropic deformations in critical O (n ) models. We prove that some fixed points proposed in the literature are unstable under the perturbation that cannot be forbidden by the discrete symmetry. In these situations, the second-order phase transition with enhanced symmetry cannot happen.
Truncatable bootstrap equations in algebraic form and critical surface exponents
Gliozzi, Ferdinando
2016-01-01
We describe examples of drastic truncations of conformal bootstrap equations encoding much more information than that obtained by a direct numerical approach. A three-term truncation of the four point function of a free scalar in any space dimensions provides algebraic identities among conformal block derivatives which generate the exact spectrum of the infinitely many primary operators contributing to it. In boundary conformal field theories, we point out that the appearance of free parameters in the solutions of bootstrap equations is not an artifact of truncations, rather it reflects a physical property of permeable conformal interfaces which are described by the same equations. Surface transitions correspond to isolated points in the parameter space. We are able to locate them in the case of 3d Ising model, thanks to a useful algebraic form of 3d boundary bootstrap equations. It turns out that the low-lying spectra of the surface operators in the ordinary and the special transitions of 3d Ising model form...
Enders, Craig K.
2005-01-01
The Bollen-Stine bootstrap can be used to correct for standard error and fit statistic bias that occurs in structural equation modeling (SEM) applications due to nonnormal data. The purpose of this article is to demonstrate the use of a custom SAS macro program that can be used to implement the Bollen-Stine bootstrap with existing SEM software.…
Adjorlolo, Clement; Mutanga, Onisimo; Cho, Moses A.; Ismail, Riyad
2013-04-01
In this paper, a user-defined inter-band correlation filter function was used to resample hyperspectral data and thereby mitigate the problem of multicollinearity in classification analysis. The proposed resampling technique convolves the spectral dependence information between a chosen band-centre and its shorter and longer wavelength neighbours. Weighting threshold of inter-band correlation (WTC, Pearson's r) was calculated, whereby r = 1 at the band-centre. Various WTC (r = 0.99, r = 0.95 and r = 0.90) were assessed, and bands with coefficients beyond a chosen threshold were assigned r = 0. The resultant data were used in the random forest analysis to classify in situ C3 and C4 grass canopy reflectance. The respective WTC datasets yielded improved classification accuracies (kappa = 0.82, 0.79 and 0.76) with less correlated wavebands when compared to resampled Hyperion bands (kappa = 0.76). Overall, the results obtained from this study suggested that resampling of hyperspectral data should account for the spectral dependence information to improve overall classification accuracy as well as reducing the problem of multicollinearity.
Using minimum bootstrap support for splits to construct confidence regions for trees
Directory of Open Access Journals (Sweden)
Edward Susko
2006-01-01
Full Text Available Many of the estimated topologies in phylogenetic studies are presented with the bootstrap support for each of the splits in the topology indicated. If phylogenetic estimation is unbiased, high bootstrap support for a split suggests that there is a good deal of certainty that the split actually is present in the tree and low bootstrap support suggests that one or more of the taxa on one side of the estimated split might in reality be located with taxa on the other side. In the latter case the follow-up questions about how many and which of the taxa could reasonably be incorrectly placed as well as where they might alternatively be placed are not addressed through the presented bootstrap support. We present here an algorithm that finds the set of all trees with minimum bootstrap support for their splits greater than some given value. The output is a ranked list of trees, ranked according to the minimum bootstrap supports for splits in the trees. The number of such trees and their topologies provides useful supplementary information in bootstrap analyses about the reasons for low bootstrap support for splits. We also present ways of quantifying low bootstrap support by considering the set of all topologies with minimum bootstrap greater than some quantity as providing a confidence region of topologies. Using a double bootstrap we are able to choose a cutoff so that the set of topologies with minimum bootstrap support for a split greater than that cutoff gives an approximate 95% confidence region. As with bootstrap support one advantage of the methods is that they are generally applicable to the wide variety of phylogenetic estimation methods.
DEFF Research Database (Denmark)
Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender L. P.
2012-01-01
A new way of parametric bootstrapping allows similar construction of confidence ellipses applicable on all results from Multiple Factor Analysis obtained from the FactoMineR package in the statistical program R. With this procedure, a similar approach will be applied to Multiple Factor Analysis...... results regardless of the origin of data and the nature of the original variables. The approach is suitable for getting an overview of product confidence intervals and also applicable for data obtained from ‘one repetition’ evaluations. Furthermore, it is a convenient way to get an overview of variations...
Directory of Open Access Journals (Sweden)
Bryan R Conroy
Full Text Available Multivariate decoding models are increasingly being applied to functional magnetic imaging (fMRI data to interpret the distributed neural activity in the human brain. These models are typically formulated to optimize an objective function that maximizes decoding accuracy. For decoding models trained on full-brain data, this can result in multiple models that yield the same classification accuracy, though some may be more reproducible than others--i.e. small changes to the training set may result in very different voxels being selected. This issue of reproducibility can be partially controlled by regularizing the decoding model. Regularization, along with the cross-validation used to estimate decoding accuracy, typically requires retraining many (often on the order of thousands of related decoding models. In this paper we describe an approach that uses a combination of bootstrapping and permutation testing to construct both a measure of cross-validated prediction accuracy and model reproducibility of the learned brain maps. This requires re-training our classification method on many re-sampled versions of the fMRI data. Given the size of fMRI datasets, this is normally a time-consuming process. Our approach leverages an algorithm called fast simultaneous training of generalized linear models (FaSTGLZ to create a family of classifiers in the space of accuracy vs. reproducibility. The convex hull of this family of classifiers can be used to identify a subset of Pareto optimal classifiers, with a single-optimal classifier selectable based on the relative cost of accuracy vs. reproducibility. We demonstrate our approach using full-brain analysis of elastic-net classifiers trained to discriminate stimulus type in an auditory and visual oddball event-related fMRI design. Our approach and results argue for a computational approach to fMRI decoding models in which the value of the interpretation of the decoding model ultimately depends upon optimizing a
Asal Bileşenler Analizine Bootstrap Yaklaşımı
AKTÜKÜN, Dr. Aylin
2011-01-01
Bu çalışmada, bootstrap yöntemlerin asal bileşenler analizine uygulanma sürecini sunduk. Hipotetik bir veri ile asal bileşenler analizinde başvurulan bazı güven aralıklarının bootstrap yöntemlerle nasıl gerçekleştirilebileceğini gösterdik. Makaledeki tüm bootstrap süreçleri Mathematica dilinde yazdığımız bir programla gerçekleştirdik. Anahtar Kelimeler: Asal bileşenler analizi, Bootstrap, Bootstrap kantiller, Bootstrap Güven Aralıkları, Mathematica. ABSRACT In this paper, we apply ...
Higher-order accuracy of multiscale-double bootstrap for testing regions
Shimodaira, Hidetoshi
2013-01-01
We consider hypothesis testing for the null hypothesis being represented as an arbitrary-shaped region in the parameter space. We compute an approximate p-value by counting how many times the null hypothesis holds in bootstrap replicates. This frequency, known as bootstrap probability, is widely used in evolutionary biology, but often reported as biased in the literature. Based on the asymptotic theory of bootstrap confidence intervals, there have been some new attempts for adjusting the bias...
A PARAMETRIC BOOTSTRAP USING THE FIRST FOUR
MOMENTS OF THE RESIDUALS
Treyens, Pierre-Eric
2007-01-01
We consider linear regression models and we suppose that disturbances are either Gaussian or non Gaussian. Until now, within the framework of the bootstrap, we thought that the error in rejection probability (ERP) had the same rate of convergence with the parametric bootstrap or the nonparametric bootstrap. For linear data generating processes (DGP) we show in this paper that this assertion is false if skewness and/or kurtosis coefficients of the distribution of the disturbances are nonnull. ...
Bootstrap Co-integration Rank Testing: The Effect of Bias-Correcting Parameter Estimates
Cavaliere, Giuseppe; Taylor, A. M. Robert; Trenkler, Carsten
2013-01-01
In this paper we investigate bootstrap-based methods for bias-correcting the first-stage parameter estimates used in some recently developed bootstrap implementations of the co-integration rank tests of Johansen (1996). In order to do so we adapt the framework of Kilian (1998) which estimates the bias in the original parameter estimates using the average bias in the corresponding parameter esti- mates taken across a large number of auxiliary bootstrap replications. A number of possible imp...
Dale Poirier
2008-01-01
This paper provides Bayesian rationalizations for White’s heteroskedastic consistent (HC) covariance estimator and various modifications of it. An informed Bayesian bootstrap provides the statistical framework.
A Bootstrap Approach to an Affordable Exploration Program
Oeftering, Richard C.
2011-01-01
This paper examines the potential to build an affordable sustainable exploration program by adopting an approach that requires investing in technologies that can be used to build a space infrastructure from very modest initial capabilities. Human exploration has had a history of flight programs that have high development and operational costs. Since Apollo, human exploration has had very constrained budgets and they are expected be constrained in the future. Due to their high operations costs it becomes necessary to consider retiring established space facilities in order to move on to the next exploration challenge. This practice may save cost in the near term but it does so by sacrificing part of the program s future architecture. Human exploration also has a history of sacrificing fully functional flight hardware to achieve mission objectives. An affordable exploration program cannot be built when it involves billions of dollars of discarded space flight hardware, instead, the program must emphasize preserving its high value space assets and building a suitable permanent infrastructure. Further this infrastructure must reduce operational and logistics cost. The paper examines the importance of achieving a high level of logistics independence by minimizing resource consumption, minimizing the dependency on external logistics, and maximizing the utility of resources available. The approach involves the development and deployment of a core suite of technologies that have minimum initial needs yet are able expand upon initial capability in an incremental bootstrap fashion. The bootstrap approach incrementally creates an infrastructure that grows and becomes self sustaining and eventually begins producing the energy, products and consumable propellants that support human exploration. The bootstrap technologies involve new methods of delivering and manipulating energy and materials. These technologies will exploit the space environment, minimize dependencies, and
Remuestreo Bootstrap y Jackknife en confiabilidad: Caso Exponencial y Weibull
Directory of Open Access Journals (Sweden)
Javier Ramírez-Montoya
2016-01-01
Full Text Available Se comparan los métodos de remuestreo Bootstrap-t y Jackknife delete I y delete II, utilizando los estimadores no paramétricos de Kaplan-Meier y Nelson-Aalen, que se utilizan con frecuencia en la práctica, teniendo en cuenta diferentes porcentajes de censura, tamaños de muestra y tiempos de interés. La comparación se realiza vía simulación, mediante el error cuadrático medio.
Bootstrapped Oblivious Transfer and Secure Two-Party Function Computation
Wang, Ye
2009-01-01
We propose an information theoretic framework for the secure two-party function computation (SFC) problem and introduce the notion of SFC capacity. We study and extend string oblivious transfer (OT) to sample-wise OT. We propose an efficient, perfectly private OT protocol utilizing the binary erasure channel or source. We also propose the bootstrap string OT protocol which provides disjoint (weakened) privacy while achieving a multiplicative increase in rate, thus trading off security for rate. Finally, leveraging our OT protocol, we construct a protocol for SFC and establish a general lower bound on SFC capacity of the binary erasure channel and source.
Bootstrap framework : web-suunnittelun työkaluna
Peltomäki, Veera
2014-01-01
Pienen näytön omaavat mobiililaitteet kasvattavat suosiotaan Internetin selaamisessa. Samaan aikaan modernit pelikonsolit ja SmartTV:t yleistyvät kuluttajien keskuudessa, jolloin Internetiä voi selata suuren näyttötarkkuuden omaavilla laitteilla. Responsiivinen web-suunnittelu vastaa nykypäivän vaatimuksiin, jossa käyttäjät odottavat sivustoilta yhdenmukaista käyttökokemusta, päätelaitteesta riippumatta. Tämä opinnäytetyö käsittelee responsiivisen Bootstrap frameworkin valintaa web-suunni...
Comparing groups randomization and bootstrap methods using R
Zieffler, Andrew S; Long, Jeffrey D
2011-01-01
A hands-on guide to using R to carry out key statistical practices in educational and behavioral sciences research Computing has become an essential part of the day-to-day practice of statistical work, broadening the types of questions that can now be addressed by research scientists applying newly derived data analytic techniques. Comparing Groups: Randomization and Bootstrap Methods Using R emphasizes the direct link between scientific research questions and data analysis. Rather than relying on mathematical calculations, this book focus on conceptual explanations and the use of statistica
The S-matrix Bootstrap II: Two Dimensional Amplitudes
Paulos, Miguel F; Toledo, Jonathan; van Rees, Balt C; Vieira, Pedro
2016-01-01
We consider constraints on the S-matrix of any gapped, Lorentz invariant quantum field theory in 1 + 1 dimensions due to crossing symmetry and unitarity. In this way we establish rigorous bounds on the cubic couplings of a given theory with a fixed mass spectrum. In special cases we identify interesting integrable theories saturating these bounds. Our analytic bounds match precisely with numerical bounds obtained in a companion paper where we consider massive QFT in an AdS box and study boundary correlators using the technology of the conformal bootstrap.
Comparing groups randomization and bootstrap methods using R
Zieffler, Andrew S; Long, Jeffrey D
2011-01-01
A hands-on guide to using R to carry out key statistical practices in educational and behavioral sciences research Computing has become an essential part of the day-to-day practice of statistical work, broadening the types of questions that can now be addressed by research scientists applying newly derived data analytic techniques. Comparing Groups: Randomization and Bootstrap Methods Using R emphasizes the direct link between scientific research questions and data analysis. Rather than relying on mathematical calculations, this book focus on conceptual explanations and
Convergence rates of empirical block length selectors for block bootstrap
Nordman, Daniel J.; Lahiri, Soumendra N.
2014-01-01
We investigate the accuracy of two general non-parametric methods for estimating optimal block lengths for block bootstraps with time series – the first proposed in the seminal paper of Hall, Horowitz and Jing (Biometrika 82 (1995) 561–574) and the second from Lahiri et al. (Stat. Methodol. 4 (2007) 292–321). The relative performances of these general methods have been unknown and, to provide a comparison, we focus on rates of convergence for these block length selectors for the moving block ...
A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data
Liang, Faming
2013-03-01
The Gaussian geostatistical model has been widely used in modeling of spatial data. However, it is challenging to computationally implement this method because it requires the inversion of a large covariance matrix, particularly when there is a large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate of the parameters is updated accordingly under the framework of stochastic approximation. Since the proposed method makes use of only a small proportion of the data at each iteration, it avoids inverting large covariance matrices and thus is scalable to large datasets. The proposed method also leads to a general parameter estimation approach, maximum mean log-likelihood estimation, which includes the popular maximum (log)-likelihood estimation (MLE) approach as a special case and is expected to play an important role in analyzing large datasets. Under mild conditions, it is shown that the estimator resulting from the proposed method converges in probability to a set of parameter values of equivalent Gaussian probability measures, and that the estimator is asymptotically normally distributed. To the best of the authors\\' knowledge, the present study is the first one on asymptotic normality under infill asymptotics for general covariance functions. The proposed method is illustrated with large datasets, both simulated and real. Supplementary materials for this article are available online. © 2013 American Statistical Association.
Resampling technique in the orthogonal direction for down-looking Synthetic Aperture Imaging Ladar
Li, Guangyuan; Sun, Jianfeng; Lu, Zhiyong; Zhang, Ning; Cai, Guangyu; Sun, Zhiwei; Liu, Liren
2015-09-01
The implementation of down-looking Synthetic Aperture Imaging Ladar(SAIL) uses quadratic phase history reconstruction in the travel direction and linear phase modulation reconstruction in the orthogonal direction. And the linear phase modulation in the orthogonal direction is generated by the shift of two cylindrical lenses in the two polarization-orthogonal beams. Therefore, the fast-moving of two cylindrical lenses is necessary for airborne down-looking SAIL to match the aircraft flight speed and to realize the compression of the orthogonal direction, but the quick start and the quick stop of the cylindrical lenses must greatly damage the motor and make the motion trail non-uniform. To reduce the damage and get relatively well trajectory, we make the motor move like a sinusoidal curve to make it more realistic movement, and through a resampling interpolation imaging algorithm, we can transform the nonlinear phase to linear phase, and get good reconstruction results of point target and area target in laboratory. The influences on imaging quality in different sampling positions when the motor make a sinusoidal motion and the necessity of the algorithm are analyzed. At last, we perform a comparison of the results of two cases in resolution.
Automotive FMCW Radar-enhanced Range Estimation via a Local Resampling Fourier Transform
Directory of Open Access Journals (Sweden)
Cailing Wang
2016-02-01
Full Text Available In complex traffic scenarios, more accurate measurement and discrimination for an automotive frequency-modulated continuous-wave (FMCW radar is required for intelligent robots, driverless cars and driver-assistant systems. A more accurate range estimation method based on a local resampling Fourier transform (LRFT for a FMCW radar is developed in this paper. Radar signal correlation in the phase space sees a higher signal-noise-ratio (SNR to achieve more accurate ranging, and the LRFT - which acts on a local neighbour as a refinement step - can achieve a more accurate target range. The rough range is estimated through conditional pulse compression (PC and then, around the initial rough estimation, a refined estimation through the LRFT in the local region achieves greater precision. Furthermore, the LRFT algorithm is tested in numerous simulations and physical system experiments, which show that the LRFT algorithm achieves a more precise range estimation than traditional FFT-based algorithms, especially for lower bandwidth signals.
Reduced bias and threshold choice in the extremal index estimation through resampling techniques
Gomes, Dora Prata; Neves, Manuela
2013-10-01
In Extreme Value Analysis there are a few parameters of particular interest among which we refer to the extremal index, a measure of extreme events clustering. It is of great interest for initial dependent samples, the common situation in many practical situations. Most semi-parametric estimators of this parameter show the same behavior: nice asymptotic properties but a high variance for small values of k, the number of upper order statistics used in the estimation and a high bias for large values of k. The Mean Square Error, a measure that encompasses bias and variance, usually shows a very sharp plot, needing an adequate choice of k. Using classical extremal index estimators considered in the literature, the emphasis is now given to derive reduced bias estimators with more stable paths, obtained through resampling techniques. An adaptive algorithm for estimating the level k for obtaining a reliable estimate of the extremal index is used. This algorithm has shown good results, but some improvements are still required. A simulation study will illustrate the properties of the estimators and the performance of the adaptive algorithm proposed.
An EMD based method for detrending RR interval series without resampling
Institute of Scientific and Technical Information of China (English)
曾超; 蒋奇云; 陈朝阳; 徐敏
2015-01-01
Slow trends in the RR interval (RRI) series should be removed in the preprocessing step to get a reliable result of heart rate variability (HRV) analysis. Re-sampling is required to convert the unevenly sampled RRI series into evenly sampled time series when using the widely accepted smoothness priors approach (SPA). Noise is introduced in this process and the information quality is thus compromised. Empirical mode decomposition (EMD) and its variants, were introduced to directly process the unevenly sampled RRI series. Besides, a RR interval model was proposed to fascinate the introduction of standard metrics for the evaluation of the detrending performance. Based on standard metrics including signal-to-noise-ratio in dB (ISNR), mean square error (EMS), and percent root square difference (DPRS), the effectiveness of detrending methods in RR interval analysis were determined. Results demonstrate that complementary ensemble EMD (CEEMD, a variant of EMD) based method has a higherISNR, a lowerEMS and a lowerDPRS as well as a better RRI series detrending performance compared with the SPA method, which would in turn lead to a more accurate HRV analysis.
Pseudocontact Shift-Driven Iterative Resampling for 3D Structure Determinations of Large Proteins.
Pilla, Kala Bharath; Otting, Gottfried; Huber, Thomas
2016-01-29
Pseudocontact shifts (PCSs) induced by paramagnetic lanthanides produce pronounced effects in nuclear magnetic resonance spectra, which are easily measured and which deliver valuable long-range structure restraints. Even sparse PCS data greatly enhance the success rate of 3D (3-dimensional) structure predictions of proteins by the modeling program Rosetta. The present work extends this approach to 3D structures of larger proteins, comprising more than 200 residues, which are difficult to model by Rosetta without additional experimental restraints. The new algorithm improves the fragment assembly method of Rosetta by utilizing PCSs generated from paramagnetic lanthanide ions attached at four different sites as the only experimental restraints. The sparse PCS data are utilized at multiple stages, to identify native-like local structures, to rank the best structural models and to rebuild the fragment libraries. The fragment libraries are refined iteratively until convergence. The PCS-driven iterative resampling algorithm is strictly data dependent and shown to generate accurate models for a benchmark set of eight different proteins, ranging from 100 to 220 residues, using solely PCSs of backbone amide protons.
Two new data-dependent choices of m when applying the m-out-of-n bootstrap to hypothesis testing
Allison, James Samuel; Santana, Leonard; Swanepoel, Jan Willem Hendrik
2011-01-01
The traditional non-parametric bootstrap (referred to as the n-out-of-n bootstrap) is a widely applicable and powerful tool for statistical inference, but in important situations it can fail. It is well known that by using a bootstrap sample of size m, different from n, the resulting m-out-of-n bootstrap provides a method for rectifying the traditional bootstrap inconsistency. Moreover, recent studies have shown that interesting cases exist where it is better to use the m-out-of-n bootstrap i...
Learning robust cell signalling models from high throughput proteomic data
Koch, Mitchell; Broom, Bradley M.; Subramanian, Devika
2009-01-01
We propose a framework for learning robust Bayesian network models of cell signalling from high-throughput proteomic data. We show that model averaging using Bayesian bootstrap resampling generates more robust structures than procedures that learn structures using all of the data. We also develop an algorithm for ranking the importance of network features using bootstrap resample data. We apply our algorithms to derive the T-cell signalling network from the flow cytometry data of Sachs et al....
Current drive and sustain experiments with the bootstrap current in JT-60
International Nuclear Information System (INIS)
The current drive and sustain experiments with the neoclassical bootstrap current are performed in the JT-60 tokamak. It is shown that up to 80% of total plasma current is driven by the bootstrap current in extremely high βp regime (βp = 3.2) and the current drive product Ip (bootstrap) n-bareRp up to 4.4 x 1019 MAm-2 has been attained with the bootstrap current. The experimental resistive loop voltages are compared with the calculations using the neoclassical resistivity with and without the bootstrap current and the Spitzer resistivity for a wide range of the plasma current (Ip = 0.5 -2 MA) and the poloidal beta (βp = 0.1 - 3.2). The calculated resistive loop voltage is consistent with the neoclassical prediction including the bootstrap current. Current sustain with the bootstrap current is tested by terminating the Ip feedback control during the high power neutral beam heating. An enhancement of the L/R decay time than those expected from the plasma resistivity with measured Te and Zeff has been confirmed experimentally supporting the large non-inductive current in the plasma and is consistent with the neoclassical prediction. A new technique to calculate the bootstrap current in multi-collisionality regime for finite aspect ratio tokamak has bee developed. The neoclassical bootstrap current is calculated directly through the force balance equations between viscous and friction forces according to the Hirshman-Sigmar theory. The bootstrap current driven by the fast ion component is also included. Ballooning stability of the high βp plasma are analyzed using the current profiles including the bootstrap current. The plasma pressure is close to the ballooning limit in high βp discharges. (author)
Interaction of bootstrap-current-driven magnetic islands
International Nuclear Information System (INIS)
The formation and interaction of fluctuating neoclassical pressure gradient driven magnetic islands is examined. The interaction of magnetic islands produces a stochastic region around the separatrices of the islands. This interaction causes the island pressure profile to be broadened, reducing the island bootstrap current and drive for the magnetic island. A model is presented that describes the magnetic topology as a bath of interacting magnetic islands with low to medium poloidal mode number (m congruent 3-30). The islands grow by the bootstrap current effect and damp due to the flattening of the pressure profile near the island separatrix caused by the interaction of the magnetic islands. The effect of this sporadic growth and decay of the islands (''magnetic bubbling'') is not normally addressed in theories of plasma transport due to magnetic fluctuations. The nature of the transport differs from statistical approaches to magnetic turbulence since the radial step size of the plasma transport is now given by the characteristic island width. This model suggests that tokamak experiments have relatively short-lived, coherent, long wavelength magnetic oscillations present in the steep pressure-gradient regions of the plasma. 42 refs
Bootstrap embedding: An internally consistent fragment-based method
Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy
2016-08-01
Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments "embedded" in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed "Bootstrap Embedding," a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.
A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series
Directory of Open Access Journals (Sweden)
Fernando Luiz Cyrino Oliveira
2014-01-01
Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.
Bootstrap embedding: An internally consistent fragment-based method.
Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy
2016-08-21
Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments "embedded" in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed "Bootstrap Embedding," a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems. PMID:27544082
MEAN SQUARED ERRORS OF BOOTSTRAP VARIANCE ESTIMATORS FOR U-STATISTICS
Mizuno, Masayuki; Maesono, Yoshihiko
2011-01-01
In this paper, we obtain an asymptotic representation of the bootstrap variance estimator for a class of U-statistics. Using the representation of the estimator, we will obtain a mean squared error of the variance estimator until the order n^. Also we compare the bootstrap and the jackknife variance estimators, theoretically.
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
Cui, Zhongmin; Kolen, Michael J.
2008-01-01
This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…
Bootstrapping to Test for Nonzero Population Correlation Coefficients Using Univariate Sampling
Beasley, William Howard; DeShea, Lise; Toothaker, Larry E.; Mendoza, Jorge L.; Bard, David E.; Rodgers, Joseph Lee
2007-01-01
This article proposes 2 new approaches to test a nonzero population correlation ([rho]): the hypothesis-imposed univariate sampling bootstrap (HI) and the observed-imposed univariate sampling bootstrap (OI). The authors simulated correlated populations with various combinations of normal and skewed variates. With [alpha[subscript "set"
Bootstrapping the Small Sample Critical Values of the Rescaled Range Statistic
Marwan Izzeldin; Anthony Murphy
2000-01-01
Finite sample critical values of the rescaled range or R/S statistic may be obtained by bootstrapping. The empirical size and power performance of these critical values is good. Using the post blackened, moving block bootstrap helps to replicate the time dependencies in the original data. The Monte Carlo results show that the asymptotic critical values in Lo (1991) should not be used.
Institute of Scientific and Technical Information of China (English)
2000-01-01
In this paper,the author studies the asymptotic accuracies of the one-term Edgeworth expansions and the bootstrap approximation for the studentized MLE from randomly censored exponential population.It is shown that the Edgeworth expansions and the bootstrap approximation are asymptotically close to the exact distribution of the studentized MLE with a rate.
Fast Generation of Ensembles of Cosmological N-Body Simulations via Mode-Resampling
Energy Technology Data Exchange (ETDEWEB)
Schneider, M D; Cole, S; Frenk, C S; Szapudi, I
2011-02-14
We present an algorithm for quickly generating multiple realizations of N-body simulations to be used, for example, for cosmological parameter estimation from surveys of large-scale structure. Our algorithm uses a new method to resample the large-scale (Gaussian-distributed) Fourier modes in a periodic N-body simulation box in a manner that properly accounts for the nonlinear mode-coupling between large and small scales. We find that our method for adding new large-scale mode realizations recovers the nonlinear power spectrum to sub-percent accuracy on scales larger than about half the Nyquist frequency of the simulation box. Using 20 N-body simulations, we obtain a power spectrum covariance matrix estimate that matches the estimator from Takahashi et al. (from 5000 simulations) with < 20% errors in all matrix elements. Comparing the rates of convergence, we determine that our algorithm requires {approx}8 times fewer simulations to achieve a given error tolerance in estimates of the power spectrum covariance matrix. The degree of success of our algorithm indicates that we understand the main physical processes that give rise to the correlations in the matter power spectrum. Namely, the large-scale Fourier modes modulate both the degree of structure growth through the variation in the effective local matter density and also the spatial frequency of small-scale perturbations through large-scale displacements. We expect our algorithm to be useful for noise modeling when constraining cosmological parameters from weak lensing (cosmic shear) and galaxy surveys, rescaling summary statistics of N-body simulations for new cosmological parameter values, and any applications where the influence of Fourier modes larger than the simulation size must be accounted for.
Montzka, C.; Moradkhani, H.; Han, X.; Hendricks Franssen, H. J.; Puetz, T.; Vereecken, H.
2014-12-01
An adequate description of soil hydraulic properties is essential for a good performance of hydrological forecasts and soil water fluxes. So far, several studies showed that data assimilation could reduce the parameter uncertainty by considering soil moisture observations. However, these observations and also the model forcings were recorded with a specific measurement error. It seems a logical step to base state updating and parameter estimation on observations made at multiple time steps, in order to reduce the influence of outliers at single time steps given measurement errors and unknown model forcings. Such outliers could result in erroneous state estimation as well as inadequate parameters. This has been one of the reasons to use a smoothing technique as implemented for Bayesian data assimilation methods such as the Ensemble Kalman Filter (i.e. Ensemble Kalman Smoother). In this contribution we present a Particle Smoother (SIR-PS) with sequentially smoothing of particle weights for state and parameter resampling within a time window as opposed to the single time step data assimilation used in filtering techniques. This can be seen as an intermediate variant between a parameter estimation technique using global optimization with estimation of single parameter sets valid for the whole period, and sequential Monte Carlo techniques with estimation of parameter sets evolving from one time step to another. The aims are i) to improve the soil moisture forecast by estimating hydraulic parameters, ii) to reduce the impact of single erroneous model inputs/observations by a smoothing method, and iii) to evaluate the performance of the SIR-PS as opposed to the SIR-PF using different ensemble and smoothing window sizes. In order to validate the performance of the proposed method for real world conditions, experimental data obtained from a two year lysimeter study were used.
Resampling method for applying density-dependent habitat selection theory to wildlife surveys.
Directory of Open Access Journals (Sweden)
Olivia Tardy
Full Text Available Isodar theory can be used to evaluate fitness consequences of density-dependent habitat selection by animals. A typical habitat isodar is a regression curve plotting competitor densities in two adjacent habitats when individual fitness is equal. Despite the increasing use of habitat isodars, their application remains largely limited to areas composed of pairs of adjacent habitats that are defined a priori. We developed a resampling method that uses data from wildlife surveys to build isodars in heterogeneous landscapes without having to predefine habitat types. The method consists in randomly placing blocks over the survey area and dividing those blocks in two adjacent sub-blocks of the same size. Animal abundance is then estimated within the two sub-blocks. This process is done 100 times. Different functional forms of isodars can be investigated by relating animal abundance and differences in habitat features between sub-blocks. We applied this method to abundance data of raccoons and striped skunks, two of the main hosts of rabies virus in North America. Habitat selection by raccoons and striped skunks depended on both conspecific abundance and the difference in landscape composition and structure between sub-blocks. When conspecific abundance was low, raccoons and striped skunks favored areas with relatively high proportions of forests and anthropogenic features, respectively. Under high conspecific abundance, however, both species preferred areas with rather large corn-forest edge densities and corn field proportions. Based on random sampling techniques, we provide a robust method that is applicable to a broad range of species, including medium- to large-sized mammals with high mobility. The method is sufficiently flexible to incorporate multiple environmental covariates that can reflect key requirements of the focal species. We thus illustrate how isodar theory can be used with wildlife surveys to assess density-dependent habitat selection
Resampling method for applying density-dependent habitat selection theory to wildlife surveys.
Tardy, Olivia; Massé, Ariane; Pelletier, Fanie; Fortin, Daniel
2015-01-01
Isodar theory can be used to evaluate fitness consequences of density-dependent habitat selection by animals. A typical habitat isodar is a regression curve plotting competitor densities in two adjacent habitats when individual fitness is equal. Despite the increasing use of habitat isodars, their application remains largely limited to areas composed of pairs of adjacent habitats that are defined a priori. We developed a resampling method that uses data from wildlife surveys to build isodars in heterogeneous landscapes without having to predefine habitat types. The method consists in randomly placing blocks over the survey area and dividing those blocks in two adjacent sub-blocks of the same size. Animal abundance is then estimated within the two sub-blocks. This process is done 100 times. Different functional forms of isodars can be investigated by relating animal abundance and differences in habitat features between sub-blocks. We applied this method to abundance data of raccoons and striped skunks, two of the main hosts of rabies virus in North America. Habitat selection by raccoons and striped skunks depended on both conspecific abundance and the difference in landscape composition and structure between sub-blocks. When conspecific abundance was low, raccoons and striped skunks favored areas with relatively high proportions of forests and anthropogenic features, respectively. Under high conspecific abundance, however, both species preferred areas with rather large corn-forest edge densities and corn field proportions. Based on random sampling techniques, we provide a robust method that is applicable to a broad range of species, including medium- to large-sized mammals with high mobility. The method is sufficiently flexible to incorporate multiple environmental covariates that can reflect key requirements of the focal species. We thus illustrate how isodar theory can be used with wildlife surveys to assess density-dependent habitat selection over large
A bootstrap method for estimating uncertainty of water quality trends
Hirsch, Robert M.; Archfield, Stacey A.; DeCicco, Laura
2015-01-01
Estimation of the direction and magnitude of trends in surface water quality remains a problem of great scientific and practical interest. The Weighted Regressions on Time, Discharge, and Season (WRTDS) method was recently introduced as an exploratory data analysis tool to provide flexible and robust estimates of water quality trends. This paper enhances the WRTDS method through the introduction of the WRTDS Bootstrap Test (WBT), an extension of WRTDS that quantifies the uncertainty in WRTDS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties. Monte Carlo experiments are applied to estimate the Type I error probabilities for this method. WBT is compared to other water-quality trend-testing methods appropriate for data sets of one to three decades in length with sampling frequencies of 6–24 observations per year. The software to conduct the test is in the EGRETci R-package.
Bootstrapping Security Policies for Wearable Apps Using Attributed Structural Graphs.
González-Tablas, Ana I; Tapiador, Juan E
2016-05-11
We address the problem of bootstrapping security and privacy policies for newly-deployed apps in wireless body area networks (WBAN) composed of smartphones, sensors and other wearable devices. We introduce a framework to model such a WBAN as an undirected graph whose vertices correspond to devices, apps and app resources, while edges model structural relationships among them. This graph is then augmented with attributes capturing the features of each entity together with user-defined tags. We then adapt available graph-based similarity metrics to find the closest app to a new one to be deployed, with the aim of reusing, and possibly adapting, its security policy. We illustrate our approach through a detailed smartphone ecosystem case study. Our results suggest that the scheme can provide users with a reasonably good policy that is consistent with the user's security preferences implicitly captured by policies already in place.
Bootstrapping Pure Quantum Gravity in AdS3
Bae, Jin-Beom; Lee, Sungjay
2016-01-01
The three-dimensional pure quantum gravity with negative cosmological constant is supposed to be dual to the extremal conformal field theory of central charge $c=24k$ in two dimensions. We employ the conformal bootstrap method to analyze the extremal CFTs, and find numerical evidence for the non-existence of the extremal CFTs for sufficiently large central charge ($k \\ge 20$). We also explore near-extremal CFTs, a small modification of extremal ones, and find similar evidence for their non-existence for large central charge. This indicates, under the assumption of holomorphic factorization, the pure gravity in the weakly curved AdS$_3$ do not exist as a consistent quantum theory.
Bootstrapping Security Policies for Wearable Apps Using Attributed Structural Graphs
Directory of Open Access Journals (Sweden)
Ana I. González-Tablas
2016-05-01
Full Text Available We address the problem of bootstrapping security and privacy policies for newly-deployed apps in wireless body area networks (WBAN composed of smartphones, sensors and other wearable devices. We introduce a framework to model such a WBAN as an undirected graph whose vertices correspond to devices, apps and app resources, while edges model structural relationships among them. This graph is then augmented with attributes capturing the features of each entity together with user-defined tags. We then adapt available graph-based similarity metrics to find the closest app to a new one to be deployed, with the aim of reusing, and possibly adapting, its security policy. We illustrate our approach through a detailed smartphone ecosystem case study. Our results suggest that the scheme can provide users with a reasonably good policy that is consistent with the user’s security preferences implicitly captured by policies already in place.
Bootstrapping Mixed Correlators in the 3D Ising Model
Kos, Filip; Simmons-Duffin, David
2014-01-01
We study the conformal bootstrap for systems of correlators involving non-identical operators. The constraints of crossing symmetry and unitarity for such mixed correlators can be phrased in the language of semidefinite programming. We apply this formalism to the simplest system of mixed correlators in 3D CFTs with a $\\mathbb{Z}_2$ global symmetry. For the leading $\\mathbb{Z}_2$-odd operator $\\sigma$ and $\\mathbb{Z}_2$-even operator $\\epsilon$, we obtain numerical constraints on the allowed dimensions $(\\Delta_\\sigma, \\Delta_\\epsilon)$ assuming that $\\sigma$ and $\\epsilon$ are the only relevant scalars in the theory. These constraints yield a small closed region in $(\\Delta_\\sigma, \\Delta_\\epsilon)$ space compatible with the known values in the 3D Ising CFT.
Higgs critical exponents and conformal bootstrap in four dimensions
Antipin, Oleg; Mølgaard, Esben; Sannino, Francesco
2015-06-01
We investigate relevant properties of composite operators emerging in non-supersymmetric, four-dimensional gauge-Yukawa theories with interacting conformal fixed points within a precise framework. The theories investigated in this work are structurally similar to the standard model of particle interactions, but differ by developing perturbative interacting fixed points. We investigate the physical properties of the singlet and the adjoint composite operators quadratic in the Higgs field, and discover, via a direct computation, that the singlet anomalous dimension is substantially larger than the adjoint one. The numerical bootstrap results are, when possible, compared to our precise findings associated to the four dimensional conformal field theoretical results. To accomplish this, it was necessary to calculate explicitly the crossing symmetry relations for the global symmetry group SU( N ) × SU( N ).
Higgs Critical Exponents and Conformal Bootstrap in Four Dimensions
Antipin, Oleg; Sannino, Francesco
2014-01-01
Within a precise framework, we investigate relevant properties of composite operators emerging in nonsupersymmetric, four-dimensional gauge-Yukawa theories with interacting conformal fixed points. The theories investigated in this work are structurally similar to the standard model of particle interactions, but differ from the standard model by developing perturbative interacting fixed points. We investigate the physical properties of the singlet and the adjoint composite operators quadratic in the Higgs field. We show that, in the Veneziano limit, and at the highest known order in perturbation theory, the singlet sector decouples from the other operators. This fact allows us to test the numerical bootstrap constraints against precise four dimensional conformal field theoretical results.
Bootstrap and the physical values of $\\pi N$ resonance parameters
Semenov-Tian-Shansky, Kirill M; Vereshagin, Vladimir V
2007-01-01
This is the 6th paper in the series developing the formalism to manage the effective scattering theory of strong interactions. Relying on the theoretical scheme suggested in our previous publications we concentrate here on the practical aspect and apply our technique to the elastic pion-nucleon scattering amplitude. We test numerically the pion-nucleon spectrum sum rules that follow from the tree level bootstrap constraints. We show how these constraints can be used to estimate the tensor and vector $NN\\rho$ coupling constants. At last, we demonstrate that the tree-level low energy expansion coefficients computed in the framework of our approach show nice agreement with known experimental data. These results allow us to claim that the extended perturbation scheme is quite reasonable from the computational point of view.
Spanning Trees and bootstrap reliability estimation in correlation based networks
Tumminello, M; Lillo, F; Micciché, S; Mantegna, R N
2006-01-01
We introduce a new technique to associate a spanning tree to the average linkage cluster analysis. We term this tree as the Average Linkage Minimum Spanning Tree. We also introduce a technique to associate a value of reliability to links of correlation based graphs by using bootstrap replicas of data. Both techniques are applied to the portfolio of the 300 most capitalized stocks traded at New York Stock Exchange during the time period 2001-2003. We show that the Average Linkage Minimum Spanning Tree recognizes economic sectors and sub-sectors as communities in the network slightly better than the Minimum Spanning Tree does. We also show that the average reliability of links in the Minimum Spanning Tree is slightly greater than the average reliability of links in the Average Linkage Minimum Spanning Tree.
Performance of Bootstrap MCEWMA: Study case of Sukuk Musyarakah data
Safiih, L. Muhamad; Hila, Z. Nurul
2014-07-01
Sukuk Musyarakah is one of several instruments of Islamic bond investment in Malaysia, where the form of this sukuk is actually based on restructuring the conventional bond to become a Syariah compliant bond. The Syariah compliant is based on prohibition of any influence of usury, benefit or fixed return. Despite of prohibition, daily returns of sukuk are non-fixed return and in statistic, the data of sukuk returns are said to be a time series data which is dependent and autocorrelation distributed. This kind of data is a crucial problem whether in statistical and financing field. Returns of sukuk can be statistically viewed by its volatility, whether it has high volatility that describing the dramatically change of price and categorized it as risky bond or else. However, this crucial problem doesn't get serious attention among researcher compared to conventional bond. In this study, MCEWMA chart in Statistical Process Control (SPC) is mainly used to monitor autocorrelated data and its application on daily returns of securities investment data has gained widespread attention among statistician. However, this chart has always been influence by inaccurate estimation, whether on base model or its limit, due to produce large error and high of probability of signalling out-of-control process for false alarm study. To overcome this problem, a bootstrap approach used in this study, by hybridise it on MCEWMA base model to construct a new chart, i.e. Bootstrap MCEWMA (BMCEWMA) chart. The hybrid model, BMCEWMA, will be applied to daily returns of sukuk Musyarakah for Rantau Abang Capital Bhd. The performance of BMCEWMA base model showed that its more effective compare to real model, MCEWMA based on smaller error estimation, shorter the confidence interval and smaller false alarm. In other word, hybrid chart reduce the variability which shown by smaller error and false alarm. It concludes that the application of BMCEWMA is better than MCEWMA.
Langella, Giuliano; Basile, Angelo; Bonfante, Antonello; Manna, Piero; Terribile, Fabio
2013-04-01
Digital soil mapping procedures are widespread used to build two-dimensional continuous maps about several pedological attributes. Our work addressed a regression kriging (RK) technique and a bootstrapped artificial neural network approach in order to evaluate and compare (i) the accuracy of prediction, (ii) the susceptibility of being included in automatic engines (e.g. to constitute web processing services), and (iii) the time cost needed for calibrating models and for making predictions. Regression kriging is maybe the most widely used geostatistical technique in the digital soil mapping literature. Here we tried to apply the EBLUP regression kriging as it is deemed to be the most statistically sound RK flavor by pedometricians. An unusual multi-parametric and nonlinear machine learning approach was accomplished, called BAGAP (Bootstrap aggregating Artificial neural networks with Genetic Algorithms and Principal component regression). BAGAP combines a selected set of weighted neural nets having specified characteristics to yield an ensemble response. The purpose of applying these two particular models is to ascertain whether and how much a more cumbersome machine learning method could be much promising in making more accurate/precise predictions. Being aware of the difficulty to handle objects based on EBLUP-RK as well as BAGAP when they are embedded in environmental applications, we explore the susceptibility of them in being wrapped within Web Processing Services. Two further kinds of aspects are faced for an exhaustive evaluation and comparison: automaticity and time of calculation with/without high performance computing leverage.
Francq, Bernard G; Cartiaux, Olivier
2016-09-10
Resecting bone tumors requires good cutting accuracy to reduce the occurrence of local recurrence. This issue is considerably reduced with a navigated technology. The estimation of extreme proportions is challenging especially with small or moderate sample sizes. When no success is observed, the commonly used binomial proportion confidence interval is not suitable while the rule of three provides a simple solution. Unfortunately, these approaches are unable to differentiate between different unobserved events. Different delta methods and bootstrap procedures are compared in univariate and linear mixed models with simulations and real data by assuming the normality. The delta method on the z-score and parametric bootstrap provide similar results but the delta method requires the estimation of the covariance matrix of the estimates. In mixed models, the observed Fisher information matrix with unbounded variance components should be preferred. The parametric bootstrap, easier to apply, outperforms the delta method for larger sample sizes but it may be time costly. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26990871
Correlation Attenuation Due to Measurement Error: A New Approach Using the Bootstrap Procedure
Padilla, Miguel A.; Veprinsky, Anna
2012-01-01
Issues with correlation attenuation due to measurement error are well documented. More than a century ago, Spearman proposed a correction for attenuation. However, this correction has seen very little use since it can potentially inflate the true correlation beyond one. In addition, very little confidence interval (CI) research has been done for…
Consonant Inventories in the Spontaneous Speech of Young Children: A Bootstrapping Procedure
Van Severen, Lieve; Van Den Berg, Renate; Molemans, Inge; Gillis, Steven
2012-01-01
Consonant inventories are commonly drawn to assess the phonological acquisition of toddlers. However, the spontaneous speech data that are analysed often vary substantially in size and composition. Consequently, comparisons between children and across studies are fundamentally hampered. This study aims to examine the effect of sample size on the…
BOOTSTRAP TECHNIQUE FOR ROC ANALYSIS: A STABLE EVALUATION OF FISHER CLASSIFIER PERFORMANCE
Institute of Scientific and Technical Information of China (English)
Xie Jigang; iu Zhengding
2007-01-01
This paper presents a novel bootstrap based method for Receiver Operating Characteristic (ROC) analysis of Fisher classifier. By defining Fisher classifier's output as a statistic, the bootstrap technique is used to obtain the sampling distributions of the outputs for the positive class and the negative class respectively. As a result, the ROC curve is a plot of all the (False Positive Rate (FPR),True Positive Rate (TPR)) pairs by varying the decision threshold over the whole range of the bootstrap sampling distributions. The advantage of this method is, the bootstrap based ROC curves are much stable than those of the holdout or cross-validation, indicating a more stable ROC analysis of Fisher classifier. Experiments on five data sets publicly available demonstrate the effectiveness of the proposed method.
Web-sivujen kehitys ja mobiilioptimointi käytettäessä Bootstrap-alustaa
Myhrberg, Mikael
2015-01-01
Opinnäytetyön aihe perustuu omakohtaiseen kokemukseen web-kehityksestä. Taustana toimi myös aiemmin hyvinkääläiselle yritykselle tehty Bootstrap-projekti. Tarkoituksena oli esitellä Bootstrap-koodikirjastoa ja sitä, kuinka sitä voi käyttää alustana käytännön web-kehityksessä ja mobiilioptimoinnissa. Tavoitteena oli saada lukija ymmärtämään, mikä Bootstrap on ja mitä web-sivujen mobiilioptimointi tarkoittaa. Lisäksi tarkoituksena oli käytännön esimerkkiprojektin avulla näyttää, miten Bootstrap...
Treyens, Pierre-Eric
2008-01-01
We consider linear regression models and we suppose that disturbances are either Gaussian or non Gaussian. Then, by using Edgeworth expansions, we compute the exact errors in the rejection probability (ERPs) for all one-restriction tests (asymptotic and bootstrap) which can occur in these linear models. More precisely, we show that the ERP is the same for the asymptotic test as for the classical parametric bootstrap test it is based on as soon as the third cumulant is nonnul. On the other sid...
Sensitivity Analysis to Efficiency Scores : How to Bootstrap in Nonparametric Frontier Models
Simar, Léopold; Wilson, Paul
1995-01-01
Efficiency scores of production units are generally measured relative to an estimated production frontier. Nonparametric estimators (DEA, FDH, ... ) are based on a finite sample of observed production units. The bootstrap is one easy way to analyze the sensitivity of efficiency scores relative to the sampling variations of the estimated frontier. The main point in order to validate the bootstrap is to define a reasonable data generating process in this complex framework and to propose a reaso...
Bootstrap methods for lasso-type estimators under a moving-parameter framework
Cai, W; Lee, SMS
2012-01-01
We study the distributions of Lasso-type regression estimators in a moving-parameter asymptotic framework, and consider various bootstrap methods for estimating them accordingly. We show, in particular, that the distribution functions of Lasso-type estimators, including even those possessing the oracle properties such as the adaptive Lasso and the SCAD, cannot be consistently estimated by the bootstraps uniformly over the space of the regression parameters, especially when some of the regre...
Desenvolupament de tema per a WordPress basat en el framework Twitter Bootstrap
Roca Escoda, Víctor
2014-01-01
Desenvolupament de tema per a WordPress fet a mida per a una web d'empresa, utilitzant el marc de treball Twitter Bootstrap que permet adaptar la pàgina a l'entorn de l'usuari. Desarrollo de tema para WordPress hecho a medida para una web de empresa, utilizando el marco de trabajo Twitter Bootstrap que permite adaptar la página al entorno del usuario. Bachelor thesis for the Multimedia program.
Sensitivity Analysis of Efficiency Scores: How to Bootstrap in Nonparametric Frontier Models
Simar, Léopold; Paul W. Wilson
1998-01-01
Efficiency scores of production units are generally measured relative to an estimated production frontier. Nonparametric estimators (DEA, FDH, \\cdots ) are based on a finite sample of observed production units. The bootstrap is one easy way to analyze the sensitivity of efficiency scores relative to the sampling variations of the estimated frontier. The main point in order to validate the bootstrap is to define a reasonable data-generating process in this complex framework and to propose a re...
Eddy Lizarazu Alanez; Jose A. Villasenor Alva
2010-01-01
Usamos simulaciones de Monte Carlo para estudiar el desempeno de la prueba de raiz unitaria de Shin-So (DFSS) bajo los enfoques de transformaciones invariantes y el bootstrapping. Si la hipotesis alternativa es un proceso estacionario alrededor de una tendencia lineal, entonces la prueba bootstrap parametrica es la mejor en terminos de la potencia estadistica. Sin embargo, si transformamos las observaciones para construir una prueba invariante, entonces la prueba DFSS es la mejor. Por consigu...
Bootstrapping Multi-Parton Loop Amplitudes in QCD
Energy Technology Data Exchange (ETDEWEB)
Bern, Zvi; /UCLA; Dixon, Lance J.; /SLAC; Kosower, David A.; /Saclay, SPhT
2005-07-06
The authors present a new method for computing complete one-loop amplitudes, including their rational parts, in non-supersymmetric gauge theory. This method merges the unitarity method with on-shell recursion relations. It systematizes a unitarity-factorization bootstrap approach previously applied by the authors to the one-loop amplitudes required for next-to-leading order QCD corrections to the processes e{sup +}e{sup -} {yields} Z, {gamma}* {yields} 4 jets and pp {yields} W + 2 jets. We illustrate the method by reproducing the one-loop color-ordered five-gluon helicity amplitudes in QCD that interfere with the tree amplitude, namely A{sub 5;1}(1{sup -}, 2{sup -}, 3{sup +}, 4{sup +}, 5{sup +}) and A{sub 5;1}(1{sup -}, 2{sup +}, 3{sup -}, 4{sup +}, 5{sup +}). Then we describe the construction of the six- and seven-gluon amplitudes with two adjacent negative-helicity gluons, A{sub 6;1}(1{sup -}, 2{sup -}, 3{sup +}, 4{sup +}, 5{sup +}, 6{sup +}) and A{sub 7;1}(1{sup -}, 2{sup -}, 3{sup +}, 4{sup +}, 5{sup +}, 6{sup +}, 7{sup +}), which uses the previously-computed logarithmic parts of the amplitudes as input. They present a compact expression for the six-gluon amplitude. No loop integrals are required to obtain the rational parts.
N=4 Superconformal Bootstrap of the K3 CFT
Lin, Ying-Hsuan; Simmons-Duffin, David; Wang, Yifan; Yin, Xi
2015-01-01
We study two-dimensional (4,4) superconformal field theories of central charge c=6, corresponding to nonlinear sigma models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N=4 superconformal blocks with c=6 and bosonic Virasoro conformal blocks with c=28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. We observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and find numerically an upper bound on this gap that is saturated by the $A_1$ N=4 cigar CFT. We also derive an analytic upper bound on the first nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence be...
N=4 Superconformal Bootstrap of the K3 CFT
CERN. Geneva
2015-01-01
We study two-dimensional (4,4) superconformal field theories of central charge c=6, corresponding to nonlinear σ models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N=4 superconformal blocks with c=6 and bosonic Virasoro conformal blocks with c=28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. We observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and find numerically an upper bound on this gap that is saturated by the A1 N=4 cigar CFT. We also derive an analytic upper bound on the first nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence...
More on analytic bootstrap for O( N) models
Dey, Parijat; Kaviraj, Apratim; Sen, Kallol
2016-06-01
This note is an extension of a recent work on the analytical bootstrapping of O( N) models. An additonal feature of the O( N) model is that the OPE contains trace and antisymmetric operators apart from the symmetric-traceless objects appearing in the OPE of the singlet sector. This in addition to the stress tensor ( T μν ) and the ϕ i ϕ i scalar, we also have other minimal twist operators as the spin-1 current J μ and the symmetric-traceless scalar in the case of O( N). We determine the effect of these additional objects on the anomalous dimensions of the corresponding trace, symmetric-traceless and antisymmetric operators in the large spin sector of the O( N) model, in the limit when the spin is much larger than the twist. As an observation, we also verified that the leading order results for the large spin sector from the ɛ-expansion are an exact match with our n = 0 case. A plausible holographic setup for the special case when N = 2 is also mentioned which mimics the calculation in the CFT.
Bootstrap equations for $\\mathcal{N}=4$ SYM with defects
Liendo, Pedro
2016-01-01
This paper focuses on the analysis of $4d$ $\\mathcal{N}=4$ superconformal theories in the presence of a defect from the point of view of the conformal bootstrap. We will concentrate first on the case of codimension one, where the defect is a boundary that preserves half of the supersymmetry. After studying the constraints imposed by supersymmetry, we will write the Ward identities associated to two-point functions of $\\tfrac{1}{2}$-BPS operators and write their solution as a superconformal block expansion. Due to a surprising connection between spacetime and R-symmetry conformal blocks, our results not only apply to $4d$ $\\Nm=4$ superconformal theories with a boundary, but also to three more systems that have the same symmetry algebra: $4d$ $\\Nm=4$ superconformal theories with a line defect, $3d$ $\\Nm=4$ superconformal theories with no defect, and $OSP(4^*|4)$ superconformal quantum mechanics. The superconformal algebra implies that all these systems possess a closed subsector of operators in which the bootst...
Language bootstrapping: learning word meanings from perception-action association.
Salvi, Giampiero; Montesano, Luis; Bernardino, Alexandre; Santos-Victor, José
2012-06-01
We address the problem of bootstrapping language acquisition for an artificial system similarly to what is observed in experiments with human infants. Our method works by associating meanings to words in manipulation tasks, as a robot interacts with objects and listens to verbal descriptions of the interactions. The model is based on an affordance network, i.e., a mapping between robot actions, robot perceptions, and the perceived effects of these actions upon objects. We extend the affordance model to incorporate spoken words, which allows us to ground the verbal symbols to the execution of actions and the perception of the environment. The model takes verbal descriptions of a task as the input and uses temporal co-occurrence to create links between speech utterances and the involved objects, actions, and effects. We show that the robot is able form useful word-to-meaning associations, even without considering grammatical structure in the learning process and in the presence of recognition errors. These word-to-meaning associations are embedded in the robot's own understanding of its actions. Thus, they can be directly used to instruct the robot to perform tasks and also allow to incorporate context in the speech recognition task. We believe that the encouraging results with our approach may afford robots with a capacity to acquire language descriptors in their operation's environment as well as to shed some light as to how this challenging process develops with human infants.
Reynaud-Bouret, Patricia; Laurent, Béatrice
2012-01-01
Considering two independent Poisson processes, we address the question of testing equality of their respective intensities. We construct multiple testing procedures from the aggregation of single tests whose testing statistics come from model selection, thresholding and/or kernel estimation methods. The corresponding critical values are computed through a non-asymptotic wild bootstrap approach. The obtained tests are proved to be exactly of level $\\alpha$, and to satisfy non-asymptotic oracle type inequalities. From these oracle type inequalities, we deduce that our tests are adaptive in the minimax sense over a large variety of classes of alternatives based on classical and weak Besov bodies in the univariate case, but also Sobolev and anisotropic Nikol'skii-Besov balls in the multivariate case. A simulation study furthermore shows that they strongly perform in practice.
A bootstrap based space-time surveillance model with an application to crime occurrences
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
The non-local bootstrap--estimation of uncertainty in diffusion MRI.
Yap, Pew-Thian; An, Hongyu; Chen, Yasheng; Shen, Dinggang
2013-01-01
Diffusion MRI is a noninvasive imaging modality that allows for the estimation and visualization of white matter connectivity patterns in the human brain. However, due to the low signal-to-noise ratio (SNR) nature of diffusion data, deriving useful statistics from the data is adversely affected by different sources of measurement noise. This is aggravated by the fact that the sampling distribution of the statistic of interest is often complex and unknown. In situations as such, the bootstrap, due to its distribution-independent nature, is an appealing tool for the estimation of the variability of almost any statistic, without relying on complicated theoretical calculations, but purely on computer simulation. In this work, we present new bootstrap strategies for variability estimation of diffusion statistics in association with noise. In contrast to the residual bootstrap, which relies on a predetermined data model, or the repetition bootstrap, which requires repeated signal measurements, our approach, called the non-local bootstrap (NLB), is non-parametric and obviates the need for time-consuming multiple acquisitions. The key assumption of NLB is that local image structures recur in the image. We exploit this self-similarity via a multivariate non-parametric kernel regression framework for bootstrap estimation of uncertainty. Evaluation of NLB using a set of high-resolution diffusion-weighted images, with lower than usual SNR due to the small voxel size, indicates that NLB is markedly more robust to noise and results in more accurate inferences. PMID:24683985
International Nuclear Information System (INIS)
Data-driven learning methods for predicting the evolution of the degradation processes affecting equipment are becoming increasingly attractive in reliability and prognostics applications. Among these, we consider here Support Vector Regression (SVR), which has provided promising results in various applications. Nevertheless, the predictions provided by SVR are point estimates whereas in order to take better informed decisions, an uncertainty assessment should be also carried out. For this, we apply bootstrap to SVR so as to obtain confidence and prediction intervals, without having to make any assumption about probability distributions and with good performance even when only a small data set is available. The bootstrapped SVR is first verified on Monte Carlo experiments and then is applied to a real case study concerning the prediction of degradation of a component from the offshore oil industry. The results obtained indicate that the bootstrapped SVR is a promising tool for providing reliable point and interval estimates, which can inform maintenance-related decisions on degrading components. - Highlights: • Bootstrap (pairs/residuals) and SVR are used as an uncertainty analysis framework. • Numerical experiments are performed to assess accuracy and coverage properties. • More bootstrap replications does not significantly improve performance. • Degradation of equipment of offshore oil wells is estimated by bootstrapped SVR. • Estimates about the scale growth rate can support maintenance-related decisions
Hennemann, Stefan
2012-01-01
Knowledge creation and dissemination in science and technology systems is perceived as a prerequisite for socio-economic development. The efficiency of creating new knowledge is considered to have a geographical component, i.e. some regions are more capable in scientific knowledge production than others. This article shows a method to use a network representation of scientific interaction to assess the relative efficiency of regions with diverse boundaries in channeling knowledge through a science system. In a first step, a weighted aggregate of the betweenness centrality is produced from empirical data (aggregation). The subsequent randomization of this empirical network produces the necessary Null-model for significance testing and normalization (randomization). This step is repeated to yield higher confidence about the results (re-sampling). The results are robust estimates for the relative regional efficiency to broker knowledge, which is discussed along with cross-sectional and longitudinal empirical exa...
Institute of Scientific and Technical Information of China (English)
2012-01-01
In this paper, we describe resourceefficient hardware architectures for softwaredefined radio （SDR） frontends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample rate changes, frequency selection, and bandwidth control. We discuss area, time, and power optimization for field programmable gate array （FPGA） based architectures in an Mpath polyphase filter bank with modified Npath polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones that are not multiples of the output sample rate. A nonmaximally decimated polyphase filter bank, where the number of data loads is not equal to the number of M subfilters, processes M subfilters in a time period that is either less than or greater than the Mdataload ＇ s time period. We present a loadprocess architecture （LPA） and a runtime architecture （RA） （based on serial polyphase structure） which have different scheduling. In LPA, Nsubfilters are loaded, and then M subfilters are processed at a clock rate that is a multiple of the input data rate. This is necessary to meet the output time constraint of the down-sampled data. In RA, Msubfilters processes are efficiently scheduled within Ndataload time while simultaneously loading N subfilters. This requires reduced clock rates compared with LPA, and potentially less power is consumed. A polyphase filter bank that uses different resampling factors for maximally decimated, underdecimated, overdecimated, and combined upand downsampled scenarios is used as a case study, and an analysis of area, time, and power for their FPGA architectures is given. For resourceoptimized SDR frontends, RA is superior for reducing operating clock rates and dynamic power consumption. RA is also superior for reducing area resources, except when indices are prestored in LUTs.
Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M
2016-01-01
Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.
[Sensitometry of Mammographic Screen-film System Using Bootstrap Aluminum Step-Wedge.].
Abe, Shinji; Imada, Ryou; Terauchi, Takashi; Fujisaki, Tatsuya; Monma, Masahiko; Nishimura, Katsuyuki; Saitoh, Hidetoshi; Mochizuki, Yasuo
2005-01-01
Recently, a few types of step-wedges for bootstrap sensitometry with a mammographic screen-film system have been proposed. In this study, the bootstrap sensitometry with the mammographic screen-film system was studied for two types of aluminum step-wedges. Characteristic X-ray energy curves were determined using mammographic and general radiographic aluminum step-wedges devised to prevent scattered X-rays generated from one step penetrating into the region of another one, and dependence of the characteristic curves on the wedges was also discussed. No difference was found in the characteristic curves due to the difference in the step-wedges for mammography and general radiography although there was a slight difference in shape at the shoulder portion for the two types of step-wedges. Therefore, it was concluded that aluminum step-wedges for mammography and general radiography could be employed in bootstrap sensitometry with the mammographic screen-film system. PMID:16479054
Closure of the Operator Product Expansion in the Non-Unitary Bootstrap
Esterlis, Ilya; Ramirez, David
2016-01-01
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a special case of the "Gliozzi" bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.
The economics of bootstrapping space industries - Development of an analytic computer model
Goldberg, A. H.; Criswell, D. R.
1982-01-01
A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.
The S-matrix Bootstrap I: QFT in AdS
Paulos, Miguel F; Toledo, Jonathan; van Rees, Balt C; Vieira, Pedro
2016-01-01
We propose a strategy to study massive Quantum Field Theory (QFT) using conformal bootstrap methods. The idea is to consider QFT in hyperbolic space and study correlation functions of its boundary operators. We show that these are solutions of the crossing equations in one lower dimension. By sending the curvature radius of the background hyperbolic space to infinity we expect to recover flat-space physics. We explain that this regime corresponds to large scaling dimensions of the boundary operators, and discuss how to obtain the flat-space scattering amplitudes from the corresponding limit of the boundary correlators. We implement this strategy to obtain universal bounds on the strength of cubic couplings in 2D flat-space QFTs using 1D conformal bootstrap techniques. Our numerical results match precisely the analytic bounds obtained in our companion paper using S-matrix bootstrap techniques.
Shi, Juanjuan; Liang, Ming; Guan, Yunpeng
2016-02-01
The conventional way for bearing fault diagnosis under variable rotational speed generally includes prefiltering, resampling based on shaft rotating frequency and order spectrum analysis. However, its application is confined by three major obstacles: a) knowledge-demanding parameter determination required by prefiltering, b) unavailable shaft rotating frequency for resampling as it is coupled with instantaneous fault characteristic frequency (IFCF) by a fault characteristic coefficient (FCC) which cannot be decided without knowing what fault actually exists, and c) complicated and error-prone resampling process. As such, we propose a new method to address these problems. The proposed method free from prefiltering and resampling mainly contains the following steps: a) extracting envelope by windowed fractal dimension (FD) transform, requiring no prefiltering, b) with the envelope signal, performing short time Fourier transform (STFT) to get a clear time frequency representation (TFR), from which the IFCF and the basic demodulator for generalized demodulation (GD) can be obtained, c) applying the generalized demodulation to the envelope signal with the current demodulator, converting the trajectory of the current time-frequency component into a linear path parallel to the time axis, d) frequency analyzing the demodulated signal, followed by searching the amplitude of the constant frequency where the linear path is situated. Updating demodulator via multiplying the basic demodulator by different real numbers (i.e., coefficient λ) and repeating the steps (c)-(d), the resampling-free order spectrum is then obtained. Based on the resulting spectrum, the final diagnosis decision can be made. The proposed method for its implementation on the example of simulated data is presented. Finally, experimental data are employed to validate the effectiveness of the proposed technique.
A New Regime for Studying the High Bootstrap Current Fraction Plasma
Institute of Scientific and Technical Information of China (English)
A. Isayama; Y. Kamada; K. Ushigusa; T. Fujita; T. Suzuki; X. Gao
2001-01-01
A new experimental regime has recently been studied for achieving the high fraction of the bootstrap current in the JT-60U hydrogen discharges. The high poloidal beta(βp ～ 3.61) plasma was obtained by high-power neutral beam injection heating at a very high edge safety factor (Ip = 0.3 MA, Bt = 3.65 T, qeff = 25 - 35) region, and the bootstrap current fraction (fBS) was correspondingly about 40% using the ACCOME code calculation. It was observed that there were no magnetohydrodynamic instabilities to retard the increase of the βp and fBS parameters in the new regime.
Bootstrapping critical Ising model on three-dimensional real projective space
Nakayama, Yu
2016-01-01
Given a conformal data on a flat Euclidean space, we use crosscap conformal bootstrap equations to numerically solve the Lee-Yang model as well as the critical Ising model on a three-dimensional real projective space. We check the rapid convergence of our bootstrap program in two-dimensions from the exact solutions available. Based on the comparison, we estimate that our systematic error on the numerically solved one-point functions of the critical Ising model on a three-dimensional real projective space is less than one percent. Our method opens up a novel way to solve conformal field theories on non-trivial geometries.
Energy Technology Data Exchange (ETDEWEB)
Niehof, Jonathan T.; Morley, Steven K.
2012-01-01
We review and develop techniques to determine associations between series of discrete events. The bootstrap, a nonparametric statistical method, allows the determination of the significance of associations with minimal assumptions about the underlying processes. We find the key requirement for this method: one of the series must be widely spaced in time to guarantee the theoretical applicability of the bootstrap. If this condition is met, the calculated significance passes a reasonableness test. We conclude with some potential future extensions and caveats on the applicability of these methods. The techniques presented have been implemented in a Python-based software toolkit.
DEFF Research Database (Denmark)
Linnet, Kristian
2005-01-01
Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...
A Novel Resampling Method for Differential Protection%一种用于差动保护的新型重采样方法
Institute of Scientific and Technical Information of China (English)
王业; 陆于平; 徐以超; 许但轩
2012-01-01
For multi-rate source, protection receives the digital signals with different sampling rates from communication interface, resulting in the different sampling rates on the two sides of differential protection and errors. This paper concentrates mainly on resampling method for input signal with different sampling rates in multi-rate source, and proposes the frequency domain transformation resampling method. Firstly, performs half-wave Fourier transformation on protection sampling signal. Secondly, processes the frequency domain resulting from the transformation based on the difference in sampling rates. Thirdly, changes the processed frequency spectrum to the time domain with inverse transformation, obtaining resampled signal. Comparing with the current method for resampling, this method can not only utilize the waiting time data of R times of continuous distinguishing of differential protection of sampled values efficiently and increase the precision of resampling, but also limit the error in resampling to a low level when signals contain a wealth of high-order harmonic, as well as archive zero-delay for resampling output. The average error rates of resampling precisions during the system normal and during external fault removal, showed by the simulation, are only 0.50% and 1.10% separately, both are better than traditional method of time domain resampling.%多数字源环境下，保护从通信接口中接收不同采样率的数字量，导致差动保护两侧采样率不一致，无法正常运行。针对此问题，该文研究多数字源不同采样率输入信号的重采样问题，提出频域变换重采样方法，先将保护采样信号进行半波傅里叶变换，再根据采样率的差异对变换后的频域进行处理，最后将处理过的频谱反变换到时域，得到重采样后的信号。该方法与现有重采样方法相比，不仅可以将采样值差动连续尺次判别所用等待时间中的数据很好地利用起来，增
Directory of Open Access Journals (Sweden)
Thawatchai Onjun
2012-02-01
Full Text Available The investigation of bootstrap current formation in ITER is carried out using BALDUR integrated predictive modelingcode. The combination of Mixed B/gB anomalous transport model and NLCASS module together with the pedestal model isused in BALDUR code to simulate the time evolution of temperature, density, and plasma current profiles. It was found inthe simulations that without the presence of ITB, a minimal fraction of bootstrap current (as well as low fusion performancewas achieved. The enhancement due to ITB depends sensitively on the strength of toroidal velocity. A sensitivity study wasalso carried out to optimize the bootstrap current fraction and plasma performance. It was found that the bootstrap currentfraction slightly improved; while the plasma performance greatly improved with increasing of NBI power or pedestal temperature.On the other hand, higher impurity concentration resulted in a significant degradation of fusion performance, buta smaller degradation in bootstrap current.
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We employ the replica method of statistical physics to study the average case performance of learning systems. The new feature of our theory is that general distributions of data can be treated, which enables applications to real data. For a class of Bayesian prediction models which are based on ...... Gaussian processes, we discuss Bootstrap estimates for learning curves....
Giving the Boot to the Bootstrap: How Not to Learn the Natural Numbers
Rips, Lance J.; Asmuth, Jennifer; Bloomfield, Amber
2006-01-01
According to one theory about how children learn the concept of natural numbers, they first determine that "one", "two", and "three" denote the size of sets containing the relevant number of items. They then make the following inductive inference (the Bootstrap): The next number word in the counting series denotes the size of the sets you get by…
Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong
2010-01-01
This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile intervals, and…
Bootstrapping realized volatility and realized beta under a local Gaussianity assumption
DEFF Research Database (Denmark)
Hounyo, Ulrich
The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency...
Parametric bootstrap methods for testing multiplicative terms in GGE and AMMI models.
Forkman, Johannes; Piepho, Hans-Peter
2014-09-01
The genotype main effects and genotype-by-environment interaction effects (GGE) model and the additive main effects and multiplicative interaction (AMMI) model are two common models for analysis of genotype-by-environment data. These models are frequently used by agronomists, plant breeders, geneticists and statisticians for analysis of multi-environment trials. In such trials, a set of genotypes, for example, crop cultivars, are compared across a range of environments, for example, locations. The GGE and AMMI models use singular value decomposition to partition genotype-by-environment interaction into an ordered sum of multiplicative terms. This article deals with the problem of testing the significance of these multiplicative terms in order to decide how many terms to retain in the final model. We propose parametric bootstrap methods for this problem. Models with fixed main effects, fixed multiplicative terms and random normally distributed errors are considered. Two methods are derived: a full and a simple parametric bootstrap method. These are compared with the alternatives of using approximate F-tests and cross-validation. In a simulation study based on four multi-environment trials, both bootstrap methods performed well with regard to Type I error rate and power. The simple parametric bootstrap method is particularly easy to use, since it only involves repeated sampling of standard normally distributed values. This method is recommended for selecting the number of multiplicative terms in GGE and AMMI models. The proposed methods can also be used for testing components in principal component analysis.
The use of vector bootstrapping to improve variable selection precision in Lasso models.
Laurin, Charles; Boomsma, Dorret; Lubke, Gitta
2016-08-01
The Lasso is a shrinkage regression method that is widely used for variable selection in statistical genetics. Commonly, K-fold cross-validation is used to fit a Lasso model. This is sometimes followed by using bootstrap confidence intervals to improve precision in the resulting variable selections. Nesting cross-validation within bootstrapping could provide further improvements in precision, but this has not been investigated systematically. We performed simulation studies of Lasso variable selection precision (VSP) with and without nesting cross-validation within bootstrapping. Data were simulated to represent genomic data under a polygenic model as well as under a model with effect sizes representative of typical GWAS results. We compared these approaches to each other as well as to software defaults for the Lasso. Nested cross-validation had the most precise variable selection at small effect sizes. At larger effect sizes, there was no advantage to nesting. We illustrated the nested approach with empirical data comprising SNPs and SNP-SNP interactions from the most significant SNPs in a GWAS of borderline personality symptoms. In the empirical example, we found that the default Lasso selected low-reliability SNPs and interactions which were excluded by bootstrapping. PMID:27248122
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
Holoien, Thomas W -S; Wechsler, Risa H
2016-01-01
We describe two new open source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program for using Gaussian mixtures to do density estimation of noisy data using extreme deconvolution (XD) algorithms that has functionality not available in other XD tools. It allows the user to select between the AstroML (Vanderplas et al. 2012; Ivezic et al. 2015) and Bovy et al. (2011) fitting methods and is compatible with scikit-learn machine learning algorithms (Pedregosa et al. 2011). Most crucially, it allows the user to condition a model based on the known values of a subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model conditioned on known values of other parameters. EmpiriciSN is an example application of this functionality that can be used for fitting an XDGMM model to observed supernova/host datas...
Narayan, Manjari; Allen, Genevera I
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches-R (2) based on resampling and random effects test statistics, and R (3) that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R (2) and R (3) have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940
Directory of Open Access Journals (Sweden)
Manjari eNarayan
2016-04-01
Full Text Available Many complex brain disorders such as Autism Spectrum Disorders exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches --- R^2 based on resampling and random effects test statistics, and R^3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R^2 and R^3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in Autism Spectrum Disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices.
Saarelma, S.; Günter, S.; Kurki-Suonio, T.; Zehrfeld, H.-P.
2000-05-01
An ELMy ASDEX Upgrade plasma equilibrium is reconstructed taking into account the bootstrap current. The peeling mode stability of the equilibrium is numerically analysed using the GATO [1] code, and it is found that the bootstrap current can drive the plasma peeling mode unstable. A high-n ballooning mode stability analysis of the equilibria revealed that, while destabilizing the peeling modes, the bootstrap current has a stabilizing effect on the ballooning modes. A combination of these two instabilities is a possible explanation for the type I ELM phenomenon. A triangularity scan showed that increasing triangularity stabilizes the peeling modes and can produce ELM-free periods observed in the experiments.
A treatment procedure for Gemini North/NIFS data cubes: application to NGC 4151
Menezes, R B; Ricci, T V
2014-01-01
We present a detailed procedure for treating data cubes obtained with the Near-Infrared Integral Field Spectrograph (NIFS) of the Gemini North telescope. This process includes the following steps: correction of the differential atmospheric refraction, spatial re-sampling, Butterworth spatial filtering, 'instrumental fingerprint' removal and Richardson-Lucy deconvolution. The clearer contours of the structures obtained with the spatial re-sampling, the high spatial-frequency noise removed with the Butterworth spatial filtering, the removed 'instrumental fingerprints' (which take the form of vertical stripes along the images) and the improvement of the spatial resolution obtained with the Richardson-Lucy deconvolution result in images with a considerably higher quality. An image of the Br{\\gamma} emission line from the treated data cube of NGC 4151 allows the detection of individual ionized-gas clouds (almost undetectable without the treatment procedure) of the narrow-line region of this galaxy, which are also ...
Institute of Scientific and Technical Information of China (English)
郑红; 隋强强; 孙玉泉
2012-01-01
There were many problems existing in the missile alarm and tracking for aircrafts,such as 2D missile motion modeling,the nonlinearity of the motion model and the non-Gaussian of the interference.The motion pattern of the proportional navigated missile from the 3D space to the camera imaging plane was studied and the motion model of the 2D projection of the proportional navigated missile was established.Since the velocity and acceleration in the model was non-linear,and the random interference（e.g.,wind direction,wind power,cyclone,air flow） encountered by missiles was also non-linear,the particle filter was used to track missiles.As for the particle degeneration problem in the tracking procedure,the ordered weight residual resampling particle filter（OWRR-PF） method was presented to release the particle degeneration problem and improve the tracking accuracy.Experimental results demonstrate that the tracking accuracy by using OWRR-PF method was improved by 70% than the standard particle filter,and 15% than the residual resampling particle filter.%针对利用图像信息实现飞行器的导弹预警跟踪中导弹运动2D建模、运动模型非线性和目标干扰非高斯等问题,研究了比例导引下导弹在三维空间中的运动在成像面上的运动模式,建立了比例制导导弹的2D投影运动状态模型.由于模型中速度和加速度等主要物理量的非线性,并考虑导弹运动过程中受到的风向、风力、气旋和气流等随机干扰的非高斯性,采用粒子滤波方法实现导弹跟踪;并针对粒子滤波在跟踪过程中存在的粒子退化问题,提出有序权值残差重采样粒子滤波（OWRR-PF,Ordered Weight Residual Resampling ParticleFilter）方法,该方法缓解了粒子退化问题,提高了跟踪的准确度.利用所建立的导弹运动模型进行连续视频试验,与标准粒子滤波相比跟踪精度提高了70%左右,与残差重采样粒子滤波方法相比跟踪精度提高了15%左右.
Sheng, Chunyang; Zhao, Jun; Wang, Wei; Leung, Henry
2013-07-01
Prediction intervals that provide estimated values as well as the corresponding reliability are applied to nonlinear time series forecast. However, constructing reliable prediction intervals for noisy time series is still a challenge. In this paper, a bootstrapping reservoir computing network ensemble (BRCNE) is proposed and a simultaneous training method based on Bayesian linear regression is developed. In addition, the structural parameters of the BRCNE, that is, the number of reservoir computing networks and the reservoir dimension, are determined off-line by the 0.632 bootstrap cross-validation. To verify the effectiveness of the proposed method, two kinds of time series data, including the multisuperimposed oscillator problem with additive noises and a practical gas flow in steel industry are employed here. The experimental results indicate that the proposed approach has a satisfactory performance on prediction intervals for practical applications.
Confidence intervals for parameters of IWD based on MLE and bootstrap
Directory of Open Access Journals (Sweden)
Mostafa MohieEl-Din
2013-12-01
Full Text Available In this paper, we will study the joint confidence regions for the parameters of inverse Weibull distribution (IWD in the point of view of record values. Based on this new censoring scheme, the approximate confidence intervals and percentile bootstrap confidence intervals as well as approximate joint confidence region for the parameters of IWD, are developed. One of the applications of the joint confidence regions of the parameters is to find confidence bounds for the functions of the parameters. Joint confidence regions for the parameters of extreme value distribution are also discussed. In this way we will discuss some numerical examples with real data set and simulated data, to illustrate the proposed method. A simulation study is performed to compare the proposed joint confidence regions. Keywords: IWD, Progressively First-Failure Censored Scheme, MLE Confidence Intervals, Bootstrap Confidence Intervals.
DEFF Research Database (Denmark)
Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio;
2014-01-01
Industrial applications of computed tomography (CT) for dimensional metrology on various components are fast increasing, owing to a number of favorable properties such as capability of non-destructive internal measurements. Uncertainty evaluation is however more complex than in conventional...... measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....
Applications of threshold models and the weighted bootstrap for Hungarian precipitation data
Varga, László; Rakonczai, Pál; Zempléni, András
2016-05-01
This paper presents applications of the peaks-over-threshold methodology for both the univariate and the recently introduced bivariate case, combined with a novel bootstrap approach. We compare the proposed bootstrap methods to the more traditional profile likelihood. We have investigated 63 years of the European Climate Assessment daily precipitation data for five Hungarian grid points, first separately for the summer and winter months, then aiming at the detection of possible changes by investigating 20 years moving windows. We show that significant changes can be observed both in the univariate and the bivariate cases, the most recent period being the most dangerous in several cases, as some return values have increased substantially. We illustrate these effects by bivariate coverage regions.
Vaze, Rahul
2012-01-01
Iterated localization is considered where each node of a network needs to get localized (find its location on 2-D plane), when initially only a subset of nodes have their location information. The iterated localization process proceeds as follows. Starting with a subset of nodes that have their location information, possibly using global positioning system (GPS) devices, any other node gets localized if it has three or more localized nodes in its radio range. The newly localized nodes are included in the subset of nodes that have their location information for the next iteration. This process is allowed to continue, until no new node can be localized. The problem is to find the minimum size of the initially localized subset to start with so that the whole network is localized with high probability. There are intimate connections between iterated localization and bootstrap percolation, that is well studied in statistical physics. Using results known in bootstrap percolation, we find a sufficient condition on t...
Langel, Steven E.; Khanafseh, Samer M.; Pervan, Boris
2016-06-01
Differential carrier phase applications that utilize cycle resolution need the probability density function of the baseline estimate to quantify its region of concentration. For the integer bootstrap estimator, the density function has an analytical definition that enables probability calculations given perfect statistical knowledge of measurement and process noise. This paper derives a method to upper bound the tail probability of the integer bootstrapped GNSS baseline when the measurement and process noise correlation functions are unknown, but can be upper and lower bounded. The tail probability is shown to be a non-convex function of a vector of conditional variances, whose feasible region is a convex polytope. We show how to solve the non-convex optimization problem globally by discretizing the polytope into small hyper-rectangular elements, and demonstrate the method for a static baseline estimation problem.
A Simple Voltage Controlled Oscillator Using Bootstrap Circuits and NOR-RS Flip Flop
Chaikla, Amphawan; Pongswatd, Sawai; Sasaki, Hirofumi; Fujimoto, Kuniaki; Yahara, Mitsutoshi
This paper presents a simple and successful design for a voltage controlled oscillator. The proposed circuit is based on the use of two identical bootstrap circuits and a NOR-RS Flip Flop to generate wide-tunable sawtooth and square waves. Increasing control voltage linearly increases the output oscillation frequency. Experimental results verifying the performances of the proposed circuit are in agreement with the calculated values.
Forward Kinematic Analysis of Tip-Tilt-Piston Parallel Manipulator using Secant-Bootstrap Method
Majidian, A.; Amani, A.; Golipour, M.; Amraei, A.
2014-01-01
This paper, deals with application of the Secant-Bootstrap Method (SBM) to solve the Closed-form forward kinematics of a new three degree-of-freedom (DOF) parallel manipulator with inextensible limbs and base-mounted actuators. The manipulator has higher resolution and precision than the existing three DOF mechanisms with extensible limbs. This methodology has been utilized to achieve approximate solutions for nonlinear equations of kinematic of Tip-Tilt-Piston (T.T.P) Parallel Manipulator. T...
Bootstrap and Higher-Order Expansion Validity When Instruments May Be Weak
Moreira, Marcelo J.; Jack R. Porter; Gustavo A. Suarez
2004-01-01
It is well-known that size-adjustments based on Edgeworth expansions for the t-statistic perform poorly when instruments are weakly correlated with the endogenous explanatory variable. This paper shows, however, that the lack of Edgeworth expansions and bootstrap validity are not tied to the weak instrument framework, but instead depends on which test statistic is examined. In particular, Edgeworth expansions are valid for the score and conditional likelihood ratio approaches, even when the i...
Cross-Sectional Dependence Robust Block Bootstrap Panel Unit Root Tests
Palm, F.C.; Smeekes, S.; Urbain, J.R.Y.J.
2008-01-01
In this paper we consider the issue of unit root testing in cross-sectionally dependent panels. We consider panels that may be characterized by various forms of cross-sectionaldependence including (but not exclusive to) the popular common factor framework. Weconsider block bootstrap versions of the group-mean Im, Pesaran, and Shin (2003) and thepooled Levin, Lin, and Chu (2002) unit root coefficient DF-tests for panel data, originallyproposed for a setting of no cross-sectional dependence bey...
Vergé, Christelle; Del Moral, Pierre; Moulines, Eric; Olsson, Jimmy
2014-01-01
Particle island models (Verg\\'e et al., 2013) provide a means of parallelization of sequential Monte Carlo methods, and in this paper we present novel convergence results for algorithms of this sort. In particular we establish a central limit theorem - as the number of islands and the common size of the islands tend jointly to infinity - of the double bootstrap algorithm with possibly adaptive selection on the island level. For this purpose we introduce a notion of archipelagos of weighted is...
Structure Constants and Integrable Bootstrap in Planar N=4 SYM Theory
Basso, Benjamin; Komatsu, Shota; Vieira, Pedro
2015-01-01
We introduce a non-perturbative framework for computing structure constants of single-trace operators in the N=4 SYM theory at large N. Our approach features new vertices, with hexagonal shape, that can be patched together into three- and possibly higher-point correlators. These newborn hexagons are more elementary and easier to deal with than the three-point functions. Moreover, they can be entirely constructed using integrability, by means of a suitable bootstrap program. In this letter, we...
Double bootstrap confidence intervals in the two-stage DEA approach
Chronopoulos, D.K.; Girardone, C.; Nankervis, J.C.
2015-01-01
Contextual factors usually assume an important role in determining firms' productive efficiencies. Nevertheless, identifying them in a regression framework might be complicated. The problem arises from the efficiencies being correlated with each other when estimated by Data Envelopment Analysis, rendering standard inference methods invalid. Simar and Wilson (2007) suggest the use of bootstrap algorithms that allow for valid statistical inference in this context. This article extends their wor...
Sieve bootstrap t-tests on long-run average parameters
Fuertes, A.
2008-01-01
Panel estimators can provide consistent measures of a long-run average parameter even if the individual regressions are spurious. However, the t-test on this parameter is fraught with problems because the limit distribution of the test statistic is non-standard and rather complicated, particularly in panels with mixed (non-)stationary errors. A sieve bootstrap framework is suggested to approximate the distribution of the t-statistic. An extensive Monte Carlo study demonstrates that the bootst...
Bootstrap generation and evaluation of an fMRI simulation database
Bellec, Pierre; Perlbarg, Vincent; Evans, Alan C.
2009-01-01
Computer simulations have played a critical role in functional magnetic resonance imaging (fMRI) research, notably in the validation of new data analysis methods. Many approaches have been used to generate fMRI simulations, but there is currently no generic framework to assess how realistic each one of these approaches may be. In this paper, a statistical technique called parametric bootstrap was used to generate a simulation database that mimicked the parameters found in a real database, whi...
DEFF Research Database (Denmark)
Hounyo, Ulrich
We propose a bootstrap mehtod for estimating the distribution (and functionals of it such as the variance) of various integrated covariance matrix estimators. In particular, we first adapt the wild blocks of blocks bootsratp method suggested for the pre-averaged realized volatility estimator to a...... the finite sample properties of the existing first-order asymptotic theory. We illustrate its practical use on high-frequency equity data...
Tsagris, Michael; Elmatzoglou, Ioannis; Christos C. Frangos
2015-01-01
Little attention has been given to the correlation coefficient when data come from discrete or continuous non-normal populations. In this article, we consider the efficiency of two correlation coefficients which are from the same family, Pearson's and Spearman's estimators. Two discrete bivariate distributions were examined: the Poisson and the Negative Binomial. The comparison between these two estimators took place using classical and bootstrap techniques for the construction of confidence ...
Tsagris, Michail; Elmatzoglou, Ioannis; C. Frangos, Christos
2012-01-01
Little attention has been given to the correlation coefficient when data come from discrete or continuous non-normal populations. In this article, we consider the efficiency of two correlation coefficients which are from the same family, Pearson's and Spearman's estimators. Two discrete bivariate distributions were examined: the Poisson and the Negative Binomial. The comparison between these two estimators took place using classical and bootstrap techniques for the construction of confidence ...
Violation of KNO scaling and the NBD phenomenon in the framework of the statistical bootstrap model
Kokoulina, E. S.; Kuvshinov, V. I.
1991-05-01
The connection is considered of multiplicity distributions in three stages: partonic, hadronization, and hadronic. An interpretation of the LoPHD parameter is found. It is shown that under specific hypotheses on the form of the mass spectrum, the statistical bootstrap model leads to the negative binomial distribution (NBD) at the hadronic stage of development of the multiple production process with specific analytic dependences of the parameters of the NBD.
A. M. Keith; Henrys, P.; Rowe, R. L.; N. P. McNamara
2015-01-01
Understanding the consequences of different land uses for the soil system is important to better inform decisions based on sustainability. The ability to assess change in soil properties, throughout the soil profile, is a critical step in this process. We present an approach to examine differences in soil depth profiles between land uses using bootstrapped Loess regressions (BLR). This non-parametric approach is data-driven, unconstrained by distributional ...
International Nuclear Information System (INIS)
The following applications of the statistical bootstrap model (SBM) to hadron production in hadron-hadron collisions are discussed: (1) the independent fireball production model (IFPM); (2) single particle distributions in the IFPM (thermodynamic model); (3) production of heavy particles and pairs of heavy particles; (4) two-particle distributions and correlations in the IFPM and the Bose effect; (5) large psub(perpendicular) phenomena and fireball models; (6) N anti N annihilation at rest in the SBM. (author)
Simulation of bootstrap current in 2D and 3D ideal magnetic fields in tokamaks
Raghunathan, M.; Graves, J. P.; Cooper, W. A.; Pedro, M.; Sauter, O.
2016-09-01
We aim to simulate the bootstrap current for a MAST-like spherical tokamak using two approaches for magnetic equilibria including externally caused 3D effects such as resonant magnetic perturbations (RMPs), the effect of toroidal ripple, and intrinsic 3D effects such as non-resonant internal kink modes. The first approach relies on known neoclassical coefficients in ideal MHD equilibria, using the Sauter (Sauter et al 1999 Phys. Plasmas 6 2834) expression valid for all collisionalities in axisymmetry, and the second approach being the quasi-analytic Shaing–Callen (Shaing and Callen 1983 Phys. Fluids 26 3315) model in the collisionless regime for 3D. Using the ideal free-boundary magnetohydrodynamic code VMEC, we compute the flux-surface averaged bootstrap current density, with the Sauter and Shaing–Callen expressions for 2D and 3D ideal MHD equilibria including an edge pressure barrier with the application of resonant magnetic perturbations, and equilibria possessing a saturated non-resonant 1/1 internal kink mode with a weak internal pressure barrier. We compare the applicability of the self-consistent iterative model on the 3D applications and discuss the limitations and advantages of each bootstrap current model for each type of equilibrium.
Conformal Bootstrap Approach to O(N) Fixed Points in Five Dimensions
Bae, Jin-Beom
2014-01-01
Whether O(N)-invariant conformal field theory exists in five dimensions with its implication to higher-spin holography was much debated. We find an affirmative result on this question by utilizing conformal bootstrap approach. In solving for the crossing symmetry condition, we propose a new approach based on specification for the low-lying spectrum distribution. We find the traditional one-gap bootstrapping is not suited since the nontrivial fixed point expected from large-N expansion sits at deep interior (not at boundary or kink) of allowed solution region. We propose two-gap bootstrapping that specifies scaling dimension of two lowest scalar operators. The approach carves out vast region of lower scaling dimensions and universally features two tips. We find that the sought-for nontrivial fixed point now sits at one of the tips, while the Gaussian fixed point sits at the other tip. The scaling dimensions of scalar operators fit well with expectation based on large-N expansion. We also find indication that t...
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture.
Stable bootstrap-current driven equilibria for low aspect ratio tokamaks
Energy Technology Data Exchange (ETDEWEB)
Miller, R.L.; Lin-Liu, Y.R.; Turnbull, A.D.; Chan, V.S. [General Atomics, San Diego, CA (United States); Pearlstein, L.D. [Lawrence Livermore National Lab., CA (United States); Sauter, O.; Villard, L. [Ecole Polytechnique Federale, Lausanne (Switzerland). Centre de Recherche en Physique des Plasma (CRPP)
1996-09-01
Low aspect ratio tokamaks can potentially provide a high ratio of plasma pressure to magnetic pressure {beta} and high plasma current I at a modest size, ultimately leading to a high power density compact fusion power plant. For the concept to be economically feasible, bootstrap current must be a major component of the plasma. A high value of the Troyon factor {beta}{sub N} and strong shaping are required to allow simultaneous operation at high {beta} and high bootstrap current fraction. Ideal magnetohydrodynamic stability of a range of equilibria at aspect 1.4 is systematically explored by varying the pressure profile and shape. The pressure and current profiles are constrained in such a way as to assure complete bootstrap current alignment. Both {beta}{sub N} and {beta} are defined in terms of the vacuum toroidal field. Equilibria with {beta} {sub N}{>=}8 and {beta} {approx_equal}35% to 55% exist which are stable to n = {infinity} ballooning modes, and stable to n = 0,1,2,3 kink modes with a conducting wall. The dependence of {beta} and {beta}{sub N} with respect to aspect ratio is also considered. (author) 9 figs., 14 refs.
Y-90 PET imaging for radiation theragnosis using bootstrap event re sampling
Energy Technology Data Exchange (ETDEWEB)
Nam, Taewon; Woo, Sangkeun; Min, Gyungju; Kim, Jimin; Kang, Joohyun; Lim, Sangmoo; Kim, Kyeongmin [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)
2013-05-15
Surgical resection is the most effective method to recover the liver function. However, Yttrium-90 (Y-90) has been used as a new treatment due to the fact that it can be delivered to the tumors and results in greater radiation exposure to the tumors than using external radiation nowadays since most treatment is palliative in case of unresectable stage of hepatocellular carcinoma (HCC). Recently, Y-90 has been received much interest and studied by many researchers. Imaging of Y-90 has been conducted using most commonly gamma camera but PET imaging is required due to low sensitivity and resolution. The purpose of this study was to assess statistical characteristics and to improve count rate of image for enhancing image quality by using nonparametric bootstrap method. PET data was able to be improved using non-parametric bootstrap method and it was verified with showing improved uniformity and SNR. Uniformity showed more improvement under the condition of low count rate, i.e. Y-90, in case of phantom and also uniformity and SNR showed improvement of 15.6% and 33.8% in case of mouse, respectively. Bootstrap method performed in this study for PET data increased count rate of PET image and consequentially time for acquisition time can be reduced. It will be expected to improve performance for diagnosis.
Institute of Scientific and Technical Information of China (English)
CHAN Kung-Sik; TONG Howell; STENSETH Nils Chr
2009-01-01
The study of the rodent fluctuations of the North was initiated in its modern form with Elton's pioneering work. Many scientific studies have been designed to collect yearly rodent abundance data, but the resulting time series are generally subject to at least two "problems": being short and non-linear. We explore the use of the continuous threshold autoregressive (TAR) models for analyzing such data. In the simplest case, the continuous TAR models are additive autoregressive models, being piecewise linear in one lag, and linear in all other lags. The location of the slope change is called the threshold parameter. The continuous TAR models for rodent abundance data can be derived from a general prey-predator model under some simplifying assumptions. The lag in which the threshold is located sheds important insights on the structure of the prey-predator system. We propose to assess the uncertainty on the location of the threshold via a new bootstrap called the nearest block bootstrap (NBB) which combines the methods of moving block bootstrap and the nearest neighbor bootstrap.The NBB assumes an underlying finite-order time-homogeneous Markov process. Essentially, the NBB bootstraps blocks of random block sizes, with each block being drawn from a non-parametric estimate of the future distribution given the realized past bootstrap series. We illustrate the methods by simulations and on a particular rodent abundance time series from Kilpisjarvi, Northern Finland.
Yang, P.; Ng, T. L.; Yang, W.
2015-12-01
Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC
不平衡数据采样方法的对比学习%Comparative study of re-sampling methods for imbalanced data sets
Institute of Scientific and Technical Information of China (English)
王晓娟; 郭躬德
2011-01-01
Study on imbalanced data sets is a hot research topic recently in data mining domain. Re-sampling is a key research direction among many approaches for imbalanced data sets. There are many different methods for dealing with imbalanced data sets. Ten different Re-sampling methods are chosen for comparative study. Some beneficial results are obtained from experiments that over-sampling methods perform better than under-sampling methods on different data sets. Over-sampling methods are also better than an ensemble method of over-sampling and under-sampling.%不平衡数据的研究是近年来数据挖掘的一大研究热点,针对不平衡数据的众多研究方法中,重采样是一个重要的研究方向。重采样的方法多种多样,本文从中选取了10种不同的重采样方法,通过对其进行对比学习,从实验中得到一些有益的结论：在不同的数据集上,过采样方法比欠采样方法更能取得较好的效果;过采样的实验结果也优于将过采样与欠采样结合的方式。
Bee, Marco; Gazzini, Amedeo
2004-01-01
The aim of this paper consists in testing the profitability of simple technical trading rules in the Italian stock market. By means of a recently developed bootstrap methodology we assess whether technical rules based on moving averages are capable of producing excess returns with respect to the Buy-and-Hold strategy. We find that in most cases the rules are profitable and the excess return is statistically significant. Howevever, the well-known problem of data-snooping, which seems to be con...
Bootstrapping O( N ) vector models with four supercharges in 3 ≤ d ≤ 4
Chester, Shai M.; Iliesiu, Luca V.; Pufu, Silviu S.; Yacoby, Ran
2016-05-01
We analyze the conformal bootstrap constraints in theories with four super-charges and a global O( N ) × U(1) flavor symmetry in 3 ≤ d ≤ 4 dimensions. In particular, we consider the 4-point function of O( N )-fundamental chiral operators Z i that have no chiral primary in the O( N )-singlet sector of their OPE. We find features in our numerical bounds that nearly coincide with the theory of N + 1 chiral super-fields with superpotential W = X∑ i = 1 N Z i 2 as well as general bounds on SCFTs where ∑ i = 1 N Z i 2 vanishes in the chiral ring.
Structure Constants and Integrable Bootstrap in Planar N=4 SYM Theory
Basso, Benjamin; Vieira, Pedro
2015-01-01
We introduce a non-perturbative framework for computing structure constants of single-trace operators in the N=4 SYM theory at large N. Our approach features new vertices, with hexagonal shape, that can be patched together into three- and possibly higher-point correlators. These newborn hexagons are more elementary and easier to deal with than the three-point functions. Moreover, they can be entirely constructed using integrability, by means of a suitable bootstrap program. In this letter, we present our main results and conjectures for these vertices, and match their predictions for the three-point functions with both weak and strong coupling data available in the literature.
Progress Toward Steady State Tokamak Operation Exploiting the high bootstrap current fraction regime
Ren, Q.
2015-11-01
Recent DIII-D experiments have advanced the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Fully noninductive plasmas with extremely high values of the poloidal beta, βp >= 4 , have been sustained at βT >= 2 % for long durations with excellent energy confinement quality (H98y,2 >= 1 . 5) and internal transport barriers (ITBs) generated at large minor radius (>= 0 . 6) in all channels (Te, Ti, ne, VTf). Large bootstrap fraction (fBS ~ 80 %) has been obtained with high βp. ITBs have been shown to be compatible with steady state operation. Because of the unusually large ITB radius, normalized pressure is not limited to low βN values by internal ITB-driven modes. βN up to ~4.3 has been obtained by optimizing the plasma-wall distance. The scenario is robust against several variations, including replacing some on-axis with off-axis neutral beam injection (NBI), adding electron cyclotron (EC) heating, and reducing the NBI torque by a factor of 2. This latter observation is particularly promising for extension of the scenario to EAST, where maximum power is obtained with balanced NBI injection, and to a reactor, expected to have low rotation. However, modeling of this regime has provided new challenges to state-of-the-art modeling capabilities: quasilinear models can dramatically underpredict the electron transport, and the Sauter bootstrap current can be insufficient. The analysis shows first-principle NEO is in good agreement with experiments for the bootstrap current calculation and ETG modes with a larger saturated amplitude or EM modes may provide the missing electron transport. Work supported in part by the US DOE under DE-FC02-04ER54698, DE-AC52-07NA
Bootstrap inference for pre-averaged realized volatility based on non-overlapping returns
DEFF Research Database (Denmark)
Gonçalves, Sílvia; Hounyo, Ulrich; Meddahi, Nour
The main contribution of this paper is to propose bootstrap methods for realized volatility-like estimators defined on pre-averaged returns. In particular, we focus on the pre-averaged realized volatility estimator proposed by Podolskij and Vetter (2009). This statistic can be written (up to a bias...... correction term) as the (scaled) sum of squared pre-averaged returns, where the pre-averaging is done over all possible non-overlapping blocks of consecutive observations. Pre-averaging reduces the influence of the noise and allows for realized volatility estimation on the pre-averaged returns. The non...
Bootstrap-based confidence estimation in PCA and multivariate statistical process control
DEFF Research Database (Denmark)
Babamoradi, Hamid
. The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus......Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based...
Directory of Open Access Journals (Sweden)
Ristya Widi Endah Yani
2008-12-01
Full Text Available Background: Bootstrap is a computer simulation-based method that provides estimation accuracy in estimating inferential statistical parameters. Purpose: This article describes a research using secondary data (n = 30 aimed to elucidate bootstrap method as the estimator of linear regression test based on the computer programs MINITAB 13, SPSS 13, and MacroMINITAB. Methods: Bootstrap regression methods determine ˆ β and Yˆ value from OLS (ordinary least square, ε i = Yi −Yˆi value, determine how many repetition for bootstrap (B, take n sample by replacement from ε i to ε (i , Yi = Yˆi + ε (i value, ˆ β value from sample bootstrap at i vector. If the amount of repetition less than, B a recalculation should be back to take n sample by using replacement from ε i . Otherwise, determine ˆ β from “bootstrap” methods as the average ˆ β value from the result of B times sample taken. Result: The result has similar result compared to linear regression equation with OLS method (α = 5%. The resulting regression equation for caries was = 1.90 + 2.02 (OHI-S, indicating that every one increase of OHI-S unit will result in caries increase of 2.02 units. Conclusion: This was conducted with B as many as 10,500 with 10 times iterations.
Energy Technology Data Exchange (ETDEWEB)
Andrade, Maria Celia Ramos; Ludwig, Gerson Otto [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Lab. Associado de Plasma]. E-mail: mcr@plasma.inpe.br
2004-07-01
Different bootstrap current formulations are implemented in a self-consistent equilibrium calculation obtained from a direct variational technique in fixed boundary tokamak plasmas. The total plasma current profile is supposed to have contributions of the diamagnetic, Pfirsch-Schlueter, and the neoclassical Ohmic and bootstrap currents. The Ohmic component is calculated in terms of the neoclassical conductivity, compared here among different expressions, and the loop voltage determined consistently in order to give the prescribed value of the total plasma current. A comparison among several bootstrap current models for different viscosity coefficient calculations and distinct forms for the Coulomb collision operator is performed for a variety of plasma parameters of the small aspect ratio tokamak ETE (Experimento Tokamak Esferico) at the Associated Plasma Laboratory of INPE, in Brazil. We have performed this comparison for the ETE tokamak so that the differences among all the models reported here, mainly regarding plasma collisionality, can be better illustrated. The dependence of the bootstrap current ratio upon some plasma parameters in the frame of the self-consistent calculation is also analysed. We emphasize in this paper what we call the Hirshman-Sigmar/Shaing model, valid for all collisionality regimes and aspect ratios, and a fitted formulation proposed by Sauter, which has the same range of validity but is faster to compute than the previous one. The advantages or possible limitations of all these different formulations for the bootstrap current estimate are analysed throughout this work. (author)
Douven, Igor; Kelp, Christoph
2013-01-01
According to a much discussed argument, reliabilism is defective for making knowledge too easy to come by. In a recent paper, Weisberg aims to show that this argument relies on a type of reasoning that is rejectable on independent grounds. We argue that the blanket rejection that Weisberg recommends
Directory of Open Access Journals (Sweden)
Gu Xun
2007-03-01
Full Text Available Abstract Background Phylogenetically related miRNAs (miRNA families convey important information of the function and evolution of miRNAs. Due to the special sequence features of miRNAs, pair-wise sequence identity between miRNA precursors alone is often inadequate for unequivocally judging the phylogenetic relationships between miRNAs. Most of the current methods for miRNA classification rely heavily on manual inspection and lack measurements of the reliability of the results. Results In this study, we designed an analysis pipeline (the Phylogeny-Bootstrap-Cluster (PBC pipeline to identify miRNA families based on branch stability in the bootstrap trees derived from overlapping genome-wide miRNA sequence sets. We tested the PBC analysis pipeline with the miRNAs from six animal species, H. sapiens, M. musculus, G. gallus, D. rerio, D. melanogaster, and C. elegans. The resulting classification was compared with the miRNA families defined in miRBase. The two classifications were largely consistent. Conclusion The PBC analysis pipeline is an efficient method for classifying large numbers of heterogeneous miRNA sequences. It requires minimum human involvement and provides measurements of the reliability of the classification results.
The Index of Biological Integrity and the bootstrap revisited: an example from Minnesota streams
Dolph, Christine L.; Sheshukov, Aleksey Y.; Chizinski, Christopher J.; Vondracek, Bruce C.; Wilson, Bruce
2010-01-01
Multimetric indices, such as the Index of Biological Integrity (IBI), are increasingly used by management agencies to determine whether surface water quality is impaired. However, important questions about the variability of these indices have not been thoroughly addressed in the scientific literature. In this study, we used a bootstrap approach to quantify variability associated with fish IBIs developed for streams in two Minnesota river basins. We further placed this variability into a management context by comparing it to impairment thresholds currently used in water quality determinations for Minnesota streams. We found that 95% confidence intervals ranged as high as 40 points for IBIs scored on a 0–100 point scale. However, on average, 90% of IBI scores calculated from bootstrap replicate samples for a given stream site yielded the same impairment status as the original IBI score. We suggest that sampling variability in IBI scores is related to both the number of fish and the number of rare taxa in a field collection. A comparison of the effects of different scoring methods on IBI variability indicates that a continuous scoring method may reduce the amount of bias in IBI scores.
Bootstrapping Mixed Correlators in the Five Dimensional Critical O(N) Models
Li, Zhijin
2016-01-01
We use the conformal bootstrap approach to explore $5D$ CFTs with $O(N)$ global symmetry, which contain $N$ scalars $\\phi_i$ transforming as $O(N)$ vector. Specifically, we study multiple four-point correlators of the leading $O(N)$ vector $\\phi_i$ and the $O(N)$ singlet $\\sigma$. The crossing symmetry of the four-point functions and the unitarity condition provide nontrivial constraints on the scaling dimensions ($\\Delta_\\phi$, $\\Delta_\\sigma$) of $\\phi_i$ and $\\sigma$. With reasonable assumptions on the gaps between scaling dimensions of $\\phi_i$ ($\\sigma$) and the next $O(N)$ vector (singlet) scalar, we are able to isolate the scaling dimensions $(\\Delta_\\phi$, $\\Delta_\\sigma)$ in small islands. In particular, for large $N=500$, the isolated region is highly consistent with the result obtained from large $N$ expansion. We also study the interacting $O(N)$ CFTs for $1\\leqslant N\\leqslant100$. Isolated regions on $(\\Delta_\\phi,\\Delta_\\sigma)$ plane are obtained using conformal bootstrap program with lower or...
Cuyabano, B C D; Su, G; Rosa, G J M; Lund, M S; Gianola, D
2015-10-01
This study compared the accuracy of genome-enabled prediction models using individual single nucleotide polymorphisms (SNP) or haplotype blocks as covariates when using either a single breed or a combined population of Nordic Red cattle. The main objective was to compare predictions of breeding values of complex traits using a combined training population with haplotype blocks, with predictions using a single breed as training population and individual SNP as predictors. To compare the prediction reliabilities, bootstrap samples were taken from the test data set. With the bootstrapped samples of prediction reliabilities, we built and graphed confidence ellipses to allow comparisons. Finally, measures of statistical distances were used to calculate the gain in predictive ability. Our analyses are innovative in the context of assessment of predictive models, allowing a better understanding of prediction reliabilities and providing a statistical basis to effectively calibrate whether one prediction scenario is indeed more accurate than another. An ANOVA indicated that use of haplotype blocks produced significant gains mainly when Bayesian mixture models were used but not when Bayesian BLUP was fitted to the data. Furthermore, when haplotype blocks were used to train prediction models in a combined Nordic Red cattle population, we obtained up to a statistically significant 5.5% average gain in prediction accuracy, over predictions using individual SNP and training the model with a single breed. PMID:26233439
Directory of Open Access Journals (Sweden)
Mohammad Reza Marami Milani
2015-09-01
Full Text Available This study analyzes the linear relationship between climate variables and milk components in Iran by applying bootstrapping to include and assess the uncertainty. The climate parameters, Temperature Humidity Index (THI and Equivalent Temperature Index (ETI are computed from the NASA-Modern Era Retrospective-Analysis for Research and Applications (NASA-MERRA reanalysis (2002–2010. Milk data for fat, protein (measured on fresh matter bases, and milk yield are taken from 936,227 milk records for the same period, using cows fed by natural pasture from April to September. Confidence intervals for the regression model are calculated using the bootstrap technique. This method is applied to the original times series, generating statistically equivalent surrogate samples. As a result, despite the short time data and the related uncertainties, an interesting behavior of the relationships between milk compound and the climate parameters is visible. During spring only, a weak dependency of milk yield and climate variations is obvious, while fat and protein concentrations show reasonable correlations. In summer, milk yield shows a similar level of relationship with ETI, but not with temperature and THI. We suggest this methodology for studies in the field of the impacts of climate change and agriculture, also environment and food with short-term data.
Impurities in a non-axisymmetric plasma: Transport and effect on bootstrap current
Energy Technology Data Exchange (ETDEWEB)
Mollén, A., E-mail: albertm@chalmers.se [Department of Applied Physics, Chalmers University of Technology, Göteborg (Sweden); Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Landreman, M. [Institute for Research in Electronics and Applied Physics, University of Maryland, College Park, Maryland 20742 (United States); Smith, H. M.; Helander, P. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Braun, S. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); German Aerospace Center, Institute of Engineering Thermodynamics, Pfaffenwaldring 38-40, D-70569 Stuttgart (Germany)
2015-11-15
Impurities cause radiation losses and plasma dilution, and in stellarator plasmas the neoclassical ambipolar radial electric field is often unfavorable for avoiding strong impurity peaking. In this work we use a new continuum drift-kinetic solver, the SFINCS code (the Stellarator Fokker-Planck Iterative Neoclassical Conservative Solver) [M. Landreman et al., Phys. Plasmas 21, 042503 (2014)] which employs the full linearized Fokker-Planck-Landau operator, to calculate neoclassical impurity transport coefficients for a Wendelstein 7-X (W7-X) magnetic configuration. We compare SFINCS calculations with theoretical asymptotes in the high collisionality limit. We observe and explain a 1/ν-scaling of the inter-species radial transport coefficient at low collisionality, arising due to the field term in the inter-species collision operator, and which is not found with simplified collision models even when momentum correction is applied. However, this type of scaling disappears if a radial electric field is present. We also use SFINCS to analyze how the impurity content affects the neoclassical impurity dynamics and the bootstrap current. We show that a change in plasma effective charge Z{sub eff} of order unity can affect the bootstrap current enough to cause a deviation in the divertor strike point locations.
Application of non-parametric Bootstrap method to gamma spectrum analysis
International Nuclear Information System (INIS)
Background: In gamma spectral measurement, if the sample activity or detection efficiency of the detector is low, the most often used method to reduce the statistical fluctuation of the measurement data is to increase the measurement time and the detector dimensions. Purpose: Considering the economic factors as well as the matching problem with other nuclear electronics devices, both the size of the detector and the measurement time are limited. In this case, processing gamma spectrum data by the mathematical method to reduce statistical fluctuations for fast and accurate analysis of radionuclides received widespread attention at home and abroad. Methods: The basic principles of non-parametric Bootstrap method was described and applied to laboratory gamma spectrum data processing. The gamma spectrum of 241Am, 137Cs and 60Co were measured in different time periods by NaI(Tl) detectors. Results: The non-parametric Bootstrap method was used to process the gamma spectra measured in short time and the results were compared with the gamma spectra data measured in long time under the same conditions, and the calculated spectra agreed well with the measured spectra. Conclusion: It provides a feasible technique to quickly measure gamma spectrum at low activity levels. (authors)
Salleh, Mad Ithnin; Ismail, Shariffah Nur Illiana Syed; Habidin, Nurul Fadly; Latip, Nor Azrin Md; Ishak, Salomawati
2014-12-01
The Malaysian government has put a greater attention on enhancing the productivity in Technical Vocational and Educational Training (TVET) sector to increase the development of a skilled workforce by the year 2020. The implementation of National Higher Education Strategic Plan (NHESP) in 2007 led to the changes in Malaysian Polytechnics sector. Thus, the study of efficiency and productivity make it possible to identify scope of improvement for the institution to perform in more efficient and effective manner. This paper aims to identify the efficiency and productivity of 24 polytechnics main campuses as in 2007. This paper applied bootstrapped Malmquist indices to investigate the effects of NHESP on the technical efficiency and changes in productivity in the Malaysian Polytechnics individually from the year 2007-2010. This method enables a more robust analysis of technical efficiency and productivity changes among polytechnics. The bootstrap simulation method is capable to identify whether or not the computed productivity changes are statistically significant. This paper founds that, the overall mean efficiency score demonstrate a significant growth. In addition, the sector as a whole has undergone positive productivity growth at the frontier during the post-NHESP period except in 2009-2010. The increase in productivity growth during post-NHESP was majorly led by technological growth. The empirical results indicated that during the post-NHESP period, the entire polytechnic showed significant TFP growth. This finding shows NHESP contribution towards the positive growth in the overall performance of the Malaysia's polytechnics sector.
Institute of Scientific and Technical Information of China (English)
CHAN; Kung-Sik; TONG; Howell; STENSETH; Nils; Chr
2009-01-01
The study of the rodent fluctuations of the North was initiated in its modern form with Elton’s pioneering work.Many scientific studies have been designed to collect yearly rodent abundance data,but the resulting time series are generally subject to at least two "problems":being short and non-linear.We explore the use of the continuous threshold autoregressive(TAR) models for analyzing such data.In the simplest case,the continuous TAR models are additive autoregressive models,being piecewise linear in one lag,and linear in all other lags.The location of the slope change is called the threshold parameter.The continuous TAR models for rodent abundance data can be derived from a general prey-predator model under some simplifying assumptions.The lag in which the threshold is located sheds important insights on the structure of the prey-predator system.We propose to assess the uncertainty on the location of the threshold via a new bootstrap called the nearest block bootstrap(NBB) which combines the methods of moving block bootstrap and the nearest neighbor bootstrap.The NBB assumes an underlying finite-order time-homogeneous Markov process.Essentially,the NBB bootstraps blocks of random block sizes,with each block being drawn from a non-parametric estimate of the future distribution given the realized past bootstrap series.We illustrate the methods by simulations and on a particular rodent abundance time series from Kilpisjrvi,Northern Finland.
Braun, M
1995-01-01
The bootstrap condition is generalized to n reggeized gluons. As a result it is demonstrated that the intercept generated by n reggeized gluons cannot be lower than the one for n=2. Arguments are presented that in the limit N_{c}\\rightarrow\\infty the bootstrap condition reduces the n gluon chain with interacting neighbours to a single BFKL pomeron. In this limit the leading contribution from n gluons corresponds to n/2 non-interacting BFKL pomerons (the n/2 pomeron cut). The sum over n leads to a unitary \\gamma^{\\ast}\\gamma amplitude of the eikonal form.