Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
International Nuclear Information System (INIS)
Iwi, G.; Millard, R.K.; Palmer, A.M.; Preece, A.W.; Saunders, M.
1999-01-01
Bootstrap resampling provides a versatile and reliable statistical method for estimating the accuracy of quantities which are calculated from experimental data. It is an empirically based method, in which large numbers of simulated datasets are generated by computer from existing measurements, so that approximate confidence intervals of the derived quantities may be obtained by direct numerical evaluation. A simple introduction to the method is given via a detailed example of estimating 95% confidence intervals for cumulated activity in the thyroid following injection of 99m Tc-sodium pertechnetate using activity-time data from 23 subjects. The application of the approach to estimating confidence limits for the self-dose to the kidney following injection of 99m Tc-DTPA organ imaging agent based on uptake data from 19 subjects is also illustrated. Results are then given for estimates of doses to the foetus following administration of 99m Tc-sodium pertechnetate for clinical reasons during pregnancy, averaged over 25 subjects. The bootstrap method is well suited for applications in radiation dosimetry including uncertainty, reliability and sensitivity analysis of dose coefficients in biokinetic models, but it can also be applied in a wide range of other biomedical situations. (author)
A NONPARAMETRIC HYPOTHESIS TEST VIA THE BOOTSTRAP RESAMPLING
Temel, Tugrul T.
2001-01-01
This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.
Genetic divergence among cupuaçu accessions by multiscale bootstrap resampling
Directory of Open Access Journals (Sweden)
Vinicius Silva dos Santos
2015-06-01
Full Text Available This study aimed at investigating the genetic divergence of eighteen accessions of cupuaçu trees based on fruit morphometric traits and comparing usual methods of cluster analysis with the proposed multiscale bootstrap resampling methodology. The data were obtained from an experiment conducted in Tomé-Açu city (PA, Brazil, arranged in a completely randomized design with eighteen cupuaçu accessions and 10 repetitions, from 2004 to 2011. Genetic parameters were estimated by restricted maximum likelihood/best linear unbiased prediction (REML/BLUP methodology. The predicted breeding values were used in the study on genetic divergence through Unweighted Pair Cluster Method with Arithmetic Mean (UPGMA hierarchical clustering and Tocher’s optimization method based on standardized Euclidean distance. Clustering consistency and optimal number of clusters in the UPGMA method were verified by the cophenetic correlation coefficient (CCC and Mojena’s criterion, respectively, besides the multiscale bootstrap resampling technique. The use of the clustering UPGMA method in situations with and without multiscale bootstrap resulted in four and five clusters, respectively, while the Tocher’s method resulted in seven clusters. The multiscale bootstrap resampling technique proves to be efficient to assess the consistency of clustering in hierarchical methods and, consequently, the optimal number of clusters.
The bootstrap and Bayesian bootstrap method in assessing bioequivalence
International Nuclear Information System (INIS)
Wan Jianping; Zhang Kongsheng; Chen Hui
2009-01-01
Parametric method for assessing individual bioequivalence (IBE) may concentrate on the hypothesis that the PK responses are normal. Nonparametric method for evaluating IBE would be bootstrap method. In 2001, the United States Food and Drug Administration (FDA) proposed a draft guidance. The purpose of this article is to evaluate the IBE between test drug and reference drug by bootstrap and Bayesian bootstrap method. We study the power of bootstrap test procedures and the parametric test procedures in FDA (2001). We find that the Bayesian bootstrap method is the most excellent.
Comparison of parametric and bootstrap method in bioequivalence test.
Ahn, Byung-Jin; Yim, Dong-Seok
2009-10-01
The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption.
Assessment of resampling methods for causality testing: A note on the US inflation behavior
Kyrtsou, Catherine; Kugiumtzis, Dimitris; Diks, Cees
2017-01-01
Different resampling methods for the null hypothesis of no Granger causality are assessed in the setting of multivariate time series, taking into account that the driving-response coupling is conditioned on the other observed variables. As appropriate test statistic for this setting, the partial transfer entropy (PTE), an information and model-free measure, is used. Two resampling techniques, time-shifted surrogates and the stationary bootstrap, are combined with three independence settings (giving a total of six resampling methods), all approximating the null hypothesis of no Granger causality. In these three settings, the level of dependence is changed, while the conditioning variables remain intact. The empirical null distribution of the PTE, as the surrogate and bootstrapped time series become more independent, is examined along with the size and power of the respective tests. Additionally, we consider a seventh resampling method by contemporaneously resampling the driving and the response time series using the stationary bootstrap. Although this case does not comply with the no causality hypothesis, one can obtain an accurate sampling distribution for the mean of the test statistic since its value is zero under H0. Results indicate that as the resampling setting gets more independent, the test becomes more conservative. Finally, we conclude with a real application. More specifically, we investigate the causal links among the growth rates for the US CPI, money supply and crude oil. Based on the PTE and the seven resampling methods, we consistently find that changes in crude oil cause inflation conditioning on money supply in the post-1986 period. However this relationship cannot be explained on the basis of traditional cost-push mechanisms. PMID:28708870
Assessment of resampling methods for causality testing: A note on the US inflation behavior.
Papana, Angeliki; Kyrtsou, Catherine; Kugiumtzis, Dimitris; Diks, Cees
2017-01-01
Different resampling methods for the null hypothesis of no Granger causality are assessed in the setting of multivariate time series, taking into account that the driving-response coupling is conditioned on the other observed variables. As appropriate test statistic for this setting, the partial transfer entropy (PTE), an information and model-free measure, is used. Two resampling techniques, time-shifted surrogates and the stationary bootstrap, are combined with three independence settings (giving a total of six resampling methods), all approximating the null hypothesis of no Granger causality. In these three settings, the level of dependence is changed, while the conditioning variables remain intact. The empirical null distribution of the PTE, as the surrogate and bootstrapped time series become more independent, is examined along with the size and power of the respective tests. Additionally, we consider a seventh resampling method by contemporaneously resampling the driving and the response time series using the stationary bootstrap. Although this case does not comply with the no causality hypothesis, one can obtain an accurate sampling distribution for the mean of the test statistic since its value is zero under H0. Results indicate that as the resampling setting gets more independent, the test becomes more conservative. Finally, we conclude with a real application. More specifically, we investigate the causal links among the growth rates for the US CPI, money supply and crude oil. Based on the PTE and the seven resampling methods, we consistently find that changes in crude oil cause inflation conditioning on money supply in the post-1986 period. However this relationship cannot be explained on the basis of traditional cost-push mechanisms.
Winter Holts Oscillatory Method: A New Method of Resampling in Time Series.
Directory of Open Access Journals (Sweden)
Muhammad Imtiaz Subhani
2016-12-01
Full Text Available The core proposition behind this research is to create innovative methods of bootstrapping that can be applied in time series data. In order to find new methods of bootstrapping, various methods were reviewed; The data of automotive Sales, Market Shares and Net Exports of the top 10 countries, which includes China, Europe, United States of America (USA, Japan, Germany, South Korea, India, Mexico, Brazil, Spain and, Canada from 2002 to 2014 were collected through various sources which includes UN Comtrade, Index Mundi and World Bank. The findings of this paper confirmed that Bootstrapping for resampling through winter forecasting by Oscillation and Average methods give more robust results than the winter forecasting by any general methods.
On a linear method in bootstrap confidence intervals
Directory of Open Access Journals (Sweden)
Andrea Pallini
2007-10-01
Full Text Available A linear method for the construction of asymptotic bootstrap confidence intervals is proposed. We approximate asymptotically pivotal and non-pivotal quantities, which are smooth functions of means of n independent and identically distributed random variables, by using a sum of n independent smooth functions of the same analytical form. Errors are of order Op(n-3/2 and Op(n-2, respectively. The linear method allows a straightforward approximation of bootstrap cumulants, by considering the set of n independent smooth functions as an original random sample to be resampled with replacement.
Orthogonal projections and bootstrap resampling procedures in the study of infraspecific variation
Directory of Open Access Journals (Sweden)
Luiza Carla Duarte
1998-12-01
Full Text Available The effect of an increase in quantitative continuous characters resulting from indeterminate growth upon the analysis of population differentiation was investigated using, as an example, a set of continuous characters measured as distance variables in 10 populations of a rodent species. The data before and after correction for allometric size effects using orthogonal projections were analyzed with a parametric bootstrap resampling procedure applied to canonical variate analysis. The variance component of the distance measures attributable to indeterminate growth within the populations was found to be substantial, although the ordination of the populations was not affected, as evidenced by the relative and absolute positions of the centroids. The covariance pattern of the distance variables used to infer the nature of the morphological differences was strongly influenced by indeterminate growth. The uncorrected data produced a misleading picture of morphological differentiation by indicating that groups of populations differed in size. However, the data corrected for allometric effects clearly demonstrated that populations differed morphologically both in size and shape. These results are discussed in terms of the analysis of morphological differentiation among populations and the definition of infraspecific geographic units.A influência do aumento em caracteres quantitativos contínuos devido ao crescimento indeterminado sobre a análise de diferenciação entre populações foi investigado utilizando como exemplo um conjunto de dados de variáveis craniométricas em 10 populações de uma espécie de roedor. Dois conjuntos de dados, um não corrigido para o efeito alométrico do tamanho e um outro corrigido para o efeito alométrico do tamanho utilizando um método de projeção ortogonal, foram analisados por um procedimento "bootstrap" de reamostragem aplicado à análise de variáveis canônicas. O componente de variância devido ao
Chuan, Zun Liang; Ismail, Noriszura; Shinyie, Wendy Ling; Lit Ken, Tan; Fam, Soo-Fen; Senawi, Azlyna; Yusoff, Wan Nur Syahidah Wan
2018-04-01
Due to the limited of historical precipitation records, agglomerative hierarchical clustering algorithms widely used to extrapolate information from gauged to ungauged precipitation catchments in yielding a more reliable projection of extreme hydro-meteorological events such as extreme precipitation events. However, identifying the optimum number of homogeneous precipitation catchments accurately based on the dendrogram resulted using agglomerative hierarchical algorithms are very subjective. The main objective of this study is to propose an efficient regionalized algorithm to identify the homogeneous precipitation catchments for non-stationary precipitation time series. The homogeneous precipitation catchments are identified using average linkage hierarchical clustering algorithm associated multi-scale bootstrap resampling, while uncentered correlation coefficient as the similarity measure. The regionalized homogeneous precipitation is consolidated using K-sample Anderson Darling non-parametric test. The analysis result shows the proposed regionalized algorithm performed more better compared to the proposed agglomerative hierarchical clustering algorithm in previous studies.
Resampling methods in Microsoft Excel® for estimating reference intervals.
Theodorsson, Elvar
2015-01-01
Computer-intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular. Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples.
Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference
Olea, R.A.; Pardo-Iguzquiza, E.
2011-01-01
The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.
Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method
Directory of Open Access Journals (Sweden)
Yi-Ming Hu
2013-01-01
Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.
Pavlov, Igor Y.; Wilson, Andrew R.; Delgado, Julio C.
2010-01-01
Reference intervals (RI) play a key role in clinical interpretation of laboratory test results. Numerous articles are devoted to analyzing and discussing various methods of RI determination. The two most widely used approaches are the parametric method, which assumes data normality, and a nonparametric, rank-based procedure. The decision about which method to use is usually made arbitrarily. The goal of this study was to demonstrate that using a resampling approach for the comparison of RI determination techniques could help researchers select the right procedure. Three methods of RI calculation—parametric, transformed parametric, and quantile-based bootstrapping—were applied to multiple random samples drawn from 81 values of complement factor B observations and from a computer-simulated normally distributed population. It was shown that differences in RI between legitimate methods could be up to 20% and even more. The transformed parametric method was found to be the best method for the calculation of RI of non-normally distributed factor B estimations, producing an unbiased RI and the lowest confidence limits and interquartile ranges. For a simulated Gaussian population, parametric calculations, as expected, were the best; quantile-based bootstrapping produced biased results at low sample sizes, and the transformed parametric method generated heavily biased RI. The resampling approach could help compare different RI calculation methods. An algorithm showing a resampling procedure for choosing the appropriate method for RI calculations is included. PMID:20554803
Green, Michael; Ohlsson, Mattias
2007-01-01
Estimation of the generalization performance for classification within the medical applications domain is always an important task. In this study we focus on artificial neural network ensembles as the machine learning technique. We present a numerical comparison between five common resampling techniques: k-fold cross validation (CV), holdout, using three cutoffs, and bootstrap using five different data sets. The results show that CV together with holdout $0.25$ and $0.50$ are the best resampl...
Confidence Limits for the Indirect Effect: Distribution of the Product and Resampling Methods
MacKinnon, David P.; Lockwood, Chondra M.; Williams, Jason
2010-01-01
The most commonly used method to test an indirect effect is to divide the estimate of the indirect effect by its standard error and compare the resulting z statistic with a critical value from the standard normal distribution. Confidence limits for the indirect effect are also typically based on critical values from the standard normal distribution. This article uses a simulation study to demonstrate that confidence limits are imbalanced because the distribution of the indirect effect is normal only in special cases. Two alternatives for improving the performance of confidence limits for the indirect effect are evaluated: (a) a method based on the distribution of the product of two normal random variables, and (b) resampling methods. In Study 1, confidence limits based on the distribution of the product are more accurate than methods based on an assumed normal distribution but confidence limits are still imbalanced. Study 2 demonstrates that more accurate confidence limits are obtained using resampling methods, with the bias-corrected bootstrap the best method overall. PMID:20157642
Bootstrapping the economy -- a non-parametric method of generating consistent future scenarios
Müller, Ulrich A; Bürgi, Roland; Dacorogna, Michel M
2004-01-01
The fortune and the risk of a business venture depends on the future course of the economy. There is a strong demand for economic forecasts and scenarios that can be applied to planning and modeling. While there is an ongoing debate on modeling economic scenarios, the bootstrapping (or resampling) approach presented here has several advantages. As a non-parametric method, it directly relies on past market behaviors rather than debatable assumptions on models and parameters. Simultaneous dep...
Sediment Curve Uncertainty Estimation Using GLUE and Bootstrap Methods
Directory of Open Access Journals (Sweden)
aboalhasan fathabadi
2017-02-01
Full Text Available Introduction: In order to implement watershed practices to decrease soil erosion effects it needs to estimate output sediment of watershed. Sediment rating curve is used as the most conventional tool to estimate sediment. Regarding to sampling errors and short data, there are some uncertainties in estimating sediment using sediment curve. In this research, bootstrap and the Generalized Likelihood Uncertainty Estimation (GLUE resampling techniques were used to calculate suspended sediment loads by using sediment rating curves. Materials and Methods: The total drainage area of the Sefidrood watershed is about 560000 km2. In this study uncertainty in suspended sediment rating curves was estimated in four stations including Motorkhane, Miyane Tonel Shomare 7, Stor and Glinak constructed on Ayghdamosh, Ghrangho, GHezelOzan and Shahrod rivers, respectively. Data were randomly divided into a training data set (80 percent and a test set (20 percent by Latin hypercube random sampling.Different suspended sediment rating curves equations were fitted to log-transformed values of sediment concentration and discharge and the best fit models were selected based on the lowest root mean square error (RMSE and the highest correlation of coefficient (R2. In the GLUE methodology, different parameter sets were sampled randomly from priori probability distribution. For each station using sampled parameter sets and selected suspended sediment rating curves equation suspended sediment concentration values were estimated several times (100000 to 400000 times. With respect to likelihood function and certain subjective threshold, parameter sets were divided into behavioral and non-behavioral parameter sets. Finally using behavioral parameter sets the 95% confidence intervals for suspended sediment concentration due to parameter uncertainty were estimated. In bootstrap methodology observed suspended sediment and discharge vectors were resampled with replacement B (set to
A bootstrapping method for development of Treebank
Zarei, F.; Basirat, A.; Faili, H.; Mirain, M.
2017-01-01
Using statistical approaches beside the traditional methods of natural language processing could significantly improve both the quality and performance of several natural language processing (NLP) tasks. The effective usage of these approaches is subject to the availability of the informative, accurate and detailed corpora on which the learners are trained. This article introduces a bootstrapping method for developing annotated corpora based on a complex and rich linguistically motivated elementary structure called supertag. To this end, a hybrid method for supertagging is proposed that combines both of the generative and discriminative methods of supertagging. The method was applied on a subset of Wall Street Journal (WSJ) in order to annotate its sentences with a set of linguistically motivated elementary structures of the English XTAG grammar that is using a lexicalised tree-adjoining grammar formalism. The empirical results confirm that the bootstrapping method provides a satisfactory way for annotating the English sentences with the mentioned structures. The experiments show that the method could automatically annotate about 20% of WSJ with the accuracy of F-measure about 80% of which is particularly 12% higher than the F-measure of the XTAG Treebank automatically generated from the approach proposed by Basirat and Faili [(2013). Bridge the gap between statistical and hand-crafted grammars. Computer Speech and Language, 27, 1085-1104].
Statistical error estimation of the Feynman-α method using the bootstrap method
International Nuclear Information System (INIS)
Endo, Tomohiro; Yamamoto, Akio; Yagi, Takahiro; Pyeon, Cheol Ho
2016-01-01
Applicability of the bootstrap method is investigated to estimate the statistical error of the Feynman-α method, which is one of the subcritical measurement techniques on the basis of reactor noise analysis. In the Feynman-α method, the statistical error can be simply estimated from multiple measurements of reactor noise, however it requires additional measurement time to repeat the multiple times of measurements. Using a resampling technique called 'bootstrap method' standard deviation and confidence interval of measurement results obtained by the Feynman-α method can be estimated as the statistical error, using only a single measurement of reactor noise. In order to validate our proposed technique, we carried out a passive measurement of reactor noise without any external source, i.e. with only inherent neutron source by spontaneous fission and (α,n) reactions in nuclear fuels at the Kyoto University Criticality Assembly. Through the actual measurement, it is confirmed that the bootstrap method is applicable to approximately estimate the statistical error of measurement results obtained by the Feynman-α method. (author)
The Local Fractional Bootstrap
DEFF Research Database (Denmark)
Bennedsen, Mikkel; Hounyo, Ulrich; Lunde, Asger
We introduce a bootstrap procedure for high-frequency statistics of Brownian semistationary processes. More specifically, we focus on a hypothesis test on the roughness of sample paths of Brownian semistationary processes, which uses an estimator based on a ratio of realized power variations. Our...... new resampling method, the local fractional bootstrap, relies on simulating an auxiliary fractional Brownian motion that mimics the fine properties of high frequency differences of the Brownian semistationary process under the null hypothesis. We prove the first order validity of the bootstrap method...... and in simulations we observe that the bootstrap-based hypothesis test provides considerable finite-sample improvements over an existing test that is based on a central limit theorem. This is important when studying the roughness properties of time series data; we illustrate this by applying the bootstrap method...
A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data
Liang, Faming; Cheng, Yichen; Song, Qifan; Park, Jincheol; Yang, Ping
2013-01-01
large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate
The wild tapered block bootstrap
DEFF Research Database (Denmark)
Hounyo, Ulrich
In this paper, a new resampling procedure, called the wild tapered block bootstrap, is introduced as a means of calculating standard errors of estimators and constructing confidence regions for parameters based on dependent heterogeneous data. The method consists in tapering each overlapping block...... of the series first, the applying the standard wild bootstrap for independent and heteroscedastic distrbuted observations to overlapping tapered blocks in an appropriate way. Its perserves the favorable bias and mean squared error properties of the tapered block bootstrap, which is the state-of-the-art block......-order asymptotic validity of the tapered block bootstrap as well as the wild tapered block bootstrap approximation to the actual distribution of the sample mean is also established when data are assumed to satisfy a near epoch dependent condition. The consistency of the bootstrap variance estimator for the sample...
Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'.
de Nijs, Robin
2015-07-21
In order to be able to calculate half-count images from already acquired data, White and Lawson published their method based on Poisson resampling. They verified their method experimentally by measurements with a Co-57 flood source. In this comment their results are reproduced and confirmed by a direct numerical simulation in Matlab. Not only Poisson resampling, but also two direct redrawing methods were investigated. Redrawing methods were based on a Poisson and a Gaussian distribution. Mean, standard deviation, skewness and excess kurtosis half-count/full-count ratios were determined for all methods, and compared to the theoretical values for a Poisson distribution. Statistical parameters showed the same behavior as in the original note and showed the superiority of the Poisson resampling method. Rounding off before saving of the half count image had a severe impact on counting statistics for counts below 100. Only Poisson resampling was not affected by this, while Gaussian redrawing was less affected by it than Poisson redrawing. Poisson resampling is the method of choice, when simulating half-count (or less) images from full-count images. It simulates correctly the statistical properties, also in the case of rounding off of the images.
Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'
DEFF Research Database (Denmark)
de Nijs, Robin
2015-01-01
In order to be able to calculate half-count images from already acquired data, White and Lawson published their method based on Poisson resampling. They verified their method experimentally by measurements with a Co-57 flood source. In this comment their results are reproduced and confirmed...... by a direct numerical simulation in Matlab. Not only Poisson resampling, but also two direct redrawing methods were investigated. Redrawing methods were based on a Poisson and a Gaussian distribution. Mean, standard deviation, skewness and excess kurtosis half-count/full-count ratios were determined for all...... methods, and compared to the theoretical values for a Poisson distribution. Statistical parameters showed the same behavior as in the original note and showed the superiority of the Poisson resampling method. Rounding off before saving of the half count image had a severe impact on counting statistics...
Resampling: An optimization method for inverse planning in robotic radiosurgery
International Nuclear Information System (INIS)
Schweikard, Achim; Schlaefer, Alexander; Adler, John R. Jr.
2006-01-01
By design, the range of beam directions in conventional radiosurgery are constrained to an isocentric array. However, the recent introduction of robotic radiosurgery dramatically increases the flexibility of targeting, and as a consequence, beams need be neither coplanar nor isocentric. Such a nonisocentric design permits a large number of distinct beam directions to be used in one single treatment. These major technical differences provide an opportunity to improve upon the well-established principles for treatment planning used with GammaKnife or LINAC radiosurgery. With this objective in mind, our group has developed over the past decade an inverse planning tool for robotic radiosurgery. This system first computes a set of beam directions, and then during an optimization step, weights each individual beam. Optimization begins with a feasibility query, the answer to which is derived through linear programming. This approach offers the advantage of completeness and avoids local optima. Final beam selection is based on heuristics. In this report we present and evaluate a new strategy for utilizing the advantages of linear programming to improve beam selection. Starting from an initial solution, a heuristically determined set of beams is added to the optimization problem, while beams with zero weight are removed. This process is repeated to sample a set of beams much larger compared with typical optimization. Experimental results indicate that the planning approach efficiently finds acceptable plans and that resampling can further improve its efficiency
Resampling Methods Improve the Predictive Power of Modeling in Class-Imbalanced Datasets
Directory of Open Access Journals (Sweden)
Paul H. Lee
2014-09-01
Full Text Available In the medical field, many outcome variables are dichotomized, and the two possible values of a dichotomized variable are referred to as classes. A dichotomized dataset is class-imbalanced if it consists mostly of one class, and performance of common classification models on this type of dataset tends to be suboptimal. To tackle such a problem, resampling methods, including oversampling and undersampling can be used. This paper aims at illustrating the effect of resampling methods using the National Health and Nutrition Examination Survey (NHANES wave 2009–2010 dataset. A total of 4677 participants aged ≥20 without self-reported diabetes and with valid blood test results were analyzed. The Classification and Regression Tree (CART procedure was used to build a classification model on undiagnosed diabetes. A participant demonstrated evidence of diabetes according to WHO diabetes criteria. Exposure variables included demographics and socio-economic status. CART models were fitted using a randomly selected 70% of the data (training dataset, and area under the receiver operating characteristic curve (AUC was computed using the remaining 30% of the sample for evaluation (testing dataset. CART models were fitted using the training dataset, the oversampled training dataset, the weighted training dataset, and the undersampled training dataset. In addition, resampling case-to-control ratio of 1:1, 1:2, and 1:4 were examined. Resampling methods on the performance of other extensions of CART (random forests and generalized boosted trees were also examined. CARTs fitted on the oversampled (AUC = 0.70 and undersampled training data (AUC = 0.74 yielded a better classification power than that on the training data (AUC = 0.65. Resampling could also improve the classification power of random forests and generalized boosted trees. To conclude, applying resampling methods in a class-imbalanced dataset improved the classification power of CART, random forests
DEFF Research Database (Denmark)
Huang, Shaojun; Mathe, Laszlo; Teodorescu, Remus
2013-01-01
Two existing methods to implement resampling modulation technique for modular multilevel converter (MMC) (the sampling frequency is a multiple of the carrier frequency) are: the software solution (using a microcontroller) and the hardware solution (using FPGA). The former has a certain level...
Assessment of Resampling Methods for Causality Testing: A note on the US Inflation Behavior
Papana, A.; Kyrtsou, C.; Kugiumtzis, D.; Diks, C.
2017-01-01
Different resampling methods for the null hypothesis of no Granger causality are assessed in the setting of multivariate time series, taking into account that the driving-response coupling is conditioned on the other observed variables. As appropriate test statistic for this setting, the partial
Testing for Granger Causality in the Frequency Domain: A Phase Resampling Method.
Liu, Siwei; Molenaar, Peter
2016-01-01
This article introduces phase resampling, an existing but rarely used surrogate data method for making statistical inferences of Granger causality in frequency domain time series analysis. Granger causality testing is essential for establishing causal relations among variables in multivariate dynamic processes. However, testing for Granger causality in the frequency domain is challenging due to the nonlinear relation between frequency domain measures (e.g., partial directed coherence, generalized partial directed coherence) and time domain data. Through a simulation study, we demonstrate that phase resampling is a general and robust method for making statistical inferences even with short time series. With Gaussian data, phase resampling yields satisfactory type I and type II error rates in all but one condition we examine: when a small effect size is combined with an insufficient number of data points. Violations of normality lead to slightly higher error rates but are mostly within acceptable ranges. We illustrate the utility of phase resampling with two empirical examples involving multivariate electroencephalography (EEG) and skin conductance data.
Yang, Yang; DeGruttola, Victor
2012-06-22
Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients.
Computerized statistical analysis with bootstrap method in nuclear medicine
International Nuclear Information System (INIS)
Zoccarato, O.; Sardina, M.; Zatta, G.; De Agostini, A.; Barbesti, S.; Mana, O.; Tarolo, G.L.
1988-01-01
Statistical analysis of data samples involves some hypothesis about the features of data themselves. The accuracy of these hypotheses can influence the results of statistical inference. Among the new methods of computer-aided statistical analysis, the bootstrap method appears to be one of the most powerful, thanks to its ability to reproduce many artificial samples starting from a single original sample and because it works without hypothesis about data distribution. The authors applied the bootstrap method to two typical situation of Nuclear Medicine Department. The determination of the normal range of serum ferritin, as assessed by radioimmunoassay and defined by the mean value ±2 standard deviations, starting from an experimental sample of small dimension, shows an unacceptable lower limit (ferritin plasmatic levels below zero). On the contrary, the results obtained by elaborating 5000 bootstrap samples gives ans interval of values (10.95 ng/ml - 72.87 ng/ml) corresponding to the normal ranges commonly reported. Moreover the authors applied the bootstrap method in evaluating the possible error associated with the correlation coefficient determined between left ventricular ejection fraction (LVEF) values obtained by first pass radionuclide angiocardiography with 99m Tc and 195m Au. The results obtained indicate a high degree of statistical correlation and give the range of r 2 values to be considered acceptable for this type of studies
Frequency Analysis Using Bootstrap Method and SIR Algorithm for Prevention of Natural Disasters
Kim, T.; Kim, Y. S.
2017-12-01
The frequency analysis of hydrometeorological data is one of the most important factors in response to natural disaster damage, and design standards for a disaster prevention facilities. In case of frequency analysis of hydrometeorological data, it assumes that observation data have statistical stationarity, and a parametric method considering the parameter of probability distribution is applied. For a parametric method, it is necessary to sufficiently collect reliable data; however, snowfall observations are needed to compensate for insufficient data in Korea, because of reducing the number of days for snowfall observations and mean maximum daily snowfall depth due to climate change. In this study, we conducted the frequency analysis for snowfall using the Bootstrap method and SIR algorithm which are the resampling methods that can overcome the problems of insufficient data. For the 58 meteorological stations distributed evenly in Korea, the probability of snowfall depth was estimated by non-parametric frequency analysis using the maximum daily snowfall depth data. The results show that probabilistic daily snowfall depth by frequency analysis is decreased at most stations, and most stations representing the rate of change were found to be consistent in both parametric and non-parametric frequency analysis. This study shows that the resampling methods can do the frequency analysis of the snowfall depth that has insufficient observed samples, which can be applied to interpretation of other natural disasters such as summer typhoons with seasonal characteristics. Acknowledgment.This research was supported by a grant(MPSS-NH-2015-79) from Disaster Prediction and Mitigation Technology Development Program funded by Korean Ministry of Public Safety and Security(MPSS).
Mirror bootstrap method for testing hypotheses of one mean
Varvak, Anna
2012-01-01
The general philosophy for bootstrap or permutation methods for testing hypotheses is to simulate the variation of the test statistic by generating the sampling distribution which assumes both that the null hypothesis is true, and that the data in the sample is somehow representative of the population. This philosophy is inapplicable for testing hypotheses for a single parameter like the population mean, since the two assumptions are contradictory (e.g., how can we assume both that the mean o...
Soybean yield modeling using bootstrap methods for small samples
Energy Technology Data Exchange (ETDEWEB)
Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.
2016-11-01
One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)
Bootstrap Determination of the Co-integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A.M.Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... of the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap re-sampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we investigate...... the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap re-sampling scheme, when time-varying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically correctly sized and...
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... of the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap re-sampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we investigate...... the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap re-sampling scheme, when time-varying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically correctly sized and...
Resampling nucleotide sequences with closest-neighbor trimming and its comparison to other methods.
Directory of Open Access Journals (Sweden)
Kouki Yonezawa
Full Text Available A large number of nucleotide sequences of various pathogens are available in public databases. The growth of the datasets has resulted in an enormous increase in computational costs. Moreover, due to differences in surveillance activities, the number of sequences found in databases varies from one country to another and from year to year. Therefore, it is important to study resampling methods to reduce the sampling bias. A novel algorithm-called the closest-neighbor trimming method-that resamples a given number of sequences from a large nucleotide sequence dataset was proposed. The performance of the proposed algorithm was compared with other algorithms by using the nucleotide sequences of human H3N2 influenza viruses. We compared the closest-neighbor trimming method with the naive hierarchical clustering algorithm and [Formula: see text]-medoids clustering algorithm. Genetic information accumulated in public databases contains sampling bias. The closest-neighbor trimming method can thin out densely sampled sequences from a given dataset. Since nucleotide sequences are among the most widely used materials for life sciences, we anticipate that our algorithm to various datasets will result in reducing sampling bias.
Bootstrap embedding: An internally consistent fragment-based method
Energy Technology Data Exchange (ETDEWEB)
Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy [Department of Chemistry, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States)
2016-08-21
Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments “embedded” in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed “Bootstrap Embedding,” a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.
Comparing groups randomization and bootstrap methods using R
Zieffler, Andrew S; Long, Jeffrey D
2011-01-01
A hands-on guide to using R to carry out key statistical practices in educational and behavioral sciences research Computing has become an essential part of the day-to-day practice of statistical work, broadening the types of questions that can now be addressed by research scientists applying newly derived data analytic techniques. Comparing Groups: Randomization and Bootstrap Methods Using R emphasizes the direct link between scientific research questions and data analysis. Rather than relying on mathematical calculations, this book focus on conceptual explanations and
Methods of soil resampling to monitor changes in the chemical concentrations of forest soils
Lawrence, Gregory B.; Fernandez, Ivan J.; Hazlett, Paul W.; Bailey, Scott W.; Ross, Donald S.; Villars, Thomas R.; Quintana, Angelica; Ouimet, Rock; McHale, Michael; Johnson, Chris E.; Briggs, Russell D.; Colter, Robert A.; Siemion, Jason; Bartlett, Olivia L.; Vargas, Olga; Antidormi, Michael; Koppers, Mary Margaret
2016-01-01
Recent soils research has shown that important chemical soil characteristics can change in less than a decade, often the result of broad environmental changes. Repeated sampling to monitor these changes in forest soils is a relatively new practice that is not well documented in the literature and has only recently been broadly embraced by the scientific community. The objective of this protocol is therefore to synthesize the latest information on methods of soil resampling in a format that can be used to design and implement a soil monitoring program. Successful monitoring of forest soils requires that a study unit be defined within an area of forested land that can be characterized with replicate sampling locations. A resampling interval of 5 years is recommended, but if monitoring is done to evaluate a specific environmental driver, the rate of change expected in that driver should be taken into consideration. Here, we show that the sampling of the profile can be done by horizon where boundaries can be clearly identified and horizons are sufficiently thick to remove soil without contamination from horizons above or below. Otherwise, sampling can be done by depth interval. Archiving of sample for future reanalysis is a key step in avoiding analytical bias and providing the opportunity for additional analyses as new questions arise.
Methods of Soil Resampling to Monitor Changes in the Chemical Concentrations of Forest Soils.
Lawrence, Gregory B; Fernandez, Ivan J; Hazlett, Paul W; Bailey, Scott W; Ross, Donald S; Villars, Thomas R; Quintana, Angelica; Ouimet, Rock; McHale, Michael R; Johnson, Chris E; Briggs, Russell D; Colter, Robert A; Siemion, Jason; Bartlett, Olivia L; Vargas, Olga; Antidormi, Michael R; Koppers, Mary M
2016-11-25
Recent soils research has shown that important chemical soil characteristics can change in less than a decade, often the result of broad environmental changes. Repeated sampling to monitor these changes in forest soils is a relatively new practice that is not well documented in the literature and has only recently been broadly embraced by the scientific community. The objective of this protocol is therefore to synthesize the latest information on methods of soil resampling in a format that can be used to design and implement a soil monitoring program. Successful monitoring of forest soils requires that a study unit be defined within an area of forested land that can be characterized with replicate sampling locations. A resampling interval of 5 years is recommended, but if monitoring is done to evaluate a specific environmental driver, the rate of change expected in that driver should be taken into consideration. Here, we show that the sampling of the profile can be done by horizon where boundaries can be clearly identified and horizons are sufficiently thick to remove soil without contamination from horizons above or below. Otherwise, sampling can be done by depth interval. Archiving of sample for future reanalysis is a key step in avoiding analytical bias and providing the opportunity for additional analyses as new questions arise.
A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series
Directory of Open Access Journals (Sweden)
Fernando Luiz Cyrino Oliveira
2014-01-01
Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.
Introductory statistics and analytics a resampling perspective
Bruce, Peter C
2014-01-01
Concise, thoroughly class-tested primer that features basic statistical concepts in the concepts in the context of analytics, resampling, and the bootstrapA uniquely developed presentation of key statistical topics, Introductory Statistics and Analytics: A Resampling Perspective provides an accessible approach to statistical analytics, resampling, and the bootstrap for readers with various levels of exposure to basic probability and statistics. Originally class-tested at one of the first online learning companies in the discipline, www.statistics.com, the book primarily focuses on application
Optical Flow of Small Objects Using Wavelets, Bootstrap Methods, and Synthetic Discriminant Filters
National Research Council Canada - National Science Library
Hewer, Gary
1997-01-01
...) targets in highly cluttered and noisy environments. In this paper; we present a novel wavelet detection algorithm which incorporates adaptive CFAR detection statistics using the bootstrap method...
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, G.; Rahbek, Anders; Taylor, A.M.R.
2014-01-01
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio (PLR) co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... of the underlying vector autoregressive (VAR) model which obtain under the reduced rank null hypothesis. They propose methods based on an independent and individual distributed (i.i.d.) bootstrap resampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co......-integrated VAR model with i.i.d. innovations. In this paper we investigate the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap resampling scheme, when time-varying behavior is present in either the conditional or unconditional variance of the innovations. We...
Moreno, Claudia E.; Guevara, Roger; Sánchez-Rojas, Gerardo; Téllez, Dianeis; Verdú, José R.
2008-01-01
Environmental assessment at the community level in highly diverse ecosystems is limited by taxonomic constraints and statistical methods requiring true replicates. Our objective was to show how diverse systems can be studied at the community level using higher taxa as biodiversity surrogates, and re-sampling methods to allow comparisons. To illustrate this we compared the abundance, richness, evenness and diversity of the litter fauna in a pine-oak forest in central Mexico among seasons, sites and collecting methods. We also assessed changes in the abundance of trophic guilds and evaluated the relationships between community parameters and litter attributes. With the direct search method we observed differences in the rate of taxa accumulation between sites. Bootstrap analysis showed that abundance varied significantly between seasons and sampling methods, but not between sites. In contrast, diversity and evenness were significantly higher at the managed than at the non-managed site. Tree regression models show that abundance varied mainly between seasons, whereas taxa richness was affected by litter attributes (composition and moisture content). The abundance of trophic guilds varied among methods and seasons, but overall we found that parasitoids, predators and detrivores decreased under management. Therefore, although our results suggest that management has positive effects on the richness and diversity of litter fauna, the analysis of trophic guilds revealed a contrasting story. Our results indicate that functional groups and re-sampling methods may be used as tools for describing community patterns in highly diverse systems. Also, the higher taxa surrogacy could be seen as a preliminary approach when it is not possible to identify the specimens at a low taxonomic level in a reasonable period of time and in a context of limited financial resources, but further studies are needed to test whether the results are specific to a system or whether they are general
A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data
Liang, Faming
2013-03-01
The Gaussian geostatistical model has been widely used in modeling of spatial data. However, it is challenging to computationally implement this method because it requires the inversion of a large covariance matrix, particularly when there is a large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate of the parameters is updated accordingly under the framework of stochastic approximation. Since the proposed method makes use of only a small proportion of the data at each iteration, it avoids inverting large covariance matrices and thus is scalable to large datasets. The proposed method also leads to a general parameter estimation approach, maximum mean log-likelihood estimation, which includes the popular maximum (log)-likelihood estimation (MLE) approach as a special case and is expected to play an important role in analyzing large datasets. Under mild conditions, it is shown that the estimator resulting from the proposed method converges in probability to a set of parameter values of equivalent Gaussian probability measures, and that the estimator is asymptotically normally distributed. To the best of the authors\\' knowledge, the present study is the first one on asymptotic normality under infill asymptotics for general covariance functions. The proposed method is illustrated with large datasets, both simulated and real. Supplementary materials for this article are available online. © 2013 American Statistical Association.
Bootstrap confidence intervals for three-way methods
Kiers, Henk A.L.
Results from exploratory three-way analysis techniques such as CANDECOMP/PARAFAC and Tucker3 analysis are usually presented without giving insight into uncertainties due to sampling. Here a bootstrap procedure is proposed that produces percentile intervals for all output parameters. Special
Efficient p-value evaluation for resampling-based tests
Yu, K.; Liang, F.; Ciampa, J.; Chatterjee, N.
2011-01-01
The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated
Warton, David I; Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang; Yu, Zhuqing; Huang, Jianhua Z.
2013-01-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap
Study of the separate exposure method for bootstrap sensitometry on X-ray cine film
International Nuclear Information System (INIS)
Matsuda, Eiji; Sanada, Taizo; Hitomi, Go; Kakuba, Koki; Kangai, Yoshiharu; Ishii, Koushi
1997-01-01
We developed a new method for bootstrap sensitometry that obtained the characteristic curve from a wide range, with a smaller number of aluminum steps than the conventional bootstrap method. In this method, the density-density curve was obtained from standard and multiplied exposures to the aluminum step wedge and used for bootstrap manipulation. The curve was acquired from two regions separated and added together, e.g., lower and higher photographic density regions. In this study, we evaluated the usefulness of a new cinefluorography method in comparison with N.D. filter sensitometry. The shape of the characteristic curve and the gradient curve obtained with the new method were highly similar to that obtained with N.D. filter sensitometry. Also, the average gradient obtained with the new bootstrap sensitometry method was not significantly different from that obtained by the N.D. filter method. The study revealed that the reliability of the characteristic curve was improved by increasing the measured value used to calculate the density-density curve. This new method was useful for obtaining a characteristic curve with a sufficient density range, and the results suggested that this new method could be applied to specific systems to which the conventional bootstrap method is not applicable. (author)
Bootstrap-Based Inference for Cube Root Consistent Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi
This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....
Efficient p-value evaluation for resampling-based tests
Yu, K.
2011-01-05
The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated calculations of the test statistic on a large number of simulated data sets for its significance level assessment, and thus it could become very computationally intensive. Here, we propose an efficient p-value evaluation procedure by adapting the stochastic approximation Markov chain Monte Carlo algorithm. The new procedure can be used easily for estimating the p-value for any resampling-based test. We show through numeric simulations that the proposed procedure can be 100-500 000 times as efficient (in term of computing time) as the standard resampling-based procedure when evaluating a test statistic with a small p-value (e.g. less than 10( - 6)). With its computational burden reduced by this proposed procedure, the versatile resampling-based test would become computationally feasible for a much wider range of applications. We demonstrate the application of the new method by applying it to a large-scale genetic association study of prostate cancer.
Energy Technology Data Exchange (ETDEWEB)
Niehof, Jonathan T.; Morley, Steven K.
2012-01-01
We review and develop techniques to determine associations between series of discrete events. The bootstrap, a nonparametric statistical method, allows the determination of the significance of associations with minimal assumptions about the underlying processes. We find the key requirement for this method: one of the series must be widely spaced in time to guarantee the theoretical applicability of the bootstrap. If this condition is met, the calculated significance passes a reasonableness test. We conclude with some potential future extensions and caveats on the applicability of these methods. The techniques presented have been implemented in a Python-based software toolkit.
DEFF Research Database (Denmark)
Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio
2014-01-01
measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....
USEFULNESS OF BOOTSTRAPPING IN PORTFOLIO MANAGEMENT
Directory of Open Access Journals (Sweden)
Boris Radovanov
2012-12-01
Full Text Available This paper contains a comparison of in-sample and out-of-sample performances between the resampled efficiency technique, patented by Richard Michaud and Robert Michaud (1999, and traditional Mean-Variance portfolio selection, presented by Harry Markowitz (1952. Based on the Monte Carlo simulation, data (samples generation process determines the algorithms by using both, parametric and nonparametric bootstrap techniques. Resampled efficiency provides the solution to use uncertain information without the need for constrains in portfolio optimization. Parametric bootstrap process starts with a parametric model specification, where we apply Capital Asset Pricing Model. After the estimation of specified model, the series of residuals are used for resampling process. On the other hand, nonparametric bootstrap divides series of price returns into the new series of blocks containing previous determined number of consecutive price returns. This procedure enables smooth resampling process and preserves the original structure of data series.
Parks, Nathan A.; Gannon, Matthew A.; Long, Stephanie M.; Young, Madeleine E.
2016-01-01
Analysis of event-related potential (ERP) data includes several steps to ensure that ERPs meet an appropriate level of signal quality. One such step, subject exclusion, rejects subject data if ERP waveforms fail to meet an appropriate level of signal quality. Subject exclusion is an important quality control step in the ERP analysis pipeline as it ensures that statistical inference is based only upon those subjects exhibiting clear evoked brain responses. This critical quality control step is most often performed simply through visual inspection of subject-level ERPs by investigators. Such an approach is qualitative, subjective, and susceptible to investigator bias, as there are no standards as to what constitutes an ERP of sufficient signal quality. Here, we describe a standardized and objective method for quantifying waveform quality in individual subjects and establishing criteria for subject exclusion. The approach uses bootstrap resampling of ERP waveforms (from a pool of all available trials) to compute a signal-to-noise ratio confidence interval (SNR-CI) for individual subject waveforms. The lower bound of this SNR-CI (SNRLB) yields an effective and objective measure of signal quality as it ensures that ERP waveforms statistically exceed a desired signal-to-noise criterion. SNRLB provides a quantifiable metric of individual subject ERP quality and eliminates the need for subjective evaluation of waveform quality by the investigator. We detail the SNR-CI methodology, establish the efficacy of employing this approach with Monte Carlo simulations, and demonstrate its utility in practice when applied to ERP datasets. PMID:26903849
Bootstrapping conformal field theories with the extremal functional method.
El-Showk, Sheer; Paulos, Miguel F
2013-12-13
The existence of a positive linear functional acting on the space of (differences between) conformal blocks has been shown to rule out regions in the parameter space of conformal field theories (CFTs). We argue that at the boundary of the allowed region the extremal functional contains, in principle, enough information to determine the dimensions and operator product expansion (OPE) coefficients of an infinite number of operators appearing in the correlator under analysis. Based on this idea we develop the extremal functional method (EFM), a numerical procedure for deriving the spectrum and OPE coefficients of CFTs lying on the boundary (of solution space). We test the EFM by using it to rederive the low lying spectrum and OPE coefficients of the two-dimensional Ising model based solely on the dimension of a single scalar quasiprimary--no Virasoro algebra required. Our work serves as a benchmark for applications to more interesting, less known CFTs in the near future.
Loop equations and bootstrap methods in the lattice
Directory of Open Access Journals (Sweden)
Peter D. Anderson
2017-08-01
Full Text Available Pure gauge theories can be formulated in terms of Wilson Loops by means of the loop equation. In the large-N limit this equation closes in the expectation value of single loops. In particular, using the lattice as a regulator, it becomes a well defined equation for a discrete set of loops. In this paper we study different numerical approaches to solving this equation. Previous ideas gave good results in the strong coupling region. Here we propose an alternative method based on the observation that certain matrices ρˆ of Wilson loop expectation values are positive definite. They also have unit trace (ρˆ⪰0,Trρˆ=1, in fact they can be defined as reduced density matrices in the space of open loops after tracing over color indices and can be used to define an entropy associated with the loss of information due to such trace SWL=−Tr[ρˆlnρˆ]. The condition that such matrices are positive definite allows us to study the weak coupling region which is relevant for the continuum limit. In the exactly solvable case of two dimensions this approach gives very good results by considering just a few loops. In four dimensions it gives good results in the weak coupling region and therefore is complementary to the strong coupling expansion. We compare the results with standard Monte Carlo simulations.
Improved inference in the evaluation of mutual fund performance using panel bootstrap methods
Blake, David; Caulfield, Tristan; Ioannidis, Christos; Tonks, I P
2014-01-01
Two new methodologies are introduced to improve inference in the evaluation of mutual fund performance against benchmarks. First, the benchmark models are estimated using panel methods with both fund and time effects. Second, the non-normality of individual mutual fund returns is accounted for by using panel bootstrap methods. We also augment the standard benchmark factors with fund-specific characteristics, such as fund size. Using a dataset of UK equity mutual fund returns, we find that fun...
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Niska, Christoffer
2014-01-01
Practical and instruction-based, this concise book will take you from understanding what Bootstrap is, to creating your own Bootstrap theme in no time! If you are an intermediate front-end developer or designer who wants to learn the secrets of Bootstrap, this book is perfect for you.
Gray bootstrap method for estimating frequency-varying random vibration signals with small samples
Directory of Open Access Journals (Sweden)
Wang Yanqing
2014-04-01
Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.
Application of the bootstrap method to radiolabeled antibody dosimetry from planar images
International Nuclear Information System (INIS)
Papenfuss, T.; Saunder, T.H.; Schleyer, P.J.; O'Keefe, G.J.; Scott, A.M.
2002-01-01
Full text: Planar imaging dosimetry of radiolabeled antibody treatment uses the MIRD schema to compute dose to an organ from the calculated activity in that and other organs. The calculated activity in an organ is a function of the average count rates in the organ, a standard and an appropriate background measurement. The geometric mean of conjugate averages, together with an attenuation factor is used to provide an approximate attenuation correction. It is sometimes desirable to know the variance of the activity in an organ in order to apply weighted least squares regression to the data. This is particularly important when incorporating more accurate data from autoradiography of biopsied tissue into the analysis, but is also useful when some data points have low signal. While the geometric mean image can be used to calculate the variance of an organ's count rate. It is difficult to calculate the variance of the activity, since the organ in question, the standard and the background all contribute to it. Bootstrap methods are Monte Carlo techniques that can be used to estimate parameters from data and to determine the accuracy of the estimation. By bootstrapping pixel values in the organ, background and standard ROIs, it is possible to calculate many realisations of the organ activity and calculate its variance. As an example, bootstrapping is applied to the pharmacodynamic analysis of 131 I-huA33 in colon. The data includes planar whole body images, and autoradiographs and planar images of a resected colon. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc
Use of Monte Carlo Bootstrap Method in the Analysis of Sample Sufficiency for Radioecological Data
International Nuclear Information System (INIS)
Silva, A. N. C. da; Amaral, R. S.; Araujo Santos Jr, J.; Wilson Vieira, J.; Lima, F. R. de A.
2015-01-01
There are operational difficulties in obtaining samples for radioecological studies. Population data may no longer be available during the study and obtaining new samples may not be possible. These problems do the researcher sometimes work with a small number of data. Therefore, it is difficult to know whether the number of samples will be sufficient to estimate the desired parameter. Hence, it is critical do the analysis of sample sufficiency. It is not interesting uses the classical methods of statistic to analyze sample sufficiency in Radioecology, because naturally occurring radionuclides have a random distribution in soil, usually arise outliers and gaps with missing values. The present work was developed aiming to apply the Monte Carlo Bootstrap method in the analysis of sample sufficiency with quantitative estimation of a single variable such as specific activity of a natural radioisotope present in plants. The pseudo population was a small sample with 14 values of specific activity of 226 Ra in forage palm (Opuntia spp.). Using the R software was performed a computational procedure to calculate the number of the sample values. The re sampling process with replacement took the 14 values of original sample and produced 10,000 bootstrap samples for each round. Then was calculated the estimated average θ for samples with 2, 5, 8, 11 and 14 values randomly selected. The results showed that if the researcher work with only 11 sample values, the average parameter will be within a confidence interval with 90% probability . (Author)
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
Bootstrapping phylogenies inferred from rearrangement data
Directory of Open Access Journals (Sweden)
Lin Yu
2012-08-01
Full Text Available Abstract Background Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. Results We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Conclusions Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its
Bootstrapping phylogenies inferred from rearrangement data.
Lin, Yu; Rajan, Vaibhav; Moret, Bernard Me
2012-08-29
Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its support values follow a similar scale and its receiver
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
International Nuclear Information System (INIS)
Castedo Echeverri, Alejandro; Harling, Benedict von; Serone, Marco
2016-06-01
We study the numerical bounds obtained using a conformal-bootstrap method where different points in the plane of conformal cross ratios z and anti z are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point z= anti z=1/2, we can consistently ''integrate out'' higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this ''effective'' bootstrap by studying the 3D Ising and O(n) vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the O(n) vector models, with n=2,3,4, which have not yet been computed using bootstrap techniques.
Energy Technology Data Exchange (ETDEWEB)
Castedo Echeverri, Alejandro [SISSA, Trieste (Italy); INFN, Trieste (Italy); Harling, Benedict von [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Serone, Marco [SISSA, Trieste (Italy); INFN, Trieste (Italy); ICTP, Trieste (Italy)
2016-06-15
We study the numerical bounds obtained using a conformal-bootstrap method where different points in the plane of conformal cross ratios z and anti z are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point z= anti z=1/2, we can consistently ''integrate out'' higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this ''effective'' bootstrap by studying the 3D Ising and O(n) vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the O(n) vector models, with n=2,3,4, which have not yet been computed using bootstrap techniques.
Fourier transform resampling: Theory and application
International Nuclear Information System (INIS)
Hawkins, W.G.
1996-01-01
One of the most challenging problems in medical imaging is the development of reconstruction algorithms for nonstandard geometries. This work focuses on the application of Fourier analysis to the problem of resampling or rebinning. Conventional resampling methods utilizing some form of interpolation almost always result in a loss of resolution in the tomographic image. Fourier Transform Resampling (FTRS) offers potential improvement because the Modulation Transfer Function (MTF) of the process behaves like an ideal low pass filter. The MTF, however, is nonstationary if the coordinate transformation is nonlinear. FTRS may be viewed as a generalization of the linear coordinate transformations of standard Fourier analysis. Simulated MTF's were obtained by projecting point sources at different transverse positions in the flat fan beam detector geometry. These MTF's were compared to the closed form expression for FIRS. Excellent agreement was obtained for frequencies at or below the estimated cutoff frequency. The resulting FTRS algorithm is applied to simulations with symmetric fan beam geometry, an elliptical orbit and uniform attenuation, with a normalized root mean square error (NRME) of 0.036. Also, a Tc-99m point source study (1 cm dia., placed in air 10 cm from the COR) for a circular fan beam acquisition was reconstructed with a hybrid resampling method. The FWHM of the hybrid resampling method was 11.28 mm and compares favorably with a direct reconstruction (FWHM: 11.03 mm)
Kantar, E.; Deviren, B.; Keskin, M.
2011-11-01
We present a study, within the scope of econophysics, of the hierarchical structure of 98 among the largest international companies including 18 among the largest Turkish companies, namely Banks, Automobile, Software-hardware, Telecommunication Services, Energy and the Oil-Gas sectors, viewed as a network of interacting companies. We analyze the daily time series data of the Boerse-Frankfurt and Istanbul Stock Exchange. We examine the topological properties among the companies over the period 2006-2010 by using the concept of hierarchical structure methods (the minimal spanning tree (MST) and the hierarchical tree (HT)). The period is divided into three subperiods, namely 2006-2007, 2008 which was the year of global economic crisis, and 2009-2010, in order to test various time-windows and observe temporal evolution. We carry out bootstrap analyses to associate the value of statistical reliability to the links of the MSTs and HTs. We also use average linkage clustering analysis (ALCA) in order to better observe the cluster structure. From these studies, we find that the interactions among the Banks/Energy sectors and the other sectors were reduced after the global economic crisis; hence the effects of the Banks and Energy sectors on the correlations of all companies were decreased. Telecommunication Services were also greatly affected by the crisis. We also observed that the Automobile and Banks sectors, including Turkish companies as well as some companies from the USA, Japan and Germany were strongly correlated with each other in all periods.
Bhaumik, Snig
2015-01-01
If you are a web developer who designs and develops websites and pages using HTML, CSS, and JavaScript, but have very little familiarity with Bootstrap, this is the book for you. Previous experience with HTML, CSS, and JavaScript will be helpful, while knowledge of jQuery would be an extra advantage.
DEFF Research Database (Denmark)
Davis, Jerrold I.; Stevenson, Dennis W.; Petersen, Gitte
2004-01-01
elements of Xyridaceae. A comparison was conducted of jackknife and bootstrap values, as computed using strict-consensus (SC) and frequency-within-replicates (FWR) approaches. Jackknife values tend to be higher than bootstrap values, and for each of these methods support values obtained with the FWR...
Directory of Open Access Journals (Sweden)
Campbell Michael J
2004-12-01
Full Text Available Abstract Health-Related Quality of Life (HRQoL measures are becoming increasingly used in clinical trials as primary outcome measures. Investigators are now asking statisticians for advice on how to analyse studies that have used HRQoL outcomes. HRQoL outcomes, like the SF-36, are usually measured on an ordinal scale. However, most investigators assume that there exists an underlying continuous latent variable that measures HRQoL, and that the actual measured outcomes (the ordered categories, reflect contiguous intervals along this continuum. The ordinal scaling of HRQoL measures means they tend to generate data that have discrete, bounded and skewed distributions. Thus, standard methods of analysis such as the t-test and linear regression that assume Normality and constant variance may not be appropriate. For this reason, conventional statistical advice would suggest that non-parametric methods be used to analyse HRQoL data. The bootstrap is one such computer intensive non-parametric method for analysing data. We used the bootstrap for hypothesis testing and the estimation of standard errors and confidence intervals for parameters, in four datasets (which illustrate the different aspects of study design. We then compared and contrasted the bootstrap with standard methods of analysing HRQoL outcomes. The standard methods included t-tests, linear regression, summary measures and General Linear Models. Overall, in the datasets we studied, using the SF-36 outcome, bootstrap methods produce results similar to conventional statistical methods. This is likely because the t-test and linear regression are robust to the violations of assumptions that HRQoL data are likely to cause (i.e. non-Normality. While particular to our datasets, these findings are likely to generalise to other HRQoL outcomes, which have discrete, bounded and skewed distributions. Future research with other HRQoL outcome measures, interventions and populations, is required to
Choi, Sae Il
2009-01-01
This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…
Buonaccorsi, John P; Romeo, Giovanni; Thoresen, Magne
2018-03-01
When fitting regression models, measurement error in any of the predictors typically leads to biased coefficients and incorrect inferences. A plethora of methods have been proposed to correct for this. Obtaining standard errors and confidence intervals using the corrected estimators can be challenging and, in addition, there is concern about remaining bias in the corrected estimators. The bootstrap, which is one option to address these problems, has received limited attention in this context. It has usually been employed by simply resampling observations, which, while suitable in some situations, is not always formally justified. In addition, the simple bootstrap does not allow for estimating bias in non-linear models, including logistic regression. Model-based bootstrapping, which can potentially estimate bias in addition to being robust to the original sampling or whether the measurement error variance is constant or not, has received limited attention. However, it faces challenges that are not present in handling regression models with no measurement error. This article develops new methods for model-based bootstrapping when correcting for measurement error in logistic regression with replicate measures. The methodology is illustrated using two examples, and a series of simulations are carried out to assess and compare the simple and model-based bootstrap methods, as well as other standard methods. While not always perfect, the model-based approaches offer some distinct improvements over the other methods. © 2017, The International Biometric Society.
Ultrafast Approximation for Phylogenetic Bootstrap
Bui Quang Minh, [No Value; Nguyen, Thi; von Haeseler, Arndt
Nonparametric bootstrap has been a widely used tool in phylogenetic analysis to assess the clade support of phylogenetic trees. However, with the rapidly growing amount of data, this task remains a computational bottleneck. Recently, approximation methods such as the RAxML rapid bootstrap (RBS) and
The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.
Rodgers, J L
1999-10-01
A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.
Directory of Open Access Journals (Sweden)
Dropkin Greg
2009-12-01
Full Text Available Abstract Background The International Commission on Radiological Protection (ICRP recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years, liver (36.9, lung (13.6, leukaemia (23.66, and pancreas (11.86 and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of
2009-01-01
Background The International Commission on Radiological Protection (ICRP) recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa) methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR) and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years), liver (36.9), lung (13.6), leukaemia (23.66), and pancreas (11.86) and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose) are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of latency and risk from 10
Karian, Zaven A
2000-01-01
Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...
An approximate analytical approach to resampling averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, M.
2004-01-01
Using a novel reformulation, we develop a framework to compute approximate resampling data averages analytically. The method avoids multiple retraining of statistical models on the samples. Our approach uses a combination of the replica "trick" of statistical physics and the TAP approach for appr...... for approximate Bayesian inference. We demonstrate our approach on regression with Gaussian processes. A comparison with averages obtained by Monte-Carlo sampling shows that our method achieves good accuracy....
Efficient bootstrap estimates for tail statistics
Breivik, Øyvind; Aarnes, Ole Johan
2017-03-01
Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.
Directory of Open Access Journals (Sweden)
Müller Kai F
2005-10-01
Full Text Available Abstract Background For parsimony analyses, the most common way to estimate confidence is by resampling plans (nonparametric bootstrap, jackknife, and Bremer support (Decay indices. The recent literature reveals that parameter settings that are quite commonly employed are not those that are recommended by theoretical considerations and by previous empirical studies. The optimal search strategy to be applied during resampling was previously addressed solely via standard search strategies available in PAUP*. The question of a compromise between search extensiveness and improved support accuracy for Bremer support received even less attention. A set of experiments was conducted on different datasets to find an empirical cut-off point at which increased search extensiveness does not significantly change Bremer support and jackknife or bootstrap proportions any more. Results For the number of replicates needed for accurate estimates of support in resampling plans, a diagram is provided that helps to address the question whether apparently different support values really differ significantly. It is shown that the use of random addition cycles and parsimony ratchet iterations during bootstrapping does not translate into higher support, nor does any extension of the search extensiveness beyond the rather moderate effort of TBR (tree bisection and reconnection branch swapping plus saving one tree per replicate. Instead, in case of very large matrices, saving more than one shortest tree per iteration and using a strict consensus tree of these yields decreased support compared to saving only one tree. This can be interpreted as a small risk of overestimating support but should be more than compensated by other factors that counteract an enhanced type I error. With regard to Bremer support, a rule of thumb can be derived stating that not much is gained relative to the surplus computational effort when searches are extended beyond 20 ratchet iterations per
Internal validation of risk models in clustered data: a comparison of bootstrap schemes
Bouwmeester, W.; Moons, K.G.M.; Kappen, T.H.; van Klei, W.A.; Twisk, J.W.R.; Eijkemans, M.J.C.; Vergouwe, Y.
2013-01-01
Internal validity of a risk model can be studied efficiently with bootstrapping to assess possible optimism in model performance. Assumptions of the regular bootstrap are violated when the development data are clustered. We compared alternative resampling schemes in clustered data for the estimation
Larriba, Yolanda; Rueda, Cristina; Fernández, Miguel A; Peddada, Shyamal D
2018-01-01
Motivation: Gene-expression data obtained from high throughput technologies are subject to various sources of noise and accordingly the raw data are pre-processed before formally analyzed. Normalization of the data is a key pre-processing step, since it removes systematic variations across arrays. There are numerous normalization methods available in the literature. Based on our experience, in the context of oscillatory systems, such as cell-cycle, circadian clock, etc., the choice of the normalization method may substantially impact the determination of a gene to be rhythmic. Thus rhythmicity of a gene can purely be an artifact of how the data were normalized. Since the determination of rhythmic genes is an important component of modern toxicological and pharmacological studies, it is important to determine truly rhythmic genes that are robust to the choice of a normalization method. Results: In this paper we introduce a rhythmicity measure and a bootstrap methodology to detect rhythmic genes in an oscillatory system. Although the proposed methodology can be used for any high-throughput gene expression data, in this paper we illustrate the proposed methodology using several publicly available circadian clock microarray gene-expression datasets. We demonstrate that the choice of normalization method has very little effect on the proposed methodology. Specifically, for any pair of normalization methods considered in this paper, the resulting values of the rhythmicity measure are highly correlated. Thus it suggests that the proposed measure is robust to the choice of a normalization method. Consequently, the rhythmicity of a gene is potentially not a mere artifact of the normalization method used. Lastly, as demonstrated in the paper, the proposed bootstrap methodology can also be used for simulating data for genes participating in an oscillatory system using a reference dataset. Availability: A user friendly code implemented in R language can be downloaded from http://www.eio.uva.es/~miguel/robustdetectionprocedure.html.
Magno, Alexandre
2013-01-01
A practical, step-by-step tutorial on developing websites for mobile using Bootstrap.This book is for anyone who wants to get acquainted with the new features available in Bootstrap 3 and who wants to develop websites with the mobile-first feature of Bootstrap. The reader should have a basic knowledge of Bootstrap as a frontend framework.
Image re-sampling detection through a novel interpolation kernel.
Hilal, Alaa
2018-06-01
Image re-sampling involved in re-size and rotation transformations is an essential element block in a typical digital image alteration. Fortunately, traces left from such processes are detectable, proving that the image has gone a re-sampling transformation. Within this context, we present in this paper two original contributions. First, we propose a new re-sampling interpolation kernel. It depends on five independent parameters that controls its amplitude, angular frequency, standard deviation, and duration. Then, we demonstrate its capacity to imitate the same behavior of the most frequent interpolation kernels used in digital image re-sampling applications. Secondly, the proposed model is used to characterize and detect the correlation coefficients involved in re-sampling transformations. The involved process includes a minimization of an error function using the gradient method. The proposed method is assessed over a large database of 11,000 re-sampled images. Additionally, it is implemented within an algorithm in order to assess images that had undergone complex transformations. Obtained results demonstrate better performance and reduced processing time when compared to a reference method validating the suitability of the proposed approaches. Copyright © 2018 Elsevier B.V. All rights reserved.
Hunink, Maria; Bult, J.R.; De Vries, J; Weinstein, MC
1998-01-01
Purpose. To illustrate the use of a nonparametric bootstrap method in the evaluation of uncertainty in decision models analyzing cost-effectiveness. Methods. The authors reevaluated a previously published cost-effectiveness analysis that used a Markov model comparing initial percutaneous
Storyboarding: A Method for Bootstrapping the Design of Computer-Based Educational Tasks
Jones, Ian
2008-01-01
There has been a recent call for the use of more systematic thought experiments when investigating learning. This paper presents a storyboarding method for capturing and sharing initial ideas and their evolution in the design of a mathematics learning task. The storyboards produced can be considered as "virtual data" created by thought experiments…
Energy Technology Data Exchange (ETDEWEB)
Rastelli, Leonardo [C.N. Yang Institute for Theoretical Physics, Stony Brook University, Stony Brook, NY (United States)
2016-04-15
This contribution collects background material and references for the overview talk that was delivered in the 21st European string workshop and 3rd COST MP1210 meeting, 'The String Theory Universe'. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
Bootstrap consistency for general semiparametric M-estimation
Cheng, Guang; Huang, Jianhua Z.
2010-01-01
, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.
The bootstrap and edgeworth expansion
Hall, Peter
1992-01-01
This monograph addresses two quite different topics, in the belief that each can shed light on the other. Firstly, it lays the foundation for a particular view of the bootstrap. Secondly, it gives an account of Edgeworth expansion. Chapter 1 is about the bootstrap, witih almost no mention of Edgeworth expansion; Chapter 2 is about Edgeworth expansion, with scarcely a word about the bootstrap; and Chapters 3 and 4 bring these two themes together, using Edgeworth expansion to explore and develop the properites of the bootstrap. The book is aimed a a graduate level audience who has some exposure to the methods of theoretical statistics. However, technical details are delayed until the last chapter (entitled "Details of Mathematical Rogour"), and so a mathematically able reader without knowledge of the rigorous theory of probability will have no trouble understanding the first four-fifths of the book. The book simultaneously fills two gaps in the literature; it provides a very readable graduate level account of t...
Energy Technology Data Exchange (ETDEWEB)
Cauthen, Katherine Regina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lambert, Gregory Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Finley, Patrick D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ross, David [US Dept. of Veterans Affairs, Washington, DC (United States); Chartier, Maggie [US Dept. of Veterans Affairs, Washington, DC (United States); Davey, Victoria J. [US Dept. of Veterans Affairs, Washington, DC (United States)
2015-10-01
There is mounting evidence that alcohol use is significantly linked to lower HCV treatment response rates in interferon-based therapies, though some of the evidence is conflicting. Furthermore, although health care providers recommend reducing or abstaining from alcohol use prior to treatment, many patients do not succeed in doing so. The goal of this meta-analysis was to systematically review and summarize the Englishlanguage literature up through January 30, 2015 regarding the relationship between alcohol use and HCV treatment outcomes, among patients who were not required to abstain from alcohol use in order to receive treatment. Seven pertinent articles studying 1,751 HCV-infected patients were identified. Log-ORs of HCV treatment response for heavy alcohol use and light alcohol use were calculated and compared. We employed a hierarchical Bayesian meta-analytic model to accommodate the small sample size. The summary estimate for the log-OR of HCV treatment response was -0.775 with a 95% credible interval of (-1.397, -0.236). The results of the Bayesian meta-analysis are slightly more conservative compared to those obtained from a boot-strapped, random effects model. We found evidence of heterogeneity (Q = 14.489, p = 0.025), accounting for 60.28% of the variation among log-ORs. Meta-regression to capture the sources of this heterogeneity did not identify any of the covariates investigated as significant. This meta-analysis confirms that heavy alcohol use is associated with decreased HCV treatment response compared to lighter levels of alcohol use. Further research is required to characterize the mechanism by which alcohol use affects HCV treatment response.
PARTICLE FILTER BASED VEHICLE TRACKING APPROACH WITH IMPROVED RESAMPLING STAGE
Directory of Open Access Journals (Sweden)
Wei Leong Khong
2014-02-01
Full Text Available Optical sensors based vehicle tracking can be widely implemented in traffic surveillance and flow control. The vast development of video surveillance infrastructure in recent years has drawn the current research focus towards vehicle tracking using high-end and low cost optical sensors. However, tracking vehicles via such sensors could be challenging due to the high probability of changing vehicle appearance and illumination, besides the occlusion and overlapping incidents. Particle filter has been proven as an approach which can overcome nonlinear and non-Gaussian situations caused by cluttered background and occlusion incidents. Unfortunately, conventional particle filter approach encounters particle degeneracy especially during and after the occlusion. Particle filter with sampling important resampling (SIR is an important step to overcome the drawback of particle filter, but SIR faced the problem of sample impoverishment when heavy particles are statistically selected many times. In this work, genetic algorithm has been proposed to be implemented in the particle filter resampling stage, where the estimated position can converge faster to hit the real position of target vehicle under various occlusion incidents. The experimental results show that the improved particle filter with genetic algorithm resampling method manages to increase the tracking accuracy and meanwhile reduce the particle sample size in the resampling stage.
Bootstrapping pronunciation models
CSIR Research Space (South Africa)
Davel, M
2006-07-01
Full Text Available -scarce language. During the procedure known as ‘bootstrapping’, a model is improved iteratively via a controlled series of increments, at each stage using the previous model to generate the next. This self- improving circularity distinguishes bootstrapping...-to-phoneme rules (the second representation) can be used to identify possible errors that require re-verification. In contrast, during the bootstrapping of acoustic models for speech recognition, both representations are amenable to automated analysis...
Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund
2016-02-18
In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. Copyright © 2016 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-02-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c ≥ (13)/(24) for the central charge of such models. A theory that saturates this bound is not known yet.
Energy Technology Data Exchange (ETDEWEB)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group
2017-02-15
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c ≥ (13)/(24) for the central charge of such models. A theory that saturates this bound is not known yet.
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-10-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c≥ 13/24 for the central charge of such models, which we argue cannot be saturated by an interacting SCFT.
DEFF Research Database (Denmark)
Hounyo, Ulrich; Varneskov, Rasmus T.
We provide a new resampling procedure - the local stable bootstrap - that is able to mimic the dependence properties of realized power variations for pure-jump semimartingales observed at different frequencies. This allows us to propose a bootstrap estimator and inference procedure for the activity...... index of the underlying process, β, as well as a bootstrap test for whether it obeys a jump-diffusion or a pure-jump process, that is, of the null hypothesis H₀: β=2 against the alternative H₁: βbootstrap power variations, activity index...... estimator, and diffusion test for H0. Moreover, the finite sample size and power properties of the proposed diffusion test are compared to those of benchmark tests using Monte Carlo simulations. Unlike existing procedures, our bootstrap test is correctly sized in general settings. Finally, we illustrate use...
Energy Technology Data Exchange (ETDEWEB)
Rejon-Barrera, Fernando [Institute for Theoretical Physics, University of Amsterdam,Science Park 904, Postbus 94485, 1090 GL, Amsterdam (Netherlands); Robbins, Daniel [Department of Physics, Texas A& M University,TAMU 4242, College Station, TX 77843 (United States)
2016-01-22
We work out all of the details required for implementation of the conformal bootstrap program applied to the four-point function of two scalars and two vectors in an abstract conformal field theory in arbitrary dimension. This includes a review of which tensor structures make appearances, a construction of the projectors onto the required mixed symmetry representations, and a computation of the conformal blocks for all possible operators which can be exchanged. These blocks are presented as differential operators acting upon the previously known scalar conformal blocks. Finally, we set up the bootstrap equations which implement crossing symmetry. Special attention is given to the case of conserved vectors, where several simplifications occur.
Early Stop Criterion from the Bootstrap Ensemble
DEFF Research Database (Denmark)
Hansen, Lars Kai; Larsen, Jan; Fog, Torben L.
1997-01-01
This paper addresses the problem of generalization error estimation in neural networks. A new early stop criterion based on a Bootstrap estimate of the generalization error is suggested. The estimate does not require the network to be trained to the minimum of the cost function, as required...... by other methods based on asymptotic theory. Moreover, in contrast to methods based on cross-validation which require data left out for testing, and thus biasing the estimate, the Bootstrap technique does not have this disadvantage. The potential of the suggested technique is demonstrated on various time...
Directory of Open Access Journals (Sweden)
Elias Chaibub Neto
Full Text Available In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.
Bootstrapping language acquisition.
Abend, Omri; Kwiatkowski, Tom; Smith, Nathaniel J; Goldwater, Sharon; Steedman, Mark
2017-07-01
The semantic bootstrapping hypothesis proposes that children acquire their native language through exposure to sentences of the language paired with structured representations of their meaning, whose component substructures can be associated with words and syntactic structures used to express these concepts. The child's task is then to learn a language-specific grammar and lexicon based on (probably contextually ambiguous, possibly somewhat noisy) pairs of sentences and their meaning representations (logical forms). Starting from these assumptions, we develop a Bayesian probabilistic account of semantically bootstrapped first-language acquisition in the child, based on techniques from computational parsing and interpretation of unrestricted text. Our learner jointly models (a) word learning: the mapping between components of the given sentential meaning and lexical words (or phrases) of the language, and (b) syntax learning: the projection of lexical elements onto sentences by universal construction-free syntactic rules. Using an incremental learning algorithm, we apply the model to a dataset of real syntactically complex child-directed utterances and (pseudo) logical forms, the latter including contextually plausible but irrelevant distractors. Taking the Eve section of the CHILDES corpus as input, the model simulates several well-documented phenomena from the developmental literature. In particular, the model exhibits syntactic bootstrapping effects (in which previously learned constructions facilitate the learning of novel words), sudden jumps in learning without explicit parameter setting, acceleration of word-learning (the "vocabulary spurt"), an initial bias favoring the learning of nouns over verbs, and one-shot learning of words and their meanings. The learner thus demonstrates how statistical learning over structured representations can provide a unified account for these seemingly disparate phenomena. Copyright © 2017 Elsevier B.V. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Iliesiu, Luca [Joseph Henry Laboratories, Princeton University, Princeton, NJ 08544 (United States); Kos, Filip; Poland, David [Department of Physics, Yale University, New Haven, CT 06520 (United States); Pufu, Silviu S. [Joseph Henry Laboratories, Princeton University, Princeton, NJ 08544 (United States); Simmons-Duffin, David [School of Natural Sciences, Institute for Advanced Study, Princeton, NJ 08540 (United States); Yacoby, Ran [Joseph Henry Laboratories, Princeton University, Princeton, NJ 08544 (United States)
2016-03-17
We study the conformal bootstrap for a 4-point function of fermions 〈ψψψψ〉 in 3D. We first introduce an embedding formalism for 3D spinors and compute the conformal blocks appearing in fermion 4-point functions. Using these results, we find general bounds on the dimensions of operators appearing in the ψ×ψ OPE, and also on the central charge C{sub T}. We observe features in our bounds that coincide with scaling dimensions in the Gross-Neveu models at large N. We also speculate that other features could coincide with a fermionic CFT containing no relevant scalar operators.
Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K
2018-03-01
The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.
More N =4 superconformal bootstrap
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C.
2017-08-01
In this long overdue second installment, we continue to develop the conformal bootstrap program for N =4 superconformal field theories (SCFTs) in four dimensions via an analysis of the correlation function of four stress-tensor supermultiplets. We review analytic results for this correlator and make contact with the SCFT/chiral algebra correspondence of Beem et al. [Commun. Math. Phys. 336, 1359 (2015), 10.1007/s00220-014-2272-x]. We demonstrate that the constraints of unitarity and crossing symmetry require the central charge c to be greater than or equal to 3 /4 in any interacting N =4 SCFT. We apply numerical bootstrap methods to derive upper bounds on scaling dimensions and operator product expansion coefficients for several low-lying, unprotected operators as a function of the central charge. We interpret our bounds in the context of N =4 super Yang-Mills theories, formulating a series of conjectures regarding the embedding of the conformal manifold—parametrized by the complexified gauge coupling—into the space of scaling dimensions and operator product expansion coefficients. Our conjectures assign a distinguished role to points on the conformal manifold that are self-dual under a subgroup of the S -duality group. This paper contains a more detailed exposition of a number of results previously reported in Beem et al. [Phys. Rev. Lett. 111, 071601 (2013), 10.1103/PhysRevLett.111.071601] in addition to new results.
Adams, Jenny; Cheng, Dunlei; Lee, John; Shock, Tiffany; Kennedy, Kathleen; Pate, Scotty
2014-07-01
Physical fitness testing is a common tool for motivating employees with strenuous occupations to reach and maintain a minimum level of fitness. Nevertheless, the use of such tests can be hampered by several factors, including required compliance with US antidiscrimination laws. The Highland Park (Texas) Department of Public Safety implemented testing in 1991, but no single test adequately evaluated its sworn employees, who are cross-trained and serve as police officers and firefighters. In 2010, the department's fitness experts worked with exercise physiologists from Baylor Heart and Vascular Hospital to develop and evaluate a single test that would be equitable regardless of race/ethnicity, disability, sex, or age >50 years. The new test comprised a series of exercises to assess overall fitness, followed by two sequences of job-specific tasks related to firefighting and police work, respectively. The study group of 50 public safety officers took the test; raw data (e.g., the number of repetitions performed or the time required to complete a task) were collected during three quarterly testing sessions. The statistical bootstrap method was then used to determine the levels of performance that would correlate with 0, 1, 2, or 3 points for each task. A sensitivity analysis was done to determine the overall minimum passing score of 17 points. The new physical fitness test and scoring system have been incorporated into the department's policies and procedures as part of the town's overall employee fitness program.
Bootstrapping Kernel-Based Semiparametric Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael
by accommodating a non-negligible bias. A noteworthy feature of the assumptions under which the result is obtained is that reliance on a commonly employed stochastic equicontinuity condition is avoided. The second main result shows that the bootstrap provides an automatic method of correcting for the bias even...... when it is non-negligible....
Inverse bootstrapping conformal field theories
Li, Wenliang
2018-01-01
We propose a novel approach to study conformal field theories (CFTs) in general dimensions. In the conformal bootstrap program, one usually searches for consistent CFT data that satisfy crossing symmetry. In the new method, we reverse the logic and interpret manifestly crossing-symmetric functions as generating functions of conformal data. Physical CFTs can be obtained by scanning the space of crossing-symmetric functions. By truncating the fusion rules, we are able to concentrate on the low-lying operators and derive some approximate relations for their conformal data. It turns out that the free scalar theory, the 2d minimal model CFTs, the ϕ 4 Wilson-Fisher CFT, the Lee-Yang CFTs and the Ising CFTs are consistent with the universal relations from the minimal fusion rule ϕ 1 × ϕ 1 = I + ϕ 2 + T , where ϕ 1 , ϕ 2 are scalar operators, I is the identity operator and T is the stress tensor.
Directory of Open Access Journals (Sweden)
Rizwan Raheem Ahmed
2017-03-01
Full Text Available The objective of this research is to examine the effect of business ethics (BE and intellectual capital (IC on the organizational performance (OP. In order to run this study, a conceptual model was designed based on the literature review, and the employees of the knowledge-based organization in pharmaceutical sector were surveyed using a closed-ended questionnaire. Modern successful and thriving organizations are those that create IC and convert it into applicable methods to improve their activities and performance within the boundaries of BE. This research is exploratory and quantitative in nature: 400 responses were directly gathered from the employees of the pharmaceutical industry through five-scaled questionnaire. This research examined the direct and indirect effect of BE and IC on the OP. Structural equation modeling (SEM, descriptive statistics, correlation, multiple regression techniques were used to analyze the impact of IC and BE on the performance. Bootstrapping method is employed in order to test the mediating effect of variables. Two-step SEM was used to the models to regress the cause and effect relation. The findings depicted that there is a very significant effect on BE and IC in the performance of pharmaceutical organizations. General BE, ethics in finance, ethics in human resource management, and ethics in sales and marketing have direct and significant impact on the OP. Human capital, structural capital and relational capital have significant indirect (mediating effect on the performance of the pharmaceutical industry. Finally, it has been concluded from the results of the research study that IC is the major contributor of the OP as a mediating variable with defined set of principles of BE in the pharmaceutical sector of Pakistan.
Directory of Open Access Journals (Sweden)
Rizwan Raheem Ahmed
2018-05-01
Full Text Available The objective of this research article is to examine the role of Pakistan’s pharmaceutical industry in job creation opportunities, with the sacred intention to eradicate poverty, and expansion in economic activities. This research is quantitative in nature, and the data is directly gathered through closed-ended questionnaires from 300 respondents. Besides predictors’, four mediating variables have also been taken into consideration that contribute indirectly in job creation opportunities. Bootstrapping and Normal theory methods have been employed in order to examine the impact of predictors’ and mediating variables. The result of this research confirmed that pharmaceutical industry plays a vital role in job creation in Pakistan. It is further concluded that the pharmaceutical industry has a direct and significant impact in job creation by providing indigenous and direct job opportunities in sales, marketing, and other supporting departments for both skilled and unskilled workers. Pharmaceutical industry also provides indirect job opportunities through other industries, which are very much linked with this industry, such as: pharmaceutical distributors, dealers, retailers, wholesalers, hotel industry, and event management industry. It is also determined that pharmaceutical industry is acting like knowledge and skills imparting institutions. Therefore, skilled-based training and organizational learning are major mediating variables that transform unskilled people into human assets, which further trigger the future job prospects. Since pharmaceutical industry is one of the biggest industries in Pakistan, providing plenteous opportunities of new jobs with consistent growth. Thus, mediating variables such as motivation and interpersonal influence also preceded an active role in new job creation
Bootstrapping quarks and gluons
Energy Technology Data Exchange (ETDEWEB)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges that lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces.
Bootstrapping quarks and gluons
International Nuclear Information System (INIS)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges that lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces
Bootstrap Dynamical Symmetry Breaking
Directory of Open Access Journals (Sweden)
Wei-Shu Hou
2013-01-01
Full Text Available Despite the emergence of a 125 GeV Higgs-like particle at the LHC, we explore the possibility of dynamical electroweak symmetry breaking by strong Yukawa coupling of very heavy new chiral quarks Q . Taking the 125 GeV object to be a dilaton with suppressed couplings, we note that the Goldstone bosons G exist as longitudinal modes V L of the weak bosons and would couple to Q with Yukawa coupling λ Q . With m Q ≳ 700 GeV from LHC, the strong λ Q ≳ 4 could lead to deeply bound Q Q ¯ states. We postulate that the leading “collapsed state,” the color-singlet (heavy isotriplet, pseudoscalar Q Q ¯ meson π 1 , is G itself, and a gap equation without Higgs is constructed. Dynamical symmetry breaking is affected via strong λ Q , generating m Q while self-consistently justifying treating G as massless in the loop, hence, “bootstrap,” Solving such a gap equation, we find that m Q should be several TeV, or λ Q ≳ 4 π , and would become much heavier if there is a light Higgs boson. For such heavy chiral quarks, we find analogy with the π − N system, by which we conjecture the possible annihilation phenomena of Q Q ¯ → n V L with high multiplicity, the search of which might be aided by Yukawa-bound Q Q ¯ resonances.
Bootstrap Power of Time Series Goodness of fit tests
Directory of Open Access Journals (Sweden)
Sohail Chand
2013-10-01
Full Text Available In this article, we looked at power of various versions of Box and Pierce statistic and Cramer von Mises test. An extensive simulation study has been conducted to compare the power of these tests. Algorithms have been provided for the power calculations and comparison has also been made between the semi parametric bootstrap methods used for time series. Results show that Box-Pierce statistic and its various versions have good power against linear time series models but poor power against non linear models while situation reverses for Cramer von Mises test. Moreover, we found that dynamic bootstrap method is better than xed design bootstrap method.
Speckle reduction in digital holography with resampling ring masks
Zhang, Wenhui; Cao, Liangcai; Jin, Guofan
2018-01-01
One-shot digital holographic imaging has the advantages of high stability and low temporal cost. However, the reconstruction is affected by the speckle noise. Resampling ring-mask method in spectrum domain is proposed for speckle reduction. The useful spectrum of one hologram is divided into several sub-spectra by ring masks. In the reconstruction, angular spectrum transform is applied to guarantee the calculation accuracy which has no approximation. N reconstructed amplitude images are calculated from the corresponding sub-spectra. Thanks to speckle's random distribution, superimposing these N uncorrelated amplitude images would lead to a final reconstructed image with lower speckle noise. Normalized relative standard deviation values of the reconstructed image are used to evaluate the reduction of speckle. Effect of the method on the spatial resolution of the reconstructed image is also quantitatively evaluated. Experimental and simulation results prove the feasibility and effectiveness of the proposed method.
International Nuclear Information System (INIS)
Wang, Guodong; He, Zhen; Xue, Li; Cui, Qingan; Lv, Shanshan; Zhou, Panpan
2017-01-01
Factors which significantly affect product reliability are of great interest to reliability practitioners. This paper proposes a bootstrap-based methodology for identifying significant factors when both location and scale parameters of the smallest extreme value distribution vary over experimental factors. An industrial thermostat experiment is presented, analyzed, and discussed as an illustrative example. The analysis results show that 1) the misspecification of a constant scale parameter may lead to misidentify spurious effects; 2) the important factors identified by different bootstrap methods (i.e., percentile bootstrapping, bias-corrected percentile bootstrapping, and bias-corrected and accelerated percentile bootstrapping) are different; 3) the number of factors affecting 10th percentile lifetime significantly is less than the number of important factors identified at 63.21th percentile. - Highlights: • Product reliability is improved by design of experiments under both scale and location parameters of smallest extreme value distribution vary with experimental factors. • A bootstrap-based methodology is proposed to identify important factors which affect 100pth lifetime percentile significantly. • Bootstrapping confidence intervals associating experimental factors are obtained by using three bootstrap methods (i.e., percentile bootstrapping, bias-corrected percentile bootstrapping, and bias-corrected and accelerated percentile bootstrapping). • The important factors identified by different bootstrap methods are different. • The number of factors affecting 10th percentile significantly is less than the number of important factors identified at 63.21th percentile.
Bootstrapping pronunciation dictionaries: practical issues
CSIR Research Space (South Africa)
Davel, MH
2005-09-01
Full Text Available Bootstrapping techniques are an efficient way to develop electronic pronunciation dictionaries, but require fast system response to be practical for medium-to-large lexicons. In addition, user errors are inevitable during this process...
Bootstrapping in language resource generation
CSIR Research Space (South Africa)
Davel, MH
2003-11-01
Full Text Available by Schultz [1]. Bootstrapping approaches are applicable to various lan- guage resource development tasks, specifically where an au- tomated mechanism can be defined to convert between vari- ous representations of the data considered. In the above ex...
Automated modal parameter estimation using correlation analysis and bootstrap sampling
Yaghoubi, Vahid; Vakilzadeh, Majid K.; Abrahamsson, Thomas J. S.
2018-02-01
The estimation of modal parameters from a set of noisy measured data is a highly judgmental task, with user expertise playing a significant role in distinguishing between estimated physical and noise modes of a test-piece. Various methods have been developed to automate this procedure. The common approach is to identify models with different orders and cluster similar modes together. However, most proposed methods based on this approach suffer from high-dimensional optimization problems in either the estimation or clustering step. To overcome this problem, this study presents an algorithm for autonomous modal parameter estimation in which the only required optimization is performed in a three-dimensional space. To this end, a subspace-based identification method is employed for the estimation and a non-iterative correlation-based method is used for the clustering. This clustering is at the heart of the paper. The keys to success are correlation metrics that are able to treat the problems of spatial eigenvector aliasing and nonunique eigenvectors of coalescent modes simultaneously. The algorithm commences by the identification of an excessively high-order model from frequency response function test data. The high number of modes of this model provides bases for two subspaces: one for likely physical modes of the tested system and one for its complement dubbed the subspace of noise modes. By employing the bootstrap resampling technique, several subsets are generated from the same basic dataset and for each of them a model is identified to form a set of models. Then, by correlation analysis with the two aforementioned subspaces, highly correlated modes of these models which appear repeatedly are clustered together and the noise modes are collected in a so-called Trashbox cluster. Stray noise modes attracted to the mode clusters are trimmed away in a second step by correlation analysis. The final step of the algorithm is a fuzzy c-means clustering procedure applied to
Bootstrap inference when using multiple imputation.
Schomaker, Michael; Heumann, Christian
2018-04-16
Many modern estimators require bootstrapping to calculate confidence intervals because either no analytic standard error is available or the distribution of the parameter of interest is nonsymmetric. It remains however unclear how to obtain valid bootstrap inference when dealing with multiple imputation to address missing data. We present 4 methods that are intuitively appealing, easy to implement, and combine bootstrap estimation with multiple imputation. We show that 3 of the 4 approaches yield valid inference, but that the performance of the methods varies with respect to the number of imputed data sets and the extent of missingness. Simulation studies reveal the behavior of our approaches in finite samples. A topical analysis from HIV treatment research, which determines the optimal timing of antiretroviral treatment initiation in young children, demonstrates the practical implications of the 4 methods in a sophisticated and realistic setting. This analysis suffers from missing data and uses the g-formula for inference, a method for which no standard errors are available. Copyright © 2018 John Wiley & Sons, Ltd.
Bootstrap prediction and Bayesian prediction under misspecified models
Fushiki, Tadayoshi
2005-01-01
We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...
Bootstrap consistency for general semiparametric M-estimation
Cheng, Guang
2010-10-01
Consider M-estimation in a semiparametric model that is characterized by a Euclidean parameter of interest and an infinite-dimensional nuisance parameter. As a general purpose approach to statistical inferences, the bootstrap has found wide applications in semiparametric M-estimation and, because of its simplicity, provides an attractive alternative to the inference approach based on the asymptotic distribution theory. The purpose of this paper is to provide theoretical justifications for the use of bootstrap as a semiparametric inferential tool. We show that, under general conditions, the bootstrap is asymptotically consistent in estimating the distribution of the M-estimate of Euclidean parameter; that is, the bootstrap distribution asymptotically imitates the distribution of the M-estimate. We also show that the bootstrap confidence set has the asymptotically correct coverage probability. These general onclusions hold, in particular, when the nuisance parameter is not estimable at root-n rate, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.
Bootstrapping pre-averaged realized volatility under market microstructure noise
DEFF Research Database (Denmark)
Hounyo, Ulrich; Goncalves, Sílvia; Meddahi, Nour
The main contribution of this paper is to propose a bootstrap method for inference on integrated volatility based on the pre-averaging approach of Jacod et al. (2009), where the pre-averaging is done over all possible overlapping blocks of consecutive observations. The overlapping nature of the pre......-averaged returns implies that these are kn-dependent with kn growing slowly with the sample size n. This motivates the application of a blockwise bootstrap method. We show that the "blocks of blocks" bootstrap method suggested by Politis and Romano (1992) (and further studied by Bühlmann and Künsch (1995......)) is valid only when volatility is constant. The failure of the blocks of blocks bootstrap is due to the heterogeneity of the squared pre-averaged returns when volatility is stochastic. To preserve both the dependence and the heterogeneity of squared pre-averaged returns, we propose a novel procedure...
Bootstrapping a time series model
International Nuclear Information System (INIS)
Son, M.S.
1984-01-01
The bootstrap is a methodology for estimating standard errors. The idea is to use a Monte Carlo simulation experiment based on a nonparametric estimate of the error distribution. The main objective of this dissertation was to demonstrate the use of the bootstrap to attach standard errors to coefficient estimates and multi-period forecasts in a second-order autoregressive model fitted by least squares and maximum likelihood estimation. A secondary objective of this article was to present the bootstrap in the context of two econometric equations describing the unemployment rate and individual income tax in the state of Oklahoma. As it turns out, the conventional asymptotic formulae (both the least squares and maximum likelihood estimates) for estimating standard errors appear to overestimate the true standard errors. But there are two problems: 1) the first two observations y 1 and y 2 have been fixed, and 2) the residuals have not been inflated. After these two factors are considered in the trial and bootstrap experiment, both the conventional maximum likelihood and bootstrap estimates of the standard errors appear to be performing quite well. At present, there does not seem to be a good rule of thumb for deciding when the conventional asymptotic formulae will give acceptable results
Bias Correction with Jackknife, Bootstrap, and Taylor Series
Jiao, Jiantao; Han, Yanjun; Weissman, Tsachy
2017-01-01
We analyze the bias correction methods using jackknife, bootstrap, and Taylor series. We focus on the binomial model, and consider the problem of bias correction for estimating $f(p)$, where $f \\in C[0,1]$ is arbitrary. We characterize the supremum norm of the bias of general jackknife and bootstrap estimators for any continuous functions, and demonstrate the in delete-$d$ jackknife, different values of $d$ may lead to drastically different behavior in jackknife. We show that in the binomial ...
Neoclassical bootstrap current and transport in optimized stellarator configurations
International Nuclear Information System (INIS)
Maassberg, H.; Lotz, W.; Nuehrenberg, J.
1993-01-01
The neoclassical bootstrap current properties of optimized stellarators are analyzed in the relevant mean-free-path regimes and compared with the neoclassical transport properties. Two methods---global Monte Carlo simulation [Phys. Fluids 31, 2984 (1988)], and local analysis with the drift kinetic equation solver code [Phys. Fluids B 1, 563 (1989)]---are employed and good agreement is obtained. Full consistency with the elimination of the bootstrap current and favorable neoclassical transport are found
Mattfeldt, Torsten
2011-04-01
Computer-intensive methods may be defined as data analytical procedures involving a huge number of highly repetitive computations. We mention resampling methods with replacement (bootstrap methods), resampling methods without replacement (randomization tests) and simulation methods. The resampling methods are based on simple and robust principles and are largely free from distributional assumptions. Bootstrap methods may be used to compute confidence intervals for a scalar model parameter and for summary statistics from replicated planar point patterns, and for significance tests. For some simple models of planar point processes, point patterns can be simulated by elementary Monte Carlo methods. The simulation of models with more complex interaction properties usually requires more advanced computing methods. In this context, we mention simulation of Gibbs processes with Markov chain Monte Carlo methods using the Metropolis-Hastings algorithm. An alternative to simulations on the basis of a parametric model consists of stochastic reconstruction methods. The basic ideas behind the methods are briefly reviewed and illustrated by simple worked examples in order to encourage novices in the field to use computer-intensive methods. © 2010 The Authors Journal of Microscopy © 2010 Royal Microscopical Society.
VOYAGER 1 SATURN MAGNETOMETER RESAMPLED DATA 9.60 SEC
National Aeronautics and Space Administration — This data set includes Voyager 1 Saturn encounter magnetometer data that have been resampled at a 9.6 second sample rate. The data set is composed of 6 columns: 1)...
VOYAGER 2 JUPITER MAGNETOMETER RESAMPLED DATA 48.0 SEC
National Aeronautics and Space Administration — This data set includes Voyager 2 Jupiter encounter magnetometer data that have been resampled at a 48.0 second sample rate. The data set is composed of 6 columns: 1)...
NAIP Aerial Imagery (Resampled), Salton Sea - 2005 [ds425
California Natural Resource Agency — NAIP 2005 aerial imagery that has been resampled from 1-meter source resolution to approximately 30-meter resolution. This is a mosaic composed from several NAIP...
Georgopoulos, A. P.; Tan, H.-R. M.; Lewis, S. M.; Leuthold, A. C.; Winskowski, A. M.; Lynch, J. K.; Engdahl, B.
2010-02-01
Traumatic experiences can produce post-traumatic stress disorder (PTSD) which is a debilitating condition and for which no biomarker currently exists (Institute of Medicine (US) 2006 Posttraumatic Stress Disorder: Diagnosis and Assessment (Washington, DC: National Academies)). Here we show that the synchronous neural interactions (SNI) test which assesses the functional interactions among neural populations derived from magnetoencephalographic (MEG) recordings (Georgopoulos A P et al 2007 J. Neural Eng. 4 349-55) can successfully differentiate PTSD patients from healthy control subjects. Externally cross-validated, bootstrap-based analyses yielded >90% overall accuracy of classification. In addition, all but one of 18 patients who were not receiving medications for their disease were correctly classified. Altogether, these findings document robust differences in brain function between the PTSD and control groups that can be used for differential diagnosis and which possess the potential for assessing and monitoring disease progression and effects of therapy.
Estimating variability in functional images using a synthetic resampling approach
International Nuclear Information System (INIS)
Maitra, R.; O'Sullivan, F.
1996-01-01
Functional imaging of biologic parameters like in vivo tissue metabolism is made possible by Positron Emission Tomography (PET). Many techniques, such as mixture analysis, have been suggested for extracting such images from dynamic sequences of reconstructed PET scans. Methods for assessing the variability in these functional images are of scientific interest. The nonlinearity of the methods used in the mixture analysis approach makes analytic formulae for estimating variability intractable. The usual resampling approach is infeasible because of the prohibitive computational effort in simulating a number of sinogram. datasets, applying image reconstruction, and generating parametric images for each replication. Here we introduce an approach that approximates the distribution of the reconstructed PET images by a Gaussian random field and generates synthetic realizations in the imaging domain. This eliminates the reconstruction steps in generating each simulated functional image and is therefore practical. Results of experiments done to evaluate the approach on a model one-dimensional problem are very encouraging. Post-processing of the estimated variances is seen to improve the accuracy of the estimation method. Mixture analysis is used to estimate functional images; however, the suggested approach is general enough to extend to other parametric imaging methods
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
RELATIVE ORIENTATION AND MODIFIED PIECEWISE EPIPOLAR RESAMPLING FOR HIGH RESOLUTION SATELLITE IMAGES
Directory of Open Access Journals (Sweden)
K. Gong
2017-05-01
Full Text Available High resolution, optical satellite sensors are boosted to a new era in the last few years, because satellite stereo images at half meter or even 30cm resolution are available. Nowadays, high resolution satellite image data have been commonly used for Digital Surface Model (DSM generation and 3D reconstruction. It is common that the Rational Polynomial Coefficients (RPCs provided by the vendors have rough precision and there is no ground control information available to refine the RPCs. Therefore, we present two relative orientation methods by using corresponding image points only: the first method will use quasi ground control information, which is generated from the corresponding points and rough RPCs, for the bias-compensation model; the second method will estimate the relative pointing errors on the matching image and remove this error by an affine model. Both methods do not need ground control information and are applied for the entire image. To get very dense point clouds, the Semi-Global Matching (SGM method is an efficient tool. However, before accomplishing the matching process the epipolar constraints are required. In most conditions, satellite images have very large dimensions, contrary to the epipolar geometry generation and image resampling, which is usually carried out in small tiles. This paper also presents a modified piecewise epipolar resampling method for the entire image without tiling. The quality of the proposed relative orientation and epipolar resampling method are evaluated, and finally sub-pixel accuracy has been achieved in our work.
Bootstrapping N=3 superconformal theories
Energy Technology Data Exchange (ETDEWEB)
Lemos, Madalena; Liendo, Pedro [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Meneghelli, Carlo [Stony Brook Univ., Stony Brook, NY (United States). Simons Center for Geometry and Physics; Mitev, Vladimir [Mainz Univ. (Germany). PRISMA Cluster of Excellence
2016-12-15
We initiate the bootstrap program for N=3 superconformal field theories (SCFTs) in four dimensions. The problem is considered from two fronts: the protected subsector described by a 2d chiral algebra, and crossing symmetry for half-BPS operators whose superconformal primaries parametrize the Coulomb branch of N=3 theories. With the goal of describing a protected subsector of a family of =3 SCFTs, we propose a new 2d chiral algebra with super Virasoro symmetry that depends on an arbitrary parameter, identified with the central charge of the theory. Turning to the crossing equations, we work out the superconformal block expansion and apply standard numerical bootstrap techniques in order to constrain the CFT data. We obtain bounds valid for any theory but also, thanks to input from the chiral algebra results, we are able to exclude solutions with N=4 supersymmetry, allowing us to zoom in on a specific N=3 SCFT.
Bootstrapping N=3 superconformal theories
Energy Technology Data Exchange (ETDEWEB)
Lemos, Madalena; Liendo, Pedro [DESY Hamburg, Theory Group,Notkestrasse 85, D-22607 Hamburg (Germany); Meneghelli, Carlo [Simons Center for Geometry and Physics,Stony Brook University, Stony Brook, NY 11794-3636 (United States); Mitev, Vladimir [PRISMA Cluster of Excellence, Institut für Physik,JGU Mainz, Staudingerweg 7, 55128 Mainz (Germany)
2017-04-06
We initiate the bootstrap program for N=3 superconformal field theories (SCFTs) in four dimensions. The problem is considered from two fronts: the protected subsector described by a 2d chiral algebra, and crossing symmetry for half-BPS operators whose superconformal primaries parametrize the Coulomb branch of N=3 theories. With the goal of describing a protected subsector of a family of N=3 SCFTs, we propose a new 2d chiral algebra with super Virasoro symmetry that depends on an arbitrary parameter, identified with the central charge of the theory. Turning to the crossing equations, we work out the superconformal block expansion and apply standard numerical bootstrap techniques in order to constrain the CFT data. We obtain bounds valid for any theory but also, thanks to input from the chiral algebra results, we are able to exclude solutions with N=4 supersymmetry, allowing us to zoom in on a specific N=3 SCFT.
On a generalized bootstrap principle
International Nuclear Information System (INIS)
Corrigan, E.; Sasaki, R.; Dorey, P.E.
1993-01-01
The S-matrices for non-simply-laced affine Toda field theories are considered in the context of a generalized bootstrap principle. The S-matrices, and in particular their poles, depend on a parameter whose range lies between the Coxeter numbers of dual pairs of the corresponding non-simply-laced algebras. It is proposed that only odd order poles in the physical strip with positive coefficients throughout this range should participate in the bootstrap. All other singularities have an explanation in principle in terms of a generalized Coleman-Thun mechanism. Besides the S-matrices introduced by Delius, Grisaru and Zanon, the missing case (F 4 (1) , e 6 (2) ), is also considered and provides many interesting examples of pole generation. (author)
Bootstrapping N=2 chiral correlators
Lemos, Madalena; Liendo, Pedro
2016-01-01
We apply the numerical bootstrap program to chiral operators in four-dimensional N=2 SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of N=2 SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
Bootstrapping N=2 chiral correlators
Energy Technology Data Exchange (ETDEWEB)
Lemos, Madalena [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Liendo, Pedro [Humboldt-Univ. Berlin (Germany). IMIP
2015-12-15
We apply the numerical bootstrap program to chiral operators in four-dimensional N=2 SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of N=2 SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
Bootstrapping N=2 chiral correlators
Energy Technology Data Exchange (ETDEWEB)
Lemos, Madalena [DESY Hamburg, Theory Group,Notkestrasse 85, D-22607 Hamburg (Germany); Liendo, Pedro [IMIP, Humboldt-Universität zu Berlin, IRIS Adlershof,Zum Großen Windkanal 6, 12489 Berlin (Germany)
2016-01-07
We apply the numerical bootstrap program to chiral operators in four-dimensional N=2 SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of N=2 SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
Beckers, J.; Weerts, A.; Tijdeman, E.; Welles, E.; McManamon, A.
2013-12-01
To provide reliable and accurate seasonal streamflow forecasts for water resources management several operational hydrologic agencies and hydropower companies around the world use the Extended Streamflow Prediction (ESP) procedure. The ESP in its original implementation does not accommodate for any additional information that the forecaster may have about expected deviations from climatology in the near future. Several attempts have been conducted to improve the skill of the ESP forecast, especially for areas which are affected by teleconnetions (e,g. ENSO, PDO) via selection (Hamlet and Lettenmaier, 1999) or weighting schemes (Werner et al., 2004; Wood and Lettenmaier, 2006; Najafi et al., 2012). A disadvantage of such schemes is that they lead to a reduction of the signal to noise ratio of the probabilistic forecast. To overcome this, we propose a resampling method conditional on climate indices to generate meteorological time series to be used in the ESP. The method can be used to generate a large number of meteorological ensemble members in order to improve the statistical properties of the ensemble. The effectiveness of the method was demonstrated in a real-time operational hydrologic seasonal forecasts system for the Columbia River basin operated by the Bonneville Power Administration. The forecast skill of the k-nn resampler was tested against the original ESP for three basins at the long-range seasonal time scale. The BSS and CRPSS were used to compare the results to those of the original ESP method. Positive forecast skill scores were found for the resampler method conditioned on different indices for the prediction of spring peak flows in the Dworshak and Hungry Horse basin. For the Libby Dam basin however, no improvement of skill was found. The proposed resampling method is a promising practical approach that can add skill to ESP forecasts at the seasonal time scale. Further improvement is possible by fine tuning the method and selecting the most
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
formal study in this direction. To solve this problem, we introduce the concept of anonymous steganography: think of a leaker Lea who wants to leak a large document to Joe the journalist. Using anonymous steganography Lea can embed this document in innocent looking communication on some popular website...... anonymous steganography, { A construction showing that anonymous steganography is possible (which uses recent results in circuits obfuscation), { A lower bound on the number of bits which are needed to bootstrap anonymous communication....
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
formal study in this direction. To solve this problem, we introduce the concept of anonymous steganography: think of a leaker Lea who wants to leak a large document to Joe the journalist. Using anonymous steganography Lea can embed this document in innocent looking communication on some popular website...... defining anonymous steganography, - A construction showing that anonymous steganography is possible (which uses recent results in circuits obfuscation), - A lower bound on the number of bits which are needed to bootstrap anonymous communication....
Mobile first design : using Bootstrap
Bhusal, Bipin
2017-01-01
The aim of this project was to design and build a website for a company based in Australia. The business offers remedial massage therapy to its clients. It is a small business which works on the basis of calls and message reservation. The business currently has a temporary website designed with Wix, a cloud-based web development platform. The new website was built with responsive design using Bootstrap. This website was intended for the customers using mobile internet browsers. This design is...
Point Set Denoising Using Bootstrap-Based Radial Basis Function.
Directory of Open Access Journals (Sweden)
Khang Jie Liew
Full Text Available This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.
Point Set Denoising Using Bootstrap-Based Radial Basis Function.
Liew, Khang Jie; Ramli, Ahmad; Abd Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.
Energy Technology Data Exchange (ETDEWEB)
Gazut, St
2007-03-15
This thesis addresses the problem of the construction of surrogate models in numerical simulation. Whenever numerical experiments are costly, the simulation model is complex and difficult to use. It is important then to select the numerical experiments as efficiently as possible in order to minimize their number. In statistics, the selection of experiments is known as optimal experimental design. In the context of numerical simulation where no measurement uncertainty is present, we describe an alternative approach based on statistical learning theory and re-sampling techniques. The surrogate models are constructed using neural networks and the generalization error is estimated by leave-one-out, cross-validation and bootstrap. It is shown that the bootstrap can control the over-fitting and extend the concept of leverage for non linear in their parameters surrogate models. The thesis describes an iterative method called LDR for Learner Disagreement from experiment Re-sampling, based on active learning using several surrogate models constructed on bootstrap samples. The method consists in adding new experiments where the predictors constructed from bootstrap samples disagree most. We compare the LDR method with other methods of experimental design such as D-optimal selection. (author)
Physics issues of high bootstrap current tokamaks
International Nuclear Information System (INIS)
Ozeki, T.; Azumi, M.; Ishii, Y.
1997-01-01
Physics issues of a tokamak plasma with a hollow current profile produced by a large bootstrap current are discussed based on experiments in JT-60U. An internal transport barrier for both ions and electrons was obtained just inside the radius of zero magnetic shear in JT-60U. Analysis of the toroidal ITG microinstability by toroidal particle simulation shows that weak and negative shear reduces the toroidal coupling and suppresses the ITG mode. A hard beta limit was observed in JT-60U negative shear experiments. Ideal MHD mode analysis shows that the n = 1 pressure-driven kink mode is a plausible candidate. One of the methods to improve the beta limit against the kink mode is to widen the negative shear region, which can induce a broader pressure profile resulting in a higher beta limit. The TAE mode for the hollow current profile is less unstable than that for the monotonic current profile. The reason is that the continuum gaps near the zero shear region are not aligned when the radius of q min is close to the region of high ∇n e . Finally, a method for stable start-up for a plasma with a hollow current profile is describe, and stable sustainment of a steady-state plasma with high bootstrap current is discussed. (Author)
Yu, Ling-Yuan; Chen, Zhen-Zhen; Zheng, Fang-Qiang; Shi, Ai-Ju; Guo, Ting-Ting; Yeh, Bao-Hua; Chi, Hsin; Xu, Yong-Yu
2013-02-01
The life table of the green lacewing, Chrysopa pallens (Rambur), was studied at 22 degrees C, a photoperiod of 15:9 (L:D) h, and 80% relative humidity in the laboratory. The raw data were analyzed using the age-stage, two-sex life table. The intrinsic rate of increase (r), the finite rate of increase (lambda), the net reproduction rate (R0), and the mean generation time (T) of Ch. pallens were 0.1258 d(-1), 1.1340 d(-1), 241.4 offspring and 43.6 d, respectively. For the estimation of the means, variances, and SEs of the population parameters, we compared the jackknife and bootstrap techniques. Although similar values of the means and SEs were obtained with both techniques, significant differences were observed in the frequency distribution and variances of all parameters. The jackknife technique will result in a zero net reproductive rate upon the omission of a male, an immature death, or a nonreproductive female. This result represents, however, a contradiction because an intrinsic rate of increase exists in this situation. Therefore, we suggest that the jackknife technique should not be used for the estimation of population parameters. In predator-prey interactions, the nonpredatory egg and pupal stages of the predator are time refuges for the prey, and the pest population can grow during these times. In this study, a population projection based on the age-stage, two-sex life table is used to determine the optimal interval between releases to fill the predation gaps and maintain the predatory capacity of the control agent.
Dahm, T.; Heimann, S.; Isken, M.; Vasyura-Bathke, H.; Kühn, D.; Sudhaus, H.; Kriegerowski, M.; Daout, S.; Steinberg, A.; Cesca, S.
2017-12-01
Seismic source and moment tensor waveform inversion is often ill-posed or non-unique if station coverage is poor or signals are weak. Therefore, the interpretation of moment tensors can become difficult, if not the full model space is explored, including all its trade-offs and uncertainties. This is especially true for non-double couple components of weak or shallow earthquakes, as for instance found in volcanic, geothermal or mining environments.We developed a bootstrap-based probabilistic optimization scheme (Grond), which is based on pre-calculated Greens function full waveform databases (e.g. fomosto tool, doi.org/10.5880/GFZ.2.1.2017.001). Grond is able to efficiently explore the full model space, the trade-offs and the uncertainties of source parameters. The program is highly flexible with respect to the adaption to specific problems, the design of objective functions, and the diversity of empirical datasets.It uses an integrated, robust waveform data processing based on a newly developed Python toolbox for seismology (Pyrocko, see Heimann et al., 2017, http://doi.org/10.5880/GFZ.2.1.2017.001), and allows for visual inspection of many aspects of the optimization problem. Grond has been applied to the CMT moment tensor inversion using W-phases, to nuclear explosions in Korea, to meteorite atmospheric explosions, to volcano-tectonic events during caldera collapse and to intra-plate volcanic and tectonic crustal events.Grond can be used to optimize simultaneously seismological waveforms, amplitude spectra and static displacements of geodetic data as InSAR and GPS (e.g. KITE, Isken et al., 2017, http://doi.org/10.5880/GFZ.2.1.2017.002). We present examples of Grond optimizations to demonstrate the advantage of a full exploration of source parameter uncertainties for interpretation.
Nonparametric bootstrap analysis with applications to demographic effects in demand functions.
Gozalo, P L
1997-12-01
"A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt
Directory of Open Access Journals (Sweden)
Mattia C F Prosperi
2010-10-01
Full Text Available Phylogenetic methods produce hierarchies of molecular species, inferring knowledge about taxonomy and evolution. However, there is not yet a consensus methodology that provides a crisp partition of taxa, desirable when considering the problem of intra/inter-patient quasispecies classification or infection transmission event identification. We introduce the threshold bootstrap clustering (TBC, a new methodology for partitioning molecular sequences, that does not require a phylogenetic tree estimation.The TBC is an incremental partition algorithm, inspired by the stochastic Chinese restaurant process, and takes advantage of resampling techniques and models of sequence evolution. TBC uses as input a multiple alignment of molecular sequences and its output is a crisp partition of the taxa into an automatically determined number of clusters. By varying initial conditions, the algorithm can produce different partitions. We describe a procedure that selects a prime partition among a set of candidate ones and calculates a measure of cluster reliability. TBC was successfully tested for the identification of type-1 human immunodeficiency and hepatitis C virus subtypes, and compared with previously established methodologies. It was also evaluated in the problem of HIV-1 intra-patient quasispecies clustering, and for transmission cluster identification, using a set of sequences from patients with known transmission event histories.TBC has been shown to be effective for the subtyping of HIV and HCV, and for identifying intra-patient quasispecies. To some extent, the algorithm was able also to infer clusters corresponding to events of infection transmission. The computational complexity of TBC is quadratic in the number of taxa, lower than other established methods; in addition, TBC has been enhanced with a measure of cluster reliability. The TBC can be useful to characterise molecular quasipecies in a broad context.
Prosperi, Mattia C F; De Luca, Andrea; Di Giambenedetto, Simona; Bracciale, Laura; Fabbiani, Massimiliano; Cauda, Roberto; Salemi, Marco
2010-10-25
Phylogenetic methods produce hierarchies of molecular species, inferring knowledge about taxonomy and evolution. However, there is not yet a consensus methodology that provides a crisp partition of taxa, desirable when considering the problem of intra/inter-patient quasispecies classification or infection transmission event identification. We introduce the threshold bootstrap clustering (TBC), a new methodology for partitioning molecular sequences, that does not require a phylogenetic tree estimation. The TBC is an incremental partition algorithm, inspired by the stochastic Chinese restaurant process, and takes advantage of resampling techniques and models of sequence evolution. TBC uses as input a multiple alignment of molecular sequences and its output is a crisp partition of the taxa into an automatically determined number of clusters. By varying initial conditions, the algorithm can produce different partitions. We describe a procedure that selects a prime partition among a set of candidate ones and calculates a measure of cluster reliability. TBC was successfully tested for the identification of type-1 human immunodeficiency and hepatitis C virus subtypes, and compared with previously established methodologies. It was also evaluated in the problem of HIV-1 intra-patient quasispecies clustering, and for transmission cluster identification, using a set of sequences from patients with known transmission event histories. TBC has been shown to be effective for the subtyping of HIV and HCV, and for identifying intra-patient quasispecies. To some extent, the algorithm was able also to infer clusters corresponding to events of infection transmission. The computational complexity of TBC is quadratic in the number of taxa, lower than other established methods; in addition, TBC has been enhanced with a measure of cluster reliability. The TBC can be useful to characterise molecular quasipecies in a broad context.
A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages...
Statistical bootstrap approach to hadronic matter and multiparticle reactions
International Nuclear Information System (INIS)
Ilgenfritz, E.M.; Kripfganz, J.; Moehring, H.J.
1977-01-01
The authors present the main ideas behind the statistical bootstrap model and recent developments within this model related to the description of fireball cascade decay. Mathematical methods developed in this model might be useful in other phenomenological schemes of strong interaction physics; they are described in detail. The present status of applications of the model to various hadronic reactions is discussed. When discussing the relations of the statistical bootstrap model to other models of hadron physics the authors point out possibly fruitful analogies and dynamical mechanisms which are modelled by the bootstrap dynamics under definite conditions. This offers interpretations for the critical temperature typical for the model and indicates futher fields of application. (author)
Accelerated spike resampling for accurate multiple testing controls.
Harrison, Matthew T
2013-02-01
Controlling for multiple hypothesis tests using standard spike resampling techniques often requires prohibitive amounts of computation. Importance sampling techniques can be used to accelerate the computation. The general theory is presented, along with specific examples for testing differences across conditions using permutation tests and for testing pairwise synchrony and precise lagged-correlation between many simultaneously recorded spike trains using interval jitter.
Stability of response characteristics of a Delphi panel: application of bootstrap data expansion
Directory of Open Access Journals (Sweden)
Cole Bryan R
2005-12-01
Full Text Available Abstract Background Delphi surveys with panels of experts in a particular area of interest have been widely utilized in the fields of clinical medicine, nursing practice, medical education and healthcare services. Despite this wide applicability of the Delphi methodology, there is no clear identification of what constitutes a sufficient number of Delphi survey participants to ensure stability of results. Methods The study analyzed the response characteristics from the first round of a Delphi survey conducted with 23 experts in healthcare quality and patient safety. The panel members had similar training and subject matter understanding of the Malcolm Baldrige Criteria for Performance Excellence in Healthcare. The raw data from the first round sampling, which usually contains the largest diversity of responses, were augmented via bootstrap sampling to obtain computer-generated results for two larger samples obtained by sampling with replacement. Response characteristics (mean, trimmed mean, standard deviation and 95% confidence intervals for 54 survey items were compared for the responses of the 23 actual study participants and two computer-generated samples of 1000 and 2000 resampling iterations. Results The results from this study indicate that the response characteristics of a small expert panel in a well-defined knowledge area are stable in light of augmented sampling. Conclusion Panels of similarly trained experts (who possess a general understanding in the field of interest provide effective and reliable utilization of a small sample from a limited number of experts in a field of study to develop reliable criteria that inform judgment and support effective decision-making.
Bootstrapping realized volatility and realized beta under a local Gaussianity assumption
DEFF Research Database (Denmark)
Hounyo, Ulrich
The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency...... context, as recently explained by Mykland and Zhang (2009). Our main contributions are as follows. First, we show that the local Gaussian bootstrap is firstorder consistent when used to estimate the distributions of realized volatility and ealized betas. Second, we show that the local Gaussian bootstrap...... matches accurately the first four cumulants of realized volatility, implying that this method provides third-order refinements. This is in contrast with the wild bootstrap of Gonçalves and Meddahi (2009), which is only second-order correct. Third, we show that the local Gaussian bootstrap is able...
Optimal plot size in the evaluation of papaya scions: proposal and comparison of methods
Directory of Open Access Journals (Sweden)
Humberto Felipe Celanti
Full Text Available ABSTRACT Evaluating the quality of scions is extremely important and it can be done by characteristics of shoots and roots. This experiment evaluated height of the aerial part, stem diameter, number of leaves, petiole length and length of roots of papaya seedlings. Analyses were performed from a blank trial with 240 seedlings of "Golden Pecíolo Curto". The determination of the optimum plot size was done by applying the methods of maximum curvature, maximum curvature of coefficient of variation and a new proposed method, which incorporates the bootstrap resampling simulation to the maximum curvature method. According to the results obtained, five is the optimal number of seedlings of papaya "Golden Pecíolo Curto" per plot. The proposed method of bootstrap simulation with replacement provides optimal plot sizes equal or higher than the maximum curvature method and provides same plot size than maximum curvature method of the coefficient of variation.
The N=2 superconformal bootstrap
Energy Technology Data Exchange (ETDEWEB)
Beem, Christopher [Institute for Advanced Study, Einstein Drive,Princeton, NJ 08540 (United States); Lemos, Madalena [C. N. Yang Institute for Theoretical Physics, Stony Brook University,Stony Brook, NY 11794-3840 (United States); Liendo, Pedro [IMIP, Humboldt-Universität zu Berlin, IRIS Adlershof,Zum Großen Windkanal 6, 12489 Berlin (Germany); Rastelli, Leonardo [C. N. Yang Institute for Theoretical Physics, Stony Brook University,Stony Brook, NY 11794-3840 (United States); Rees, Balt C. van [Theory Group, Physics Department, CERN,CH-1211 Geneva 23 (Switzerland)
2016-03-29
In this work we initiate the conformal bootstrap program for N=2 superconformal field theories in four dimensions. We promote an abstract operator-algebraic viewpoint in order to unify the description of Lagrangian and non-Lagrangian theories, and formulate various conjectures concerning the landscape of theories. We analyze in detail the four-point functions of flavor symmetry current multiplets and of N=2 chiral operators. For both correlation functions we review the solution of the superconformal Ward identities and describe their superconformal block decompositions. This provides the foundation for an extensive numerical analysis discussed in the second half of the paper. We find a large number of constraints for operator dimensions, OPE coefficients, and central charges that must hold for any N=2 superconformal field theory.
Efficient bootstrap with weakly dependent processes
Bravo, Francesco; Crudu, Federico
The efficient bootstrap methodology is developed for overidentified moment conditions models with weakly dependent observation. The resulting bootstrap procedure is shown to be asymptotically valid and can be used to approximate the distributions of t-statistics, the J-statistic for overidentifying
A 'bootstrapped' Teaching/Learning Procedure
Odusina Odusote, Olusogo
1998-04-01
Erasing preconceived antiphysics ideas by nonscience/nonmajor physics students have elicited diverse teaching methods. Introductory general physics courses at college level have been taught by a 'bootstrap' approach. A concise treatment of the syllabus by the teacher in about 1/2 of the course duration, with brief exercises and examples. Students are then introduced to real life situations - toys, home appliances, sports, disasters, etc, and the embedded physics concepts discussed. Usually this generates a feeling of deja vu, which elicits desire for more. Each application usually encompasses topics in a broad range of the syllabus. The other half of the course is used by students to work individually/groups on assigned and graded home-works and essays, with guidance from the lecture notes and the teacher/supervisor. An end of course examination shows increase in the success rate.
DEFF Research Database (Denmark)
Hounyo, Ulrich
to a gneral class of estimators of integrated covolatility. We then show the first-order asymptotic validity of this method in the multivariate context with a potential presence of jumps, dependent microsturcture noise, irregularly spaced and non-synchronous data. Due to our focus on non...... covariance estimator. As an application of our results, we also consider the bootstrap for regression coefficients. We show that the wild blocks of bootstrap, appropriately centered, is able to mimic both the dependence and heterogeneity of the scores, thus justifying the construction of bootstrap percentile...... intervals as well as variance estimates in this context. This contrasts with the traditional pairs bootstrap which is not able to mimic the score heterogeneity even in the simple case where no microsturcture noise is present. Our Monte Carlo simulations show that the wild blocks of blocks bootstrap improves...
Bootstrap currents in stellarators and tokamaks
International Nuclear Information System (INIS)
Okamoto, Masao; Nakajima, Noriyoshi.
1990-09-01
The remarkable feature of the bootstrap current in stellarators is it's strong dependence on the magnetic field configuration. Neoclassical bootstrap currents in a large helical device of torsatron/heliotron type (L = 2, M = 10, R = 4 m, B = 4 T) is evaluated in the banana (1/ν) and the plateau regime. Various vacuum magnetic field configurations are studied with a view to minimizing the bootstrap current. It is found that in the banana regime, shifting of the magnetic axis and shaping of magnetic surfaces have a remarkable influence on the bootstrap current; a small outward shift of the magnetic axis and vertically elongated magnetic surfaces are favourable for a reduction of the bootstrap current. It is noted, however, that the ripple diffusion in the 1/ν regime has opposite tendency to the bootstrap current; it increases with the outward shift and increases as the plasma cross section is vertically elongated. The comparison will be made between bootstrap currents in stellarators and tokamaks. (author)
Stock Price Simulation Using Bootstrap and Monte Carlo
Directory of Open Access Journals (Sweden)
Pažický Martin
2017-06-01
Full Text Available In this paper, an attempt is made to assessment and comparison of bootstrap experiment and Monte Carlo experiment for stock price simulation. Since the stock price evolution in the future is extremely important for the investors, there is the attempt to find the best method how to determine the future stock price of BNP Paribas′ bank. The aim of the paper is define the value of the European and Asian option on BNP Paribas′ stock at the maturity date. There are employed four different methods for the simulation. First method is bootstrap experiment with homoscedastic error term, second method is blocked bootstrap experiment with heteroscedastic error term, third method is Monte Carlo simulation with heteroscedastic error term and the last method is Monte Carlo simulation with homoscedastic error term. In the last method there is necessary to model the volatility using econometric GARCH model. The main purpose of the paper is to compare the mentioned methods and select the most reliable. The difference between classical European option and exotic Asian option based on the experiment results is the next aim of tis paper.
Meadors, Grant David; Krishnan, Badri; Papa, Maria Alessandra; Whelan, John T.; Zhang, Yuanhao
2018-02-01
Continuous-wave (CW) gravitational waves (GWs) call for computationally-intensive methods. Low signal-to-noise ratio signals need templated searches with long coherent integration times and thus fine parameter-space resolution. Longer integration increases sensitivity. Low-mass x-ray binaries (LMXBs) such as Scorpius X-1 (Sco X-1) may emit accretion-driven CWs at strains reachable by current ground-based observatories. Binary orbital parameters induce phase modulation. This paper describes how resampling corrects binary and detector motion, yielding source-frame time series used for cross-correlation. Compared to the previous, detector-frame, templated cross-correlation method, used for Sco X-1 on data from the first Advanced LIGO observing run (O1), resampling is about 20 × faster in the costliest, most-sensitive frequency bands. Speed-up factors depend on integration time and search setup. The speed could be reinvested into longer integration with a forecast sensitivity gain, 20 to 125 Hz median, of approximately 51%, or from 20 to 250 Hz, 11%, given the same per-band cost and setup. This paper's timing model enables future setup optimization. Resampling scales well with longer integration, and at 10 × unoptimized cost could reach respectively 2.83 × and 2.75 × median sensitivities, limited by spin-wandering. Then an O1 search could yield a marginalized-polarization upper limit reaching torque-balance at 100 Hz. Frequencies from 40 to 140 Hz might be probed in equal observing time with 2 × improved detectors.
Definition of total bootstrap current in tokamaks
International Nuclear Information System (INIS)
Ross, D.W.
1995-01-01
Alternative definitions of the total bootstrap current are compared. An analogous comparison is given for the ohmic and auxiliary currents. It is argued that different definitions than those usually employed lead to simpler analyses of tokamak operating scenarios
Quantum bootstrapping via compressed quantum Hamiltonian learning
International Nuclear Information System (INIS)
Wiebe, Nathan; Granade, Christopher; Cory, D G
2015-01-01
A major problem facing the development of quantum computers or large scale quantum simulators is that general methods for characterizing and controlling are intractable. We provide a new approach to this problem that uses small quantum simulators to efficiently characterize and learn control models for larger devices. Our protocol achieves this by using Bayesian inference in concert with Lieb–Robinson bounds and interactive quantum learning methods to achieve compressed simulations for characterization. We also show that the Lieb–Robinson velocity is epistemic for our protocol, meaning that information propagates at a rate that depends on the uncertainty in the system Hamiltonian. We illustrate the efficiency of our bootstrapping protocol by showing numerically that an 8 qubit Ising model simulator can be used to calibrate and control a 50 qubit Ising simulator while using only about 750 kilobits of experimental data. Finally, we provide upper bounds for the Fisher information that show that the number of experiments needed to characterize a system rapidly diverges as the duration of the experiments used in the characterization shrinks, which motivates the use of methods such as ours that do not require short evolution times. (fast track communication)
On uniform resampling and gaze analysis of bidirectional texture functions
Czech Academy of Sciences Publication Activity Database
Filip, Jiří; Chantler, M.J.; Haindl, Michal
2009-01-01
Roč. 6, č. 3 (2009), s. 1-15 ISSN 1544-3558 R&D Projects: GA MŠk 1M0572; GA ČR GA102/08/0593 Grant - others:EC Marie Curie(BE) 41358 Institutional research plan: CEZ:AV0Z10750506 Keywords : BTF * texture * eye tracking Subject RIV: BD - Theory of Information Impact factor: 1.447, year: 2009 http://library.utia.cas.cz/separaty/2009/RO/haindl-on uniform resampling and gaze analysis of bidirectional texture functions.pdf
Optimal resampling for the noisy OneMax problem
Liu, Jialin; Fairbank, Michael; Pérez-Liébana, Diego; Lucas, Simon M.
2016-01-01
The OneMax problem is a standard benchmark optimisation problem for a binary search space. Recent work on applying a Bandit-Based Random Mutation Hill-Climbing algorithm to the noisy OneMax Problem showed that it is important to choose a good value for the resampling number to make a careful trade off between taking more samples in order to reduce noise, and taking fewer samples to reduce the total computational cost. This paper extends that observation, by deriving an analytical expression f...
A resampling-based meta-analysis for detection of differential gene expression in breast cancer
Directory of Open Access Journals (Sweden)
Ergul Gulusan
2008-12-01
Full Text Available Abstract Background Accuracy in the diagnosis of breast cancer and classification of cancer subtypes has improved over the years with the development of well-established immunohistopathological criteria. More recently, diagnostic gene-sets at the mRNA expression level have been tested as better predictors of disease state. However, breast cancer is heterogeneous in nature; thus extraction of differentially expressed gene-sets that stably distinguish normal tissue from various pathologies poses challenges. Meta-analysis of high-throughput expression data using a collection of statistical methodologies leads to the identification of robust tumor gene expression signatures. Methods A resampling-based meta-analysis strategy, which involves the use of resampling and application of distribution statistics in combination to assess the degree of significance in differential expression between sample classes, was developed. Two independent microarray datasets that contain normal breast, invasive ductal carcinoma (IDC, and invasive lobular carcinoma (ILC samples were used for the meta-analysis. Expression of the genes, selected from the gene list for classification of normal breast samples and breast tumors encompassing both the ILC and IDC subtypes were tested on 10 independent primary IDC samples and matched non-tumor controls by real-time qRT-PCR. Other existing breast cancer microarray datasets were used in support of the resampling-based meta-analysis. Results The two independent microarray studies were found to be comparable, although differing in their experimental methodologies (Pearson correlation coefficient, R = 0.9389 and R = 0.8465 for ductal and lobular samples, respectively. The resampling-based meta-analysis has led to the identification of a highly stable set of genes for classification of normal breast samples and breast tumors encompassing both the ILC and IDC subtypes. The expression results of the selected genes obtained through real
Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes
Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.
2017-12-01
Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.
On removing interpolation and resampling artifacts in rigid image registration.
Aganj, Iman; Yeo, Boon Thye Thomas; Sabuncu, Mert R; Fischl, Bruce
2013-02-01
We show that image registration using conventional interpolation and summation approximations of continuous integrals can generally fail because of resampling artifacts. These artifacts negatively affect the accuracy of registration by producing local optima, altering the gradient, shifting the global optimum, and making rigid registration asymmetric. In this paper, after an extensive literature review, we demonstrate the causes of the artifacts by comparing inclusion and avoidance of resampling analytically. We show the sum-of-squared-differences cost function formulated as an integral to be more accurate compared with its traditional sum form in a simple case of image registration. We then discuss aliasing that occurs in rotation, which is due to the fact that an image represented in the Cartesian grid is sampled with different rates in different directions, and propose the use of oscillatory isotropic interpolation kernels, which allow better recovery of true global optima by overcoming this type of aliasing. Through our experiments on brain, fingerprint, and white noise images, we illustrate the superior performance of the integral registration cost function in both the Cartesian and spherical coordinates, and also validate the introduced radial interpolation kernel by demonstrating the improvement in registration.
An algebraic approach to the analytic bootstrap
Energy Technology Data Exchange (ETDEWEB)
Alday, Luis F. [Mathematical Institute, University of Oxford, Andrew Wiles Building, Radcliffe Observatory Quarter, Woodstock Road, Oxford, OX2 6GG (United Kingdom); Zhiboedov, Alexander [Center for the Fundamental Laws of Nature, Harvard University, Cambridge, MA 02138 (United States)
2017-04-27
We develop an algebraic approach to the analytic bootstrap in CFTs. By acting with the Casimir operator on the crossing equation we map the problem of doing large spin sums to any desired order to the problem of solving a set of recursion relations. We compute corrections to the anomalous dimension of large spin operators due to the exchange of a primary and its descendants in the crossed channel and show that this leads to a Borel-summable expansion. We analyse higher order corrections to the microscopic CFT data in the direct channel and its matching to infinite towers of operators in the crossed channel. We apply this method to the critical O(N) model. At large N we reproduce the first few terms in the large spin expansion of the known two-loop anomalous dimensions of higher spin currents in the traceless symmetric representation of O(N) and make further predictions. At small N we present the results for the truncated large spin expansion series of anomalous dimensions of higher spin currents.
Conformal bootstrap: non-perturbative QFT's under siege
CERN. Geneva
2016-01-01
[Exceptionally in Council Chamber] Originally formulated in the 70's, the conformal bootstrap is the ambitious idea that one can use internal consistency conditions to carve out, and eventually solve, the space of conformal field theories. In this talk I will review recent developments in the field which have boosted this program to a new level. I will present a method to extract quantitative informations in strongly-interacting theories, such as 3D Ising, O(N) vector model and even systems without a Lagrangian formulation. I will explain how these techniques have led to the world record determination of several critical exponents. Finally, I will review exact analytical results obtained using bootstrap techniques.
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
Banks, H T; Holm, Kathleen; Robbins, Danielle
2010-11-01
We computationally investigate two approaches for uncertainty quantification in inverse problems for nonlinear parameter dependent dynamical systems. We compare the bootstrapping and asymptotic theory approaches for problems involving data with several noise forms and levels. We consider both constant variance absolute error data and relative error which produces non-constant variance data in our parameter estimation formulations. We compare and contrast parameter estimates, standard errors, confidence intervals, and computational times for both bootstrapping and asymptotic theory methods.
Conference on Bootstrapping and Related Techniques
Rothe, Günter; Sendler, Wolfgang
1992-01-01
This book contains 30 selected, refereed papers from an in- ternational conference on bootstrapping and related techni- ques held in Trier 1990. Thepurpose of the book is to in- form about recent research in the area of bootstrap, jack- knife and Monte Carlo Tests. Addressing the novice and the expert it covers as well theoretical as practical aspects of these statistical techniques. Potential users in different disciplines as biometry, epidemiology, computer science, economics and sociology but also theoretical researchers s- hould consult the book to be informed on the state of the art in this area.
Bootstrapping Density-Weighted Average Derivatives
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael
Employing the "small bandwidth" asymptotic framework of Cattaneo, Crump, and Jansson (2009), this paper studies the properties of a variety of bootstrap-based inference procedures associated with the kernel-based density-weighted averaged derivative estimator proposed by Powell, Stock, and Stoker...... (1989). In many cases validity of bootstrap-based inference procedures is found to depend crucially on whether the bandwidth sequence satisfies a particular (asymptotic linearity) condition. An exception to this rule occurs for inference procedures involving a studentized estimator employing a "robust...
A Simple Counterexample to the Bootstrap
Donald W.K. Andrews
1997-01-01
The bootstrap of the maximum likelihood estimator of the mean of a sample of iid normal random variables with mean mu and variance one is not asymptotically correct to first order when the mean is restricted to be nonnegative. The problem occurs when the true value of the mean mu equals zero. This counterexample to the bootstrap generalizes to a wide variety of estimation problems in which the true parameter may be on the boundary of the parameter space. We provide some alternatives to the bo...
Bootstrap percolation: a renormalisation group approach
International Nuclear Information System (INIS)
Branco, N.S.; Santos, Raimundo R. dos; Queiroz, S.L.A. de.
1984-02-01
In bootstrap percolation, sites are occupied at random with probability p, but each site is considered active only if at least m of its neighbours are also active. Within an approximate position-space renormalization group framework on a square lattice we obtain the behaviour of the critical concentration p (sub)c and of the critical exponents ν and β for m = 0 (ordinary percolation), 1,2 and 3. We find that the bootstrap percolation problem can be cast into different universality classes, characterized by the values of m. (author) [pt
Self-consistent ECCD calculations with bootstrap current
International Nuclear Information System (INIS)
Decker, J.; Bers, A.; Ram, A. K; Peysson, Y.
2003-01-01
To achieve high performance, steady-state operation in tokamaks, it is increasingly important to find the appropriate means for modifying and sustaining the pressure and magnetic shear profiles in the plasma. In such advanced scenarios, especially in the vicinity of internal transport barrier, RF induced currents have to be calculated self-consistently with the bootstrap current, thus taking into account possible synergistic effects resulting from the momentum space distortion of the electron distribution function f e . Since RF waves can cause the distribution of electrons to become non-Maxwellian, the associated changes in parallel diffusion of momentum between trapped and passing particles can be expected to modify the bootstrap current fraction; conversely, the bootstrap current distribution function can enhance the current driven by RF waves. For this purpose, a new, fast and fully implicit solver has been recently developed to carry out computations including new and detailed evaluations of the interactions between bootstrap current (BC) and Electron Cyclotron current drive (ECCD). Moreover, Ohkawa current drive (OKCD) appears to be an efficient method for driving current when the fraction of trapped particles is large. OKCD in the presence of BC is also investigated. Here, results are illustrated around projected tokamak parameters in high performance scenarios of AlcatorC-MOD. It is shown that by increasing n // , the EC wave penetration into the bulk of the electron distribution is greater, and since the resonance extends up to high p // values, this situation is the usual ECCD based on the Fisch-Boozer mechanism concerning passing particles. However, because of the close vicinity of the trapped boundary at r/a=0.7, this process is counterbalanced by the Ohkawa effect, possibly leading to a negative net current. Therefore, by injecting the EC wave in the opposite toroidal direction (n // RF by OKCD may be 70% larger than that of ECCD, with a choice of EC
Bootstrap inversion for Pn wave velocity in North-Western Italy
Directory of Open Access Journals (Sweden)
C. Eva
1997-06-01
Full Text Available An inversion of Pn arrival times from regional distance earthquakes (180-800 km, recorded by 94 seismic stations operating in North-Western Italy and surrounding areas, was carried out to image lateral variations of P-wave velocity at the crust-mantle boundary, and to estimate the static delay time at each station. The reliability of the obtained results was assessed using both synthetic tests and the bootstrap Monte Carlo resampling technique. Numerical simulations demonstrated the existence of a trade-off between cell velocities and estimated station delay times along the edge of the model. Bootstrap inversions were carried out to determine the standard deviation of velocities and time terms. Low Pn velocity anomalies are detected beneath the outer side of the Alps (-6% and the Western Po plain (-4% in correspondence with two regions of strong crustal thickening and negative Bouguer anomaly. In contrast, high Pn velocities are imaged beneath the inner side of the Alps (+4% indicating the presence of high velocity and density lower crust-upper mantle. The Ligurian sea shows high Pn velocities close to the Ligurian coastlines (+3% and low Pn velocities (-1.5% in the middle of the basin in agreement with the upper mantle velocity structure revealed by seismic refraction profiles.
Pulling Econometrics Students up by Their Bootstraps
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
How to Bootstrap a Human Communication System
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
Quadratic mass relations in topological bootstrap theory
International Nuclear Information System (INIS)
Jones, C.E.; Uschersohn, J.
1980-01-01
From the requirement of reality of discontinuities of scattering amplitudes at the spherical level of the topological bootstrap theory, a large number of mass relations for hadrons is derived. Quadratic mass formulas for the symmetry-breaking pattern of both mesons and baryon is obtained and their relation to conventional models of symmetry breaking is briefly discussed
A framework for bootstrapping morphological decomposition
CSIR Research Space (South Africa)
Joubert, LJ
2004-11-01
Full Text Available The need for a bootstrapping approach to the morphological decomposition of words in agglutinative languages such as isiZulu is motivated, and the complexities of such an approach are described. The authors then introduce a generic framework which...
Robust block bootstrap panel predictability tests
Westerlund, J.; Smeekes, S.
2013-01-01
Most panel data studies of the predictability of returns presume that the cross-sectional units are independent, an assumption that is not realistic. As a response to this, the current paper develops block bootstrap-based panel predictability tests that are valid under very general conditions. Some
Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S
2017-08-01
Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.
Efficient generation of pronunciation dictionaries: human factors factors during bootstrapping
CSIR Research Space (South Africa)
Davel, MH
2004-10-01
Full Text Available Bootstrapping techniques have significant potential for the efficient generation of linguistic resources such as electronic pronunciation dictionaries. The authors describe a system and an approach to bootstrapping for the development...
Bootstrapping pronunciation models: a South African case study
CSIR Research Space (South Africa)
Davel, M
2006-02-27
Full Text Available Bootstrapping techniques can accelerate the development of language technology for new languages. The authors define a framework for the analysis of a general bootstrapping process whereby a model is improved through a controlled series...
A bootstrap based space-time surveillance model with an application to crime occurrences
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
International Nuclear Information System (INIS)
Lins, Isis Didier; Droguett, Enrique López; Moura, Márcio das Chagas; Zio, Enrico; Jacinto, Carlos Magno
2015-01-01
Data-driven learning methods for predicting the evolution of the degradation processes affecting equipment are becoming increasingly attractive in reliability and prognostics applications. Among these, we consider here Support Vector Regression (SVR), which has provided promising results in various applications. Nevertheless, the predictions provided by SVR are point estimates whereas in order to take better informed decisions, an uncertainty assessment should be also carried out. For this, we apply bootstrap to SVR so as to obtain confidence and prediction intervals, without having to make any assumption about probability distributions and with good performance even when only a small data set is available. The bootstrapped SVR is first verified on Monte Carlo experiments and then is applied to a real case study concerning the prediction of degradation of a component from the offshore oil industry. The results obtained indicate that the bootstrapped SVR is a promising tool for providing reliable point and interval estimates, which can inform maintenance-related decisions on degrading components. - Highlights: • Bootstrap (pairs/residuals) and SVR are used as an uncertainty analysis framework. • Numerical experiments are performed to assess accuracy and coverage properties. • More bootstrap replications does not significantly improve performance. • Degradation of equipment of offshore oil wells is estimated by bootstrapped SVR. • Estimates about the scale growth rate can support maintenance-related decisions
The use of the bootstrap in the analysis of case-control studies with missing data
DEFF Research Database (Denmark)
Siersma, Volkert Dirk; Johansen, Christoffer
2004-01-01
nonparametric bootstrap, bootstrap confidence intervals, missing values, multiple imputation, matched case-control study......nonparametric bootstrap, bootstrap confidence intervals, missing values, multiple imputation, matched case-control study...
The $(2,0)$ superconformal bootstrap
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C
2016-01-01
We develop the conformal bootstrap program for six-dimensional conformal field theories with $(2,0)$ supersymmetry, focusing on the universal four-point function of stress tensor multiplets. We review the solution of the superconformal Ward identities and describe the superconformal block decomposition of this correlator. We apply numerical bootstrap techniques to derive bounds on OPE coefficients and scaling dimensions from the constraints of crossing symmetry and unitarity. We also derive analytic results for the large spin spectrum using the lightcone expansion of the crossing equation. Our principal result is strong evidence that the $A_1$ theory realizes the minimal allowed central charge $(c=25)$ for any interacting $(2,0)$ theory. This implies that the full stress tensor four-point function of the $A_1$ theory is the unique unitary solution to the crossing symmetry equation at $c=25$. For this theory, we estimate the scaling dimensions of the lightest unprotected operators appearing in the stress tenso...
Heptagons from the Steinmann cluster bootstrap
International Nuclear Information System (INIS)
Dixon, Lance J.; McLeod, Andrew J.; Drummond, James; Harrington, Thomas; Spradlin, Marcus; Papathanasiou, Georgios; Stanford Univ., CA
2016-12-01
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planar N=4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal (anti Q) relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.
Kepler Planet Detection Metrics: Statistical Bootstrap Test
Jenkins, Jon M.; Burke, Christopher J.
2016-01-01
This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.
Heptagons from the Steinmann cluster bootstrap
Energy Technology Data Exchange (ETDEWEB)
Dixon, Lance J.; McLeod, Andrew J. [Stanford Univ., CA (United States). SLAC National Accelerator Lab.; Drummond, James [Southampton Univ. (United Kingdom). School of Physics and Astronomy; Harrington, Thomas; Spradlin, Marcus [Brown Univ., Providence, RI (United States). Dept. of Physics; Papathanasiou, Georgios [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Stanford Univ., CA (United States). SLAC National Accelerator Lab.
2016-12-15
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planar N=4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal (anti Q) relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.
Heptagons from the Steinmann cluster bootstrap
International Nuclear Information System (INIS)
Dixon, Lance J.; Drummond, James; Papathanasiou, Georgios
2017-01-01
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planar N = 4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal Q̄ relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.
Electric conductivity and bootstrap current in tokamak
International Nuclear Information System (INIS)
Mao Jianshan; Wang Maoquan
1996-12-01
A modified Ohm's law for the electric conductivity calculation is presented, where the modified ohmic current can be compensated by the bootstrap current. A comparison of TEXT tokamak experiment with the theories shows that the modified Ohm's law is a more close approximation to the tokamak experiments than the classical and neoclassical theories and can not lead to the absurd result of Z eff <1, and the extended neoclassical theory would be not necessary. (3 figs.)
Accidental symmetries and the conformal bootstrap
Energy Technology Data Exchange (ETDEWEB)
Chester, Shai M.; Giombi, Simone; Iliesiu, Luca V.; Klebanov, Igor R.; Pufu, Silviu S.; Yacoby, Ran [Joseph Henry Laboratories, Princeton University,Princeton, NJ 08544 (United States)
2016-01-19
We study an N=2 supersymmetric generalization of the three-dimensional critical O(N) vector model that is described by N+1 chiral superfields with superpotential W=g{sub 1}X∑{sub i}Z{sub i}{sup 2}+g{sub 2}X{sup 3}. By combining the tools of the conformal bootstrap with results obtained through supersymmetric localization, we argue that this model exhibits a symmetry enhancement at the infrared superconformal fixed point due to g{sub 2} flowing to zero. This example is special in that the existence of an infrared fixed point with g{sub 1},g{sub 2}≠0, which does not exhibit symmetry enhancement, does not generally lead to any obvious unitarity violations or other inconsistencies. We do show, however, that the F-theorem excludes the models with g{sub 1},g{sub 2}≠0 for N>5. The conformal bootstrap provides a stronger constraint and excludes such models for N>2. We provide evidence that the g{sub 2}=0 models, which have the enhanced O(N)×U(1) symmetry, come close to saturating the bootstrap bounds. We extend our analysis to fractional dimensions where we can motivate the nonexistence of the g{sub 1},g{sub 2}≠0 models by studying them perturbatively in the 4−ϵ expansion.
Bootstrapping Relational Affordances of Object Pairs using Transfer
DEFF Research Database (Denmark)
Fichtl, Severin; Kraft, Dirk; Krüger, Norbert
2018-01-01
leverage past knowledge to accelerate current learning (which we call bootstrapping). We learn Random Forest based affordance predictors from visual inputs and demonstrate two approaches to knowledge transfer for bootstrapping. In the first approach (direct bootstrapping), the state-space for a new...... affordance predictor is augmented with the output of previously learnt affordances. In the second approach (category based bootstrapping), we form categories that capture underlying commonalities of a pair of existing affordances and augment the state-space with this category classifier’s output. In addition......, we introduce a novel heuristic, which suggests how a large set of potential affordance categories can be pruned to leave only those categories which are most promising for bootstrapping future affordances. Our results show that both bootstrapping approaches outperform learning without bootstrapping...
An add-in implementation of the RESAMPLING syntax under Microsoft EXCEL.
Meineke, I
2000-10-01
The RESAMPLING syntax defines a set of powerful commands, which allow the programming of probabilistic statistical models with few, easily memorized statements. This paper presents an implementation of the RESAMPLING syntax using Microsoft EXCEL with Microsoft WINDOWS(R) as a platform. Two examples are given to demonstrate typical applications of RESAMPLING in biomedicine. Details of the implementation with special emphasis on the programming environment are discussed at length. The add-in is available electronically to interested readers upon request. The use of the add-in facilitates numerical statistical analyses of data from within EXCEL in a comfortable way.
Effect of bootstrap current on MHD equilibrium beta limit in heliotron plasmas
International Nuclear Information System (INIS)
Watanabe, K.Y.
2001-01-01
The effect of bootstrap current on the beta limit of MHD equilibria is studied systematically by an iterative calculation of MHD equilibrium and the consistent bootstrap current in high beta heliotron plasmas. The LHD machine is treated as a standard configuration heliotron with an L=2 planar axis. The effects of vacuum magnetic configurations, pressure profiles and the vertical field control method are studied. The equilibrium beta limit with consistent bootstrap current is quite sensitive to the magnetic axis location for finite beta, compared with the currentless cases. For a vacuum configuration with the magnetic axis shifted inwards in the torus, even in the high beta regimes, the bootstrap current flows to increase the rotational transform, leading to an increase in the equilibrium beta limit. On the contrary, for a vacuum configuration with the magnetic axis shifted outwards in the torus, even in the low beta regimes, the bootstrap current flows so as to reduce the rotational transform; therefore, there is an acceleration of the Shafranov shift increase as beta increases, leading to a decrease in the equilibrium beta limit. The pressure profiles and vertical field control methods influence the equilibrium beta limit through the location of the magnetic axis for finite beta. These characteristics are independent of both device parameters, such as magnetic field strength, and device size in the low collisional regime. (author)
ROSETTA-ORBITER SW RPCMAG 4 CR2 RESAMPLED V3.0
National Aeronautics and Space Administration — 2010-07-30 SBN:T.Barnes Updated and DATA_SET_DESCThis dataset contains RESAMPLED DATA of the CRUISE 2 phase (CR2). (Version 3.0 is the first version archived.)
Quantifying uncertainty on sediment loads using bootstrap confidence intervals
Slaets, Johanna I. F.; Piepho, Hans-Peter; Schmitter, Petra; Hilger, Thomas; Cadisch, Georg
2017-01-01
Load estimates are more informative than constituent concentrations alone, as they allow quantification of on- and off-site impacts of environmental processes concerning pollutants, nutrients and sediment, such as soil fertility loss, reservoir sedimentation and irrigation channel siltation. While statistical models used to predict constituent concentrations have been developed considerably over the last few years, measures of uncertainty on constituent loads are rarely reported. Loads are the product of two predictions, constituent concentration and discharge, integrated over a time period, which does not make it straightforward to produce a standard error or a confidence interval. In this paper, a linear mixed model is used to estimate sediment concentrations. A bootstrap method is then developed that accounts for the uncertainty in the concentration and discharge predictions, allowing temporal correlation in the constituent data, and can be used when data transformations are required. The method was tested for a small watershed in Northwest Vietnam for the period 2010-2011. The results showed that confidence intervals were asymmetric, with the highest uncertainty in the upper limit, and that a load of 6262 Mg year-1 had a 95 % confidence interval of (4331, 12 267) in 2010 and a load of 5543 Mg an interval of (3593, 8975) in 2011. Additionally, the approach demonstrated that direct estimates from the data were biased downwards compared to bootstrap median estimates. These results imply that constituent loads predicted from regression-type water quality models could frequently be underestimating sediment yields and their environmental impact.
Bootstrap inference for pre-averaged realized volatility based on non-overlapping returns
DEFF Research Database (Denmark)
Gonçalves, Sílvia; Hounyo, Ulrich; Meddahi, Nour
The main contribution of this paper is to propose bootstrap methods for realized volatility-like estimators defined on pre-averaged returns. In particular, we focus on the pre-averaged realized volatility estimator proposed by Podolskij and Vetter (2009). This statistic can be written (up to a bias......-overlapping nature of the pre-averaged returns implies that these are asymptotically independent, but possibly heteroskedastic. This motivates the application of the wild bootstrap in this context. We provide a proof of the first order asymptotic validity of this method for percentile and percentile-t intervals. Our...... Monte Carlo simulations show that the wild bootstrap can improve the finite sample properties of the existing first order asymptotic theory provided we choose the external random variable appropriately. We use empirical work to illustrate its use in practice....
Closure of the operator product expansion in the non-unitary bootstrap
Energy Technology Data Exchange (ETDEWEB)
Esterlis, Ilya [Stanford Institute for Theoretical Physics, Stanford University,Via Pueblo, Stanford, CA 94305 (United States); Fitzpatrick, A. Liam [Department of Physics, Boston University,Commonwealth Ave, Boston, MA, 02215 (United States); Ramirez, David M. [Stanford Institute for Theoretical Physics, Stanford University,Via Pueblo, Stanford, CA 94305 (United States)
2016-11-07
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in http://arxiv.org/abs/1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a special case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.
Transport Barriers in Bootstrap Driven Tokamaks
Staebler, Gary
2017-10-01
Maximizing the bootstrap current in a tokamak, so that it drives a high fraction of the total current, reduces the external power required to drive current by other means. Improved energy confinement, relative to empirical scaling laws, enables a reactor to more fully take advantage of the bootstrap driven tokamak. Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is due to the suppression of turbulence primarily due to the large Shafranov shift. ExB velocity shear does not play a significant role in the transport barrier due to the high safety factor. It will be shown, that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift driven barrier formation. The ion energy transport is reduced to neoclassical and electron energy and particle transport is reduced, but still turbulent, within the barrier. Deeper into the plasma, very large levels of electron transport are observed. The observed electron temperature profile is shown to be close to the threshold for the electron temperature gradient (ETG) mode. A large ETG driven energy transport is qualitatively consistent with recent multi-scale gyrokinetic simulations showing that reducing the ion scale turbulence can lead to large increase in the electron scale transport. A new saturation model for the quasilinear TGLF transport code, that fits these multi-scale gyrokinetic simulations, can match the data if the impact of zonal flow mixing on the ETG modes is reduced at high safety factor. This work was supported by the U.S. Department of Energy under DE-FG02-95ER54309 and DE-FC02
Bootstrap procedure in the quasinuclear quark model
International Nuclear Information System (INIS)
Anisovich, V.V.; Gerasyuta, S.M.; Keltuyala, I.V.
1983-01-01
The scattering amplitude for quarks (dressed quarks of a single flavour, and three colours) is obtained by means of a bootstrap procedure with introdUction of an initial paint-wise interaction due to a heavy gluon exchange. The obtained quasi-nuclear model (effective short-range interaction in the S-wave states) has reasonable properties: there exist colourless meson states Jsup(p)=0sup(-), 1 - ; there are no bound states in coloured channels, a virtual diquark level Jsup(p)=1sup(+) appears in the coloured state anti 3sub(c)
Towards bootstrapping QED{sub 3}
Energy Technology Data Exchange (ETDEWEB)
Chester, Shai M.; Pufu, Silviu S. [Joseph Henry Laboratories, Princeton University,Princeton, NJ 08544 (United States)
2016-08-02
We initiate the conformal bootstrap study of Quantum Electrodynamics in 2+1 space-time dimensions (QED{sub 3}) with N flavors of charged fermions by focusing on the 4-point function of four monopole operators with the lowest unit of topological charge. We obtain upper bounds on the scaling dimension of the doubly-charged monopole operator, with and without assuming other gaps in the operator spectrum. Intriguingly, we find a (gap-dependent) kink in these bounds that comes reasonably close to the large N extrapolation of the scaling dimensions of the singly-charged and doubly-charged monopole operators down to N=4 and N=6.
Bootstrapping Malmquist indices for Danish seiners in the North Sea and Skagerrak
DEFF Research Database (Denmark)
Hoff, Ayoe
2006-01-01
DEA scores or related parameters. The bootstrap method for estimating confidence intervals of deterministic parameters can however be applied to estimate confidence intervals for DEA scores. This method is applied in the present paper for assessing TFP changes between 1987 and 1999 for the fleet...
A Mellin space approach to the conformal bootstrap
Energy Technology Data Exchange (ETDEWEB)
Gopakumar, Rajesh [International Centre for Theoretical Sciences (ICTS-TIFR),Survey No. 151, Shivakote, Hesaraghatta Hobli, Bangalore North 560 089 (India); Kaviraj, Apratim [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India); Sen, Kallol [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India); Kavli Institute for the Physics and Mathematics of the Universe (WPI),The University of Tokyo Institutes for Advanced Study, Kashiwa, Chiba 277-8583 (Japan); Sinha, Aninda [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India)
2017-05-05
We describe in more detail our approach to the conformal bootstrap which uses the Mellin representation of CFT{sub d} four point functions and expands them in terms of crossing symmetric combinations of AdS{sub d+1} Witten exchange functions. We consider arbitrary external scalar operators and set up the conditions for consistency with the operator product expansion. Namely, we demand cancellation of spurious powers (of the cross ratios, in position space) which translate into spurious poles in Mellin space. We discuss two contexts in which we can immediately apply this method by imposing the simplest set of constraint equations. The first is the epsilon expansion. We mostly focus on the Wilson-Fisher fixed point as studied in an epsilon expansion about d=4. We reproduce Feynman diagram results for operator dimensions to O(ϵ{sup 3}) rather straightforwardly. This approach also yields new analytic predictions for OPE coefficients to the same order which fit nicely with recent numerical estimates for the Ising model (at ϵ=1). We will also mention some leading order results for scalar theories near three and six dimensions. The second context is a large spin expansion, in any dimension, where we are able to reproduce and go a bit beyond some of the results recently obtained using the (double) light cone expansion. We also have a preliminary discussion about numerical implementation of the above bootstrap scheme in the absence of a small parameter.
Comparison of Bootstrap Confidence Intervals Using Monte Carlo Simulations
Directory of Open Access Journals (Sweden)
Roberto S. Flowers-Cano
2018-02-01
Full Text Available Design of hydraulic works requires the estimation of design hydrological events by statistical inference from a probability distribution. Using Monte Carlo simulations, we compared coverage of confidence intervals constructed with four bootstrap techniques: percentile bootstrap (BP, bias-corrected bootstrap (BC, accelerated bias-corrected bootstrap (BCA and a modified version of the standard bootstrap (MSB. Different simulation scenarios were analyzed. In some cases, the mother distribution function was fit to the random samples that were generated. In other cases, a distribution function different to the mother distribution was fit to the samples. When the fitted distribution had three parameters, and was the same as the mother distribution, the intervals constructed with the four techniques had acceptable coverage. However, the bootstrap techniques failed in several of the cases in which the fitted distribution had two parameters.
How to bootstrap a human communication system.
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified items to a partner using repeated non-linguistic vocalization, repeated gesture, or repeated non-linguistic vocalization plus gesture (but without using their existing language system). Gesture proved more effective (measured by communication success) and more efficient (measured by the time taken to communicate) than non-linguistic vocalization across a range of item categories (emotion, object, and action). Combining gesture and vocalization did not improve performance beyond gesture alone. We experimentally demonstrate that gesture is a more effective means of bootstrapping a human communication system. We argue that gesture outperforms non-linguistic vocalization because it lends itself more naturally to the production of motivated signs. © 2013 Cognitive Science Society, Inc.
Selfconsistent RF driven and bootstrap currents
International Nuclear Information System (INIS)
Peysson, Y.
2002-01-01
This important problem selfconsistent calculations of the bootstrap current with RF, taking into account possible synergistic effects, is addressed for the case of lower hybrid (LH) and electron cyclotron (EC) current drive by numerically solving the electron drift kinetic equation. Calculations are performed using a new, fast, and fully implicit code which solves the 3-D relativistic Fokker-Planck equation with quasilinear diffusion. These calculations take into account the perturbations to the electron distribution due to radial drifts induced by magnetic field gradient and curvature. While the synergism between bootstrap and LH-driven current does not seem to exceed 15%, it can reach 30-40% with the EC-driven current for some plasma parameters. In addition, considerable current can be generated by judiciously using ECCD with the Okhawa effect. This is in contrast to the usual ECCD which tries to avoid it. A detailed analysis of the numerical results is presented using a simplified analytical model which incorporates the underlying physical processes. (author)
Transport barriers in bootstrap-driven tokamaks
Staebler, G. M.; Garofalo, A. M.; Pan, C.; McClenaghan, J.; Van Zeeland, M. A.; Lao, L. L.
2018-05-01
Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is caused by the suppression of turbulence primarily from the large Shafranov shift. It is shown that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift-driven barrier formation. Two self-organized states of the internal and edge transport barrier are observed. It is shown that these two states are controlled by the interaction of the bootstrap current with magnetic shear, and the kinetic ballooning mode instability boundary. Election scale energy transport is predicted to be dominant in the inner 60% of the profile. Evidence is presented that energetic particle-driven instabilities could be playing a role in the thermal energy transport in this region.
arXiv The S-matrix Bootstrap I: QFT in AdS
Paulos, Miguel F.; Toledo, Jonathan; van Rees, Balt C.; Vieira, Pedro
2017-11-21
We propose a strategy to study massive Quantum Field Theory (QFT) using conformal bootstrap methods. The idea is to consider QFT in hyperbolic space and study correlation functions of its boundary operators. We show that these are solutions of the crossing equations in one lower dimension. By sending the curvature radius of the background hyperbolic space to infinity we expect to recover flat-space physics. We explain that this regime corresponds to large scaling dimensions of the boundary operators, and discuss how to obtain the flat-space scattering amplitudes from the corresponding limit of the boundary correlators. We implement this strategy to obtain universal bounds on the strength of cubic couplings in 2D flat-space QFTs using 1D conformal bootstrap techniques. Our numerical results match precisely the analytic bounds obtained in our companion paper using S-matrix bootstrap techniques.
Using the bootstrap in a multivariadte data problem: An example
International Nuclear Information System (INIS)
Glosup, J.G.; Axelrod, M.C.
1995-01-01
The use of the bootstrap in the multivariate version of the paired t-test is considered and demonstrated through an example. The problem of interest involves comparing two different techniques for measuring the chemical constituents of an sample item. The bootstrap is used to form an empirical significance level for Hotelling's one-sample T-squared statistic. The bootstrap was selected to determine empirical significance levels because the implicit assumption of multivariate normality in the classic Hotelling's one-sample test night not hold. The results of both the classic and bootstrap test are presented and contrasted
Jannati, Mojtaba; Valadan Zoej, Mohammad Javad; Mokhtarzade, Mehdi
2018-03-01
This paper presents a novel approach to epipolar resampling of cross-track linear pushbroom imagery using orbital parameters model (OPM). The backbone of the proposed method relies on modification of attitude parameters of linear array stereo imagery in such a way to parallelize the approximate conjugate epipolar lines (ACELs) with the instantaneous base line (IBL) of the conjugate image points (CIPs). Afterward, a complementary rotation is applied in order to parallelize all the ACELs throughout the stereo imagery. The new estimated attitude parameters are evaluated based on the direction of the IBL and the ACELs. Due to the spatial and temporal variability of the IBL (respectively changes in column and row numbers of the CIPs) and nonparallel nature of the epipolar lines in the stereo linear images, some polynomials in the both column and row numbers of the CIPs are used to model new attitude parameters. As the instantaneous position of sensors remains fix, the digital elevation model (DEM) of the area of interest is not required in the resampling process. According to the experimental results obtained from two pairs of SPOT and RapidEye stereo imagery with a high elevation relief, the average absolute values of remained vertical parallaxes of CIPs in the normalized images were obtained 0.19 and 0.28 pixels respectively, which confirm the high accuracy and applicability of the proposed method.
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We employ the replica method of statistical physics to study the average case performance of learning systems. The new feature of our theory is that general distributions of data can be treated, which enables applications to real data. For a class of Bayesian prediction models which are based...... on Gaussian processes, we discuss Bootstrap estimates for learning curves....
The use of vector bootstrapping to improve variable selection precision in Lasso models
Laurin, C.; Boomsma, D.I.; Lubke, G.H.
2016-01-01
The Lasso is a shrinkage regression method that is widely used for variable selection in statistical genetics. Commonly, K-fold cross-validation is used to fit a Lasso model. This is sometimes followed by using bootstrap confidence intervals to improve precision in the resulting variable selections.
Conformal bootstrap, universality and gravitational scattering
Directory of Open Access Journals (Sweden)
Steven Jackson
2015-12-01
Full Text Available We use the conformal bootstrap equations to study the non-perturbative gravitational scattering between infalling and outgoing particles in the vicinity of a black hole horizon in AdS. We focus on irrational 2D CFTs with large c and only Virasoro symmetry. The scattering process is described by the matrix element of two light operators (particles between two heavy states (BTZ black holes. We find that the operator algebra in this regime is (i universal and identical to that of Liouville CFT, and (ii takes the form of an exchange algebra, specified by an R-matrix that exactly matches the scattering amplitude of 2+1 gravity. The R-matrix is given by a quantum 6j-symbol and the scattering phase by the volume of a hyperbolic tetrahedron. We comment on the relevance of our results to scrambling and the holographic reconstruction of the bulk physics near black hole horizons.
Parameter tolerance of the SQUID bootstrap circuit
International Nuclear Information System (INIS)
Zhang Guofeng; Dong Hui; Xie Xiaoming; Jiang Mianheng; Zhang Yi; Krause, Hans-Joachim; Braginski, Alex I; Offenhäusser, Andreas
2012-01-01
We recently demonstrated and analysed the voltage-biased SQUID bootstrap circuit (SBC) conceived to suppress the preamplifier noise contribution in the absence of flux modulation readout. Our scheme contains both the additional voltage and current feedbacks. In this study, we analysed the tolerance of the SBC noise suppression performance to spreads in SQUID and SBC circuit parameters. Analytical results were confirmed by experiments. A one-time adjustable current feedback can be used to extend the tolerance to spreads such as those caused by the integrated circuit fabrication process. This should help to improve the fabrication yield of SBC devices integrated on one chip—as required for multi-channel SQUID systems.
The ${\\mathcal N}=2$ superconformal bootstrap
Beem, Christopher; Liendo, Pedro; Rastelli, Leonardo; van Rees, Balt C
2016-01-01
In this work we initiate the conformal bootstrap program for ${\\mathcal N}=2$ superconformal field theories in four dimensions. We promote an abstract operator-algebraic viewpoint in order to unify the description of Lagrangian and non-Lagrangian theories, and formulate various conjectures concerning the landscape of theories. We analyze in detail the four-point functions of flavor symmetry current multiplets and of ${\\mathcal N}=2$ chiral operators. For both correlation functions we review the solution of the superconformal Ward identities and describe their superconformal block decompositions. This provides the foundation for an extensive numerical analysis discussed in the second half of the paper. We find a large number of constraints for operator dimensions, OPE coefficients, and central charges that must hold for any ${\\mathcal N}=2$ superconformal field theory.
Bootstrapping the O(N) archipelago
Energy Technology Data Exchange (ETDEWEB)
Kos, Filip; Poland, David [Department of Physics, Yale University, New Haven, CT 06520 (United States); Simmons-Duffin, David [School of Natural Sciences, Institute for Advanced Study, Princeton, New Jersey 08540 (United States); Vichi, Alessandro [Theory Division, CERN, Geneva (Switzerland)
2015-11-17
We study 3d CFTs with an O(N) global symmetry using the conformal bootstrap for a system of mixed correlators. Specifically, we consider all nonvanishing scalar four-point functions containing the lowest dimension O(N) vector ϕ{sub i} and the lowest dimension O(N) singlet s, assumed to be the only relevant operators in their symmetry representations. The constraints of crossing symmetry and unitarity for these four-point functions force the scaling dimensions (Δ{sub ϕ},Δ{sub s}) to lie inside small islands. We also make rigorous determinations of current two-point functions in the O(2) and O(3) models, with applications to transport in condensed matter systems.
Bootstrapping the O(N) Archipelago
Kos, Filip; Simmons-Duffin, David; Vichi, Alessandro
2015-01-01
We study 3d CFTs with an $O(N)$ global symmetry using the conformal bootstrap for a system of mixed correlators. Specifically, we consider all nonvanishing scalar four-point functions containing the lowest dimension $O(N)$ vector $\\phi_i$ and the lowest dimension $O(N)$ singlet $s$, assumed to be the only relevant operators in their symmetry representations. The constraints of crossing symmetry and unitarity for these four-point functions force the scaling dimensions $(\\Delta_\\phi, \\Delta_s)$ to lie inside small islands. We also make rigorous determinations of current two-point functions in the $O(2)$ and $O(3)$ models, with applications to transport in condensed matter systems.
Learning web development with Bootstrap and AngularJS
Radford, Stephen
2015-01-01
Whether you know a little about Bootstrap or AngularJS, or you're a complete beginner, this book will enhance your capabilities in both frameworks and you'll build a fully functional web app. A working knowledge of HTML, CSS, and JavaScript is required to fully get to grips with Bootstrap and AngularJS.
The nonparametric bootstrap for the current status model
Groeneboom, P.; Hendrickx, K.
2017-01-01
It has been proved that direct bootstrapping of the nonparametric maximum likelihood estimator (MLE) of the distribution function in the current status model leads to inconsistent confidence intervals. We show that bootstrapping of functionals of the MLE can however be used to produce valid
On transport and the bootstrap current in toroidal plasmas
International Nuclear Information System (INIS)
Connor, J.W.; Taylor, J.B.
1987-01-01
The recently reported observation of the bootstrap current in a tokamak plasma highlights the problem of reconciling this neoclassical effect with the anomalous (i.e., non-neoclassical) electron thermal transport. This Comment reviews the bootstrap current and considers the implications of a self-consistent modification of neoclassical theory based on an enhanced electron-electron interaction. (author)
Automotive FMCW Radar-Enhanced Range Estimation via a Local Resampling Fourier Transform
Directory of Open Access Journals (Sweden)
Cailing Wang
2016-02-01
Full Text Available In complex traffic scenarios, more accurate measurement and discrimination for an automotive frequency-modulated continuous-wave (FMCW radar is required for intelligent robots, driverless cars and driver-assistant systems. A more accurate range estimation method based on a local resampling Fourier transform (LRFT for a FMCW radar is developed in this paper. Radar signal correlation in the phase space sees a higher signal-noise-ratio (SNR to achieve more accurate ranging, and the LRFT - which acts on a local neighbour as a refinement step - can achieve a more accurate target range. The rough range is estimated through conditional pulse compression (PC and then, around the initial rough estimation, a refined estimation through the LRFT in the local region achieves greater precision. Furthermore, the LRFT algorithm is tested in numerous simulations and physical system experiments, which show that the LRFT algorithm achieves a more precise range estimation than traditional FFT-based algorithms, especially for lower bandwidth signals.
arXiv Bootstrapping the QCD soft anomalous dimension
Almelid, Øyvind; Gardi, Einan; McLeod, Andrew; White, Chris D.
2017-09-18
The soft anomalous dimension governs the infrared singularities of scattering amplitudes to all orders in perturbative quantum field theory, and is a crucial ingredient in both formal and phenomenological applications of non-abelian gauge theories. It has recently been computed at three-loop order for massless partons by explicit evaluation of all relevant Feynman diagrams. In this paper, we show how the same result can be obtained, up to an overall numerical factor, using a bootstrap procedure. We first give a geometrical argument for the fact that the result can be expressed in terms of single-valued harmonic polylogarithms. We then use symmetry considerations as well as known properties of scattering amplitudes in collinear and high-energy (Regge) limits to constrain an ansatz of basis functions. This is a highly non-trivial cross-check of the result, and our methods pave the way for greatly simplified higher-order calculations.
Boundary and interface CFTs from the conformal bootstrap
Energy Technology Data Exchange (ETDEWEB)
Gliozzi, Ferdinando [Dipartimento di Fisica, Università di Torino,Via P. Giuria 1 I-10125 Torino (Italy); Istituto Nazionale di Fisica Nucleare - sezione di Torino,Via P. Giuria 1 I-10125 Torino (Italy); Liendo, Pedro [IMIP, Humboldt-Universität zu Berlin, IRIS Adelershof,Zum Großen Windkanal 6, 12489 Berlin (Germany); Meineri, Marco [Scuola Normale Superiore,Piazza dei Cavalieri 7 I-56126 Pisa (Italy); Istituto Nazionale di Fisica Nucleare - sezione di Pisa,Largo B. Pontecorvo, 3, 56127 Pisa (Italy); Rago, Antonio [Centre for Mathematical Sciences, Plymouth University,Drake Circus, Plymouth, PL4 8AA (United Kingdom)
2015-05-07
We explore some consequences of the crossing symmetry for defect conformal field theories, focusing on codimension one defects like flat boundaries or interfaces. We study surface transitions of the 3d Ising and other O(N) models through numerical solutions to the crossing equations with the method of determinants. In the extraordinary transition, where the low-lying spectrum of the surface operators is known, we use the bootstrap equations to obtain information on the bulk spectrum of the theory. In the ordinary transition the knowledge of the low-lying bulk spectrum allows to calculate the scale dimension of the relevant surface operator, which compares well with known results of two-loop calculations in 3d. Estimates of various OPE coefficients are also obtained. We also analyze in 4-ϵ dimensions the renormalization group interface between the O(N) model and the free theory and check numerically the results in 3d.
Smoothed Bootstrap und seine Anwendung in parametrischen Testverfahren
Directory of Open Access Journals (Sweden)
Handschuh, Dmitri
2015-03-01
Full Text Available In empirical research, the distribution of observations is usually unknown. This creates a problem if parametric methods are to be employed. The functionality of parametric methods relies on strong parametric assumptions. If these are violated the result of using classical parametric methods is questionable. Therefore, modifications of the parametric methods are required, if the appropriateness of their assumptions is in doubt. In this article, a modification of the smoothed bootstrap is presented (using the linear interpolation to approximate the distribution law suggested by the data. The application of this modification to statistical parametric methods allows taking into account deviations of the observed data distributions from the classical distribution assumptions without changing to other hypotheses, which often is implicit in using nonparametric methods. The approach is based on Monte Carlo method and is presented using one-way ANOVA as an example. The original and the modified statistical methods lead to identical outcomes when the assumptions of the original method are satisfied. For strong violations of the distributional assumptions, the modified version of the method is generally preferable. All procedures have been implemented in SAS. Test characteristics (type 1 error, the operating characteristic curve of the modified ANOVA are calculated.
Marill, Keith A; Chang, Yuchiao; Wong, Kim F; Friedman, Ari B
2017-08-01
Objectives Assessing high-sensitivity tests for mortal illness is crucial in emergency and critical care medicine. Estimating the 95% confidence interval (CI) of the likelihood ratio (LR) can be challenging when sample sensitivity is 100%. We aimed to develop, compare, and automate a bootstrapping method to estimate the negative LR CI when sample sensitivity is 100%. Methods The lowest population sensitivity that is most likely to yield sample sensitivity 100% is located using the binomial distribution. Random binomial samples generated using this population sensitivity are then used in the LR bootstrap. A free R program, "bootLR," automates the process. Extensive simulations were performed to determine how often the LR bootstrap and comparator method 95% CIs cover the true population negative LR value. Finally, the 95% CI was compared for theoretical sample sizes and sensitivities approaching and including 100% using: (1) a technique of individual extremes, (2) SAS software based on the technique of Gart and Nam, (3) the Score CI (as implemented in the StatXact, SAS, and R PropCI package), and (4) the bootstrapping technique. Results The bootstrapping approach demonstrates appropriate coverage of the nominal 95% CI over a spectrum of populations and sample sizes. Considering a study of sample size 200 with 100 patients with disease, and specificity 60%, the lowest population sensitivity with median sample sensitivity 100% is 99.31%. When all 100 patients with disease test positive, the negative LR 95% CIs are: individual extremes technique (0,0.073), StatXact (0,0.064), SAS Score method (0,0.057), R PropCI (0,0.062), and bootstrap (0,0.048). Similar trends were observed for other sample sizes. Conclusions When study samples demonstrate 100% sensitivity, available methods may yield inappropriately wide negative LR CIs. An alternative bootstrapping approach and accompanying free open-source R package were developed to yield realistic estimates easily. This
Montzka, Carsten; Hendricks Franssen, Harrie-Jan; Moradkhani, Hamid; Pütz, Thomas; Han, Xujun; Vereecken, Harry
2013-04-01
An adequate description of soil hydraulic properties is essential for a good performance of hydrological forecasts. So far, several studies showed that data assimilation could reduce the parameter uncertainty by considering soil moisture observations. However, these observations and also the model forcings were recorded with a specific measurement error. It seems a logical step to base state updating and parameter estimation on observations made at multiple time steps, in order to reduce the influence of outliers at single time steps given measurement errors and unknown model forcings. Such outliers could result in erroneous state estimation as well as inadequate parameters. This has been one of the reasons to use a smoothing technique as implemented for Bayesian data assimilation methods such as the Ensemble Kalman Filter (i.e. Ensemble Kalman Smoother). Recently, an ensemble-based smoother has been developed for state update with a SIR particle filter. However, this method has not been used for dual state-parameter estimation. In this contribution we present a Particle Smoother with sequentially smoothing of particle weights for state and parameter resampling within a time window as opposed to the single time step data assimilation used in filtering techniques. This can be seen as an intermediate variant between a parameter estimation technique using global optimization with estimation of single parameter sets valid for the whole period, and sequential Monte Carlo techniques with estimation of parameter sets evolving from one time step to another. The aims are i) to improve the forecast of evaporation and groundwater recharge by estimating hydraulic parameters, and ii) to reduce the impact of single erroneous model inputs/observations by a smoothing method. In order to validate the performance of the proposed method in a real world application, the experiment is conducted in a lysimeter environment.
MPBoot: fast phylogenetic maximum parsimony tree inference and bootstrap approximation.
Hoang, Diep Thi; Vinh, Le Sy; Flouri, Tomáš; Stamatakis, Alexandros; von Haeseler, Arndt; Minh, Bui Quang
2018-02-02
The nonparametric bootstrap is widely used to measure the branch support of phylogenetic trees. However, bootstrapping is computationally expensive and remains a bottleneck in phylogenetic analyses. Recently, an ultrafast bootstrap approximation (UFBoot) approach was proposed for maximum likelihood analyses. However, such an approach is still missing for maximum parsimony. To close this gap we present MPBoot, an adaptation and extension of UFBoot to compute branch supports under the maximum parsimony principle. MPBoot works for both uniform and non-uniform cost matrices. Our analyses on biological DNA and protein showed that under uniform cost matrices, MPBoot runs on average 4.7 (DNA) to 7 times (protein data) (range: 1.2-20.7) faster than the standard parsimony bootstrap implemented in PAUP*; but 1.6 (DNA) to 4.1 times (protein data) slower than the standard bootstrap with a fast search routine in TNT (fast-TNT). However, for non-uniform cost matrices MPBoot is 5 (DNA) to 13 times (protein data) (range:0.3-63.9) faster than fast-TNT. We note that MPBoot achieves better scores more frequently than PAUP* and fast-TNT. However, this effect is less pronounced if an intensive but slower search in TNT is invoked. Moreover, experiments on large-scale simulated data show that while both PAUP* and TNT bootstrap estimates are too conservative, MPBoot bootstrap estimates appear more unbiased. MPBoot provides an efficient alternative to the standard maximum parsimony bootstrap procedure. It shows favorable performance in terms of run time, the capability of finding a maximum parsimony tree, and high bootstrap accuracy on simulated as well as empirical data sets. MPBoot is easy-to-use, open-source and available at http://www.cibiv.at/software/mpboot .
Academic Carelessness, Bootstrapping, and the Cybernetic Investigator
Directory of Open Access Journals (Sweden)
Hannah Drayson
2017-11-01
Full Text Available The following discussion is concerned with certain forms of poor practice in academic publishing that give rise to “academic urban legends.” It suggests that rather than simply consider phenomena such as poor citation practices and circular reporting as mistakes, misunderstandings, and evidence of lack of rigor, we might also read them as evidence of a particular kind of creativity—for which misunderstandings, assump-tions, and failures of diligence are mechanisms by which potentially influential ideas manifest. Reflecting particularly on a critique of the debate surrounding pharmaceutical cognitive enhancement and its use by university staff and students, the following will argue that investigators within the disciplines concerned with the effects or development of these technologies are themselves implicated as potential subjects. Alongside reflections from science fiction studies that offer insights into the experiential dimension of reading and misreading, this paper offers some insights regarding how we might think of mistakes and misunderstandings as a form of bootstrapping and a source of creativity in scientific and technological development.
Analysis of cost data in a cluster-randomized, controlled trial: comparison of methods
DEFF Research Database (Denmark)
Sokolowski, Ineta; Ørnbøl, Eva; Rosendal, Marianne
studies have used non-valid analysis of skewed data. We propose two different methods to compare mean cost in two groups. Firstly, we use a non-parametric bootstrap method where the re-sampling takes place on two levels in order to take into account the cluster effect. Secondly, we proceed with a log......-transformation of the cost data and apply the normal theory on these data. Again we try to account for the cluster effect. The performance of these two methods is investigated in a simulation study. The advantages and disadvantages of the different approaches are discussed....... We consider health care data from a cluster-randomized intervention study in primary care to test whether the average health care costs among study patients differ between the two groups. The problems of analysing cost data are that most data are severely skewed. Median instead of mean...
Bootstrap and fast wave current drive for tokamak reactors
International Nuclear Information System (INIS)
Ehst, D.A.
1991-09-01
Using the multi-species neoclassical treatment of Hirshman and Sigmar we study steady state bootstrap equilibria with seed currents provided by low frequency (ICRF) fast waves and with additional surface current density driven by lower hybrid waves. This study applies to reactor plasmas of arbitrary aspect ratio. IN one limit the bootstrap component can supply nearly the total equilibrium current with minimal driving power ( o = 18 MA needs P FW = 15 MW, P LH = 75 MW). A computational survey of bootstrap fraction and current drive efficiency is presented. 11 refs., 8 figs
Directory of Open Access Journals (Sweden)
Engin YILDIZTEPE
2015-05-01
Full Text Available Bootstrap methodology is a modern statistical tool which enables us makin g statistical inference when the sampling distribution of the estimator is not known. Although the underlying idea is the same in all bootstrap methods, one might come across so many variations in tthe literature. In this study, the coverage accuracy of four most commonly used bootstrap confidence interval methods was assessed for various asymmetr c and heavy tailed distributions with an exh austive Monte Carlo simulation. In most of the cases, it has been found that the coverage accuracy of bootstrap percentile method is close to nominal for robust estimators of location.
Directory of Open Access Journals (Sweden)
José Antonio Castorina
2005-12-01
Full Text Available El presente artículo expone la teoría explicativa propuesta por Carey para el cambio conceptual. Primeramente, se plantea la cuestión de la reorganización conceptual en la psicología cognitiva y la posición de Carey. En segundo lugar, se ponen de relieve las condiciones epistémica que deben cumplir las "teorías" infantiles para que la reestructuración conceptual sea posible, así como los modos que adopta esta última. En tercer lugar, se muestran los resultados de investigaciones que verifican el cambio conceptual entre teorías infantiles de biología intuitiva. En cuarto lugar, se plantean las dificultades de otras teorías del cambio conceptual, para luego formular los rasgos del mecanismo alternativo de bootstrapping y su pertinencia para interpretrar los datos de las indagaciones mencionadas. Finalmente, se evalúan la originalidad de la teoría del bootstrpping en el escenario de los debates contemporáneos. Muy especialmente, se esboza una posible aproximación con las tesis dialécticas de Piaget.This paper examines the Carey's theory of conceptual change. First, it describes the conceptual reorganization in cognitive psychology and the author position. Second, the epistemic conditions that children "theories" should fulfil to make conceptual restructuring possible, as well as the ways adopted by the latter, are analyzed. In third place, findings of researches testing the conceptual change among biology intuitive children theories are explained. Subsequently, it discusses the difficulties other theories of conceptual change present, in order to state features of bootstrapping as an alternative mechanism and its relevance for the interpretation of abovementioned researches results. Finally, it evaluates the originality of "bootstrapping" theory in the scene of contemporary debates. It particularly outlines a possible approach to Piaget's dialectic theses.
Directory of Open Access Journals (Sweden)
Yeqing Zhang
2018-02-01
Full Text Available For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully.
van den Broek, Egon
A practitioner’s guide to resampling for data analysis, data mining, and modeling provides a gentle and pragmatic introduction in the proposed topics. Its supporting Web site was offline and, hence, its potentially added value could not be verified. The book refrains from using advanced mathematics
A steady-State Genetic Algorithm with Resampling for Noisy Inventory Control
Prestwich, S.; Tarim, S.A.; Rossi, R.; Hnich, B.
2008-01-01
Noisy fitness functions occur in many practical applications of evolutionary computation. A standard technique for solving these problems is fitness resampling but this may be inefficient or need a large population, and combined with elitism it may overvalue chromosomes or reduce genetic diversity.
Zhang, Yeqing; Wang, Meiling; Li, Yafeng
2018-01-01
For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301
Fast Generation of Ensembles of Cosmological N-Body Simulations via Mode-Resampling
Energy Technology Data Exchange (ETDEWEB)
Schneider, M D; Cole, S; Frenk, C S; Szapudi, I
2011-02-14
We present an algorithm for quickly generating multiple realizations of N-body simulations to be used, for example, for cosmological parameter estimation from surveys of large-scale structure. Our algorithm uses a new method to resample the large-scale (Gaussian-distributed) Fourier modes in a periodic N-body simulation box in a manner that properly accounts for the nonlinear mode-coupling between large and small scales. We find that our method for adding new large-scale mode realizations recovers the nonlinear power spectrum to sub-percent accuracy on scales larger than about half the Nyquist frequency of the simulation box. Using 20 N-body simulations, we obtain a power spectrum covariance matrix estimate that matches the estimator from Takahashi et al. (from 5000 simulations) with < 20% errors in all matrix elements. Comparing the rates of convergence, we determine that our algorithm requires {approx}8 times fewer simulations to achieve a given error tolerance in estimates of the power spectrum covariance matrix. The degree of success of our algorithm indicates that we understand the main physical processes that give rise to the correlations in the matter power spectrum. Namely, the large-scale Fourier modes modulate both the degree of structure growth through the variation in the effective local matter density and also the spatial frequency of small-scale perturbations through large-scale displacements. We expect our algorithm to be useful for noise modeling when constraining cosmological parameters from weak lensing (cosmic shear) and galaxy surveys, rescaling summary statistics of N-body simulations for new cosmological parameter values, and any applications where the influence of Fourier modes larger than the simulation size must be accounted for.
Time evolution of the bootstrap current profile in LHD plasmas
International Nuclear Information System (INIS)
Nakamura, Yuji; Kawaoto, K.; Watanabe, K.Y.
2008-10-01
The direction of the bootstrap current is inverted in the outward shifted plasmas of the Large Helical Device (LHD). In order to verify the reliability of the theoretical models of the bootstrap current in helical plasmas, the rotational transform profiles are observed by the Motional Stark Effect measurement in the bootstrap current carrying plasmas of the LHD, and they are compared with the numerical simulations of the toroidal current profile including the bootstrap current. Since the toroidal current profile is not in the steady state in these plasmas, taking care of the inversely induced component of the toroidal current and finite duration of the resistive diffusion of the toroidal current are important in the numerical simulations. Reasonable agreement can be obtained between the rotational transform profiles measured in the experiments and those calculated in the numerical simulations. (author)
Bootstrap current of fast ions in neutral beam injection heating
International Nuclear Information System (INIS)
Huang Qianhong; Gong Xueyu; Li Xinxia; Yu Jun
2012-01-01
The bootstrap current of fast ions produced by neutral beam injection (NBI) is investigated in a large-aspect-ratio tokamak with circular cross-section under specific parameters. The bootstrap current density distribution and the total bootstrap current are reported. In addition, the beam bootstrap current always accompanies the electron return current due to the parallel momentum transfer from fast ions. With the electron return current taken into consideration, the net current density obviously decreases; at the same time, the peak of the current moves towards the central plasma. Numerical results show that the value of the net current depends sensitively not only on the angle of the NBI but also on the ratio of the velocity of fast ions to the critical velocity: the value of the net current is small for neutral beam parallel injection, but increases severalfold for perpendicular injection, and increases with increasing beam energy. (paper)
Bootstrap current of fast ions in neutral beam injection heating
International Nuclear Information System (INIS)
Huang Qianhong; Gong Xueyu; Yang Lei; Li Xinxia; Lu Xingqiang; Yu Jun
2012-01-01
The bootstrap current of fast ions produced by the neutral beam injection is investigated in a large aspect ratio tokamak with circular cross-section under specific parameters. The bootstrap current density distribution and the total bootstrap current are figured out. In addition, the beam bootstrap current always accompanies the electron return current due to the parallel momentum transfer from fast ions. With the electron return current considered, the net current density obviously decreases due to electron return current, at the same time the peak of current moves towards the centre plasma. Numerical results show that the value of the net current depends sensitively not only on the angle of the neutral beam injection but also on the ratio of the velocity of fast ions to the critical velocity: the value of net current is small for the neutral beam parallel injection but increases multipliedly for perpendicular injection, and increases with beam energy increasing. (authors)
Forecasting Model for IPTV Service in Korea Using Bootstrap Ridge Regression Analysis
Lee, Byoung Chul; Kee, Seho; Kim, Jae Bum; Kim, Yun Bae
The telecom firms in Korea are taking new step to prepare for the next generation of convergence services, IPTV. In this paper we described our analysis on the effective method for demand forecasting about IPTV broadcasting. We have tried according to 3 types of scenarios based on some aspects of IPTV potential market and made a comparison among the results. The forecasting method used in this paper is the multi generation substitution model with bootstrap ridge regression analysis.
Generalized bootstrap equations and possible implications for the NLO Odderon
Energy Technology Data Exchange (ETDEWEB)
Bartels, J. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Vacca, G.P. [INFN, Sezione di Bologna (Italy)
2013-07-15
We formulate and discuss generalized bootstrap equations in nonabelian gauge theories. They are shown to hold in the leading logarithmic approximation. Since their validity is related to the self-consistency of the Steinmann relations for inelastic production amplitudes they can be expected to be valid also in NLO. Specializing to the N=4 SYM, we show that the validity in NLO of these generalized bootstrap equations allows to find the NLO Odderon solution with intercept exactly at one.
Solution of the statistical bootstrap with Bose statistics
International Nuclear Information System (INIS)
Engels, J.; Fabricius, K.; Schilling, K.
1977-01-01
A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density
Random resampling masks: a non-Bayesian one-shot strategy for noise reduction in digital holography.
Bianco, V; Paturzo, M; Memmolo, P; Finizio, A; Ferraro, P; Javidi, B
2013-03-01
Holographic imaging may become severely degraded by a mixture of speckle and incoherent additive noise. Bayesian approaches reduce the incoherent noise, but prior information is needed on the noise statistics. With no prior knowledge, one-shot reduction of noise is a highly desirable goal, as the recording process is simplified and made faster. Indeed, neither multiple acquisitions nor a complex setup are needed. So far, this result has been achieved at the cost of a deterministic resolution loss. Here we propose a fast non-Bayesian denoising method that avoids this trade-off by means of a numerical synthesis of a moving diffuser. In this way, only one single hologram is required as multiple uncorrelated reconstructions are provided by random complementary resampling masks. Experiments show a significant incoherent noise reduction, close to the theoretical improvement bound, resulting in image-contrast improvement. At the same time, we preserve the resolution of the unprocessed image.
White, H; Racine, J
2001-01-01
We propose tests for individual and joint irrelevance of network inputs. Such tests can be used to determine whether an input or group of inputs "belong" in a particular model, thus permitting valid statistical inference based on estimated feedforward neural-network models. The approaches employ well-known statistical resampling techniques. We conduct a small Monte Carlo experiment showing that our tests have reasonable level and power behavior, and we apply our methods to examine whether there are predictable regularities in foreign exchange rates. We find that exchange rates do appear to contain information that is exploitable for enhanced point prediction, but the nature of the predictive relations evolves through time.
International Nuclear Information System (INIS)
Ozel, M.E.; Mayer-Hasselwander, H.
1985-01-01
This paper discusses the bootstrap scheme which fits well for many astronomical applications. It is based on the well-known sampling plan called ''sampling with replacement''. Digital computers make the method very practical for the investigation of various trends present in a limited set of data which is usually a small fraction of the total population. The authors attempt to apply the method and demonstrate its feasibility. The study indicates that the discrete nature of high energy gamma-ray data makes the bootstrap method especially attractive for gamma-ray astronomy. Present analysis shows that the ratio of pulse strengths is variable with a 99.8% confidence
Control of bootstrap current in the pedestal region of tokamaks
Energy Technology Data Exchange (ETDEWEB)
Shaing, K. C. [Institute for Space and Plasma Sciences, National Cheng Kung University, Tainan City 70101, Taiwan (China); Department of Engineering Physics, University of Wisconsin, Madison, Wisconsin 53796 (United States); Lai, A. L. [Institute for Space and Plasma Sciences, National Cheng Kung University, Tainan City 70101, Taiwan (China)
2013-12-15
The high confinement mode (H-mode) plasmas in the pedestal region of tokamaks are characterized by steep gradient of the radial electric field, and sonic poloidal U{sub p,m} flow that consists of poloidal components of the E×B flow and the plasma flow velocity that is parallel to the magnetic field B. Here, E is the electric field. The bootstrap current that is important for the equilibrium, and stability of the pedestal of H-mode plasmas is shown to have an expression different from that in the conventional theory. In the limit where ‖U{sub p,m}‖≫ 1, the bootstrap current is driven by the electron temperature gradient and inductive electric field fundamentally different from that in the conventional theory. The bootstrap current in the pedestal region can be controlled through manipulating U{sub p,m} and the gradient of the radial electric. This, in turn, can control plasma stability such as edge-localized modes. Quantitative evaluations of various coefficients are shown to illustrate that the bootstrap current remains finite when ‖U{sub p,m}‖ approaches infinite and to provide indications how to control the bootstrap current. Approximate analytic expressions for viscous coefficients that join results in the banana and plateau-Pfirsch-Schluter regimes are presented to facilitate bootstrap and neoclassical transport simulations in the pedestal region.
GSD: An SPSS extension command for sub-sampling and bootstrapping datasets
Directory of Open Access Journals (Sweden)
Harding, Bradley
2016-09-01
Full Text Available Statistical analyses have grown immensely since the inception of computational methods. However, many quantitative methods classes teach sampling and sub-sampling at a very abstract level despite the fact that, with the faster computers of today, these notions could be demonstrated live to the students. For this reason, we have created a simple extension module for SPSS that can sub-sample and Bootstrap data, GSD (Generator of Sub-sampled Data. In this paper, we describe and show how to use the GSD module as well as provide short descriptions of both the sub-sampling and Bootstrap methods. In addition, as this article aims to inspire instructors to introduce these concepts in their statistics classes of all levels, we provide three short exercises that are ready for curriculum implementation.
How many bootstrap replicates are necessary?
Pattengale, Nicholas D; Alipour, Masoud; Bininda-Emonds, Olaf R P; Moret, Bernard M E; Stamatakis, Alexandros
2010-03-01
Phylogenetic bootstrapping (BS) is a standard technique for inferring confidence values on phylogenetic trees that is based on reconstructing many trees from minor variations of the input data, trees called replicates. BS is used with all phylogenetic reconstruction approaches, but we focus here on one of the most popular, maximum likelihood (ML). Because ML inference is so computationally demanding, it has proved too expensive to date to assess the impact of the number of replicates used in BS on the relative accuracy of the support values. For the same reason, a rather small number (typically 100) of BS replicates are computed in real-world studies. Stamatakis et al. recently introduced a BS algorithm that is 1 to 2 orders of magnitude faster than previous techniques, while yielding qualitatively comparable support values, making an experimental study possible. In this article, we propose stopping criteria--that is, thresholds computed at runtime to determine when enough replicates have been generated--and we report on the first large-scale experimental study to assess the effect of the number of replicates on the quality of support values, including the performance of our proposed criteria. We run our tests on 17 diverse real-world DNA--single-gene as well as multi-gene--datasets, which include 125-2,554 taxa. We find that our stopping criteria typically stop computations after 100-500 replicates (although the most conservative criterion may continue for several thousand replicates) while producing support values that correlate at better than 99.5% with the reference values on the best ML trees. Significantly, we also find that the stopping criteria can recommend very different numbers of replicates for different datasets of comparable sizes. Our results are thus twofold: (i) they give the first experimental assessment of the effect of the number of BS replicates on the quality of support values returned through BS, and (ii) they validate our proposals for
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés
2018-03-01
Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.
Y-90 PET imaging for radiation theragnosis using bootstrap event re sampling
International Nuclear Information System (INIS)
Nam, Taewon; Woo, Sangkeun; Min, Gyungju; Kim, Jimin; Kang, Joohyun; Lim, Sangmoo; Kim, Kyeongmin
2013-01-01
Surgical resection is the most effective method to recover the liver function. However, Yttrium-90 (Y-90) has been used as a new treatment due to the fact that it can be delivered to the tumors and results in greater radiation exposure to the tumors than using external radiation nowadays since most treatment is palliative in case of unresectable stage of hepatocellular carcinoma (HCC). Recently, Y-90 has been received much interest and studied by many researchers. Imaging of Y-90 has been conducted using most commonly gamma camera but PET imaging is required due to low sensitivity and resolution. The purpose of this study was to assess statistical characteristics and to improve count rate of image for enhancing image quality by using nonparametric bootstrap method. PET data was able to be improved using non-parametric bootstrap method and it was verified with showing improved uniformity and SNR. Uniformity showed more improvement under the condition of low count rate, i.e. Y-90, in case of phantom and also uniformity and SNR showed improvement of 15.6% and 33.8% in case of mouse, respectively. Bootstrap method performed in this study for PET data increased count rate of PET image and consequentially time for acquisition time can be reduced. It will be expected to improve performance for diagnosis
Han, Guanghui; Liu, Xiabi; Zheng, Guangyuan; Wang, Murong; Huang, Shan
2018-06-06
Ground-glass opacity (GGO) is a common CT imaging sign on high-resolution CT, which means the lesion is more likely to be malignant compared to common solid lung nodules. The automatic recognition of GGO CT imaging signs is of great importance for early diagnosis and possible cure of lung cancers. The present GGO recognition methods employ traditional low-level features and system performance improves slowly. Considering the high-performance of CNN model in computer vision field, we proposed an automatic recognition method of 3D GGO CT imaging signs through the fusion of hybrid resampling and layer-wise fine-tuning CNN models in this paper. Our hybrid resampling is performed on multi-views and multi-receptive fields, which reduces the risk of missing small or large GGOs by adopting representative sampling panels and processing GGOs with multiple scales simultaneously. The layer-wise fine-tuning strategy has the ability to obtain the optimal fine-tuning model. Multi-CNN models fusion strategy obtains better performance than any single trained model. We evaluated our method on the GGO nodule samples in publicly available LIDC-IDRI dataset of chest CT scans. The experimental results show that our method yields excellent results with 96.64% sensitivity, 71.43% specificity, and 0.83 F1 score. Our method is a promising approach to apply deep learning method to computer-aided analysis of specific CT imaging signs with insufficient labeled images. Graphical abstract We proposed an automatic recognition method of 3D GGO CT imaging signs through the fusion of hybrid resampling and layer-wise fine-tuning CNN models in this paper. Our hybrid resampling reduces the risk of missing small or large GGOs by adopting representative sampling panels and processing GGOs with multiple scales simultaneously. The layer-wise fine-tuning strategy has ability to obtain the optimal fine-tuning model. Our method is a promising approach to apply deep learning method to computer-aided analysis
Roberts, Steven; Martin, Michael A
2010-01-01
Concerns have been raised about findings of associations between particulate matter (PM) air pollution and mortality that have been based on a single "best" model arising from a model selection procedure, because such a strategy may ignore model uncertainty inherently involved in searching through a set of candidate models to find the best model. Model averaging has been proposed as a method of allowing for model uncertainty in this context. To propose an extension (double BOOT) to a previously described bootstrap model-averaging procedure (BOOT) for use in time series studies of the association between PM and mortality. We compared double BOOT and BOOT with Bayesian model averaging (BMA) and a standard method of model selection [standard Akaike's information criterion (AIC)]. Actual time series data from the United States are used to conduct a simulation study to compare and contrast the performance of double BOOT, BOOT, BMA, and standard AIC. Double BOOT produced estimates of the effect of PM on mortality that have had smaller root mean squared error than did those produced by BOOT, BMA, and standard AIC. This performance boost resulted from estimates produced by double BOOT having smaller variance than those produced by BOOT and BMA. Double BOOT is a viable alternative to BOOT and BMA for producing estimates of the mortality effect of PM.
Improved Correction of Misclassification Bias With Bootstrap Imputation.
van Walraven, Carl
2018-07-01
Diagnostic codes used in administrative database research can create bias due to misclassification. Quantitative bias analysis (QBA) can correct for this bias, requires only code sensitivity and specificity, but may return invalid results. Bootstrap imputation (BI) can also address misclassification bias but traditionally requires multivariate models to accurately estimate disease probability. This study compared misclassification bias correction using QBA and BI. Serum creatinine measures were used to determine severe renal failure status in 100,000 hospitalized patients. Prevalence of severe renal failure in 86 patient strata and its association with 43 covariates was determined and compared with results in which renal failure status was determined using diagnostic codes (sensitivity 71.3%, specificity 96.2%). Differences in results (misclassification bias) were then corrected with QBA or BI (using progressively more complex methods to estimate disease probability). In total, 7.4% of patients had severe renal failure. Imputing disease status with diagnostic codes exaggerated prevalence estimates [median relative change (range), 16.6% (0.8%-74.5%)] and its association with covariates [median (range) exponentiated absolute parameter estimate difference, 1.16 (1.01-2.04)]. QBA produced invalid results 9.3% of the time and increased bias in estimates of both disease prevalence and covariate associations. BI decreased misclassification bias with increasingly accurate disease probability estimates. QBA can produce invalid results and increase misclassification bias. BI avoids invalid results and can importantly decrease misclassification bias when accurate disease probability estimates are used.
A voltage biased superconducting quantum interference device bootstrap circuit
International Nuclear Information System (INIS)
Xie Xiaoming; Wang Huiwu; Wang Yongliang; Dong Hui; Jiang Mianheng; Zhang Yi; Krause, Hans-Joachim; Braginski, Alex I; Offenhaeusser, Andreas; Mueck, Michael
2010-01-01
We present a dc superconducting quantum interference device (SQUID) readout circuit operating in the voltage bias mode and called a SQUID bootstrap circuit (SBC). The SBC is an alternative implementation of two existing methods for suppression of room-temperature amplifier noise: additional voltage feedback and current feedback. Two circuit branches are connected in parallel. In the dc SQUID branch, an inductively coupled coil connected in series provides the bias current feedback for enhancing the flux-to-current coefficient. The circuit branch parallel to the dc SQUID branch contains an inductively coupled voltage feedback coil with a shunt resistor in series for suppressing the preamplifier noise current by increasing the dynamic resistance. We show that the SBC effectively reduces the preamplifier noise to below the SQUID intrinsic noise. For a helium-cooled planar SQUID magnetometer with a SQUID inductance of 350 pH, a flux noise of about 3 μΦ 0 Hz -1/2 and a magnetic field resolution of less than 3 fT Hz -1/2 were obtained. The SBC leads to a convenient direct readout electronics for a dc SQUID with a wider adjustment tolerance than other feedback schemes.
Bootstrap-based Support of HGT Inferred by Maximum Parsimony
Directory of Open Access Journals (Sweden)
Nakhleh Luay
2010-05-01
Full Text Available Abstract Background Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. Results In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. Conclusions We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/, and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.
Bootstrap-based support of HGT inferred by maximum parsimony.
Park, Hyun Jung; Jin, Guohua; Nakhleh, Luay
2010-05-05
Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/), and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.
Locality, bulk equations of motion and the conformal bootstrap
Energy Technology Data Exchange (ETDEWEB)
Kabat, Daniel [Department of Physics and Astronomy, Lehman College, City University of New York,250 Bedford Park Blvd. W, Bronx NY 10468 (United States); Lifschytz, Gilad [Department of Mathematics, Faculty of Natural Science, University of Haifa,199 Aba Khoushy Ave., Haifa 31905 (Israel)
2016-10-18
We develop an approach to construct local bulk operators in a CFT to order 1/N{sup 2}. Since 4-point functions are not fixed by conformal invariance we use the OPE to categorize possible forms for a bulk operator. Using previous results on 3-point functions we construct a local bulk operator in each OPE channel. We then impose the condition that the bulk operators constructed in different channels agree, and hence give rise to a well-defined bulk operator. We refer to this condition as the “bulk bootstrap.” We argue and explicitly show in some examples that the bulk bootstrap leads to some of the same results as the regular conformal bootstrap. In fact the bulk bootstrap provides an easier way to determine some CFT data, since it does not require knowing the form of the conformal blocks. This analysis clarifies previous results on the relation between bulk locality and the bootstrap for theories with a 1/N expansion, and it identifies a simple and direct way in which OPE coefficients and anomalous dimensions determine the bulk equations of motion to order 1/N{sup 2}.
A comparison of resampling schemes for estimating model observer performance with small ensembles
Elshahaby, Fatma E. A.; Jha, Abhinav K.; Ghaly, Michael; Frey, Eric C.
2017-09-01
In objective assessment of image quality, an ensemble of images is used to compute the 1st and 2nd order statistics of the data. Often, only a finite number of images is available, leading to the issue of statistical variability in numerical observer performance. Resampling-based strategies can help overcome this issue. In this paper, we compared different combinations of resampling schemes (the leave-one-out (LOO) and the half-train/half-test (HT/HT)) and model observers (the conventional channelized Hotelling observer (CHO), channelized linear discriminant (CLD) and channelized quadratic discriminant). Observer performance was quantified by the area under the ROC curve (AUC). For a binary classification task and for each observer, the AUC value for an ensemble size of 2000 samples per class served as a gold standard for that observer. Results indicated that each observer yielded a different performance depending on the ensemble size and the resampling scheme. For a small ensemble size, the combination [CHO, HT/HT] had more accurate rankings than the combination [CHO, LOO]. Using the LOO scheme, the CLD and CHO had similar performance for large ensembles. However, the CLD outperformed the CHO and gave more accurate rankings for smaller ensembles. As the ensemble size decreased, the performance of the [CHO, LOO] combination seriously deteriorated as opposed to the [CLD, LOO] combination. Thus, it might be desirable to use the CLD with the LOO scheme when smaller ensemble size is available.
Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping.
Borra-Serrano, Irene; Peña, José Manuel; Torres-Sánchez, Jorge; Mesas-Carrascosa, Francisco Javier; López-Granados, Francisca
2015-08-12
Unmanned aerial vehicles (UAVs) combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images) created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra) captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA) implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights.
BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.
Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter
2013-02-01
Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of
Komachi, Mamoru; Kudo, Taku; Shimbo, Masashi; Matsumoto, Yuji
Bootstrapping has a tendency, called semantic drift, to select instances unrelated to the seed instances as the iteration proceeds. We demonstrate the semantic drift of Espresso-style bootstrapping has the same root as the topic drift of Kleinberg's HITS, using a simplified graph-based reformulation of bootstrapping. We confirm that two graph-based algorithms, the von Neumann kernels and the regularized Laplacian, can reduce the effect of semantic drift in the task of word sense disambiguation (WSD) on Senseval-3 English Lexical Sample Task. Proposed algorithms achieve superior performance to Espresso and previous graph-based WSD methods, even though the proposed algorithms have less parameters and are easy to calibrate.
Modelos alternativos de simulación Bootstrap
Pino Mejías, Rafael
1992-01-01
Se describen las características fundamentales de los métodos Bootstrap. Se analizan diversas problemáticas que presentan tales métodos, por lo que se presentan dos métodos alternativos dentro del método Bootstrap basado en la simulación de muestras (método II de Efron). En el primero se presenta un método, que a partir de un estudio de las propiedades algebraicas y estadísticas del conjunto de posibles muestras, utiliza un criterio probabilístico para detectar muestras "outliers". En el segu...
Bootstrapped efficiency measures of oil blocks in Angola
International Nuclear Information System (INIS)
Barros, C.P.; Assaf, A.
2009-01-01
This paper investigates the technical efficiency of Angola oil blocks over the period 2002-2007. A double bootstrap data envelopment analysis (DEA) model is adopted composed in the first stage of a DEA-variable returns to scale (VRS) model and then followed in the second stage by a bootstrapped truncated regression. Results showed that on average, the technical efficiency has fluctuated over the period of study, but deep and ultradeep oil blocks have generally maintained a consistent efficiency level. Policy implications are derived.
Arnaud, Patrick; Cantet, Philippe; Odry, Jean
2017-11-01
Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with
Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.
Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta
2016-10-27
This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.
Energy Technology Data Exchange (ETDEWEB)
Wedenberg, Minna, E-mail: minna.wedenberg@raysearchlabs.com
2013-11-15
Purpose: To apply a statistical bootstrap analysis to assess the uncertainty in the dose–response relation for the endpoints pneumonitis and myelopathy reported in the QUANTEC review. Methods and Materials: The bootstrap method assesses the uncertainty of the estimated population-based dose-response relation due to sample variability, which reflects the uncertainty due to limited numbers of patients in the studies. A large number of bootstrap replicates of the original incidence data were produced by random sampling with replacement. The analysis requires only the dose, the number of patients, and the number of occurrences of the studied endpoint, for each study. Two dose–response models, a Poisson-based model and the Lyman model, were fitted to each bootstrap replicate using maximum likelihood. Results: The bootstrap analysis generates a family of curves representing the range of plausible dose–response relations, and the 95% bootstrap confidence intervals give an estimated upper and lower toxicity risk. The curve families for the 2 dose–response models overlap for doses included in the studies at hand but diverge beyond that, with the Lyman model suggesting a steeper slope. The resulting distributions of the model parameters indicate correlation and non-Gaussian distribution. For both data sets, the likelihood of the observed data was higher for the Lyman model in >90% of the bootstrap replicates. Conclusions: The bootstrap method provides a statistical analysis of the uncertainty in the estimated dose–response relation for myelopathy and pneumonitis. It suggests likely values of model parameter values, their confidence intervals, and how they interrelate for each model. Finally, it can be used to evaluate to what extent data supports one model over another. For both data sets considered here, the Lyman model was preferred over the Poisson-based model.
Predicting disease risk using bootstrap ranking and classification algorithms.
Directory of Open Access Journals (Sweden)
Ohad Manor
Full Text Available Genome-wide association studies (GWAS are widely used to search for genetic loci that underlie human disease. Another goal is to predict disease risk for different individuals given their genetic sequence. Such predictions could either be used as a "black box" in order to promote changes in life-style and screening for early diagnosis, or as a model that can be studied to better understand the mechanism of the disease. Current methods for risk prediction typically rank single nucleotide polymorphisms (SNPs by the p-value of their association with the disease, and use the top-associated SNPs as input to a classification algorithm. However, the predictive power of such methods is relatively poor. To improve the predictive power, we devised BootRank, which uses bootstrapping in order to obtain a robust prioritization of SNPs for use in predictive models. We show that BootRank improves the ability to predict disease risk of unseen individuals in the Wellcome Trust Case Control Consortium (WTCCC data and results in a more robust set of SNPs and a larger number of enriched pathways being associated with the different diseases. Finally, we show that combining BootRank with seven different classification algorithms improves performance compared to previous studies that used the WTCCC data. Notably, diseases for which BootRank results in the largest improvements were recently shown to have more heritability than previously thought, likely due to contributions from variants with low minimum allele frequency (MAF, suggesting that BootRank can be beneficial in cases where SNPs affecting the disease are poorly tagged or have low MAF. Overall, our results show that improving disease risk prediction from genotypic information may be a tangible goal, with potential implications for personalized disease screening and treatment.
TRANSFORMERLESS OPERATION OF DIII-D WITH HIGH BOOTSTRAP FRACTION
International Nuclear Information System (INIS)
POLITZER, PA; HYATT, AW; LUCE, TC; MAHDAVI, MA; MURAKAMI, M; PERKINS, FW; PRATER, R; TURNBULL, AD; CASPER, TA; FERRON, JR; JAYAKUMAR, RJ; LAHAYE, RJ; LAZARUS, EA; PETTY, CC; WADE, MR
2003-01-01
OAK-B135 The authors have initiated an experimental program to address some of the questions associated with operation of a tokamak with high bootstrap current fraction under high performance conditions, without assistance from a transformer. In these discharges they have maintained stationary (or slowly improving) conditions for > 2.2 s at β N ∼ β p ∼ 2.8. Significant current overdrive, with dI/dt > 50 kA/s and zero or negative voltage, is sustained for over 0.7 s. The overdrive condition is usually ended with the appearance of MHD activity, which alters the profiles and reduces the bootstrap current. Characteristically these plasmas have 65%-80% bootstrap current, 25%-30% NBCD, and 5%-10% ECCD. Fully noninductive operation is essential for steady-state tokamaks. For efficient operation, the bootstrap current fraction must be close to 100%, allowing for a small additional (∼ 10%) external current drive capability to be used for control. In such plasmas the current and pressure profiles are rightly coupled because J(r) is entirely determined by p(r) (or more accurately by the kinetic profiles). The pressure gradient in turn is determined by transport coefficients which depend on the poloidal field profile
Adaptive Kernel In The Bootstrap Boosting Algorithm In KDE ...
African Journals Online (AJOL)
This paper proposes the use of adaptive kernel in a bootstrap boosting algorithm in kernel density estimation. The algorithm is a bias reduction scheme like other existing schemes but uses adaptive kernel instead of the regular fixed kernels. An empirical study for this scheme is conducted and the findings are comparatively ...
A bootstrap invariance principle for highly nonstationary long memory processes
Kapetanios, George
2004-01-01
This paper presents an invariance principle for highly nonstationary long memory processes, defined as processes with long memory parameter lying in (1, 1.5). This principle provides the tools for showing asymptotic validity of the bootstrap in the context of such processes.
Bootstrapping the energy flow in the beginning of life
Hengeveld, R.; Fedonkin, M.A.
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in
Integrable deformations of conformal theories and bootstrap trees
International Nuclear Information System (INIS)
Mussardo, G.
1991-01-01
I present recent results in the study of massive integrable quantum field theories in (1+1) dimensions considered as perturbed conformal minimal models. The on mass-shell properties of such theories, with a particular emphasis on the bootstrap principle, are investigated. (orig.)
A Bootstrap Cointegration Rank Test for Panels of VAR Models
DEFF Research Database (Denmark)
Callot, Laurent
functions of the individual Cointegrated VARs (CVAR) models. A bootstrap based procedure is used to compute empirical distributions of the trace test statistics for these individual models. From these empirical distributions two panel trace test statistics are constructed. The satisfying small sample...
Bootstrapping the energy flow in the beginning of life.
Hengeveld, R.; Fedonkin, M.A.
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in
Finite-size effects for anisotropic bootstrap percolation : Logarithmic corrections
van Enter, Aernout C. D.; Hulshof, Tim
In this note we analyse an anisotropic, two-dimensional bootstrap percolation model introduced by Gravner and Griffeath. We present upper and lower bounds on the finite-size effects. We discuss the similarities with the semi-oriented model introduced by Duarte.
Metastability Thresholds for Anisotropic Bootstrap Percolation in Three Dimensions
Enter, Aernout C.D. van; Fey, Anne
In this paper we analyze several anisotropic bootstrap percolation models in three dimensions. We present the order of magnitude for the metastability thresholds for a fairly general class of models. In our proofs, we use an adaptation of the technique of dimensional reduction. We find that the
Finite-size effects for anisotropic bootstrap percolation: logerithmic corrections
Enter, van A.C.D.; Hulshof, T.
2007-01-01
In this note we analyse an anisotropic, two-dimensional bootstrap percolation model introduced by Gravner and Griffeath. We present upper and lower bounds on the finite-size effects. We discuss the similarities with the semi-oriented model introduced by Duarte.
Finite-Size Effects for Some Bootstrap Percolation Models
Enter, A.C.D. van; Adler, Joan; Duarte, J.A.M.S.
The consequences of Schonmann's new proof that the critical threshold is unity for certain bootstrap percolation models are explored. It is shown that this proof provides an upper bound for the finite-size scaling in these systems. Comparison with data for one case demonstrates that this scaling
A resampling-based meta-analysis for detection of differential gene expression in breast cancer
International Nuclear Information System (INIS)
Gur-Dedeoglu, Bala; Konu, Ozlen; Kir, Serkan; Ozturk, Ahmet Rasit; Bozkurt, Betul; Ergul, Gulusan; Yulug, Isik G
2008-01-01
Accuracy in the diagnosis of breast cancer and classification of cancer subtypes has improved over the years with the development of well-established immunohistopathological criteria. More recently, diagnostic gene-sets at the mRNA expression level have been tested as better predictors of disease state. However, breast cancer is heterogeneous in nature; thus extraction of differentially expressed gene-sets that stably distinguish normal tissue from various pathologies poses challenges. Meta-analysis of high-throughput expression data using a collection of statistical methodologies leads to the identification of robust tumor gene expression signatures. A resampling-based meta-analysis strategy, which involves the use of resampling and application of distribution statistics in combination to assess the degree of significance in differential expression between sample classes, was developed. Two independent microarray datasets that contain normal breast, invasive ductal carcinoma (IDC), and invasive lobular carcinoma (ILC) samples were used for the meta-analysis. Expression of the genes, selected from the gene list for classification of normal breast samples and breast tumors encompassing both the ILC and IDC subtypes were tested on 10 independent primary IDC samples and matched non-tumor controls by real-time qRT-PCR. Other existing breast cancer microarray datasets were used in support of the resampling-based meta-analysis. The two independent microarray studies were found to be comparable, although differing in their experimental methodologies (Pearson correlation coefficient, R = 0.9389 and R = 0.8465 for ductal and lobular samples, respectively). The resampling-based meta-analysis has led to the identification of a highly stable set of genes for classification of normal breast samples and breast tumors encompassing both the ILC and IDC subtypes. The expression results of the selected genes obtained through real-time qRT-PCR supported the meta-analysis results. The
Evaluation of resampling applied to UAV imagery for weed detection using OBIA
Borra, I.; Peña Barragán, José Manuel; Torres Sánchez, Jorge; López Granados, Francisca
2015-01-01
Los vehículos aéreos no tripulados (UAVs) son una tecnología emergente en el estudio de parámetros agrícolas por sus características y por portar sensores en diferente rango espectral. En este trabajo se ha detectado y cartografiado rodales de malas hierbas en fase temprana mediante análisis OBIA para elaborar mapas que optimicen el tratamiento herbicida localizado. Se ha aplicado resampling (resampleo) sobre imágenes tomadas en campo desde un UAV (UAV-I) para crear una nueva imagen con disti...
Direct measurement of fast transients by using boot-strapped waveform averaging
Olsson, Mattias; Edman, Fredrik; Karki, Khadga Jung
2018-03-01
An approximation to coherent sampling, also known as boot-strapped waveform averaging, is presented. The method uses digital cavities to determine the condition for coherent sampling. It can be used to increase the effective sampling rate of a repetitive signal and the signal to noise ratio simultaneously. The method is demonstrated by using it to directly measure the fluorescence lifetime from Rhodamine 6G by digitizing the signal from a fast avalanche photodiode. The obtained lifetime of 4.0 ns is in agreement with the known values.
Construction of prediction intervals for Palmer Drought Severity Index using bootstrap
Beyaztas, Ufuk; Bickici Arikan, Bugrayhan; Beyaztas, Beste Hamiye; Kahya, Ercan
2018-04-01
In this study, we propose an approach based on the residual-based bootstrap method to obtain valid prediction intervals using monthly, short-term (three-months) and mid-term (six-months) drought observations. The effects of North Atlantic and Arctic Oscillation indexes on the constructed prediction intervals are also examined. Performance of the proposed approach is evaluated for the Palmer Drought Severity Index (PDSI) obtained from Konya closed basin located in Central Anatolia, Turkey. The finite sample properties of the proposed method are further illustrated by an extensive simulation study. Our results revealed that the proposed approach is capable of producing valid prediction intervals for future PDSI values.
Hadronic equation of state in the statistical bootstrap model and linear graph theory
International Nuclear Information System (INIS)
Fre, P.; Page, R.
1976-01-01
Taking a statistical mechanical point og view, the statistical bootstrap model is discussed and, from a critical analysis of the bootstrap volume comcept, it is reached a physical ipothesis, which leads immediately to the hadronic equation of state provided by the bootstrap integral equation. In this context also the connection between the statistical bootstrap and the linear graph theory approach to interacting gases is analyzed
Bootstrapping as a Resource Dependence Management Strategy and its Association with Startup Growth
T. VANACKER; S. MANIGART; M. MEULEMAN; L. SELS
2011-01-01
This paper studies the association between bootstrapping and startup growth. Bootstrapping reduces a startup’s dependence on financial investors, but may create new dependencies. Drawing upon resource dependence theory, we hypothesize that when bootstrapping does not create new strong dependencies it will benefit startup growth, especially when dependence from financial investors is high. However, when bootstrapping creates new strong dependencies it will constrain growth, especially when dep...
A Bootstrap Approach to Eigenvalue Correction
Hendrikse, A.J.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.
2009-01-01
Eigenvalue analysis is an important aspect in many data modeling methods. Unfortunately, the eigenvalues of the sample covariance matrix (sample eigenvalues) are biased estimates of the eigenvalues of the covariance matrix of the data generating process (population eigenvalues). We present a new
van de Water, S.; Kraan, A. C.; Breedveld, S.; Schillemans, W.; Teguh, D. N.; Kooy, H. M.; Madden, T. M.; Heijmen, B. J. M.; Hoogeman, M. S.
2013-10-01
This study investigates whether ‘pencil beam resampling’, i.e. iterative selection and weight optimization of randomly placed pencil beams (PBs), reduces optimization time and improves plan quality for multi-criteria optimization in intensity-modulated proton therapy, compared with traditional modes in which PBs are distributed over a regular grid. Resampling consisted of repeatedly performing: (1) random selection of candidate PBs from a very fine grid, (2) inverse multi-criteria optimization, and (3) exclusion of low-weight PBs. The newly selected candidate PBs were added to the PBs in the existing solution, causing the solution to improve with each iteration. Resampling and traditional regular grid planning were implemented into our in-house developed multi-criteria treatment planning system ‘Erasmus iCycle’. The system optimizes objectives successively according to their priorities as defined in the so-called ‘wish-list’. For five head-and-neck cancer patients and two PB widths (3 and 6 mm sigma at 230 MeV), treatment plans were generated using: (1) resampling, (2) anisotropic regular grids and (3) isotropic regular grids, while using varying sample sizes (resampling) or grid spacings (regular grid). We assessed differences in optimization time (for comparable plan quality) and in plan quality parameters (for comparable optimization time). Resampling reduced optimization time by a factor of 2.8 and 5.6 on average (7.8 and 17.0 at maximum) compared with the use of anisotropic and isotropic grids, respectively. Doses to organs-at-risk were generally reduced when using resampling, with median dose reductions ranging from 0.0 to 3.0 Gy (maximum: 14.3 Gy, relative: 0%-42%) compared with anisotropic grids and from -0.3 to 2.6 Gy (maximum: 11.4 Gy, relative: -4%-19%) compared with isotropic grids. Resampling was especially effective when using thin PBs (3 mm sigma). Resampling plans contained on average fewer PBs, energy layers and protons than anisotropic
RANDOM QUADRATIC-FORMS AND THE BOOTSTRAP FOR U-STATISTICS
DEHLING, H; MIKOSCH, T
We study the bootstrap distribution for U-statistics with special emphasis on the degenerate case. For the Efron bootstrap we give a short proof of the consistency using Mallows' metrics. We also study the i.i.d. weighted bootstrap [GRAPHICS] where (X(i)) and (xi(i)) are two i.i.d. sequences,
Energy Technology Data Exchange (ETDEWEB)
Holoien, Thomas W.-S.; /Ohio State U., Dept. Astron. /Ohio State U., CCAPP /KIPAC, Menlo Park /SLAC; Marshall, Philip J.; Wechsler, Risa H.; /KIPAC, Menlo Park /SLAC
2017-05-11
We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of a subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.
Holoien, Thomas W.-S.; Marshall, Philip J.; Wechsler, Risa H.
2017-06-01
We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of a subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.
Embodied Language Learning and Cognitive Bootstrapping
DEFF Research Database (Denmark)
Lyon, C.E.; Nehaniv, C. L.; Saunders, Joe
2016-01-01
Co-development of action, conceptualization and social interaction mutually scaffold and support each other within a virtuous feedback cycle in the development of human language in children. Within this framework, the purpose of this article is to bring together diverse but complementary accounts...... of research methods that jointly contribute to our understanding of cognitive development and in particular, language acquisition in robots. Thus, we include research pertaining to developmental robotics, cognitive science, psychology, linguistics and neuroscience, as well as practical computer science...... the humanoid robot iCub are reported, while human learning relevant to developmental robotics has also contributed useful results. Disparate approaches are brought together via common underlying design principles. Without claiming to model human language acquisition directly, we are nonetheless inspired...
A proof of fulfillment of the strong bootstrap condition
International Nuclear Information System (INIS)
Fadin, V.S.; Papa, A.
2002-01-01
It is shown that the kernel of the BFKL equation for the octet color state of two Reggeized gluons satisfies the strong bootstrap condition in the next-to-leading order. This condition is much more restrictive than the one obtained from the requirement of the Reggeized form for the elastic scattering amplitudes in the next-to-leading approximation. It is necessary, however, for self-consistency of the assumption of the Reggeized form of the production amplitudes in multi-Regge kinematics, which are used in the derivation of the BFKL equation. The fulfillment of the strong bootstrap condition for the kernel opens the way to a rigorous proof of the BFKL equation in the next-to-leading approximation. (author)
Combined RF current drive and bootstrap current in tokamaks
International Nuclear Information System (INIS)
Schultz, S. D.; Bers, A.; Ram, A. K.
1999-01-01
By calculating radio frequency current drive (RFCD) and the bootstrap current in a consistent kinetic manner, we find synergistic effects in the total noninductive current density in tokamaks [1]. We include quasilinear diffusion in the Drift Kinetic Equation (DKE) in order to generalize neoclassical theory to highly non-Maxwellian electron distributions due to RFCD. The parallel plasma current is evaluated numerically with the help of the FASTEP Fokker-Planck code [2]. Current drive efficiency is found to be significantly affected by neoclassical effects, even in cases where only circulating electrons interact with the waves. Predictions of the current drive efficiency are made for lower hybrid and electron cyclotron wave current drive scenarios in the presence of bootstrap current
Bootstrap bound for conformal multi-flavor QCD on lattice
Energy Technology Data Exchange (ETDEWEB)
Nakayama, Yu [Department of Physics, Rikkyo University,Toshima, Tokyo 171-8501 (Japan); Kavli Institute for the Physics and Mathematics of the Universe (WPI), University of Tokyo,5-1-5 Kashiwanoha, Kashiwa, Chiba 277-8583 (Japan)
2016-07-08
The recent work by Iha et al. shows an upper bound on mass anomalous dimension γ{sub m} of multi-flavor massless QCD at the renormalization group fixed point from the conformal bootstrap in SU(N{sub F}){sub V} symmetric conformal field theories under the assumption that the fixed point is realizable with the lattice regularization based on staggered fermions. We show that the almost identical but slightly stronger bound applies to the regularization based on Wilson fermions (or domain wall fermions) by studying the conformal bootstrap in SU(N{sub f}){sub L}×SU(N{sub f}){sub R} symmetric conformal field theories. For N{sub f}=8, our bound implies γ{sub m}<1.31 to avoid dangerously irrelevant operators that are not compatible with the lattice symmetry.
VENUS+δf - A bootstrap current calculation module for 3D configurations
International Nuclear Information System (INIS)
Isaev, M.Yu.; Brunner, S.; Cooper, W.A.; Tran, T.M.; Bergmann, A.; Beidler, C.D.; Geiger, J.; Maassberg, H.; Nuehrenberg, J.; Schmidt, M.
2005-01-01
We present a new 3D code VENUS+δf for neoclassical transport calculations in nonaxisymmetric toroidal systems. Numerical drift orbits from the original VENUS code and the δf method for tokamak transport calculations are combined. The first results obtained with VENUS+δf are compared with neoclassical theory for different collisional regimes in a JT-60 tokamak test case with monoenergetic particles and with a Maxwellian distribution. Benchmarks with DKES code results for the bootstrap current in the W7X configuration as well as further VENUS+δf developments are discussed. (author)
Noncritical String Liouville Theory and Geometric Bootstrap Hypothesis
Hadasz, Leszek; Jaskólski, Zbigniew
The applications of the existing Liouville theories for the description of the longitudinal dynamics of noncritical Nambu-Goto string are analyzed. We show that the recently developed DOZZ solution to the Liouville theory leads to the cut singularities in tree string amplitudes. We propose a new version of the Polyakov geometric approach to Liouville theory and formulate its basic consistency condition — the geometric bootstrap equation. Also in this approach the tree amplitudes develop cut singularities.
Higgs Critical Exponents and Conformal Bootstrap in Four Dimensions
DEFF Research Database (Denmark)
Antipin, Oleg; Mølgaard, Esben; Sannino, Francesco
2015-01-01
We investigate relevant properties of composite operators emerging in nonsupersymmetric, four-dimensional gauge-Yukawa theories with interacting conformal fixed points within a precise framework. The theories investigated in this work are structurally similar to the standard model of particle int...... bootstrap results are then compared to precise four dimensional conformal field theoretical results. To accomplish this, it was necessary to calculate explicitly the crossing symmetry relations for the global symmetry group SU($N$)$\\times$SU($N$)....
Truncatable bootstrap equations in algebraic form and critical surface exponents
Energy Technology Data Exchange (ETDEWEB)
Gliozzi, Ferdinando [Dipartimento di Fisica, Università di Torino andIstituto Nazionale di Fisica Nucleare - sezione di Torino,Via P. Giuria 1, Torino, I-10125 (Italy)
2016-10-10
We describe examples of drastic truncations of conformal bootstrap equations encoding much more information than that obtained by a direct numerical approach. A three-term truncation of the four point function of a free scalar in any space dimensions provides algebraic identities among conformal block derivatives which generate the exact spectrum of the infinitely many primary operators contributing to it. In boundary conformal field theories, we point out that the appearance of free parameters in the solutions of bootstrap equations is not an artifact of truncations, rather it reflects a physical property of permeable conformal interfaces which are described by the same equations. Surface transitions correspond to isolated points in the parameter space. We are able to locate them in the case of 3d Ising model, thanks to a useful algebraic form of 3d boundary bootstrap equations. It turns out that the low-lying spectra of the surface operators in the ordinary and the special transitions of 3d Ising model form two different solutions of the same polynomial equation. Their interplay yields an estimate of the surface renormalization group exponents, y{sub h}=0.72558(18) for the ordinary universality class and y{sub h}=1.646(2) for the special universality class, which compare well with the most recent Monte Carlo calculations. Estimates of other surface exponents as well as OPE coefficients are also obtained.
Sabourin, Jeremy; Nobel, Andrew B.; Valdar, William
2014-01-01
Genomewide association studies sometimes identify loci at which both the number and identities of the underlying causal variants are ambiguous. In such cases, statistical methods that model effects of multiple SNPs simultaneously can help disentangle the observed patterns of association and provide information about how those SNPs could be prioritized for follow-up studies. Current multi-SNP methods, however, tend to assume that SNP effects are well captured by additive genetics; yet when genetic dominance is present, this assumption translates to reduced power and faulty prioritizations. We describe a statistical procedure for prioritizing SNPs at GWAS loci that efficiently models both additive and dominance effects. Our method, LLARRMA-dawg, combines a group LASSO procedure for sparse modeling of multiple SNP effects with a resampling procedure based on fractional observation weights; it estimates for each SNP the robustness of association with the phenotype both to sampling variation and to competing explanations from other SNPs. In producing a SNP prioritization that best identifies underlying true signals, we show that: our method easily outperforms a single marker analysis; when additive-only signals are present, our joint model for additive and dominance is equivalent to or only slightly less powerful than modeling additive-only effects; and, when dominance signals are present, even in combination with substantial additive effects, our joint model is unequivocally more powerful than a model assuming additivity. We also describe how performance can be improved through calibrated randomized penalization, and discuss how dominance in ungenotyped SNPs can be incorporated through either heterozygote dosage or multiple imputation. PMID:25417853
Interval estimation methods of the mean in small sample situation and the results' comparison
International Nuclear Information System (INIS)
Wu Changli; Guo Chunying; Jiang Meng; Lin Yuangen
2009-01-01
The methods of the sample mean's interval estimation, namely the classical method, the Bootstrap method, the Bayesian Bootstrap method, the Jackknife method and the spread method of the Empirical Characteristic distribution function are described. Numerical calculation on the samples' mean intervals is carried out where the numbers of the samples are 4, 5, 6 respectively. The results indicate the Bootstrap method and the Bayesian Bootstrap method are much more appropriate than others in small sample situation. (authors)
A Bootstrap Metropolis-Hastings Algorithm for Bayesian Analysis of Big Data.
Liang, Faming; Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively.
A Bootstrap Metropolis–Hastings Algorithm for Bayesian Analysis of Big Data
Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively. PMID:29033469
Diks, C.; Fang, H.
2017-01-01
The information-theoretical concept transfer entropy is an ideal measure for detecting conditional independence, or Granger causality in a time series setting. The recent literature indeed witnesses an increased interest in applications of entropy-based tests in this direction. However, those tests
Analyzing Repeated Measures Marginal Models on Sample Surveys with Resampling Methods
Directory of Open Access Journals (Sweden)
James D. Knoke
2005-12-01
Full Text Available Packaged statistical software for analyzing categorical, repeated measures marginal models on sample survey data with binary covariates does not appear to be available. Consequently, this report describes a customized SAS program which accomplishes such an analysis on survey data with jackknifed replicate weights for which the primary sampling unit information has been suppressed for respondent confidentiality. First, the program employs the Macro Language and the Output Delivery System (ODS to estimate the means and covariances of indicator variables for the response variables, taking the design into account. Then, it uses PROC CATMOD and ODS, ignoring the survey design, to obtain the design matrix and hypothesis test specifications. Finally, it enters these results into another run of CATMOD, which performs automated direct input of the survey design specifications and accomplishes the appropriate analysis. This customized SAS program can be employed, with minor editing, to analyze general categorical, repeated measures marginal models on sample surveys with replicate weights. Finally, the results of our analysis accounting for the survey design are compared to the results of two alternate analyses of the same data. This comparison confirms that such alternate analyses, which do not properly account for the design, do not produce useful results.
Methods of soil resampling to monitor changes in the chemical concentrations of forest soils
Gregory B. Lawrence; Ivan J. Fernandez; Paul W. Hazlett; Scott W. Bailey; Donald S. Ross; Thomas R. Villars; Angelica Quintana; Rock Ouimet; Michael R. McHale; Chris E. Johnson; Russell D. Briggs; Robert A. Colter; Jason Siemion; Olivia L. Bartlett; Olga Vargas; Michael R. Antidormi; Mary M. Koppers
2016-01-01
Recent soils research has shown that important chemical soil characteristics can change in less than a decade, often the result of broad environmental changes. Repeated sampling to monitor these changes in forest soils is a relatively new practice that is not well documented in the literature and has only recently been broadly embraced by the scientific community. The...
Directory of Open Access Journals (Sweden)
Gu Xun
2007-03-01
Full Text Available Abstract Background Phylogenetically related miRNAs (miRNA families convey important information of the function and evolution of miRNAs. Due to the special sequence features of miRNAs, pair-wise sequence identity between miRNA precursors alone is often inadequate for unequivocally judging the phylogenetic relationships between miRNAs. Most of the current methods for miRNA classification rely heavily on manual inspection and lack measurements of the reliability of the results. Results In this study, we designed an analysis pipeline (the Phylogeny-Bootstrap-Cluster (PBC pipeline to identify miRNA families based on branch stability in the bootstrap trees derived from overlapping genome-wide miRNA sequence sets. We tested the PBC analysis pipeline with the miRNAs from six animal species, H. sapiens, M. musculus, G. gallus, D. rerio, D. melanogaster, and C. elegans. The resulting classification was compared with the miRNA families defined in miRBase. The two classifications were largely consistent. Conclusion The PBC analysis pipeline is an efficient method for classifying large numbers of heterogeneous miRNA sequences. It requires minimum human involvement and provides measurements of the reliability of the classification results.
Check of the bootstrap conditions for the gluon Reggeization
International Nuclear Information System (INIS)
Papa, A.
2000-01-01
The property of gluon Reggeization plays an essential role in the derivation of the Balitsky-Fadin-Kuraev-Lipatov (BFKL) equation for the cross sections at high energy √s in perturbative QCD. This property has been proved to all orders of perturbation theory in the leading logarithmic approximation and it is assumed to be valid also in the next-to-leading logarithmic approximation, where it has been checked only to the first three orders of perturbation theory. From s-channel unitarity, however, very stringent 'bootstrap' conditions can be derived which, if fulfilled, leave no doubts that gluon Reggeization holds
A SQUID Bootstrap Circuit with a Large Parameter Tolerance
International Nuclear Information System (INIS)
Zhang Guo-Feng; Kong Xiang-Yan; Xie Xiao-Ming; Zhang Yi; Krause Hans-Joachim; Offenhäusser Andreas
2013-01-01
The voltage biased (SQUID) bootstrap circuit (SBC) was recently introduced as an effective means to reduce the preamplifier noise contribution. We analyze the tolerances of the SBC noise suppression performance to spreads in SQUID and SBC circuit parameters. It is found that the tolerance to spread mainly caused by the integrated circuit fabrication process could be extended by a one-time adjustable current feedback. A helium-cooled niobium SQUID with a loop inductance of 350 pH is employed to experimentally verify the analysis. From this work, design criteria for fully integrated SBC devices with a high yield can be derived
A Bootstrap Approach to Computing Uncertainty in Inferred Oil and Gas Reserve Estimates
International Nuclear Information System (INIS)
Attanasi, Emil D.; Coburn, Timothy C.
2004-01-01
This study develops confidence intervals for estimates of inferred oil and gas reserves based on bootstrap procedures. Inferred reserves are expected additions to proved reserves in previously discovered conventional oil and gas fields. Estimates of inferred reserves accounted for 65% of the total oil and 34% of the total gas assessed in the U.S. Geological Survey's 1995 National Assessment of oil and gas in US onshore and State offshore areas. When the same computational methods used in the 1995 Assessment are applied to more recent data, the 80-year (from 1997 through 2076) inferred reserve estimates for pre-1997 discoveries located in the lower 48 onshore and state offshore areas amounted to a total of 39.7 billion barrels of oil (BBO) and 293 trillion cubic feet (TCF) of gas. The 90% confidence interval about the oil estimate derived from the bootstrap approach is 22.4 BBO to 69.5 BBO. The comparable 90% confidence interval for the inferred gas reserve estimate is 217 TCF to 413 TCF. The 90% confidence interval describes the uncertainty that should be attached to the estimates. It also provides a basis for developing scenarios to explore the implications for energy policy analysis
Off-critical statistical models: factorized scattering theories and bootstrap program
International Nuclear Information System (INIS)
Mussardo, G.
1992-01-01
We analyze those integrable statistical systems which originate from some relevant perturbations of the minimal models of conformal field theories. When only massive excitations are present, the systems can be efficiently characterized in terms of the relativistic scattering data. We review the general properties of the factorizable S-matrix in two dimensions with particular emphasis on the bootstrap principle. The classification program of the allowed spins of conserved currents and of the non-degenerate S-matrices is discussed and illustrated by means of some significant examples. The scattering theories of several massive perturbations of the minimal models are fully discussed. Among them are the Ising model, the tricritical Ising model, the Potts models, the series of the non-unitary minimal models M 2,2n+3 , the non-unitary model M 3,5 and the scaling limit of the polymer system. The ultraviolet limit of these massive integrable theories can be exploited by the thermodynamics Bethe ansatz, in particular the central charge of the original conformal theories can be recovered from the scattering data. We also consider the numerical method based on the so-called conformal space truncated approach which confirms the theoretical results and allows a direct measurement of the scattering data, i.e. the masses and the S-matrix of the particles in bootstrap interaction. The problem of computing the off-critical correlation functions is discussed in terms of the form-factor approach
On Current Drive and Wave Induced Bootstrap Current in Toroidal Plasmas
International Nuclear Information System (INIS)
Hellsten, T.; Johnson, T.
2008-01-01
A comprehensive treatment of wave-particle interactions in toroidal plasmas including collisional relaxation, applicable to heating or anomalous wave induced transport, has been obtained by using Monte Carlo operators satisfying quasi-neutrality. This approach enables a self-consistent treatment of wave-particle interactions applicable to the banana regime in the neoclassical theory. It allows an extension into a regime with large temperature and density gradients, losses and transport of particles by wave-particle interactions making the method applicable to transport barriers. It is found that at large gradients the relationship between radial electric field, parallel velocity, temperature and density gradient in the neoclassical theory is modified such that coefficient in front of the logarithmic ion temperature gradient, which in the standard neoclassical theory is small and counteracts the electric field caused by the density gradient, now changes sign and contributes to the built up of the radial electric field. The possibility to drive current by absorbing the waves on trapped particles has been studied and how the wave-particle interactions affect the bootstrap current. Two new current drive mechanisms are studied: current drive by wave induced bootstrap current and selective detrapping into passing orbits by directed waves.
Im, Subin; Min, Soonhong
2013-04-01
Exploratory factor analyses of the Kirton Adaption-Innovation Inventory (KAI), which serves to measure individual cognitive styles, generally indicate three factors: sufficiency of originality, efficiency, and rule/group conformity. In contrast, a 2005 study by Im and Hu using confirmatory factor analysis supported a four-factor structure, dividing the sufficiency of originality dimension into two subdimensions, idea generation and preference for change. This study extends Im and Hu's (2005) study of a derived version of the KAI by providing additional evidence of the four-factor structure. Specifically, the authors test the robustness of the parameter estimates to the violation of normality assumptions in the sample using bootstrap methods. A bias-corrected confidence interval bootstrapping procedure conducted among a sample of 356 participants--members of the Arkansas Household Research Panel, with middle SES and average age of 55.6 yr. (SD = 13.9)--showed that the four-factor model with two subdimensions of sufficiency of originality fits the data significantly better than the three-factor model in non-normality conditions.
Directory of Open Access Journals (Sweden)
Mohammad Reza Marami Milani
2015-09-01
Full Text Available This study analyzes the linear relationship between climate variables and milk components in Iran by applying bootstrapping to include and assess the uncertainty. The climate parameters, Temperature Humidity Index (THI and Equivalent Temperature Index (ETI are computed from the NASA-Modern Era Retrospective-Analysis for Research and Applications (NASA-MERRA reanalysis (2002–2010. Milk data for fat, protein (measured on fresh matter bases, and milk yield are taken from 936,227 milk records for the same period, using cows fed by natural pasture from April to September. Confidence intervals for the regression model are calculated using the bootstrap technique. This method is applied to the original times series, generating statistically equivalent surrogate samples. As a result, despite the short time data and the related uncertainties, an interesting behavior of the relationships between milk compound and the climate parameters is visible. During spring only, a weak dependency of milk yield and climate variations is obvious, while fat and protein concentrations show reasonable correlations. In summer, milk yield shows a similar level of relationship with ETI, but not with temperature and THI. We suggest this methodology for studies in the field of the impacts of climate change and agriculture, also environment and food with short-term data.
Financial bootstrapping use in new family ventures and the impact on venture growth
Helleboogh, David; LAVEREN, Eddy; LYBAERT, Nadine
2010-01-01
This paper contributes to the general knowledge of bootstrap financing among new family ventures in two ways. Firstly, this research reveals which human capital characteristics of the owner-manager has an impact on financial bootstrapping use. The empirical results indicate that the use of bootstrapping techniques does not depend upon the family's business founder's education, but that it is a skill which is absorbed from self-employed parents or during the founder's prior work and management...
Financial bootstrapping use in family ventures and the impact on start-up growth
Helleboogh, D.; Laveren, E.; LYBAERT, Nadine
2010-01-01
This paper contributes to the general knowledge of bootstrap financing among new family ventures in two ways. Firstly, this research reveals which human capital characteristics of the owner-manager has an impact on financial bootstrapping use. The empirical results indicate that the use of bootstrapping techniques does not depend upon the family business founder's education, but that it is a skill which is absorbed from self-employed parents or during the founder‟s prior work and management e...
Bootstrapping the (A1, A2) Argyres-Douglas theory
Cornagliotto, Martina; Lemos, Madalena; Liendo, Pedro
2018-03-01
We apply bootstrap techniques in order to constrain the CFT data of the ( A 1 , A 2) Argyres-Douglas theory, which is arguably the simplest of the Argyres-Douglas models. We study the four-point function of its single Coulomb branch chiral ring generator and put numerical bounds on the low-lying spectrum of the theory. Of particular interest is an infinite family of semi-short multiplets labeled by the spin ℓ. Although the conformal dimensions of these multiplets are protected, their three-point functions are not. Using the numerical bootstrap we impose rigorous upper and lower bounds on their values for spins up to ℓ = 20. Through a recently obtained inversion formula, we also estimate them for sufficiently large ℓ, and the comparison of both approaches shows consistent results. We also give a rigorous numerical range for the OPE coefficient of the next operator in the chiral ring, and estimates for the dimension of the first R-symmetry neutral non-protected multiplet for small spin.
Interaction of bootstrap-current-driven magnetic islands
International Nuclear Information System (INIS)
Hegna, C.C.; Callen, J.D.
1991-10-01
The formation and interaction of fluctuating neoclassical pressure gradient driven magnetic islands is examined. The interaction of magnetic islands produces a stochastic region around the separatrices of the islands. This interaction causes the island pressure profile to be broadened, reducing the island bootstrap current and drive for the magnetic island. A model is presented that describes the magnetic topology as a bath of interacting magnetic islands with low to medium poloidal mode number (m congruent 3-30). The islands grow by the bootstrap current effect and damp due to the flattening of the pressure profile near the island separatrix caused by the interaction of the magnetic islands. The effect of this sporadic growth and decay of the islands (''magnetic bubbling'') is not normally addressed in theories of plasma transport due to magnetic fluctuations. The nature of the transport differs from statistical approaches to magnetic turbulence since the radial step size of the plasma transport is now given by the characteristic island width. This model suggests that tokamak experiments have relatively short-lived, coherent, long wavelength magnetic oscillations present in the steep pressure-gradient regions of the plasma. 42 refs
Bootstrap equations for N=4 SYM with defects
Energy Technology Data Exchange (ETDEWEB)
Liendo, Pedro [IMIP, Humboldt-Universität zu Berlin, IRIS Adlershof,Zum Großen Windkanal 6, 12489 Berlin (Germany); Meneghelli, Carlo [Simons Center for Geometry and Physics, Stony Brook University,Stony Brook, NY 11794-3636 (United States)
2017-01-27
This paper focuses on the analysis of 4dN=4 superconformal theories in the presence of a defect from the point of view of the conformal bootstrap. We will concentrate first on the case of codimension one, where the defect is a boundary that preserves half of the supersymmetry. After studying the constraints imposed by supersymmetry, we will obtain the Ward identities associated to two-point functions of (1/2)-BPS operators and write their solution as a superconformal block expansion. Due to a surprising connection between spacetime and R-symmetry conformal blocks, our results not only apply to 4dN=4 superconformal theories with a boundary, but also to three more systems that have the same symmetry algebra: 4dN=4 superconformal theories with a line defect, 3dN=4 superconformal theories with no defect, and OSP(4{sup ∗}|4) superconformal quantum mechanics. The superconformal algebra implies that all these systems possess a closed subsector of operators in which the bootstrap equations become polynomial constraints on the CFT data. We derive these truncated equations and initiate the study of their solutions.
Bootstrapping non-commutative gauge theories from L∞ algebras
Blumenhagen, Ralph; Brunner, Ilka; Kupriyanov, Vladislav; Lüst, Dieter
2018-05-01
Non-commutative gauge theories with a non-constant NC-parameter are investigated. As a novel approach, we propose that such theories should admit an underlying L∞ algebra, that governs not only the action of the symmetries but also the dynamics of the theory. Our approach is well motivated from string theory. We recall that such field theories arise in the context of branes in WZW models and briefly comment on its appearance for integrable deformations of AdS5 sigma models. For the SU(2) WZW model, we show that the earlier proposed matrix valued gauge theory on the fuzzy 2-sphere can be bootstrapped via an L∞ algebra. We then apply this approach to the construction of non-commutative Chern-Simons and Yang-Mills theories on flat and curved backgrounds with non-constant NC-structure. More concretely, up to the second order, we demonstrate how derivative and curvature corrections to the equations of motion can be bootstrapped in an algebraic way from the L∞ algebra. The appearance of a non-trivial A∞ algebra is discussed, as well.
Bootstrap equations for N=4 SYM with defects
International Nuclear Information System (INIS)
Liendo, Pedro; Meneghelli, Carlo
2017-01-01
This paper focuses on the analysis of 4dN=4 superconformal theories in the presence of a defect from the point of view of the conformal bootstrap. We will concentrate first on the case of codimension one, where the defect is a boundary that preserves half of the supersymmetry. After studying the constraints imposed by supersymmetry, we will obtain the Ward identities associated to two-point functions of (1/2)-BPS operators and write their solution as a superconformal block expansion. Due to a surprising connection between spacetime and R-symmetry conformal blocks, our results not only apply to 4dN=4 superconformal theories with a boundary, but also to three more systems that have the same symmetry algebra: 4dN=4 superconformal theories with a line defect, 3dN=4 superconformal theories with no defect, and OSP(4 ∗ |4) superconformal quantum mechanics. The superconformal algebra implies that all these systems possess a closed subsector of operators in which the bootstrap equations become polynomial constraints on the CFT data. We derive these truncated equations and initiate the study of their solutions.
Thermal energy and bootstrap current in fusion reactor plasmas
International Nuclear Information System (INIS)
Becker, G.
1993-01-01
For DT fusion reactors with prescribed alpha particle heating power P α , plasma volume V and burn temperature i > ∼ 10 keV specific relations for the thermal energy content, bootstrap current, central plasma pressure and other quantities are derived. It is shown that imposing P α and V makes these relations independent of the magnitudes of the density and temperature, i.e. they only depend on P α , V and shape factors or profile parameters. For model density and temperature profiles analytic expressions for these shape factors and for the factor C bs in the bootstrap current formula I bs ∼ C bs (a/R) 1/2 β p I p are given. In the design of next-step devices and fusion reactors, the fusion power is a fixed quantity. Prescription of the alpha particle heating power and plasma volume results in specific relations which can be helpful for interpreting computer simulations and for the design of fusion reactors. (author) 5 refs
Application of Robust Regression and Bootstrap in Poductivity Analysis of GERD Variable in EU27
Directory of Open Access Journals (Sweden)
Dagmar Blatná
2014-06-01
Full Text Available The GERD is one of Europe 2020 headline indicators being tracked within the Europe 2020 strategy. The headline indicator is the 3% target for the GERD to be reached within the EU by 2020. Eurostat defi nes “GERD” as total gross domestic expenditure on research and experimental development in a percentage of GDP. GERD depends on numerous factors of a general economic background, namely of employment, innovation and research, science and technology. The values of these indicators vary among the European countries, and consequently the occurrence of outliers can be anticipated in corresponding analyses. In such a case, a classical statistical approach – the least squares method – can be highly unreliable, the robust regression methods representing an acceptable and useful tool. The aim of the present paper is to demonstrate the advantages of robust regression and applicability of the bootstrap approach in regression based on both classical and robust methods.
DMSP SSM/I Daily and Monthly Polar Gridded Bootstrap Sea Ice Concentrations
National Aeronautics and Space Administration — DMSP SSM/I Daily and Monthly Polar Gridded Bootstrap Sea Ice Concentrations in polar stereographic projection currently include Defense Meteorological Satellite...
Energy confinement of tokamak plasma with consideration of bootstrap current effect
International Nuclear Information System (INIS)
Yuan Ying; Gao Qingdi
1992-01-01
Based on the η i -mode induced anomalous transport model of Lee et al., the energy confinement of tokamak plasmas with auxiliary heating is investigated with consideration of bootstrap current effect. The results indicate that energy confinement time increases with plasma current and tokamak major radius, and decreases with heating power, toroidal field and minor radius. This is in reasonable agreement with the Kaye-Goldston empirical scaling law. Bootstrap current always leads to an improvement of energy confinement and the contraction of inversion radius. When γ, the ratio between bootstrap current and total plasma current, is small, the part of energy confinement time contributed from bootstrap current will be about γ/2
Inferring microevolution from museum collections and resampling: lessons learned from Cepaea
Directory of Open Access Journals (Sweden)
Małgorzata Ożgo
2017-10-01
Full Text Available Natural history collections are an important and largely untapped source of long-term data on evolutionary changes in wild populations. Here, we utilize three large geo-referenced sets of samples of the common European land-snail Cepaea nemoralis stored in the collection of Naturalis Biodiversity Center in Leiden, the Netherlands. Resampling of these populations allowed us to gain insight into changes occurring over 95, 69, and 50 years. Cepaea nemoralis is polymorphic for the colour and banding of the shell; the mode of inheritance of these patterns is known, and the polymorphism is under both thermal and predatory selection. At two sites the general direction of changes was towards lighter shells (yellow and less heavily banded, which is consistent with predictions based on on-going climatic change. At one site no directional changes were detected. At all sites there were significant shifts in morph frequencies between years, and our study contributes to the recognition that short-term changes in the states of populations often exceed long-term trends. Our interpretation was limited by the few time points available in the studied collections. We therefore stress the need for natural history collections to routinely collect large samples of common species, to allow much more reliable hind-casting of evolutionary responses to environmental change.
MapReduce particle filtering with exact resampling and deterministic runtime
Thiyagalingam, Jeyarajan; Kekempanos, Lykourgos; Maskell, Simon
2017-12-01
Particle filtering is a numerical Bayesian technique that has great potential for solving sequential estimation problems involving non-linear and non-Gaussian models. Since the estimation accuracy achieved by particle filters improves as the number of particles increases, it is natural to consider as many particles as possible. MapReduce is a generic programming model that makes it possible to scale a wide variety of algorithms to Big data. However, despite the application of particle filters across many domains, little attention has been devoted to implementing particle filters using MapReduce. In this paper, we describe an implementation of a particle filter using MapReduce. We focus on a component that what would otherwise be a bottleneck to parallel execution, the resampling component. We devise a new implementation of this component, which requires no approximations, has O( N) spatial complexity and deterministic O((log N)2) time complexity. Results demonstrate the utility of this new component and culminate in consideration of a particle filter with 224 particles being distributed across 512 processor cores.
Integrating Multiple Microarray Data for Cancer Pathway Analysis Using Bootstrapping K-S Test
Directory of Open Access Journals (Sweden)
Bing Han
2009-01-01
Full Text Available Previous applications of microarray technology for cancer research have mostly focused on identifying genes that are differentially expressed between a particular cancer and normal cells. In a biological system, genes perform different molecular functions and regulate various biological processes via interactions with other genes thus forming a variety of complex networks. Therefore, it is critical to understand the relationship (e.g., interactions between genes across different types of cancer in order to gain insights into the molecular mechanisms of cancer. Here we propose an integrative method based on the bootstrapping Kolmogorov-Smirnov test and a large set of microarray data produced with various types of cancer to discover common molecular changes in cells from normal state to cancerous state. We evaluate our method using three key pathways related to cancer and demonstrate that it is capable of finding meaningful alterations in gene relations.
Current drive and sustain experiments with the bootstrap current in JT-60
International Nuclear Information System (INIS)
Kikuchi, Mitsuru; Azumi, Masafumi; Tani, Keiji; Tsuji, Shunji; Kubo, Hirotaka
1989-11-01
The current drive and sustain experiments with the neoclassical bootstrap current are performed in the JT-60 tokamak. It is shown that up to 80% of total plasma current is driven by the bootstrap current in extremely high β p regime (β p = 3.2) and the current drive product I p (bootstrap) n-bar e R p up to 4.4 x 10 19 MAm -2 has been attained with the bootstrap current. The experimental resistive loop voltages are compared with the calculations using the neoclassical resistivity with and without the bootstrap current and the Spitzer resistivity for a wide range of the plasma current (I p = 0.5 -2 MA) and the poloidal beta (β p = 0.1 - 3.2). The calculated resistive loop voltage is consistent with the neoclassical prediction including the bootstrap current. Current sustain with the bootstrap current is tested by terminating the I p feedback control during the high power neutral beam heating. An enhancement of the L/R decay time than those expected from the plasma resistivity with measured T e and Zeff has been confirmed experimentally supporting the large non-inductive current in the plasma and is consistent with the neoclassical prediction. A new technique to calculate the bootstrap current in multi-collisionality regime for finite aspect ratio tokamak has bee developed. The neoclassical bootstrap current is calculated directly through the force balance equations between viscous and friction forces according to the Hirshman-Sigmar theory. The bootstrap current driven by the fast ion component is also included. Ballooning stability of the high β p plasma are analyzed using the current profiles including the bootstrap current. The plasma pressure is close to the ballooning limit in high β p discharges. (author)
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.
DEFF Research Database (Denmark)
Thorndahl, Søren; Korup Andersen, Aske; Larsen, Anders Badsberg
2017-01-01
Continuous and long rainfall series are a necessity in rural and urban hydrology for analysis and design purposes. Local historical point rainfall series often cover several decades, which makes it possible to estimate rainfall means at different timescales, and to assess return periods of extreme...... includes climate changes projected to a specific future period. This paper presents a framework for resampling of historical point rainfall series in order to generate synthetic rainfall series, which has the same statistical properties as an original series. Using a number of key target predictions...... for the future climate, such as winter and summer precipitation, and representation of extreme events, the resampled historical series are projected to represent rainfall properties in a future climate. Climate-projected rainfall series are simulated by brute force randomization of model parameters, which leads...
Integral equations of hadronic correlation functions a functional- bootstrap approach
Manesis, E K
1974-01-01
A reasonable 'microscopic' foundation of the Feynman hadron-liquid analogy is offered, based on a class of models for hadron production. In an external field formalism, the equivalence (complementarity) of the exclusive and inclusive descriptions of hadronic reactions is specifically expressed in a functional-bootstrap form, and integral equations between inclusive and exclusive correlation functions are derived. Using the latest CERN-ISR data on the two-pion inclusive correlation function, and assuming rapidity translational invariance for the exclusive one, the simplest integral equation is solved in the 'central region' and an exclusive correlation length in rapidity predicted. An explanation is also offered for the unexpected similarity observed between pi /sup +/ pi /sup -/ and pi /sup -/ pi /sup -/ inclusive correlations. (31 refs).
Performance of Bootstrap MCEWMA: Study case of Sukuk Musyarakah data
Safiih, L. Muhamad; Hila, Z. Nurul
2014-07-01
Sukuk Musyarakah is one of several instruments of Islamic bond investment in Malaysia, where the form of this sukuk is actually based on restructuring the conventional bond to become a Syariah compliant bond. The Syariah compliant is based on prohibition of any influence of usury, benefit or fixed return. Despite of prohibition, daily returns of sukuk are non-fixed return and in statistic, the data of sukuk returns are said to be a time series data which is dependent and autocorrelation distributed. This kind of data is a crucial problem whether in statistical and financing field. Returns of sukuk can be statistically viewed by its volatility, whether it has high volatility that describing the dramatically change of price and categorized it as risky bond or else. However, this crucial problem doesn't get serious attention among researcher compared to conventional bond. In this study, MCEWMA chart in Statistical Process Control (SPC) is mainly used to monitor autocorrelated data and its application on daily returns of securities investment data has gained widespread attention among statistician. However, this chart has always been influence by inaccurate estimation, whether on base model or its limit, due to produce large error and high of probability of signalling out-of-control process for false alarm study. To overcome this problem, a bootstrap approach used in this study, by hybridise it on MCEWMA base model to construct a new chart, i.e. Bootstrap MCEWMA (BMCEWMA) chart. The hybrid model, BMCEWMA, will be applied to daily returns of sukuk Musyarakah for Rantau Abang Capital Bhd. The performance of BMCEWMA base model showed that its more effective compare to real model, MCEWMA based on smaller error estimation, shorter the confidence interval and smaller false alarm. In other word, hybrid chart reduce the variability which shown by smaller error and false alarm. It concludes that the application of BMCEWMA is better than MCEWMA.
EBW-Bootstrap Current Synergy in the National Spherical Torus Experiment (NSTX)
International Nuclear Information System (INIS)
Harvey, R.W.; Taylor, G.
2005-01-01
Current driven by electron Bernstein waves (EBW) and by the electron bootstrap effect are calculated separately and concurrently with a kinetic code, to determine the degree of synergy between them. A target β = 40% NSTX plasma is examined. A simple bootstrap model in the CQL3D Fokker-Planck code is used in these studies: the transiting electron distributions are connected in velocity-space at the trapped-passing boundary to trapped-electron distributions which are displaced radially by a half-banana width outwards/inwards for the co-/counter-passing regions. This model agrees well with standard bootstrap current calculations, over the outer 60% of the plasma radius. Relatively small synergy net bootstrap current is obtained for EBW power up to 4 MW. Locally, bootstrap current density increases in proportion to increased plasma pressure, and this effect can significantly affect the radial profile of driven current
Kent, Robert; Belitz, Kenneth; Fram, Miranda S.
2014-01-01
The Priority Basin Project (PBP) of the Groundwater Ambient Monitoring and Assessment (GAMA) Program was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The GAMA-PBP began sampling, primarily public supply wells in May 2004. By the end of February 2006, seven (of what would eventually be 35) study units had been sampled over a wide area of the State. Selected wells in these first seven study units were resampled for water quality from August 2007 to November 2008 as part of an assessment of temporal trends in water quality by the GAMA-PBP. The initial sampling was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within the seven study units. In the 7 study units, 462 wells were selected by using a spatially distributed, randomized grid-based method to provide statistical representation of the study area. Wells selected this way are referred to as grid wells or status wells. Approximately 3 years after the initial sampling, 55 of these previously sampled status wells (approximately 10 percent in each study unit) were randomly selected for resampling. The seven resampled study units, the total number of status wells sampled for each study unit, and the number of these wells resampled for trends are as follows, in chronological order of sampling: San Diego Drainages (53 status wells, 7 trend wells), North San Francisco Bay (84, 10), Northern San Joaquin Basin (51, 5), Southern Sacramento Valley (67, 7), San Fernando–San Gabriel (35, 6), Monterey Bay and Salinas Valley Basins (91, 11), and Southeast San Joaquin Valley (83, 9). The groundwater samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOCs], pesticides, and pesticide degradates), constituents of special interest (perchlorate, N
Directory of Open Access Journals (Sweden)
Yongbin Liu
2017-01-01
Full Text Available Envelope spectrum analysis is a simple, effective, and classic method for bearing fault identification. However, in the wayside acoustic health monitoring system, owing to the high relative moving speed between the railway vehicle and the wayside mounted microphone, the recorded signal is embedded with Doppler effect, which brings in shift and expansion of the bearing fault characteristic frequency (FCF. What is more, the background noise is relatively heavy, which makes it difficult to identify the FCF. To solve the two problems, this study introduces solutions for the wayside acoustic fault diagnosis of train bearing based on Doppler effect reduction using the improved time-domain interpolation resampling (TIR method and diagnosis-relevant information enhancement using Weighted-Correlation-Coefficient-Guided Stochastic Resonance (WCCSR method. First, the traditional TIR method is improved by incorporating the original method with kinematic parameter estimation based on time-frequency analysis and curve fitting. Based on the estimated parameters, the Doppler effect is removed using the TIR easily. Second, WCCSR is employed to enhance the diagnosis-relevant period signal component in the obtained Doppler-free signal. Finally, paved with the above two procedures, the local fault is identified using envelope spectrum analysis. Simulated and experimental cases have verified the effectiveness of the proposed method.
N=4 superconformal bootstrap of the K3 CFT
Energy Technology Data Exchange (ETDEWEB)
Lin, Ying-Hsuan; Shao, Shu-Heng [Jefferson Physical Laboratory, Harvard University,17 Oxford Street, Cambridge, MA 02138 (United States); Simmons-Duffin, David [School of Natural Sciences, Institute for Advanced Study,1 Einstein Drive, Princeton, NJ 08540 (United States); Wang, Yifan [Center for Theoretical Physics, Massachusetts Institute of Technology,77 Massachusetts Ave, Cambridge, MA 02139 (United States); Yin, Xi [Jefferson Physical Laboratory, Harvard University,17 Oxford Street, Cambridge, MA 02138 (United States)
2017-05-23
We study two-dimensional (4,4) superconformal field theories of central charge c=6, corresponding to nonlinear sigma models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N=4 superconformal blocks with c=6 and bosonic Virasoro conformal blocks with c=28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. We observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and find numerically an upper bound on this gap that is saturated by the A{sub 1}N=4 cigar CFT. We also derive an analytic upper bound on the first nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence between a class of BPS N=2 superconformal blocks and Virasoro conformal blocks in two dimensions, and an upper bound on the four-point functions of operators of sufficiently low scaling dimension in three and four dimensional CFTs.
N=4 Superconformal Bootstrap of the K3 CFT
CERN. Geneva
2015-01-01
We study two-dimensional (4,4) superconformal field theories of central charge c=6, corresponding to nonlinear σ models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N=4 superconformal blocks with c=6 and bosonic Virasoro conformal blocks with c=28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. We observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and find numerically an upper bound on this gap that is saturated by the A1 N=4 cigar CFT. We also derive an analytic upper bound on the first nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence...
More on analytic bootstrap for O(N) models
Energy Technology Data Exchange (ETDEWEB)
Dey, Parijat; Kaviraj, Apratim; Sen, Kallol [Centre for High Energy Physics, Indian Institute of Science,C.V. Raman Avenue, Bangalore 560012 (India)
2016-06-22
This note is an extension of a recent work on the analytical bootstrapping of O(N) models. An additonal feature of the O(N) model is that the OPE contains trace and antisymmetric operators apart from the symmetric-traceless objects appearing in the OPE of the singlet sector. This in addition to the stress tensor (T{sub μν}) and the ϕ{sub i}ϕ{sup i} scalar, we also have other minimal twist operators as the spin-1 current J{sub μ} and the symmetric-traceless scalar in the case of O(N). We determine the effect of these additional objects on the anomalous dimensions of the corresponding trace, symmetric-traceless and antisymmetric operators in the large spin sector of the O(N) model, in the limit when the spin is much larger than the twist. As an observation, we also verified that the leading order results for the large spin sector from the ϵ−expansion are an exact match with our n=0 case. A plausible holographic setup for the special case when N=2 is also mentioned which mimics the calculation in the CFT.
Non-abelian binding energies from the lightcone bootstrap
Energy Technology Data Exchange (ETDEWEB)
Li, Daliang [Department of Physics, Yale University,New Haven, CT 06511 (United States); Department of Physics and Astronomy, Johns Hopkins University,Baltimore, MD 21218 (United States); Meltzer, David [Department of Physics, Yale University,New Haven, CT 06511 (United States); Poland, David [Department of Physics, Yale University,New Haven, CT 06511 (United States); School of Natural Sciences, Institute for Advanced Study,Princeton, NJ 08540 (United States)
2016-02-23
We analytically study the lightcone limit of the conformal bootstrap for 4-point functions containing scalars charged under global symmetries. We show the existence of large spin double-twist operators in various representations of the global symmetry group. We then compute their anomalous dimensions in terms of the central charge C{sub T}, current central charge C{sub J}, and the OPE coefficients of low dimension scalars. In AdS, these results correspond to the binding energy of two-particle states arising from the exchange of gravitons, gauge bosons, and light scalar fields. Using unitarity and crossing symmetry, we show that gravity is universal and attractive among different types of two-particle states, while the gauge binding energy can have either sign as determined by the representation of the two-particle state, with universal ratios fixed by the symmetry group. We apply our results to 4D N=1 SQCD and the 3D O(N) vector models. We also show that in a unitary CFT, if the current central charge C{sub J} stays finite when the global symmetry group becomes infinitely large, such as the N→∞ limit of the O(N) vector model, then the theory must contain an infinite number of higher spin currents.
Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Guiseppe, Cavaliere; Rahbæk, Anders; Taylor, A.M. Robert
with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... in the literature by proposing a bootstrap sequential algorithm which we demonstrate delivers consistent cointegration rank estimation for general I(1) processes. Finite sample Monte Carlo simulations show the proposed procedure performs well in practice....
DEFF Research Database (Denmark)
Peng, Yi; Knadel, Maria; Greve, Mette Balslev
2016-01-01
geographically closest sampling points. The SOC prediction resulted in R2: 0.76; RMSE: 4.02 %; RPD: 1.59; RPIQ: 0.35. The results for clay prediction were also successful (R2: 0.84; RMSE: 2.36 %; RPD: 2.35; RPIQ: 2.88). For SOC predictions, over 90% of soil samples were well predicted compared...... samples) for soils from each 7-km grid sampling point in the country. In the resampling and modelling process, each target sample was predicted by a specific model which was calibrated using geographically closest soil spectra. The geographically closest 20, 30, 40, and 50 sampling points (profiles) were...
On the Consistency of Bootstrap Testing for a Parameter on the Boundary of the Parameter Space
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Nielsen, Heino Bohn; Rahbek, Anders
2017-01-01
It is well known that with a parameter on the boundary of the parameter space, such as in the classic cases of testing for a zero location parameter or no autoregressive conditional heteroskedasticity (ARCH) effects, the classic nonparametric bootstrap – based on unrestricted parameter estimates...... – leads to inconsistent testing. In contrast, we show here that for the two aforementioned cases, a nonparametric bootstrap test based on parameter estimates obtained under the null – referred to as ‘restricted bootstrap’ – is indeed consistent. While the restricted bootstrap is simple to implement...... in practice, novel theoretical arguments are required in order to establish consistency. In particular, since the bootstrap is analysed both under the null hypothesis and under the alternative, non-standard asymptotic expansions are required to deal with parameters on the boundary. Detailed proofs...
Using the Bootstrap Concept to Build an Adaptable and Compact Subversion Artifice
National Research Council Canada - National Science Library
Lack, Lindsey
2003-01-01
.... Early tiger teams recognized the possibility of this design and compared it to the two-card bootstrap loader used in mainframes since both exhibit the characteristics of compactness and adaptability...
Directory of Open Access Journals (Sweden)
Pagni Marco
2005-08-01
Full Text Available Abstract Background Whole-genome sequencing projects are rapidly producing an enormous number of new sequences. Consequently almost every family of proteins now contains hundreds of members. It has thus become necessary to develop tools, which classify protein sequences automatically and also quickly and reliably. The difficulty of this task is intimately linked to the mechanism by which protein sequences diverge, i.e. by simultaneous residue substitutions, insertions and/or deletions and whole domain reorganisations (duplications/swapping/fusion. Results Here we present a novel approach, which is based on random sampling of sub-sequences (probes out of a set of input sequences. The probes are compared to the input sequences, after a normalisation step; the results are used to partition the input sequences into homogeneous groups of proteins. In addition, this method provides information on diagnostic parts of the proteins. The performance of this method is challenged by two data sets. The first one contains the sequences of prokaryotic lyases that could be arranged as a multiple sequence alignment. The second one contains all proteins from Swiss-Prot Release 36 with at least one Src homology 2 (SH2 domain – a classical example for proteins with modular architecture. Conclusion The outcome of our method is robust, highly reproducible as shown using bootstrap and resampling validation procedures. The results are essentially coherent with the biology. This method depends solely on well-established publicly available software and algorithms.
Bootstrap finance: the art of start-ups.
Bhide, A
1992-01-01
Entrepreneurship is more popular than ever: courses are full, policymakers emphasize new ventures, managers yearn to go off on their own. Would-be founders often misplace their energies, however. Believing in a "big money" model of entrepreneurship, they spend a lot of time trying to attract investors instead of using wits and hustle to get their ideas off the ground. A study of 100 of the 1989 Inc. "500" list of fastest growing U.S. start-ups attests to the value of bootstrapping. In fact, what it takes to start a business often conflicts with what venture capitalists require. Investors prefer solid plans, well-defined markets, and track records. Entrepreneurs are heavy on energy and enthusiasm but may be short on credentials. They thrive in rapidly changing environments where uncertain prospects may scare off established companies. Rolling with the punches is often more important than formal plans. Striving to adhere to investors' criteria can diminish the flexibility--the try-it, fix-it approach--an entrepreneur needs to make a new venture work. Seven principles are basic for successful start-ups: get operational fast; look for quick break-even, cash-generating projects; offer high-value products or services that can sustain direct personal selling; don't try to hire the crack team; keep growth in check; focus on cash; and cultivate banks early. Growth and change are the start-up's natural environment. But change is also the reward for success: just as ventures grow, their founders usually have to take a fresh look at everything again: roles, organization, even the very policies that got the business up and running.
Nonparametric bootstrap procedures for predictive inference based on recursive estimation schemes
Corradi, Valentina; Swanson, Norman R.
2005-01-01
Our objectives in this paper are twofold. First, we introduce block bootstrap techniques that are (first order) valid in recursive estimation frameworks. Thereafter, we present two examples where predictive accuracy tests are made operational using our new bootstrap procedures. In one application, we outline a consistent test for out-of-sample nonlinear Granger causality, and in the other we outline a test for selecting amongst multiple alternative forecasting models, all of which are possibl...
A Bootstrap Neural Network Based Heterogeneous Panel Unit Root Test: Application to Exchange Rates
Christian de Peretti; Carole Siani; Mario Cerrato
2010-01-01
This paper proposes a bootstrap artificial neural network based panel unit root test in a dynamic heterogeneous panel context. An application to a panel of bilateral real exchange rate series with the US Dollar from the 20 major OECD countries is provided to investigate the Purchase Power Parity (PPP). The combination of neural network and bootstrapping significantly changes the findings of the economic study in favour of PPP.
Analytic description of tokamak equilibrium sustained by high fraction bootstrap current
International Nuclear Information System (INIS)
Shi Bingren
2002-01-01
Recently, to save the current drive power and to obtain more favorable confinement merit for tokamak reactor, large faction bootstrap current sustained equilibrium has attracted great interests both theoretically and experimentally. An powerful expanding technique and the tokamak ordering are used to expand the Grad-Shafranov equation to obtain a series of ordinary differential equations which allow for different sets of input parameters. The fully bootstrap current sustained tokamak equilibria are then solved analytically
Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J
2014-03-01
When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood. Copyright © 2013 Elsevier Inc. All rights reserved.
Wang, Yi; Zheng, Tong; Zhao, Ying; Jiang, Jiping; Wang, Yuanyuan; Guo, Liang; Wang, Peng
2013-12-01
In this paper, bootstrapped wavelet neural network (BWNN) was developed for predicting monthly ammonia nitrogen (NH(4+)-N) and dissolved oxygen (DO) in Harbin region, northeast of China. The Morlet wavelet basis function (WBF) was employed as a nonlinear activation function of traditional three-layer artificial neural network (ANN) structure. Prediction intervals (PI) were constructed according to the calculated uncertainties from the model structure and data noise. Performance of BWNN model was also compared with four different models: traditional ANN, WNN, bootstrapped ANN, and autoregressive integrated moving average model. The results showed that BWNN could handle the severely fluctuating and non-seasonal time series data of water quality, and it produced better performance than the other four models. The uncertainty from data noise was smaller than that from the model structure for NH(4+)-N; conversely, the uncertainty from data noise was larger for DO series. Besides, total uncertainties in the low-flow period were the biggest due to complicated processes during the freeze-up period of the Songhua River. Further, a data missing-refilling scheme was designed, and better performances of BWNNs for structural data missing (SD) were observed than incidental data missing (ID). For both ID and SD, temporal method was satisfactory for filling NH(4+)-N series, whereas spatial imputation was fit for DO series. This filling BWNN forecasting method was applied to other areas suffering "real" data missing, and the results demonstrated its efficiency. Thus, the methods introduced here will help managers to obtain informed decisions.
PCA-based bootstrap confidence interval tests for gene-disease association involving multiple SNPs
Directory of Open Access Journals (Sweden)
Xue Fuzhong
2010-01-01
Full Text Available Abstract Background Genetic association study is currently the primary vehicle for identification and characterization of disease-predisposing variant(s which usually involves multiple single-nucleotide polymorphisms (SNPs available. However, SNP-wise association tests raise concerns over multiple testing. Haplotype-based methods have the advantage of being able to account for correlations between neighbouring SNPs, yet assuming Hardy-Weinberg equilibrium (HWE and potentially large number degrees of freedom can harm its statistical power and robustness. Approaches based on principal component analysis (PCA are preferable in this regard but their performance varies with methods of extracting principal components (PCs. Results PCA-based bootstrap confidence interval test (PCA-BCIT, which directly uses the PC scores to assess gene-disease association, was developed and evaluated for three ways of extracting PCs, i.e., cases only(CAES, controls only(COES and cases and controls combined(CES. Extraction of PCs with COES is preferred to that with CAES and CES. Performance of the test was examined via simulations as well as analyses on data of rheumatoid arthritis and heroin addiction, which maintains nominal level under null hypothesis and showed comparable performance with permutation test. Conclusions PCA-BCIT is a valid and powerful method for assessing gene-disease association involving multiple SNPs.
Changeover Inference: Estimating the Relationship Between DT and OT Data
National Research Council Canada - National Science Library
Dippery, Kevin
1997-01-01
... which has undergone developmental testing. Using a re-sampling method called the Bootstrap, the sampling variance and standard error of the changeover factor are calculated, as are confidence intervals for the OT failure rate of a new system...
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Nielsen, Morten Ørregaard; Taylor, A.M. Robert
Empirical evidence from time series methods which assume the usual I(0)/I(1) paradigm suggests that the efficient market hypothesis, stating that spot and futures prices of a commodity should cointegrate with a unit slope on futures prices, does not hold. However, these statistical methods...... fractionally integrated model we are able to find a body of evidence in support of the efficient market hypothesis for a number of commodities. Our new tests are wild bootstrap implementations of score-based tests for the order of integration of a fractionally integrated time series. These tests are designed...... principle do. A Monte Carlo simulation study demonstrates that very significant improvements infinite sample behaviour can be obtained by the bootstrap vis-à-vis the corresponding asymptotic tests in both heteroskedastic and homoskedastic environments....
Directory of Open Access Journals (Sweden)
Chipperfield James O.
2015-09-01
Full Text Available Record linkage is the act of bringing together records that are believed to belong to the same unit (e.g., person or business from two or more files. Record linkage is not an error-free process and can lead to linking a pair of records that do not belong to the same unit. This occurs because linking fields on the files, which ideally would uniquely identify each unit, are often imperfect. There has been an explosion of record linkage applications, particularly involving government agencies and in the field of health, yet there has been little work on making correct inference using such linked files. Naively treating a linked file as if it were linked without errors can lead to biased inferences. This article develops a method of making inferences for cross tabulated variables when record linkage is not an error-free process. In particular, it develops a parametric bootstrap approach to estimation which can accommodate the sophisticated probabilistic record linkage techniques that are widely used in practice (e.g., 1-1 linkage. The article demonstrates the effectiveness of this method in a simulation and in a real application.
Obraztsov, S. M.; Konobeev, Yu. V.; Birzhevoy, G. A.; Rachkov, V. I.
2006-12-01
The dependence of mechanical properties of ferritic/martensitic (F/M) steels on irradiation temperature is of interest because these steels are used as structural materials for fast, fusion reactors and accelerator driven systems. Experimental data demonstrating temperature peaks in physical and mechanical properties of neutron irradiated pure iron, nickel, vanadium, and austenitic stainless steels are available in the literature. A lack of such an information for F/M steels forces one to apply a computational mathematical-statistical modeling methods. The bootstrap procedure is one of such methods that allows us to obtain the necessary statistical characteristics using only a sample of limited size. In the present work this procedure is used for modeling the frequency distribution histograms of ultimate strength temperature peaks in pure iron and Russian F/M steels EP-450 and EP-823. Results of fitting the sums of Lorentz or Gauss functions to the calculated distributions are presented. It is concluded that there are two temperature (at 360 and 390 °C) peaks of the ultimate strength in EP-450 steel and single peak at 390 °C in EP-823.
Yan, Hong; Song, Xiangzhong; Tian, Kuangda; Chen, Yilin; Xiong, Yanmei; Min, Shungeng
2018-02-01
A novel method, mid-infrared (MIR) spectroscopy, which enables the determination of Chlorantraniliprole in Abamectin within minutes, is proposed. We further evaluate the prediction ability of four wavelength selection methods, including bootstrapping soft shrinkage approach (BOSS), Monte Carlo uninformative variable elimination (MCUVE), genetic algorithm partial least squares (GA-PLS) and competitive adaptive reweighted sampling (CARS) respectively. The results showed that BOSS method obtained the lowest root mean squared error of cross validation (RMSECV) (0.0245) and root mean squared error of prediction (RMSEP) (0.0271), as well as the highest coefficient of determination of cross-validation (Qcv2) (0.9998) and the coefficient of determination of test set (Q2test) (0.9989), which demonstrated that the mid infrared spectroscopy can be used to detect Chlorantraniliprole in Abamectin conveniently. Meanwhile, a suitable wavelength selection method (BOSS) is essential to conducting a component spectral analysis.
Langella, Giuliano; Basile, Angelo; Bonfante, Antonello; Manna, Piero; Terribile, Fabio
2013-04-01
Digital soil mapping procedures are widespread used to build two-dimensional continuous maps about several pedological attributes. Our work addressed a regression kriging (RK) technique and a bootstrapped artificial neural network approach in order to evaluate and compare (i) the accuracy of prediction, (ii) the susceptibility of being included in automatic engines (e.g. to constitute web processing services), and (iii) the time cost needed for calibrating models and for making predictions. Regression kriging is maybe the most widely used geostatistical technique in the digital soil mapping literature. Here we tried to apply the EBLUP regression kriging as it is deemed to be the most statistically sound RK flavor by pedometricians. An unusual multi-parametric and nonlinear machine learning approach was accomplished, called BAGAP (Bootstrap aggregating Artificial neural networks with Genetic Algorithms and Principal component regression). BAGAP combines a selected set of weighted neural nets having specified characteristics to yield an ensemble response. The purpose of applying these two particular models is to ascertain whether and how much a more cumbersome machine learning method could be much promising in making more accurate/precise predictions. Being aware of the difficulty to handle objects based on EBLUP-RK as well as BAGAP when they are embedded in environmental applications, we explore the susceptibility of them in being wrapped within Web Processing Services. Two further kinds of aspects are faced for an exhaustive evaluation and comparison: automaticity and time of calculation with/without high performance computing leverage.
Lee, Soojeong; Rajan, Sreeraman; Jeon, Gwanggil; Chang, Joon-Hyuk; Dajani, Hilmi R; Groza, Voicu Z
2017-06-01
Blood pressure (BP) is one of the most important vital indicators and plays a key role in determining the cardiovascular activity of patients. This paper proposes a hybrid approach consisting of nonparametric bootstrap (NPB) and machine learning techniques to obtain the characteristic ratios (CR) used in the blood pressure estimation algorithm to improve the accuracy of systolic blood pressure (SBP) and diastolic blood pressure (DBP) estimates and obtain confidence intervals (CI). The NPB technique is used to circumvent the requirement for large sample set for obtaining the CI. A mixture of Gaussian densities is assumed for the CRs and Gaussian mixture model (GMM) is chosen to estimate the SBP and DBP ratios. The K-means clustering technique is used to obtain the mixture order of the Gaussian densities. The proposed approach achieves grade "A" under British Society of Hypertension testing protocol and is superior to the conventional approach based on maximum amplitude algorithm (MAA) that uses fixed CR ratios. The proposed approach also yields a lower mean error (ME) and the standard deviation of the error (SDE) in the estimates when compared to the conventional MAA method. In addition, CIs obtained through the proposed hybrid approach are also narrower with a lower SDE. The proposed approach combining the NPB technique with the GMM provides a methodology to derive individualized characteristic ratio. The results exhibit that the proposed approach enhances the accuracy of SBP and DBP estimation and provides narrower confidence intervals for the estimates. Copyright © 2015 Elsevier Ltd. All rights reserved.
Electron Bernstein wave-bootstrap current synergy in the National Spherical Torus Experiment
International Nuclear Information System (INIS)
Harvey, R.W.; Taylor, G.
2005-01-01
Current driven by electron Bernstein waves (EBW) and by the electron bootstrap effect are calculated separately and concurrently with a kinetic code to determine the degree of synergy between them. A target β=40% NSTX [M. Ono, S. Kaye, M. Peng et al., Proceedings of the 17th IAEA Fusion Energy Conference, edited by M. Spak (IAEA, Vienna, Austria, 1999), Vol. 3, p. 1135] plasma is examined. A simple bootstrap model in the collisional-quasilinear CQL3D Fokker-Planck code (National Technical Information Service document No. DE93002962) is used in these studies: the transiting electron distributions are connected in velocity space at the trapped-passing boundary to trapped-electron distributions that are displaced radially by a half-banana-width outwards/inwards for the co-passing/counter-passing regions. This model agrees well with standard bootstrap current calculations over the outer 60% of the plasma radius. Relatively small synergy net bootstrap current is obtained for EBW power up to 4 MW. Locally, bootstrap current density increases in proportion to increased plasma pressure, and this effect can significantly affect the radial profile of driven current
DEFF Research Database (Denmark)
Awan, Mehmood-Ur-Rehman; Le Moullec, Yannick; Koch, Peter
2012-01-01
, and power optimization for field programmable gate array (FPGA) based architectures in an M -path polyphase filter bank with modified N -path polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones......In this paper, we describe resource-efficient hardware architectures for software-defined radio (SDR) front-ends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample rate changes, frequency selection, and bandwidth control. We discuss area, time...... that are not multiples of the output sample rate. A non-maximally decimated polyphase filter bank, where the number of data loads is not equal to the number of M subfilters, processes M subfilters in a time period that is either less than or greater than the M data-load’s time period. We present a load...
International Nuclear Information System (INIS)
Nemov, V V; Kalyuzhnyj, V N; Kasilov, S V; Drevlak, M; Nuehrenberg, J; Kernbichler, W; Reiman, A; Monticello, D
2004-01-01
For the magnetic field of the Wendelstein 7-X (W7-X) standard high-mirror configuration, computed by the PIES code, taking into account real coil geometry, neoclassical transport and bootstrap current are analysed in the 1/upsilon regime using methods based on the integration along magnetic field lines in a given magnetic field. The zero beta and (beta) = 1% cases are studied. The results are compared to the corresponding results for the vacuum magnetic field directly produced by modular coils. A significant advantage of W7-X over a conventional stellarator resulting from reduced neoclassical transport and from reduced bootstrap current follows from the computations although the neoclassical transport is somewhat larger than that previously obtained for the ideal W7-X model configuration
The economics of bootstrapping space industries - Development of an analytic computer model
Goldberg, A. H.; Criswell, D. R.
1982-01-01
A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.
Bootstrap-based confidence estimation in PCA and multivariate statistical process control
DEFF Research Database (Denmark)
Babamoradi, Hamid
be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q-statistic......Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...
Introduction of Bootstrap Current Reduction in the Stellarator Optimization Using the Algorithm DAB
International Nuclear Information System (INIS)
Castejón, F.; Gómez-Iglesias, A.; Velasco, J. L.
2015-01-01
This work is devoted to introduce new optimization criterion in the DAB (Distributed Asynchronous Bees) code. With this new criterion, we have now in DAB the equilibrium and Mercier stability criteria, the minimization of Bxgrad(B) criterion, which ensures the reduction of neoclassical transport and the improvement of the confinement of fast particles, and the reduction of bootstrap current. We have started from a neoclassically optimised configuration of the helias type and imposed the reduction of bootstrap current. The obtained configuration only presents a modest reduction of total bootstrap current, but the local current density is reduced along the minor radii. Further investigations are developed to understand the reason of this modest improvement.
Inference for Local Distributions at High Sampling Frequencies: A Bootstrap Approach
DEFF Research Database (Denmark)
Hounyo, Ulrich; Varneskov, Rasmus T.
of "large" jumps. Our locally dependent wild bootstrap (LDWB) accommodate issues related to the stochastic scale and jumps as well as account for a special block-wise dependence structure induced by sampling errors. We show that the LDWB replicates first and second-order limit theory from the usual...... empirical process and the stochastic scale estimate, respectively, as well as an asymptotic bias. Moreover, we design the LDWB sufficiently general to establish asymptotic equivalence between it and and a nonparametric local block bootstrap, also introduced here, up to second-order distribution theory....... Finally, we introduce LDWB-aided Kolmogorov-Smirnov tests for local Gaussianity as well as local von-Mises statistics, with and without bootstrap inference, and establish their asymptotic validity using the second-order distribution theory. The finite sample performance of CLT and LDWB-aided local...
Aspect Ratio Scaling of Ideal No-wall Stability Limits in High Bootstrap Fraction Tokamak Plasmas
International Nuclear Information System (INIS)
Menard, J.E.; Bell, M.G.; Bell, R.E.; Gates, D.A.; Kaye, S.M.; LeBlanc, B.P.; Maingi, R.; Sabbagh, S.A.; Soukhanovskii, V.; Stutman, D.
2003-01-01
Recent experiments in the low aspect ratio National Spherical Torus Experiment (NSTX) [M. Ono et al., Nucl. Fusion 40 (2000) 557] have achieved normalized beta values twice the conventional tokamak limit at low internal inductance and with significant bootstrap current. These experimental results have motivated a computational re-examination of the plasma aspect ratio dependence of ideal no-wall magnetohydrodynamic stability limits. These calculations find that the profile-optimized no-wall stability limit in high bootstrap fraction regimes is well described by a nearly aspect ratio invariant normalized beta parameter utilizing the total magnetic field energy density inside the plasma. However, the scaling of normalized beta with internal inductance is found to be strongly aspect ratio dependent at sufficiently low aspect ratio. These calculations and detailed stability analyses of experimental equilibria indicate that the nonrotating plasma no-wall stability limit has been exceeded by as much as 30% in NSTX in a high bootstrap fraction regime
Extended theory of main ion and impurity rotation and bootstrap current in a shear layer
International Nuclear Information System (INIS)
Kim, Y.B.; Hinton, F.L.; St. John, H.; Taylor, T.S.; Wroblewski, D.
1993-11-01
In this paper, standard neoclassical theory has been extended into the shear layer. Main ion and impurity ion rotation velocity and bootstrap current within shear layer in H-mode are discussed. Inside the H-mode shear layer, standard neoclassical theory is not valid since the ion poloidal gyroradius becomes comparable to pressure gradient and electric field gradient scale length. To allow for arbitrary ratio of ρθi/L n and ρθi/L Er a new kinetic theory of main ion species within electric field shear layer has been developed with the assumption that ρθi/R o is still small. As a consequence, both impurity flows and bootstrap current have to be modified. We present modified expressions of impurity flows and bootstrap current are presented neglecting ion temperature gradient. Comparisons with DIII-D measurements are also discussed
Carnegie, Nicole Bohme
2011-04-15
The incidence of new infections is a key measure of the status of the HIV epidemic, but accurate measurement of incidence is often constrained by limited data. Karon et al. (Statist. Med. 2008; 27:4617–4633) developed a model to estimate the incidence of HIV infection from surveillance data with biologic testing for recent infection for newly diagnosed cases. This method has been implemented by public health departments across the United States and is behind the new national incidence estimates, which are about 40 per cent higher than previous estimates. We show that the delta method approximation given for the variance of the estimator is incomplete, leading to an inflated variance estimate. This contributes to the generation of overly conservative confidence intervals, potentially obscuring important differences between populations. We demonstrate via simulation that an innovative model-based bootstrap method using the specified model for the infection and surveillance process improves confidence interval coverage and adjusts for the bias in the point estimate. Confidence interval coverage is about 94–97 per cent after correction, compared with 96–99 per cent before. The simulated bias in the estimate of incidence ranges from −6.3 to +14.6 per cent under the original model but is consistently under 1 per cent after correction by the model-based bootstrap. In an application to data from King County, Washington in 2007 we observe correction of 7.2 per cent relative bias in the incidence estimate and a 66 per cent reduction in the width of the 95 per cent confidence interval using this method. We provide open-source software to implement the method that can also be extended for alternate models.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Directory of Open Access Journals (Sweden)
Dan Garcia-Carrillo
2016-03-01
Full Text Available The Internet of Things (IoT is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP. Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP and Authentication Authorization and Accounting (AAA. We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Improving Web Learning through model Optimization using Bootstrap for a Tour-Guide Robot
Directory of Open Access Journals (Sweden)
Rafael León
2012-09-01
Full Text Available We perform a review of Web Mining techniques and we describe a Bootstrap Statistics methodology applied to pattern model classifier optimization and verification for Supervised Learning for Tour-Guide Robot knowledge repository management. It is virtually impossible to test thoroughly Web Page Classifiers and many other Internet Applications with pure empirical data, due to the need for human intervention to generate training sets and test sets. We propose using the computer-based Bootstrap paradigm to design a test environment where they are checked with better reliability
Coal consumption and economic growth nexus: Evidence from bootstrap panel Granger causality test
Directory of Open Access Journals (Sweden)
Anoruo Emmanuel
2017-01-01
Full Text Available This paper explores the causal relationship between coal consumption and economic growth for a panel of 15 African countries using bootstrap panel Granger causality test. Specifically, this paper uses the Phillips-Perron unit root test to ascertain the order of integration for the coal consumption and economic growth series. A bootstrap panel Granger causality test is employed to determine the direction of causality between coal consumption and economic growth. The results provide evidence of unidirectional causality from economic growth to coal consumption. This finding implies that coal conservation measures may be implemented with little or no adverse impact on economic growth for the sample countries as a group.
Directory of Open Access Journals (Sweden)
Hojin Moon
2006-08-01
Full Text Available A computational tool for testing for a dose-related trend and/or a pairwise difference in the incidence of an occult tumor via an age-adjusted bootstrap-based poly-k test and the original poly-k test is presented in this paper. The poly-k test (Bailer and Portier 1988 is a survival-adjusted Cochran-Armitage test, which achieves robustness to effects of differential mortality across dose groups. The original poly-k test is asymptotically standard normal under the null hypothesis. However, the asymptotic normality is not valid if there is a deviation from the tumor onset distribution that is assumed in this test. Our age-adjusted bootstrap-based poly-k test assesses the significance of assumed asymptotic normal tests and investigates an empirical distribution of the original poly-k test statistic using an age-adjusted bootstrap method. A tumor of interest is an occult tumor for which the time to onset is not directly observable. Since most of the animal carcinogenicity studies are designed with a single terminal sacrifice, the present tool is applicable to rodent tumorigenicity assays that have a single terminal sacrifice. The present tool takes input information simply from a user screen and reports testing results back to the screen through a user-interface. The computational tool is implemented in C/C++ and is applied to analyze a real data set as an example. Our tool enables the FDA and the pharmaceutical industry to implement a statistical analysis of tumorigenicity data from animal bioassays via our age-adjusted bootstrap-based poly-k test and the original poly-k test which has been adopted by the National Toxicology Program as its standard statistical test.
Directory of Open Access Journals (Sweden)
Prescott C. Ensign
2016-01-01
Full Text Available This case study chronicles the timeline of a new venture – Keenga Research. Keenga Research has a novel proposition that it is seeking to introduce to the market. The business concept is to ask entrepreneurs to review the venture capital (VC firm that funded them. Reviews of VC firms would then be developed and marketed to those interested (funds and perhaps enterprises seeking funding. What makes this case unique is that Keenga Research was a lean start-up. Bootstrapping is a situation in which the entrepreneur chooses to fund the venture with his/her own personal resources. It involves self-funding (family and friends, tight monitoring of expenses, and maintaining control of ownership and management (Winborg & Landstrom 2001; Perry, Chandler, Yao, & Wolff, 2011; Winborg, 2015. The lean start-up approach favors experimentation over elaborate planning, customer feedback over intuition and iterative design over traditional big upfront research and development. This case study requires the reader to consider a number of the basic challenges facing all entrepreneurs and new ventures. Is the concept marketable? Can the concept be developed and brought to market in a timely manner? Will the product generate revenue? How? When? What are the commitments of the entrepreneurs? Have they considered the major challenges to be faced? Since this venture involved gathering and developing research information and then creating an online platform, Keenga Research faced significant concept-to-market challenges. The research method used in this case study is first person participant observation and interviews. One of the authors was a team member so the contextual details come from direct observation and first-hand knowledge. This method of research is often used in anthropology, sociology, and social psychology where an investigator studies the group by sharing in its activities. The other author provided an objective and conceptual perspective for analyzing
International Nuclear Information System (INIS)
Margalet, S. D.; Cooper, W. A.; Volpe, F.; Castejon, F.
2005-01-01
In magnetic confinement devices, the inhomogeneity of the confining magnetic field along a magnetic field line generates the trapping of particles within local magnetic wells. One of the consequences of the trapped particles is the generation of a current, known as the bootstrap current (BC), whose direction depends on the nature of the magnetic trapping. The BC provides an extra contribution to the poloidal component of the confining magnetic field. The variation of the poloidal component produces the alteration of the winding of the magnetic field lines around the flux surfaces quantified by the rotational transform. When reaches low rational values, it can trigger the generation of ideal MHD instabilities. Therefore, the BC may be responsible for the destabilisation of the configuration [1]. Having established the potentially dangerous implication of the BC, principally, in reactor prototypes, a method to compensate its harmful effects is proposed. It consists of the modelling of the current driven by externally launched ECWs within the plasma to compensate the effects of the BC. This method is flexible enough to allow the identification of the appropriate scenarios in which to generate the required CD depending on the nature of the confining magnetic field and the specific plasma parameters of the configuration. Both the BC and the CD calculations are included in a self-consistent scheme which leads to the computation of a stable BC+CD-consistent MHD equilibrium. This procedure is applied in this paper to simulate the required CD to stabilise a QAS and a QHS reactor prototypes. The estimation of the input power required and the effect of the driven current on the final equilibrium of the system is performed for several relevant scenarios and wave polarisations providing various options of stabilising driven currents. (Author)
On the definition of Pfirsch--Schlueter and bootstrap currents in toroidal systems
International Nuclear Information System (INIS)
Coronado, M.; Wobig, H.
1992-01-01
In the plasma physics literature there appear two different definitions of Pfirsch--Schlueter current. One of them is predominantly used in equilibrium calculations and satisfies the condition I T =0. The other definition appears commonly in transport calculations and requires that the surface average of the dot product of the Pfirsch--Schlueter current density with the magnetic field vanish, i.e., left-angle J PS ·B right-angle=0. The difference between the definitions is a surface function. Within the framework of the moment equation approach, the total parallel current is completely determined through a surface average of Ohm's law; thus different definitions of Pfirsch--Schlueter current imply different expressions for the bootstrap current. Understanding the different implications of these two definitions is of particular importance when designing toroidal devices with minimized Pfirsch--Schlueter current or studying tokamaks with optimized bootstrap current. In this paper the definitions of Pfirsch--Schlueter and bootstrap current, as well as the expressions for the corresponding Pfirsch--Schlueter diffusion flux, are analyzed and discussed for the case of axisymmetric and nonaxisymmetric plasmas. Although in cases like a current-free stellarator or a large-aspect-ratio tokamak both definitions are equivalent, they are in general different, and in order to avoid misunderstandings it is therefore important to use only one. The most appropriate definition is I T =0. In this paper the equations for determining the bootstrap current within the framework of the fluid equations are also analyzed
Collisionality dependence of Mercier stability in LHD equilibria with bootstrap currents
International Nuclear Information System (INIS)
Ichiguchi, Katsuji.
1997-02-01
The Mercier stability of the plasmas carrying bootstrap currents with different plasma collisionality is studied in the Large Helical Device (LHD). In the LHD configuration, the direction of the bootstrap current depends on the collisionality of the plasma through the change in the sign of the geometrical factor. When the beta value is raised by increasing the density of the plasma with a fixed low temperature, the plasma becomes more collisional and the collisionality approaches the plateau regime. In this case, the bootstrap current can flow in the direction so as to decrease the rotational transform. Then, the large Shafranov shift enhances the magnetic well and the magnetic shear, and therefore, the Mercier stability is improved. On the other hand, when the beta value is raised by increasing the temperature of the plasma with a fixed low density, the plasma collisionality becomes reduced to enter the 1/ν collisionality regime and the bootstrap current flows so that the rotational transform should be increased, which is unfavorable for the Mercier stability. Hence, the beta value should be raised by increasing the density rather than the temperature in order to obtain a high beta plasma. (author)
Maximum non-extensive entropy block bootstrap for non-stationary processes
Czech Academy of Sciences Publication Activity Database
Bergamelli, M.; Novotný, Jan; Urga, G.
2015-01-01
Roč. 91, 1/2 (2015), s. 115-139 ISSN 0001-771X R&D Projects: GA ČR(CZ) GA14-27047S Institutional support: RVO:67985998 Keywords : maximum entropy * bootstrap * Monte Carlo simulations Subject RIV: AH - Economics
Dekkers, G.J.M.; Nelissen, J.H.M.
2001-01-01
We look at the contribution of various income components on income inequality and the changes in this in Belgium.Starting from the Shorrocks decomposition, we apply bootstrapping to construct confidence intervals for both the annual decomposition and the changes over time.It appears that the
Barrera, Begoña Barrios; Figalli, Alessio; Valdinoci, Enrico
2012-01-01
We prove that $C^{1,\\alpha}$ $s$-minimal surfaces are automatically $C^\\infty$. For this, we develop a new bootstrap regularity theory for solutions of integro-differential equations of very general type, which we believe is of independent interest.
Abrupt change in mean using block bootstrap and avoiding variance estimation
Czech Academy of Sciences Publication Activity Database
Peštová, Barbora; Pešta, M.
2018-01-01
Roč. 33, č. 1 (2018), s. 413-441 ISSN 0943-4062 Grant - others:GA ČR(CZ) GJ15-04774Y Institutional support: RVO:67985807 Keywords : Block bootstrap * Change in mean * Change point * Hypothesis test ing * Ratio type statistics * Robustness Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.434, year: 2016
Low mass diffractive dissociation in a simple t-dependent dual bootstrap model
International Nuclear Information System (INIS)
Bishari, M.
1978-08-01
The smallness of inelastic diffractive dissociation is explicitly demonstrated, in the framework of the '1/N dual unitarization' scheme, by incorporating a Deck type mechanism with the crucial planar bootstrap equation. Although both inelastic and elastic pomeron couplings are of the same order in 1/N, the origin for their smallness, however, is not identical. (author)
Aslan, Alper; Destek, Mehmet Akif; Okumus, Ilyas
2018-01-01
This study aims to examine the validity of inverted U-shaped Environmental Kuznets Curve by investigating the relationship between economic growth and environmental pollution for the period from 1966 to 2013 in the USA. Previous studies based on the assumption of parameter stability and obtained parameters do not change over the full sample. This study uses bootstrap rolling window estimation method to detect the possible changes in causal relations and also obtain the parameters for sub-sample periods. The results show that the parameter of economic growth has increasing trend in 1982-1996 sub-sample periods, and it has decreasing trend in 1996-2013 sub-sample periods. Therefore, the existence of inverted U-shaped Environmental Kuznets Curve is confirmed in the USA.
Preacher, Kristopher J; Hayes, Andrew F
2008-08-01
Hypotheses involving mediation are common in the behavioral sciences. Mediation exists when a predictor affects a dependent variable indirectly through at least one intervening variable, or mediator. Methods to assess mediation involving multiple simultaneous mediators have received little attention in the methodological literature despite a clear need. We provide an overview of simple and multiple mediation and explore three approaches that can be used to investigate indirect processes, as well as methods for contrasting two or more mediators within a single model. We present an illustrative example, assessing and contrasting potential mediators of the relationship between the helpfulness of socialization agents and job satisfaction. We also provide SAS and SPSS macros, as well as Mplus and LISREL syntax, to facilitate the use of these methods in applications.
Automatic shape model building based on principal geodesic analysis bootstrapping
DEFF Research Database (Denmark)
Dam, Erik B; Fletcher, P Thomas; Pizer, Stephen M
2008-01-01
iteration are used. Thereby, we gradually capture the shape variation in the training collection better and better. Convergence of the method is explicitly enforced. The method is evaluated on collections of artificial training shapes where the expected shape mean and modes of variation are known by design...
The Bootstrap Current and Neutral Beam Current Drive in DIII-D
International Nuclear Information System (INIS)
Politzer, P.A.
2005-01-01
Noninductive current drive is an essential part of the implementation of the DIII-D Advanced Tokamak program. For an efficient steady-state tokamak reactor, the plasma must provide close to 100% bootstrap fraction (f bs ). For noninductive operation of DIII-D, current drive by injection of energetic neutral beams [neutral beam current drive (NBCD)] is also important. DIII-D experiments have reached ∼80% bootstrap current in stationary discharges without inductive current drive. The remaining current is ∼20% NBCD. This is achieved at β N [approximately equal to] β p > 3, but at relatively high q 95 (∼10). In lower q 95 Advanced Tokamak plasmas, f bs ∼ 0.6 has been reached in essentially noninductive plasmas. The phenomenology of high β p and β N plasmas without current control is being studied. These plasmas display a relaxation oscillation involving repetitive formation and collapse of an internal transport barrier. The frequency and severity of these events increase with increasing β, limiting the achievable average β and causing modulation of the total current as well as the pressure. Modeling of both bootstrap and NBCD currents is based on neoclassical theory. Measurements of the total bootstrap and NBCD current agree with calculations. A recent experiment based on the evolution of the transient voltage profile after an L-H transition shows that the more recent bootstrap current models accurately describe the plasma behavior. The profiles and the parametric dependences of the local neutral beam-driven current density have not yet been compared with theory
DEFF Research Database (Denmark)
Linnet, Kristian
2005-01-01
Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...
Simultaneous confidence bands for the integrated hazard function
Dudek, Anna; Gocwin, Maciej; Leskow, Jacek
2006-01-01
The construction of the simultaneous confidence bands for the integrated hazard function is considered. The Nelson--Aalen estimator is used. The simultaneous confidence bands based on bootstrap methods are presented. Two methods of construction of such confidence bands are proposed. The weird bootstrap method is used for resampling. Simulations are made to compare the actual coverage probability of the bootstrap and the asymptotic simultaneous confidence bands. It is shown that the equal--tai...
Varouchakis, Emmanouil; Hristopulos, Dionissios
2015-04-01
Space-time geostatistical approaches can improve the reliability of dynamic groundwater level models in areas with limited spatial and temporal data. Space-time residual Kriging (STRK) is a reliable method for spatiotemporal interpolation that can incorporate auxiliary information. The method usually leads to an underestimation of the prediction uncertainty. The uncertainty of spatiotemporal models is usually estimated by determining the space-time Kriging variance or by means of cross validation analysis. For de-trended data the former is not usually applied when complex spatiotemporal trend functions are assigned. A Bayesian approach based on the bootstrap idea and sequential Gaussian simulation are employed to determine the uncertainty of the spatiotemporal model (trend and covariance) parameters. These stochastic modelling approaches produce multiple realizations, rank the prediction results on the basis of specified criteria and capture the range of the uncertainty. The correlation of the spatiotemporal residuals is modeled using a non-separable space-time variogram based on the Spartan covariance family (Hristopulos and Elogne 2007, Varouchakis and Hristopulos 2013). We apply these simulation methods to investigate the uncertainty of groundwater level variations. The available dataset consists of bi-annual (dry and wet hydrological period) groundwater level measurements in 15 monitoring locations for the time period 1981 to 2010. The space-time trend function is approximated using a physical law that governs the groundwater flow in the aquifer in the presence of pumping. The main objective of this research is to compare the performance of two simulation methods for prediction uncertainty estimation. In addition, we investigate the performance of the Spartan spatiotemporal covariance function for spatiotemporal geostatistical analysis. Hristopulos, D.T. and Elogne, S.N. 2007. Analytic properties and covariance functions for a new class of generalized Gibbs
Introduction to probability and statistics for ecosystem managers simulation and resampling
Haas, Timothy C
2013-01-01
Explores computer-intensive probability and statistics for ecosystem management decision making Simulation is an accessible way to explain probability and stochastic model behavior to beginners. This book introduces probability and statistics to future and practicing ecosystem managers by providing a comprehensive treatment of these two areas. The author presents a self-contained introduction for individuals involved in monitoring, assessing, and managing ecosystems and features intuitive, simulation-based explanations of probabilistic and statistical concepts. Mathematical programming details are provided for estimating ecosystem model parameters with Minimum Distance, a robust and computer-intensive method. The majority of examples illustrate how probability and statistics can be applied to ecosystem management challenges. There are over 50 exercises - making this book suitable for a lecture course in a natural resource and/or wildlife management department, or as the main text in a program of self-stud...
Educating Bootstrapping : Financial decision making processes in Create Business Incubator
Nosov, Igor; Hamraev, Rustam
2009-01-01
Recently, small businesses have attracted much attention from scholars and businessmen, since the significance of these businesses estimated essential in rapid changing business environment from the perspective of wealth and job creation. Simultaneously, it is well known that most infant entrepreneurs are constrained by shortage of financial resource for development and growth of their business. Some entrepreneurs carry out the need for resources by applying the particular methods of financia...
International Nuclear Information System (INIS)
Saarelma, S.; Kurki-Suonio, T.; Guenter, S.; Zehrfeld, H.-P.
2000-01-01
An ELMy ASDEX Upgrade plasma equilibrium is reconstructed taking into account the bootstrap current. The peeling mode stability of the equilibrium is numerically analysed using the GATO [1] code, and it is found that the bootstrap current can drive the plasma peeling mode unstable. A high-n ballooning mode stability analysis of the equilibria revealed that, while destabilizing the peeling modes, the bootstrap current has a stabilizing effect on the ballooning modes. A combination of these two instabilities is a possible explanation for the type I ELM phenomenon. A triangularity scan showed that increasing triangularity stabilizes the peeling modes and can produce ELM-free periods observed in the experiments. (author)
Dunham, Kylee; Grand, James B.
2016-01-01
We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.
Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut
2017-05-01
This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Stationary, high bootstrap fraction plasmas in DIII-D without inductive current control
International Nuclear Information System (INIS)
Politzer, P.A.; Hyatt, A.W.; Luce, T.C.; Prater, R.; Turnbull, A.D.; Ferron, J.R.; Greenfield, C.M.; La Haye, R.J.; Petty, C.C.; Perkins, F.W.; Brennan, D.P.; Lazarus, E.A.; Jayakumar, J.; Wade, M.R.
2005-01-01
We have initiated an experimental program to address some of the questions associated with operation of a tokamak with high bootstrap current fraction under high performance conditions, without assistance from a transformer. In these discharges stationary (or slowly improving) conditions are maintained for > 3.7 s at β N ∼ β p ≤ 3.3. The achievable current and pressure are limited by a relaxation oscillation, involving growth and collapse of an ITB at ρ ≥ 0.6. The pressure gradually increases and the current profile broadens throughout the discharge. Eventually the plasma reaches a more stable, high confinement (H89P ∼ 3) state. Characteristically these plasmas have 65%-85% bootstrap current, 15%-30% NBCD, and 0%-10% ECCD. (author)
Considerations on ECFH current drive and bootstrap current for W VII-X
International Nuclear Information System (INIS)
Gasparino, U.; Maassberg, H.
1988-01-01
Low shear is the characteristic of all proposed Wendelstein VII-X configurations. To avoid low harmonic rational numbers within the rotational transform profile, the current contribution to the rotational transform, Δt a α I/B, should be typically less than 10%. This leads to an upper limit of 50 kA (at B = 2.5 T) for the tolerable net toroidal current. A considerable net toroidal current (bootstrap current) is expected by neoclassical theory in the plateau and the low-collisionality regimes. Both radial transport as well as the bootstrap current densities depend sensitively on the magnetic configuration (see A. Montvai, this workshop). In case of an axisymmetric configuration with dimension and plasma parameters as predicted for the high- regime of WVII-X ( ∼ 5%), this current (∼ 0.5/1 MA) would dominate the rotational transform profile. This requires a reduction of magnitude of the bootstrap current to some % of the value of an equivalent tokamak. This reduction must act on the current profile itself and should not be merely obtained by having two channels of currents of different sign at different radii. Due to the possibility of controlling absorbed power and driven current profiles, electron cyclotron waves are a natural candidate for current profile control. Linear calculations show the possibility to drive a counteracting current with a profile similar to the bootstrap one. For ∼ 5% conditions, however, the optimium current drive efficiency (η ∼ 10 kA per MW) is far too low to make ECF-current drive suitable
Stationary high confinement plasmas with large bootstrap current fraction in JT-60U
International Nuclear Information System (INIS)
Sakamoto, Y.; Fujita, T.; Ide, S.; Isayama, A.; Takechi, M.; Suzuki, T.; Takenaga, H.; Oyama, N.; Kamada, Y.
2005-01-01
This paper reports the results of the progress in stationary discharges with a large bootstrap current fraction in JT-60U towards steady-state tokamak operation. In the weak shear plasma regime, high-β p ELMy H-mode discharges have been optimized under nearly full non-inductive current drive conditions by the large bootstrap current fraction (f BS ∼ 45%) and the beam driven current fraction (f BD ∼ 50%), which was sustained for 5.8 s in the stationary condition. This duration corresponds to ∼26τ E and ∼2.8τ R , which was limited by the pulse length of negative-ion-based neutral beams. The high confinement enhancement factor H 89 ∼ 2.2 (HH 98y2 ∼ 1.0) was obtained and the profiles of current and pressure reached the stationary condition. In the reversed shear plasma regime, a large bootstrap current fraction (f BS ∼ 75%) has been sustained for 7.4 s under nearly full non-inductive current drive conditions. This duration corresponds to ∼16τ E and ∼2.7τ R . The high confinement enhancement factor H 89 ∼ 3.0 (HH 98y2 ∼ 1.7) was also sustained, and the profiles of current and pressure reached the stationary condition. The large bootstrap current and the off-axis beam driven current sustained this reversed q profile. This duration was limited only by the duration of the neutral beam injection
Syntactic bootstrapping in children with Down syndrome: the impact of bilingualism.
Cleave, Patricia L; Kay-Raining Bird, Elizabeth; Trudeau, Natacha; Sutton, Ann
2014-01-01
The purpose of the study was to add to our knowledge of bilingual learning in children with Down syndrome (DS) using a syntactic bootstrapping task. Four groups of children and youth matched on non-verbal mental age participated. There were 14 bilingual participants with DS (DS-B, mean age 12;5), 12 monolingual participants with DS (DS-M, mean age 10;10), 9 bilingual typically developing children (TD-B; mean age 4;1) and 11 monolingual typically developing children (TD-M; mean age 4;1). The participants completed a computerized syntactic bootstrapping task involving unfamiliar nouns and verbs. The syntactic cues employed were a for the nouns and ing for the verbs. Performance was better on nouns than verbs. There was also a main effect for group. Follow-up t-tests revealed that there were no significant differences between the TD-M and TD-B or between the DS-M and DS-B groups. However, the DS-M group performed more poorly than the TD-M group with a large effect size. Analyses at the individual level revealed a similar pattern of results. There was evidence that Down syndrome impacted performance; there was no evidence that bilingualism negatively affected the syntactic bootstrapping skills of individuals with DS. These results from a dynamic language task are consistent with those of previous studies that used static or product measures. Thus, the results are consistent with the position that parents should be supported in their decision to provide bilingual input to their children with DS. Readers of this article will identify (1) research evidence regarding bilingual development in children with Down syndrome and (2) syntactic bootstrapping skills in monolingual and bilingual children who are typically developing or who have Down syndrome. Copyright © 2014 Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Oikawa, T.; Suzuki, T.; Isayama, A.; Hayashi, N.; Fujita, T.; Naito, O.; Tuda, T.; Kurita, G.
2005-01-01
Evolution of the current density profile associated with magnetic island formation in a neoclassical tearing mode plasma is measured for the first time in JT-60U by using a motional Stark effect diagnostic. As the island grows, the current density profile turns flat at the radial region of the island and a hollow structure appears at the rational surface. As the island shrinks the deformed region becomes narrower and finally diminishes after the disappearance of the island. In a quiescent plasma without magnetohydrodynamic instabilities, on the other hand, no deformation is observed. The observed deformation in the current density profile associated with the tearing mode is reproduced in a time dependent transport simulation assuming the reduction of the bootstrap current in the radial region of the island. Comparison of the measurement with a calculated steady-state solution also shows that the reduction and recovery of the bootstrap current at the island explains the temporal behaviours of the current density and safety factor profiles. From the experimental observation and simulations, we reach the conclusion that the bootstrap current decreases within the island O-point
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Stable equilibria for bootstrap-current-driven low aspect ratio tokamaks
International Nuclear Information System (INIS)
Miller, R.L.; Lin-Liu, Y.R.; Turnbull, A.D.; Chan, V.S.; Pearlstein, L.D.; Sauter, O.; Villard, L.
1997-01-01
Low aspect ratio tokamaks (LATs) can potentially provide a high ratio of plasma pressure to magnetic pressure β and high plasma current I at a modest size. This opens up the possibility of a high-power density compact fusion power plant. For the concept to be economically feasible, bootstrap current must be a major component of the plasma current, which requires operating at high β p . A high value of the Troyon factor β N and strong shaping is required to allow simultaneous operation at a high-β and high bootstrap fraction. Ideal magnetohydrodynamic stability of a range of equilibria at aspect ratio 1.4 is systematically explored by varying the pressure profile and shape. The pressure and current profiles are constrained in such a way as to assure complete bootstrap current alignment. Both β N and β are defined in terms of the vacuum toroidal field. Equilibria with β N ≥8 and β∼35%endash 55% exist that are stable to n=∞ ballooning modes. The highest β case is shown to be stable to n=0,1,2,3 kink modes with a conducting wall. copyright 1997 American Institute of Physics
Energy Technology Data Exchange (ETDEWEB)
Schmitt, J. C.; Talmadge, J. N.; Anderson, D. T. [Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Hanson, J. D. [Department of Physics, Auburn University, Auburn, Alabama 36849 (United States)
2014-09-15
The bootstrap current for three electron cyclotron resonance heated plasma scenarios in a quasihelically symmetric stellarator (the Helically Symmetric Experiment) are analyzed and compared to a neoclassical transport code PENTA. The three conditions correspond to 50 kW input power with a resonance that is off-axis, 50 kW on-axis heating and 100 kW on-axis heating. When the heating location was moved from off-axis to on-axis with 50 kW heating power, the stored energy and the extrapolated steady-state current were both observed to increase. When the on-axis heating power was increased from 50 kW to 100 kW, the stored energy continued to increase while the bootstrap current slightly decreased. This trend is qualitatively in agreement with the calculations which indicate that a large positive electric field for the 100 kW case was driving the current negative in a small region close to the magnetic axis and accounting for the decrease in the total integrated current. This trend in the calculations is only observed to occur when momentum conservation between particle species is included. Without momentum conservation, the calculated bootstrap current increases monotonically. We show that the magnitude of the bootstrap current as calculated by PENTA agrees better with the experiment when momentum conservation between plasma species is included in the calculation. The total current was observed in all cases to flow in a direction to unwind the transform, unlike in a tokamak in which the bootstrap current adds to the transform. The 3-D inductive response of the plasma is simulated to predict the evolution of the current profile during the discharge. The 3-D equilibrium reconstruction code V3FIT is used to reconstruct profiles of the plasma pressure and current constrained by measurements with a set of magnetic diagnostics. The reconstructed profiles are consistent with the measured plasma pressure profile and the simulated current profile when the
arXiv The S-matrix Bootstrap II: Two Dimensional Amplitudes
Paulos, Miguel F.; Toledo, Jonathan; van Rees, Balt C.; Vieira, Pedro
2017-11-22
We consider constraints on the S-matrix of any gapped, Lorentz invariant quantum field theory in 1 + 1 dimensions due to crossing symmetry and unitarity. In this way we establish rigorous bounds on the cubic couplings of a given theory with a fixed mass spectrum. In special cases we identify interesting integrable theories saturating these bounds. Our analytic bounds match precisely with numerical bounds obtained in a companion paper where we consider massive QFT in an AdS box and study boundary correlators using the technology of the conformal bootstrap.
Validation of neoclassical bootstrap current models in the edge of an H-mode plasma.
Wade, M R; Murakami, M; Politzer, P A
2004-06-11
Analysis of the parallel electric field E(parallel) evolution following an L-H transition in the DIII-D tokamak indicates the generation of a large negative pulse near the edge which propagates inward, indicative of the generation of a noninductive edge current. Modeling indicates that the observed E(parallel) evolution is consistent with a narrow current density peak generated in the plasma edge. Very good quantitative agreement is found between the measured E(parallel) evolution and that expected from neoclassical theory predictions of the bootstrap current.
Progress Toward Steady State Tokamak Operation Exploiting the high bootstrap current fraction regime
Ren, Q.
2015-11-01
Recent DIII-D experiments have advanced the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Fully noninductive plasmas with extremely high values of the poloidal beta, βp >= 4 , have been sustained at βT >= 2 % for long durations with excellent energy confinement quality (H98y,2 >= 1 . 5) and internal transport barriers (ITBs) generated at large minor radius (>= 0 . 6) in all channels (Te, Ti, ne, VTf). Large bootstrap fraction (fBS ~ 80 %) has been obtained with high βp. ITBs have been shown to be compatible with steady state operation. Because of the unusually large ITB radius, normalized pressure is not limited to low βN values by internal ITB-driven modes. βN up to ~4.3 has been obtained by optimizing the plasma-wall distance. The scenario is robust against several variations, including replacing some on-axis with off-axis neutral beam injection (NBI), adding electron cyclotron (EC) heating, and reducing the NBI torque by a factor of 2. This latter observation is particularly promising for extension of the scenario to EAST, where maximum power is obtained with balanced NBI injection, and to a reactor, expected to have low rotation. However, modeling of this regime has provided new challenges to state-of-the-art modeling capabilities: quasilinear models can dramatically underpredict the electron transport, and the Sauter bootstrap current can be insufficient. The analysis shows first-principle NEO is in good agreement with experiments for the bootstrap current calculation and ETG modes with a larger saturated amplitude or EM modes may provide the missing electron transport. Work supported in part by the US DOE under DE-FC02-04ER54698, DE-AC52-07NA
Selecting the Most Economic Project under Uncertainty Using Bootstrap Technique and Fuzzy Simulation
Directory of Open Access Journals (Sweden)
Kamran Shahanaghi
2012-01-01
Full Text Available This article, by leaving pre-determined membership function of a fuzzy set which is a basic assumption for such subject, will try to propose a hybrid technique to select the most economic project among alternative projects in fuzziness interest rates condition. In this way, net present worth (NPW would be the economic indicator. This article tries to challenge the assumption of large sample sizes availability for membership function determination and shows that some other techniques may have less accuracy. To give a robust solution, bootstrapping and fuzzy simulation is suggested and a numerical example is given and analyzed.
Performance of mutual equity funds in Brazil – A bootstrap analysis
Directory of Open Access Journals (Sweden)
Marco Antonio Laes
2014-09-01
Full Text Available This article reports a study on the performance of mutual equity funds in Brazil from January 2002 to August 2012. For the analyses, Carhart's four-factor model is used as the benchmark for performance, and bootstrap procedures are applied to separate skill from luck. The results show that returns of the best performers are more due to luck than skill of their managers. For the bottom ranked funds, on the contrary, there is statistical evidence that their poor performance is caused mainly by bad management, rather than by bad luck. It is also showed that the largest funds perform better than the small or middle-sized funds.
A condition for small bootstrap current in three-dimensional toroidal configurations
Energy Technology Data Exchange (ETDEWEB)
Mikhailov, M. I., E-mail: mikhaylov-mi@nrcki.ru [National Russian Research Center Kurchatov Institute (Russian Federation); Nührenberg, J.; Zille, R. [Max-Planck-Institut für Plasmaphysik (Germany)
2016-11-15
It is shown that, if the maximum of the magnetic field strength on a magnetic surface in a threedimensional magnetic confinement configuration with stellarator symmetry constitutes a line that is orthogonal to the field lines and crosses the symmetry line, then the bootstrap current density is smaller compared to that in quasi-axisymmetric (qa) [J. Nührenberg et al., in Proc. of Joint Varenna−Lausanne Int. Workshop on Theory of Fusion Plasmas, Varenna, 1994, p. 3] and quasi-helically (qh) symmetric [J. Nührenberg and R. Zille, Phys. Lett. A 129, 113 (1988)] configurations.
Benchmarking of a T-wave alternans detection method based on empirical mode decomposition.
Blanco-Velasco, Manuel; Goya-Esteban, Rebeca; Cruz-Roldán, Fernando; García-Alberola, Arcadi; Rojo-Álvarez, José Luis
2017-07-01
T-wave alternans (TWA) is a fluctuation of the ST-T complex occurring on an every-other-beat basis of the surface electrocardiogram (ECG). It has been shown to be an informative risk stratifier for sudden cardiac death, though the lack of gold standard to benchmark detection methods has promoted the use of synthetic signals. This work proposes a novel signal model to study the performance of a TWA detection. Additionally, the methodological validation of a denoising technique based on empirical mode decomposition (EMD), which is used here along with the spectral method, is also tackled. The proposed test bed system is based on the following guidelines: (1) use of open source databases to enable experimental replication; (2) use of real ECG signals and physiological noise; (3) inclusion of randomized TWA episodes. Both sensitivity (Se) and specificity (Sp) are separately analyzed. Also a nonparametric hypothesis test, based on Bootstrap resampling, is used to determine whether the presence of the EMD block actually improves the performance. The results show an outstanding specificity when the EMD block is used, even in very noisy conditions (0.96 compared to 0.72 for SNR = 8 dB), being always superior than that of the conventional SM alone. Regarding the sensitivity, using the EMD method also outperforms in noisy conditions (0.57 compared to 0.46 for SNR=8 dB), while it decreases in noiseless conditions. The proposed test setting designed to analyze the performance guarantees that the actual physiological variability of the cardiac system is reproduced. The use of the EMD-based block in noisy environment enables the identification of most patients with fatal arrhythmias. Copyright © 2017 Elsevier B.V. All rights reserved.
Koçak, Emrah; Şarkgüneşi, Aykut
2018-01-01
Pollution haven hypothesis (PHH), which is defined as foreign direct investment inducing a raising impact on the pollution level in the hosting country, is lately a subject of discussion in the field of economics. This study, within the scope of related discussion, aims to look into the potential impact of foreign direct investments on CO 2 emission in Turkey in 1974-2013 period using environmental Kuznets curve (EKC) model. For this purpose, Maki (Econ Model 29(5):2011-2015, 2012) structural break cointegration test, Stock and Watson (Econometrica 61:783-820, 1993) dynamic ordinary least square estimator (DOLS), and Hacker and Hatemi-J (J Econ Stud 39(2):144-160, 2012) bootstrap test for causality method are used. Research results indicate the existence of a long-term balance relationship between FDI, economic growth, energy usage, and CO 2 emission. As per this relationship, in Turkey, (1) the potential impact of FDI on CO 2 emission is positive. This result shows that PHH is valid in Turkey. (2) Moreover, this is not a one-way relationship; the changes in CO 2 emission also affect FDI entries. (3) The results also provide evidence for the existence of the EKC hypothesis in Turkey. Within the frame of related findings, the study concludes several polities and presents various suggestions.
Energy Technology Data Exchange (ETDEWEB)
Andrade, Maria Celia Ramos; Ludwig, Gerson Otto [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Lab. Associado de Plasma]. E-mail: mcr@plasma.inpe.br
2004-07-01
Different bootstrap current formulations are implemented in a self-consistent equilibrium calculation obtained from a direct variational technique in fixed boundary tokamak plasmas. The total plasma current profile is supposed to have contributions of the diamagnetic, Pfirsch-Schlueter, and the neoclassical Ohmic and bootstrap currents. The Ohmic component is calculated in terms of the neoclassical conductivity, compared here among different expressions, and the loop voltage determined consistently in order to give the prescribed value of the total plasma current. A comparison among several bootstrap current models for different viscosity coefficient calculations and distinct forms for the Coulomb collision operator is performed for a variety of plasma parameters of the small aspect ratio tokamak ETE (Experimento Tokamak Esferico) at the Associated Plasma Laboratory of INPE, in Brazil. We have performed this comparison for the ETE tokamak so that the differences among all the models reported here, mainly regarding plasma collisionality, can be better illustrated. The dependence of the bootstrap current ratio upon some plasma parameters in the frame of the self-consistent calculation is also analysed. We emphasize in this paper what we call the Hirshman-Sigmar/Shaing model, valid for all collisionality regimes and aspect ratios, and a fitted formulation proposed by Sauter, which has the same range of validity but is faster to compute than the previous one. The advantages or possible limitations of all these different formulations for the bootstrap current estimate are analysed throughout this work. (author)
Energy Technology Data Exchange (ETDEWEB)
Castejón, F.; Gómez-Iglesias, A.; Velasco, J. L.
2015-07-01
This work is devoted to introduce new optimization criterion in the DAB (Distributed Asynchronous Bees) code. With this new criterion, we have now in DAB the equilibrium and Mercier stability criteria, the minimization of Bxgrad(B) criterion, which ensures the reduction of neoclassical transport and the improvement of the confinement of fast particles, and the reduction of bootstrap current. We have started from a neoclassically optimised configuration of the helias type and imposed the reduction of bootstrap current. The obtained configuration only presents a modest reduction of total bootstrap current, but the local current density is reduced along the minor radii. Further investigations are developed to understand the reason of this modest improvement.
How efficient are Greek hospitals? A case study using a double bootstrap DEA approach.
Kounetas, Kostas; Papathanassopoulos, Fotis
2013-12-01
The purpose of this study was to measure Greek hospital performance using different input-output combinations, and to identify the factors that influence their efficiency thus providing policy makers with valuable input for the decision-making process. Using a unique dataset, we estimated the productive efficiency of each hospital through a bootstrapped data envelopment analysis (DEA) approach. In a second stage, we explored, using a bootstrapped truncated regression, the impact of environmental factors on hospitals' technical and scale efficiency. Our results reveal that over 80% of the examined hospitals appear to have a technical efficiency lower than 0.8, while the majority appear to be scale efficient. Moreover, efficiency performance differed with inclusion of medical examinations as an additional variable. On the other hand, bed occupancy ratio appeared to affect both technical and scale efficiency in a rather interesting way, while the adoption of advanced medical equipment and the type of hospital improves scale and technical efficiency, correspondingly. The findings of this study on Greek hospitals' performance are not encouraging. Furthermore, our results raise questions regarding the number of hospitals that should operate, and which type of hospital is more efficient. Finally, the results indicate the role of medical equipment in performance, confirming its misallocation in healthcare expenditure.
Impurities in a non-axisymmetric plasma: Transport and effect on bootstrap current
Energy Technology Data Exchange (ETDEWEB)
Mollén, A., E-mail: albertm@chalmers.se [Department of Applied Physics, Chalmers University of Technology, Göteborg (Sweden); Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Landreman, M. [Institute for Research in Electronics and Applied Physics, University of Maryland, College Park, Maryland 20742 (United States); Smith, H. M.; Helander, P. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Braun, S. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); German Aerospace Center, Institute of Engineering Thermodynamics, Pfaffenwaldring 38-40, D-70569 Stuttgart (Germany)
2015-11-15
Impurities cause radiation losses and plasma dilution, and in stellarator plasmas the neoclassical ambipolar radial electric field is often unfavorable for avoiding strong impurity peaking. In this work we use a new continuum drift-kinetic solver, the SFINCS code (the Stellarator Fokker-Planck Iterative Neoclassical Conservative Solver) [M. Landreman et al., Phys. Plasmas 21, 042503 (2014)] which employs the full linearized Fokker-Planck-Landau operator, to calculate neoclassical impurity transport coefficients for a Wendelstein 7-X (W7-X) magnetic configuration. We compare SFINCS calculations with theoretical asymptotes in the high collisionality limit. We observe and explain a 1/ν-scaling of the inter-species radial transport coefficient at low collisionality, arising due to the field term in the inter-species collision operator, and which is not found with simplified collision models even when momentum correction is applied. However, this type of scaling disappears if a radial electric field is present. We also use SFINCS to analyze how the impurity content affects the neoclassical impurity dynamics and the bootstrap current. We show that a change in plasma effective charge Z{sub eff} of order unity can affect the bootstrap current enough to cause a deviation in the divertor strike point locations.
Modality specificity and integration in working memory: Insights from visuospatial bootstrapping.
Allen, Richard J; Havelka, Jelena; Falcon, Thomas; Evans, Sally; Darling, Stephen
2015-05-01
The question of how meaningful associations between verbal and spatial information might be utilized to facilitate working memory performance is potentially highly instructive for models of memory function. The present study explored how separable processing capacities within specialized domains might each contribute to this, by examining the disruptive impacts of simple verbal and spatial concurrent tasks on young adults' recall of visually presented digit sequences encountered either in a single location or within a meaningful spatial "keypad" configuration. The previously observed advantage for recall in the latter condition (the "visuospatial bootstrapping effect") consistently emerged across 3 experiments, indicating use of familiar spatial information in boosting verbal memory. The magnitude of this effect interacted with concurrent activity; articulatory suppression during encoding disrupted recall to a greater extent when digits were presented in single locations (Experiment 1), while spatial tapping during encoding had a larger impact on the keypad condition and abolished the visuospatial bootstrapping advantage (Experiment 2). When spatial tapping was performed during recall (Experiment 3), no task by display interaction was observed. Outcomes are discussed within the context of the multicomponent model of working memory, with a particular emphasis on cross-domain storage in the episodic buffer (Baddeley, 2000). (c) 2015 APA, all rights reserved).
Analytic bounds and emergence of AdS{sub 2} physics from the conformal bootstrap
Energy Technology Data Exchange (ETDEWEB)
Mazáč, Dalimil [Perimeter Institute for Theoretical Physics,Waterloo, ON N2L 2Y5 (Canada); Department of Physics and Astronomy, University of Waterloo,ON N2L 3G1 (Canada)
2017-04-26
We study analytically the constraints of the conformal bootstrap on the low-lying spectrum of operators in field theories with global conformal symmetry in one and two spacetime dimensions. We introduce a new class of linear functionals acting on the conformal bootstrap equation. In 1D, we use the new basis to construct extremal functionals leading to the optimal upper bound on the gap above identity in the OPE of two identical primary operators of integer or half-integer scaling dimension. We also prove an upper bound on the twist gap in 2D theories with global conformal symmetry. When the external scaling dimensions are large, our functionals provide a direct point of contact between crossing in a 1D CFT and scattering of massive particles in large AdS{sub 2}. In particular, CFT crossing can be shown to imply that appropriate OPE coefficients exhibit an exponential suppression characteristic of massive bound states, and that the 2D flat-space S-matrix should be analytic away from the real axis.
Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G
2018-03-01
Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.
International Nuclear Information System (INIS)
Chu, Hsiao-Ping; Chang Tsangyao
2012-01-01
This study applies bootstrap panel Granger causality to test whether energy consumption promotes economic growth using data from G-6 countries over the period of 1971–2010. Both nuclear and oil consumption data are used in this study. Regarding the nuclear consumption-economic growth nexus, nuclear consumption causes economic growth in Japan, the UK, and the US; economic growth causes nuclear consumption in the US; nuclear consumption and economic growth show no causal relation in Canada, France and Germany. Regarding oil consumption-economic growth nexus, we find that there is one-way causality from economic growth to oil consumption only in the US, and that oil consumption does not Granger cause economic growth in G-6 countries except Germany and Japan. Our results have important policy implications for the G-6 countries within the context of economic development. - Highlights: ► Bootstrap panel Granger causality test whether energy consumption promotes economic growth. ► Data from G-6 countries for both nuclear and oil consumption data are used. ► Results have important policy implications within the context of economic development.
A neurocomputational theory of how explicit learning bootstraps early procedural learning.
Paul, Erick J; Ashby, F Gregory
2013-01-01
It is widely accepted that human learning and memory is mediated by multiple memory systems that are each best suited to different requirements and demands. Within the domain of categorization, at least two systems are thought to facilitate learning: an explicit (declarative) system depending largely on the prefrontal cortex, and a procedural (non-declarative) system depending on the basal ganglia. Substantial evidence suggests that each system is optimally suited to learn particular categorization tasks. However, it remains unknown precisely how these systems interact to produce optimal learning and behavior. In order to investigate this issue, the present research evaluated the progression of learning through simulation of categorization tasks using COVIS, a well-known model of human category learning that includes both explicit and procedural learning systems. Specifically, the model's parameter space was thoroughly explored in procedurally learned categorization tasks across a variety of conditions and architectures to identify plausible interaction architectures. The simulation results support the hypothesis that one-way interaction between the systems occurs such that the explicit system "bootstraps" learning early on in the procedural system. Thus, the procedural system initially learns a suboptimal strategy employed by the explicit system and later refines its strategy. This bootstrapping could be from cortical-striatal projections that originate in premotor or motor regions of cortex, or possibly by the explicit system's control of motor responses through basal ganglia-mediated loops.
A Neurocomputational Theory of how Explicit Learning Bootstraps Early Procedural Learning
Directory of Open Access Journals (Sweden)
Erick Joseph Paul
2013-12-01
Full Text Available It is widely accepted that human learning and memory is mediated by multiple memory systems that are each best suited to different requirements and demands. Within the domain of categorization, at least two systems are thought to facilitate learning: an explicit (declarative system depending largely on the prefrontal cortex, and a procedural (non-declarative system depending on the basal ganglia. Substantial evidence suggests that each system is optimally suited to learn particular categorization tasks. However, it remains unknown precisely how these systems interact to produce optimal learning and behavior. In order to investigate this issue, the present research evaluated the progression of learning through simulation of categorization tasks using COVIS, a well-known model of human category learning that includes both explicit and procedural learning systems. Specifically, the model's parameter space was thoroughly explored in procedurally learned categorization tasks across a variety of conditions and architectures to identify plausible interaction architectures. The simulation results support the hypothesis that one-way interaction between the systems occurs such that the explicit system "bootstraps" learning early on in the procedural system. Thus, the procedural system initially learns a suboptimal strategy employed by the explicit system and later refines its strategy. This bootstrapping could be from cortical-striatal projections that originate in premotor or motor regions of cortex, or possibly by the explicit system’s control of motor responses through basal ganglia-mediated loops.
A symbol of uniqueness: the cluster bootstrap for the 3-loop MHV heptagon
Energy Technology Data Exchange (ETDEWEB)
Drummond, J.M. [School of Physics & Astronomy, University of Southampton, Highfield, Southampton, SO17 1BJ (United Kingdom); Theory Division, Physics Department, CERN, CH-1211 Geneva 23 (Switzerland); LAPTh, CNRS, Université de Savoie, F-74941 Annecy-le-Vieux Cedex (France); Papathanasiou, G. [LAPTh, CNRS, Université de Savoie, F-74941 Annecy-le-Vieux Cedex (France); Spradlin, M. [Department of Physics, Brown University, Providence, RI 02912 (United States)
2015-03-16
Seven-particle scattering amplitudes in planar super-Yang-Mills theory are believed to belong to a special class of generalised polylogarithm functions called heptagon functions. These are functions with physical branch cuts whose symbols may be written in terms of the 42 cluster A-coordinates on Gr (4,7). Motivated by the success of the hexagon bootstrap programme for constructing six-particle amplitudes we initiate the systematic study of the symbols of heptagon functions. We find that there is exactly one such symbol of weight six which satisfies the MHV last-entry condition and is finite in the 7∥6 collinear limit. This unique symbol is both dihedral and parity-symmetric, and remarkably its collinear limit is exactly the symbol of the three-loop six-particle MHV amplitude, although none of these properties were assumed a priori. It must therefore be the symbol of the three-loop seven-particle MHV amplitude. The simplicity of its construction suggests that the n-gon bootstrap may be surprisingly powerful for n>6.
International Nuclear Information System (INIS)
King, Martin; Rodgers, Zachary; Giger, Maryellen L.; Bardo, Dianna M. E.; Patel, Amit R.
2010-01-01
Purpose: In cardiac computed tomography (CT), important clinical indices, such as the coronary calcium score and the percentage of coronary artery stenosis, are often adversely affected by motion artifacts. As a result, the expert observer must decide whether or not to use these indices during image interpretation. Computerized methods potentially can be used to assist in these decisions. In a previous study, an artificial neural network (ANN) regression model provided assessability (image quality) indices of calcified plaque images from the software NCAT phantom that were highly agreeable with those provided by expert observers. The method predicted assessability indices based on computer-extracted features of the plaque. In the current study, the ANN-predicted assessability indices were used to identify calcified plaque images with diagnostic calcium scores (based on mass) from a physical dynamic cardiac phantom. The basic assumption was that better quality images were associated with more accurate calcium scores. Methods: A 64-channel CT scanner was used to obtain 500 calcified plaque images from a physical dynamic cardiac phantom at different heart rates, cardiac phases, and plaque locations. Two expert observers independently provided separate sets of assessability indices for each of these images. Separate sets of ANN-predicted assessability indices tailored to each observer were then generated within the framework of a bootstrap resampling scheme. For each resampling iteration, the absolute calcium score error between the calcium scores of the motion-contaminated plaque image and its corresponding stationary image served as the ground truth in terms of indicating images with diagnostic calcium scores. The performances of the ANN-predicted and observer-assigned indices in identifying images with diagnostic calcium scores were then evaluated using ROC analysis. Results: Assessability indices provided by the first observer and the corresponding ANN performed
Energy Technology Data Exchange (ETDEWEB)
King, Martin; Rodgers, Zachary; Giger, Maryellen L.; Bardo, Dianna M. E.; Patel, Amit R. [Department of Radiology, Committee on Medical Physics, University of Chicago, 5841 South Maryland Avenue, MC 2026, Chicago, Illinois 60637 (United States); Department of Diagnostic Radiology, Oregon Health and Science University, 3181 Southwest Sam Jackson Park Road, Portland, Oregon 97239 (United States); Department of Medicine, University of Chicago, 5841 South Maryland Avenue, MC 5084, Chicago, Illinois 60637 (United States)
2010-11-15
Purpose: In cardiac computed tomography (CT), important clinical indices, such as the coronary calcium score and the percentage of coronary artery stenosis, are often adversely affected by motion artifacts. As a result, the expert observer must decide whether or not to use these indices during image interpretation. Computerized methods potentially can be used to assist in these decisions. In a previous study, an artificial neural network (ANN) regression model provided assessability (image quality) indices of calcified plaque images from the software NCAT phantom that were highly agreeable with those provided by expert observers. The method predicted assessability indices based on computer-extracted features of the plaque. In the current study, the ANN-predicted assessability indices were used to identify calcified plaque images with diagnostic calcium scores (based on mass) from a physical dynamic cardiac phantom. The basic assumption was that better quality images were associated with more accurate calcium scores. Methods: A 64-channel CT scanner was used to obtain 500 calcified plaque images from a physical dynamic cardiac phantom at different heart rates, cardiac phases, and plaque locations. Two expert observers independently provided separate sets of assessability indices for each of these images. Separate sets of ANN-predicted assessability indices tailored to each observer were then generated within the framework of a bootstrap resampling scheme. For each resampling iteration, the absolute calcium score error between the calcium scores of the motion-contaminated plaque image and its corresponding stationary image served as the ground truth in terms of indicating images with diagnostic calcium scores. The performances of the ANN-predicted and observer-assigned indices in identifying images with diagnostic calcium scores were then evaluated using ROC analysis. Results: Assessability indices provided by the first observer and the corresponding ANN performed
International Nuclear Information System (INIS)
Porto, Paolo; Walling, Des E.; Alewell, Christine; Callegari, Giovanni; Mabit, Lionel; Mallimo, Nicola; Meusburger, Katrin; Zehringer, Markus
2014-01-01
Soil erosion and both its on-site and off-site impacts are increasingly seen as a serious environmental problem across the world. The need for an improved evidence base on soil loss and soil redistribution rates has directed attention to the use of fallout radionuclides, and particularly 137 Cs, for documenting soil redistribution rates. This approach possesses important advantages over more traditional means of documenting soil erosion and soil redistribution. However, one key limitation of the approach is the time-averaged or lumped nature of the estimated erosion rates. In nearly all cases, these will relate to the period extending from the main period of bomb fallout to the time of sampling. Increasing concern for the impact of global change, particularly that related to changing land use and climate change, has frequently directed attention to the need to document changes in soil redistribution rates within this period. Re-sampling techniques, which should be distinguished from repeat-sampling techniques, have the potential to meet this requirement. As an example, the use of a re-sampling technique to derive estimates of the mean annual net soil loss from a small (1.38 ha) forested catchment in southern Italy is reported. The catchment was originally sampled in 1998 and samples were collected from points very close to the original sampling points again in 2013. This made it possible to compare the estimate of mean annual erosion for the period 1954–1998 with that for the period 1999–2013. The availability of measurements of sediment yield from the catchment for parts of the overall period made it possible to compare the results provided by the 137 Cs re-sampling study with the estimates of sediment yield for the same periods. In order to compare the estimates of soil loss and sediment yield for the two different periods, it was necessary to establish the uncertainty associated with the individual estimates. In the absence of a generally accepted procedure
International Nuclear Information System (INIS)
Takei, Nahoko; Tsutsui, Hiroaki; Tsuji-Iio, Shunji; Shimada, Ryuichi; Nakamura, Yukiharu; Kawano, Yasunori; Ozeki, Takahisa; Tobita, Kenji; Sugihara, Masayoshi
2004-01-01
Axisymmetric MHD simulation using the Tokamak Simulation Code demonstrated detailed disruption dynamics triggered by a crash of internal transport barrier in high bootstrap current, high β, reversed shear plasmas. Self-consistent time-evolutions of ohmic current bootstrap current and induced loop voltage profiles inside the disrupting plasma were shown from a view point of disruption characterization and mitigation. In contrast with positive shear plasmas, a particular feature of high bootstrap current reversed shear plasma disruption was computed to be a significant change of plasma current profile, which is normally caused due to resistive diffusion of the electric field induced by the crash of internal transport barrier in a region wider than the internal transport barrier. Discussion based on the simulation results was made on the fastest record of the plasma current quench observed in JT-60U reversed shear plasma disruptions. (author)
A simulation study on burning profile tailoring of steady state, high bootstrap current tokamaks
International Nuclear Information System (INIS)
Nakamura, Y.; Takei, N.; Tobita, K.; Sakamoto, Y.; Fujita, T.; Fukuyama, A.; Jardin, S.C.
2007-01-01
From the aspect of fusion burn control in steady state DEMO plant, the significant challenges are to maintain its high power burning state of ∝3-5 GW without burning instability, hitherto well-known as ''thermal stability'', and also to keep its desired burning profile relevant with internal transport barrier (ITB) that generates high bootstrap current. The paper presents a simulation modeling of the burning stability coupled with the self-ignited fusion burn and the structure-formation of the ITB. A self-consistent simulation, including a model for improved core energy confinement, has pointed out that in the high power fusion DEMO plant there is a close, nonlinear interplay between the fusion burnup and the current source of non-inductive, ITB-generated bootstrap current. Consequently, as much distinct from usual plasma controls under simulated burning conditions with lower power (<<1 GW), the selfignited fusion burn at a high power burning state of ∝3-5 GW becomes so strongly selforganized that any of external means except fuelling can not provide the effective control of the stable fusion burn.It is also demonstrated that externally applied, inductive current perturbations can be used to control both the location and strength of ITB in a fully noninductive tokamak discharge. We find that ITB structures formed with broad noninductive current sources such as LHCD are more readily controlled than those formed by localized sources such as ECCD. The physics of the inductive current is well known. Consequently, we believe that the controllability of the ITB is generic, and does not depend on the details of the transport model (as long as they can form an ITB for sufficiently reversed magnetic shear q-profile). Through this external control of the magnetic shear profile, we can maintain the ITB strength that is otherwise prone to deteriorate when the bootstrap current increases. These distinguishing capabilities of inductive current perturbation provide steady
A MONTE-CARLO METHOD FOR ESTIMATING THE CORRELATION EXPONENT
MIKOSCH, T; WANG, QA
We propose a Monte Carlo method for estimating the correlation exponent of a stationary ergodic sequence. The estimator can be considered as a bootstrap version of the classical Hill estimator. A simulation study shows that the method yields reasonable estimates.
Bootstrapping in a language of thought: a formal model of numerical concept learning.
Piantadosi, Steven T; Tenenbaum, Joshua B; Goodman, Noah D
2012-05-01
In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful representational system. We provide an implemented model that is powerful enough to learn number word meanings and other related conceptual systems from naturalistic data. The model shows that bootstrapping can be made computationally and philosophically well-founded as a theory of number learning. Our approach demonstrates how learners may combine core cognitive operations to build sophisticated representations during the course of development, and how this process explains observed developmental patterns in number word learning. Copyright Â© 2011 Elsevier B.V. All rights reserved.
Bootstrap calculation of the dynamical quark mass in QCD4 at finite temperature
International Nuclear Information System (INIS)
Cabo, A.; Kalashnikov, O.K.; Veliev, E.Kh.
1988-01-01
Nonperturbative calculations of the dynamical quark mass m(T) are given in QCD 4 , based on the bootstrap solution of the Schwinger-Dyson equation for the quark Green function at finite temperatures. A closed nonlinear equation is obtained for m(T) whose solution is found under some simplifying assumptions. We used a particular approximation for the effective charge and the nonperturbative expressions of the gluon magnetic and electric masses. The singular behavior of m(T) is established and its parameters are determined numerically. The singularity found is shown to correctly reproduce the chiral phase transition and the temperature limits obtained for m(T) are qualitatively correct. The complete phase diagram of QCD 4 in the (μ,T) plane is briefly discussed. (orig.)
Directory of Open Access Journals (Sweden)
Moh Ainol Yaqin
2018-05-01
Full Text Available The information system in the form of Integrated Advice Planning by using CodeIgniter Framework and based on Framework Bootstrap one of system which gives responsive form. The system is a service as one of solution for e-Government. Advice Planning service is the optimization of public services in the licensing sector and the optimization of agency management. The licensing service is in the form of design consultation and the location of the building in accordance with the Spatial and Regional Plan within a Local Government. Licensing process that must be fulfilled by prospective investors either individually or on behalf of the company and supporting infrastructure around the investment location. The services provided by the Information System have provided Advice Planning application submission online. The system is expected to provide convenience for the community in the region and create a friendly, the comfortable, the transparent and cheap of interaction between the government and the community.
Oil consumption and output: What causes what? Bootstrap panel causality for 49 countries
International Nuclear Information System (INIS)
Chu, Hsiao-Ping
2012-01-01
This study examines the growth, conservation, neutrality and feedback hypotheses for 49 countries during the period from 1970 to 2010 using panel causality analysis: this technique accounts for both dependence and heterogeneity across the countries. The results provide evidence as to the direction of causality between oil consumption and output and are consistent with the neutrality hypothesis for 24 countries, the growth hypothesis for 5 countries, the conservation hypothesis for 13 countries, and the feedback hypothesis for 7 countries. The findings provide important policy implications for the 49 countries under study. - Highlights: ► Bootstrap panel causality for 49 countries. ► Examines the “growth, conservation, neutrality and feedback” hypotheses for 49 countries during the period from 1970 to 2010.
Bootstrapping hypercubic and hypertetrahedral theories in three dimensions arXiv
Stergiou, Andreas
There are three generalizations of the Platonic solids that exist in all dimensions, namely the hypertetrahedron, the hypercube, and the hyperoctahedron, with the latter two being dual. Conformal field theories with the associated symmetry groups as global symmetries can be argued to exist in $d=3$ spacetime dimensions if the $\\varepsilon=4-d$ expansion is valid when $\\varepsilon\\to1$. In this paper hypercubic and hypertetrahedral theories are studied with the non-perturbative numerical conformal bootstrap. In the $N=3$ cubic case it is found that a bound with a kink is saturated by a solution with properties that cannot be reconciled with the $\\varepsilon$ expansion of the cubic theory. Possible implications for cubic magnets and structural phase transitions are discussed. For the hypertetrahedral theory evidence is found that the non-conformal window that is seen with the $\\varepsilon$ expansion exists in $d=3$ as well, and a rough estimate of its extent is given.
On the optimization of a steady-state bootstrap-reactor
International Nuclear Information System (INIS)
Polevoy, A.R.; Martynov, A.A.; Medvedev, S.Yu.
1993-01-01
A commercial fusion tokamak-reactor may be economically acceptable only for low recirculating power fraction r 0 ≡ P CD /P α BS ≡I BS /I > 0.9 to sustain the steady-state operation mode for high plasma densities > 1.5 10 20 m -3 , fulfilled the divertor conditions. This paper presents the approximate expressions for the optimal set of reactor parameters for r BS /I∼1, based on the self-consistent plasma simulations by 1.5D ASTRA code. The linear MHD stability analysis for ideal n=1 kink and ballooning modes has been carried out to determine the conditions of stabilization for bootstrap steady state tokamak reactor BSSTR configurations. (author) 10 refs., 1 tab
Bootstrapping O(N) vector models with four supercharges in 3≤d≤4
Energy Technology Data Exchange (ETDEWEB)
Chester, Shai M.; Iliesiu, Luca V.; Pufu, Silviu S.; Yacoby, Ran [Joseph Henry Laboratories, Princeton University,Washington Road, Princeton, NJ 08544 (United States)
2016-05-17
We analyze the conformal bootstrap constraints in theories with four supercharges and a global O(N)×U(1) flavor symmetry in 3≤d≤4 dimensions. In particular, we consider the 4-point function of O(N)-fundamental chiral operators Z{sub i} that have no chiral primary in the O(N)-singlet sector of their OPE. We find features in our numerical bounds that nearly coincide with the theory of N+1 chiral super-fields with superpotential W=X∑{sub i=1}{sup N}Z{sub i}{sup 2}, as well as general bounds on SCFTs where ∑{sub i=1}{sup N}Z{sub i}{sup 2} vanishes in the chiral ring.
Directory of Open Access Journals (Sweden)
Rodolfo Gordillo-Orquera
2018-06-01
Full Text Available Efficient energy management is strongly dependent on determining the adequate power contracts among the ones offered by different electricity suppliers. This topic takes special relevance in healthcare buildings, where noticeable amounts of energy are required to generate an adequate health environment for patients and staff. In this paper, a convex optimization method is scrutinized to give a straightforward analysis of the optimal power levels to be contracted while minimizing the electricity bill cost in a time-of-use pricing scheme. In addition, a sensitivity analysis is carried out on the constraints in the optimization problems, which are analyzed in terms of both their empirical distribution and their bootstrap-estimated statistical distributions to create a simple-to-use tool for this purpose, the so-called mosaic-distribution. The evaluation of the proposed method was carried out with five-year consumption data on two different kinds of healthcare buildings, a large one given by Hospital Universitario de Fuenlabrada, and a primary care center, Centro de Especialidades el Arroyo, both located at Fuenlabrada (Madrid, Spain. The analysis of the resulting optimization shows that the annual savings achieved vary moderately, ranging from −0.22 % to +27.39%, depending on the analyzed year profile and the healthcare building type. The analysis introducing mosaic-distribution to represent the sensitivity score also provides operative information to evaluate the convenience of implementing energy saving measures. All this information is useful for managers to determine the appropriate power levels for next year contract renewal and to consider whether to implement demand response mechanisms in healthcare buildings.
International Nuclear Information System (INIS)
Gates, D.A.
2003-01-01
Long-pulse, high-beta scenarios have been established on the National Spherical Torus Experiment (NSTX). Beta(sub)t(always equal to 2μ(sub)0· /B 2 (sub)t0) ∼ 35% has been achieved during transient discharges. The machine improvements that lead to these results, including error field reduction and high-temperature bakeout of plasma-facing components are described. The highest Beta(sub)t plasmas have high triangularity (delta = 0.8) and elongation (k = 2.0) at low-aspect ratio A always equal to R/a = 1.4. The strong shaping permits large values of normalized current, I(sub)N(always equal to I(sub)p /(aB(sub)t0)) approximately equal to 6 while maintaining moderate values of q(sub)95 = 4. Long-pulse discharges up to 1 sec in duration have been achieved with substantial bootstrap current. The total noninductive current drive can be as high as 60%, comprised of 50% bootstrap current and ∼10% neutral-beam current drive. The confinement enhancement factor H89P is in excess of 2.7. Beta(sub)N * H(sub)89P approximately or greater than 15 has been maintained for 8 * tau(sub)E ∼ 1.6 * tau(sub)CR, where tau(sub)CR is the relaxation time of the first radial moment of the toroidal current density. The ion temperature for these plasmas is significantly higher than that predicted by neoclassical theory
Cheng, Zhaohui; Cai, Miao; Tao, Hongbing; He, Zhifei; Lin, Xiaojun; Lin, Haifeng; Zuo, Yuling
2016-01-01
Objective Township hospitals (THs) are important components of the three-tier rural healthcare system of China. However, the efficiency and productivity of THs have been questioned since the healthcare reform was implemented in 2009. The objective of this study is to analyse the efficiency and productivity changes in THs before and after the reform process. Setting and participants A total of 48 sample THs were selected from the Xiaogan Prefecture in Hubei Province from 2008 to 2014. Outcome measures First, bootstrapping data envelopment analysis (DEA) was performed to estimate the technical efficiency (TE), pure technical efficiency (PTE) and scale efficiency (SE) of the sample THs during the period. Second, the bootstrapping Malmquist productivity index was used to calculate the productivity changes over time. Results The average TE, PTE and SE of the sample THs over the 7-year period were 0.5147, 0.6373 and 0.7080, respectively. The average TE and PTE increased from 2008 to 2012 but declined considerably after 2012. In general, the sample THs experienced a negative shift in productivity from 2008 to 2014. The negative change was 2.14%, which was attributed to a 23.89% decrease in technological changes (TC). The sample THs experienced a positive productivity shift from 2008 to 2012 but experienced deterioration from 2012 to 2014. Conclusions There was considerable space for TE improvement in the sample THs since the average TE was relatively low. From 2008 to 2014, the sample THs experienced a decrease in productivity, and the adverse alteration in TC should be emphasised. In the context of healthcare reform, the factors that influence TE and productivity of THs are complex. Results suggest that numerous quantitative and qualitative studies are necessary to explore the reasons for the changes in TE and productivity. PMID:27836870
Bootstrapping mixed correlators in the five dimensional critical O(N) models
Energy Technology Data Exchange (ETDEWEB)
Li, Zhijin; Su, Ning [George P. and Cynthia W. Mitchell Institute for Fundamental Physics and Astronomy,Texas A& M University, College Station, TX 77843 (United States)
2017-04-18
We use the conformal bootstrap approach to explore 5D CFTs with O(N) global symmetry, which contain N scalars ϕ{sub i} transforming as O(N) vector. Specifically, we study multiple four-point correlators of the leading O(N) vector ϕ{sub i} and the O(N) singlet σ. The crossing symmetry of the four-point functions and the unitarity condition provide nontrivial constraints on the scaling dimensions (Δ{sub ϕ}, Δ{sub σ}) of ϕ{sub i} and σ. With reasonable assumptions on the gaps between scaling dimensions of ϕ{sub i} (σ) and the next O(N) vector ϕ{sub i}{sup ′} (singlet σ{sup ′}) scalar, we are able to isolate the scaling dimensions (Δ{sub ϕ}, Δ{sub σ}) in small islands. In particular, for large N=500, the isolated region is highly consistent with the result obtained from large N expansion. We also study the interacting O(N) CFTs for 1≤N≤100. Isolated regions on (Δ{sub ϕ},Δ{sub σ}) plane are obtained using conformal bootstrap program with lower order of derivatives Λ; however, they disappear after increasing Λ. For N=100, no solution can be found with Λ=25 under the assumptions on the scaling dimensions of next O(N) vector Δ{sub ϕ{sub i{sup ′}}}≥5.0 (singlet Δ{sub σ{sup ′}}≥3.3). These islands are expected to be corresponding to interacting but nonunitary O(N) CFTs. Our results suggest a lower bound on the critical value N{sub c}>100, below which the interacting O(N) CFTs turn into nonunitary.
Czech Academy of Sciences Publication Activity Database
Kyselý, Jan
2010-01-01
Roč. 101, 3-4 (2010), s. 345-361 ISSN 0177-798X R&D Projects: GA AV ČR KJB300420801 Institutional research plan: CEZ:AV0Z30420517 Keywords : bootstrap * extreme value analysis * confidence intervals * heavy-tailed distributions * precipitation amounts Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 1.684, year: 2010
International Nuclear Information System (INIS)
Li, Ke; Lin, Boqiang
2017-01-01
This paper proposes a total-factor energy consumption performance index (TEPI) for measuring China's energy efficiency across 30 provinces during the period 1997 to 2012. The TEPI is derived by solving an improved non-radial data envelopment analysis (DEA) model, which is based on an energy distance function. The production possibility set is constructed by combining the super-efficiency and sequential DEA models to avoid “discriminating power problem” and “technical regress”. In order to explore the impacts of technological progress on TEPI and perform statistical inferences on the results, a two-stage double bootstrap approach is adopted. The important findings are that China's energy technology innovation produces a negative effect on TEPI, while technology import and imitative innovation produce positive effects on TEPI. Thus, the main contribution of TEPI improvement is technology import. These conclusions imply that technology import especially foreign direct investment (FDI) is important for imitative innovation and can improve China's energy efficiency. In the long run, as the technical level of China approaches to the frontier, energy technology innovation and its wide adoption become a sustained way to improve energy efficiency. Therefore, it is urgent for China to introduce measures such as technology translation and spillover policies as well as energy pricing reforms to support energy technology innovation. - Highlights: • A total-factor energy consumption performance index (TEPI) is introduced. • Three types of technological progress have various effects on TEPI. • FDI is the main contributor of TEPI improvement. • An improved DEA calculation method is introduced. • A two-stage double-bootstrap non-radial DEA model is used.
International Nuclear Information System (INIS)
Duan, Na; Guo, Jun-Peng; Xie, Bai-Chen
2016-01-01
Highlights: • Evaluate the energy and CO_2 emission performance of China’s thermal power industry. • Perform statistical inferences for the estimates of efficiency and productivity indexes. • There exist differences between the energy and CO_2 emission performance. • Technological progress is the main driving force for productivity improvement. - Abstract: A scientific evaluation of the energy efficiency and CO_2 emission performance of the thermal power industry could not only provide valuable information for reducing energy consumption and carbon emissions but also serve as a tool to estimate the effectiveness of relevant policy reforms. Considering the opposite effects of energy conservation and carbon emission reduction on generation cost, this study respectively measures the energy and CO_2 emission performance of the thermal power industries in China’s 30 provincial administrative regions during the period 2005–2012 from both static and dynamic perspectives. We implement the bootstrap method for the directional distance function to correct the possible estimate bias and test the significance of productivity changes where the weak disposability of undesirable outputs is also integrated. The empirical analysis leads to the following conclusions. The bootstrapping results could provide us with much valuable information because the initial estimates might result from sampling noise rather than reveal the real variations. In addition, some differences do exist between the energy and CO_2 emission performance of China’s thermal power industry. Furthermore, technological progress is the main driving force for energy and CO_2 emission productivity improvement and it works better for the former.
Al-Mudhafar, W. J.
2013-12-01
Precisely prediction of rock facies leads to adequate reservoir characterization by improving the porosity-permeability relationships to estimate the properties in non-cored intervals. It also helps to accurately identify the spatial facies distribution to perform an accurate reservoir model for optimal future reservoir performance. In this paper, the facies estimation has been done through Multinomial logistic regression (MLR) with respect to the well logs and core data in a well in upper sandstone formation of South Rumaila oil field. The entire independent variables are gamma rays, formation density, water saturation, shale volume, log porosity, core porosity, and core permeability. Firstly, Robust Sequential Imputation Algorithm has been considered to impute the missing data. This algorithm starts from a complete subset of the dataset and estimates sequentially the missing values in an incomplete observation by minimizing the determinant of the covariance of the augmented data matrix. Then, the observation is added to the complete data matrix and the algorithm continues with the next observation with missing values. The MLR has been chosen to estimate the maximum likelihood and minimize the standard error for the nonlinear relationships between facies & core and log data. The MLR is used to predict the probabilities of the different possible facies given each independent variable by constructing a linear predictor function having a set of weights that are linearly combined with the independent variables by using a dot product. Beta distribution of facies has been considered as prior knowledge and the resulted predicted probability (posterior) has been estimated from MLR based on Baye's theorem that represents the relationship between predicted probability (posterior) with the conditional probability and the prior knowledge. To assess the statistical accuracy of the model, the bootstrap should be carried out to estimate extra-sample prediction error by randomly
Directory of Open Access Journals (Sweden)
C. B. Alden
2018-03-01
Full Text Available Advances in natural gas extraction technology have led to increased activity in the production and transport sectors in the United States and, as a consequence, an increased need for reliable monitoring of methane leaks to the atmosphere. We present a statistical methodology in combination with an observing system for the detection and attribution of fugitive emissions of methane from distributed potential source location landscapes such as natural gas production sites. We measure long (> 500 m, integrated open-path concentrations of atmospheric methane using a dual frequency comb spectrometer and combine measurements with an atmospheric transport model to infer leak locations and strengths using a novel statistical method, the non-zero minimum bootstrap (NZMB. The new statistical method allows us to determine whether the empirical distribution of possible source strengths for a given location excludes zero. Using this information, we identify leaking source locations (i.e., natural gas wells through rejection of the null hypothesis that the source is not leaking. The method is tested with a series of synthetic data inversions with varying measurement density and varying levels of model–data mismatch. It is also tested with field observations of (1 a non-leaking source location and (2 a source location where a controlled emission of 3.1 × 10−5 kg s−1 of methane gas is released over a period of several hours. This series of synthetic data tests and outdoor field observations using a controlled methane release demonstrates the viability of the approach for the detection and sizing of very small leaks of methane across large distances (4+ km2 in synthetic tests. The field tests demonstrate the ability to attribute small atmospheric enhancements of 17 ppb to the emitting source location against a background of combined atmospheric (e.g., background methane variability and measurement uncertainty of 5 ppb (1σ, when
Alden, Caroline B.; Ghosh, Subhomoy; Coburn, Sean; Sweeney, Colm; Karion, Anna; Wright, Robert; Coddington, Ian; Rieker, Gregory B.; Prasad, Kuldeep
2018-03-01
Advances in natural gas extraction technology have led to increased activity in the production and transport sectors in the United States and, as a consequence, an increased need for reliable monitoring of methane leaks to the atmosphere. We present a statistical methodology in combination with an observing system for the detection and attribution of fugitive emissions of methane from distributed potential source location landscapes such as natural gas production sites. We measure long (> 500 m), integrated open-path concentrations of atmospheric methane using a dual frequency comb spectrometer and combine measurements with an atmospheric transport model to infer leak locations and strengths using a novel statistical method, the non-zero minimum bootstrap (NZMB). The new statistical method allows us to determine whether the empirical distribution of possible source strengths for a given location excludes zero. Using this information, we identify leaking source locations (i.e., natural gas wells) through rejection of the null hypothesis that the source is not leaking. The method is tested with a series of synthetic data inversions with varying measurement density and varying levels of model-data mismatch. It is also tested with field observations of (1) a non-leaking source location and (2) a source location where a controlled emission of 3.1 × 10-5 kg s-1 of methane gas is released over a period of several hours. This series of synthetic data tests and outdoor field observations using a controlled methane release demonstrates the viability of the approach for the detection and sizing of very small leaks of methane across large distances (4+ km2 in synthetic tests). The field tests demonstrate the ability to attribute small atmospheric enhancements of 17 ppb to the emitting source location against a background of combined atmospheric (e.g., background methane variability) and measurement uncertainty of 5 ppb (1σ), when measurements are averaged over 2 min. The
Economic policy uncertainty and housing returns in Germany: Evidence from a bootstrap rolling window
Directory of Open Access Journals (Sweden)
David Su
2016-06-01
Full Text Available The purpose of this investigation is to research the causal link between economic policy uncertainty (EPU and the housing returns (HR in Germany. In the estimated vector autoregressive models, we test its stability and find the short-run relationship between HR and EPU is unstable. As a result, a time-varying approach (bootstrap rolling window causality test is utilized to revisit the dynamic causal link, and we find EPU has no impact on HR due to the stability of the real estate market in Germany. HR does not have significant effects on EPU in most time periods. However, significant feedback in several sub-periods (both positive and negative are found from HR to EPU, which indicates the causal link from HR to EPU varies over time. The empirical results do not support the general equilibrium model of government policy choices that indicate EPU does not play a role in the real estate market. The basic conclusion is that the real estate market shows its stability due to the social welfare nature and the rational institutional arrangement of the real estate in Germany, and the real estate market also shows its importance that it has significant effect on the economic policy choice in some periods when negative external shocks occur.
Estimating the efficiency from Brazilian banks: a bootstrapped Data Envelopment Analysis (DEA
Directory of Open Access Journals (Sweden)
Ana Elisa Périco
2016-01-01
Full Text Available Abstract The Brazilian banking sector went through several changes in its structure over the past few years. Such changes are related to fusions and acquisitions, as well as the largest market opening to foreign banks. The objective of this paper is to analyze, by applying the bootstrap DEA, the efficiency of banks in Brazil in 2010-2013. The methodology was applied to 30 largest banking organizations in a financial intermediation approach. In that model, the resources entering a bank in the form of deposits and total assets are classified as inputs and besides these manual labor is also considered as a resource capable of generating results. For the output variable, credit operations represent the most appropriate alternative, considering the role of the bank as a financial intermediary. In this work, the matter of the best classification among retail banks and banks specialized in credit has little relevance. The low relevance in this type of comparison is a result of analysis by segments (segments were analyzed separately. The results presented here point to an average level of efficiency for the large Brazilian banks in the period. This scenario requires efforts to reduce expenses but also to increase revenues.
Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid
2015-12-01
This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.
Bootstrap-DEA analysis of BRICS’ energy efficiency based on small sample data
International Nuclear Information System (INIS)
Song, Ma-Lin; Zhang, Lin-Ling; Liu, Wei; Fisher, Ron
2013-01-01
Highlights: ► The BRICS’ economies have flourished with increasingly energy consumptions. ► The analyses and comparison of energy efficiency are conducted among the BRICS. ► As a whole, there is low energy efficiency but a growing trend of BRICS. ► The BRICS should adopt relevant energy policies based on their own conditions. - Abstract: As a representative of many emerging economies, BRICS’ economies have been greatly developed in recent years. Meanwhile, the proportion of energy consumption of BRICS to the whole world consumption has increased. Therefore, it is significant to analyze and compare the energy efficiency among them. This paper firstly utilizes a Super-SBM model to measure and calculate the energy efficiency of BRICS, then analyzes their present status and development trend. Further, Bootstrap is applied to modify the values based on DEA derived from small sample data, and finally the relationship between energy efficiency and carbon emissions is measured. Results show that energy efficiency of BRICS as a whole is low but has a quickly increasing trend. Also, the relationship between energy efficiency and carbon emissions vary from country to country because of their different energy structures. The governments of BRICS should make some relevant energy policies according to their own conditions
A Bootstrapping Model of Frequency and Context Effects in Word Learning.
Kachergis, George; Yu, Chen; Shiffrin, Richard M
2017-04-01
Prior research has shown that people can learn many nouns (i.e., word-object mappings) from a short series of ambiguous situations containing multiple words and objects. For successful cross-situational learning, people must approximately track which words and referents co-occur most frequently. This study investigates the effects of allowing some word-referent pairs to appear more frequently than others, as is true in real-world learning environments. Surprisingly, high-frequency pairs are not always learned better, but can also boost learning of other pairs. Using a recent associative model (Kachergis, Yu, & Shiffrin, 2012), we explain how mixing pairs of different frequencies can bootstrap late learning of the low-frequency pairs based on early learning of higher frequency pairs. We also manipulate contextual diversity, the number of pairs a given pair appears with across training, since it is naturalistically confounded with frequency. The associative model has competing familiarity and uncertainty biases, and their interaction is able to capture the individual and combined effects of frequency and contextual diversity on human learning. Two other recent word-learning models do not account for the behavioral findings. Copyright © 2016 Cognitive Science Society, Inc.
Spherical bootstrap calculation of qqq-baryon and multiquark-hadron masses
International Nuclear Information System (INIS)
Balazs, L.A.P.; Nicolescu, B.
1979-12-01
Recently a way of implementing the dual-topological unitarization program has been found, in which baryons and other multiquark hadrons are put on the sphere and appear at the same topological-complexity level as ordinary anti-qq mesons. This permits one to have a lowest-order 'spherical bootstrap', within which unitarity, duality and crossing can be consistently satisfied. In the present paper, this framework to calculate hadron masses has been used by imposing duality on an infinite sum of ladder graphs generated from spherical unitarity. By making a certain simple dynamical approximation, an explicit generic Regge-trajectory formula is derived for any given process. If one then makes certain reasonable dynamical assumptions and requires simultaneous consistency for entire sets of processes, it is possible to calculate the masses of all the lowest states and the Regge trajectories associated with each of them. The only arbitrary parameter is the mass of the rho which merely serves to set the mass scale
Using i2b2 to Bootstrap Rural Health Analytics and Learning Networks.
Harris, Daniel R; Baus, Adam D; Harper, Tamela J; Jarrett, Traci D; Pollard, Cecil R; Talbert, Jeffery C
2016-08-01
We demonstrate that the open-source i2b2 (Informatics for Integrating Biology and the Bedside) data model can be used to bootstrap rural health analytics and learning networks. These networks promote communication and research initiatives by providing the infrastructure necessary for sharing data and insights across a group of healthcare and research partners. Data integration remains a crucial challenge in connecting rural healthcare sites with a common data sharing and learning network due to the lack of interoperability and standards within electronic health records. The i2b2 data model acts as a point of convergence for disparate data from multiple healthcare sites. A consistent and natural data model for healthcare data is essential for overcoming integration issues, but challenges such as those caused by weak data standardization must still be addressed. We describe our experience in the context of building the West Virginia/Kentucky Health Analytics and Learning Network, a collaborative, multi-state effort connecting rural healthcare sites.
Mass transport and the bootstrap current from Ohm's law in steady-state tokamaks
International Nuclear Information System (INIS)
Kim, J.-S.; Greene, J.M.
1989-01-01
The consequences of mass conservation and Ohm's law are examined for steady state Tokamaks. In a Tokamak, magnetofluid-dynamic waves rapidly equilibrate pressure and toroidal field along magnetic surfaces. As a result, the detailed current distribution is determined by the flux surface averaged poloidal and toroidal currents. The electrons that carry the plasma current are impeded in their motion by interactions with ions, which is resistivity and its generalizations, and by interactions with electrons, which is viscosity and its generalizations. The important viscous terms arise from the interaction between trapped and untrapped electrons, and so viscosity acts by impeding poloidal current. properly chosen, the results of neoclassical theory are The neoclassical viscous coefficient is here regarded as less likely than Spitzer conductivity to be experimentally relevant in a turbulent Tokamak. Thus, the toroidal Ohm's law is regarded as being more reliable than the poloidal Ohm's law. A combination of toroidal and poloidal Ohm's law, namely the component parallel to the magnetic field, eliminates the influence of plasma fueling, and directly relates the bootstrap current and the pressure gradient. The latter is the usual relation, but, since i
Brandic, Ivona; Music, Dejan; Dustdar, Schahram
Nowadays, novel computing paradigms as for example Cloud Computing are gaining more and more on importance. In case of Cloud Computing users pay for the usage of the computing power provided as a service. Beforehand they can negotiate specific functional and non-functional requirements relevant for the application execution. However, providing computing power as a service bears different research challenges. On one hand dynamic, versatile, and adaptable services are required, which can cope with system failures and environmental changes. On the other hand, human interaction with the system should be minimized. In this chapter we present the first results in establishing adaptable, versatile, and dynamic services considering negotiation bootstrapping and service mediation achieved in context of the Foundations of Self-Governing ICT Infrastructures (FoSII) project. We discuss novel meta-negotiation and SLA mapping solutions for Cloud services bridging the gap between current QoS models and Cloud middleware and representing important prerequisites for the establishment of autonomic Cloud services.
Energy Technology Data Exchange (ETDEWEB)
Secchi, Piercesare [MOX, Department of Mathematics, Polytechnic of Milan (Italy); Zio, Enrico [Department of Energy, Polytechnic of Milan, Via Ponzio 34/3, 20133 Milano (Italy)], E-mail: enrico.zio@polimi.it; Di Maio, Francesco [Department of Energy, Polytechnic of Milan, Via Ponzio 34/3, 20133 Milano (Italy)
2008-12-15
For licensing purposes, safety cases of Nuclear Power Plants (NPPs) must be presented at the Regulatory Authority with the necessary confidence on the models used to describe the plant safety behavior. In principle, this requires the repetition of a large number of model runs to account for the uncertainties inherent in the model description of the true plant behavior. The present paper propounds the use of bootstrapped Artificial Neural Networks (ANNs) for performing the numerous model output calculations needed for estimating safety margins with appropriate confidence intervals. Account is given both to the uncertainties inherent in the plant model and to those introduced by the ANN regression models used for performing the repeated safety parameter evaluations. The proposed framework of analysis is first illustrated with reference to a simple analytical model and then to the estimation of the safety margin on the maximum fuel cladding temperature reached during a complete group distribution header blockage scenario in a RBMK-1500 nuclear reactor. The results are compared with those obtained by a traditional parametric approach.
The lightcone bootstrap and the spectrum of the 3d Ising CFT
Energy Technology Data Exchange (ETDEWEB)
Simmons-Duffin, David [School of Natural Sciences, Institute for Advanced Study, Princeton, New Jersey 08540 (United States); Walter Burke Institute for Theoretical Physics, Caltech, Pasadena, California 91125 (United States)
2017-03-15
We compute numerically the dimensions and OPE coefficients of several operators in the 3d Ising CFT, and then try to reverse-engineer the solution to crossing symmetry analytically. Our key tool is a set of new techniques for computing infinite sums of SL(2,ℝ) conformal blocks. Using these techniques, we solve the lightcone bootstrap to all orders in an asymptotic expansion in large spin, and suggest a strategy for going beyond the large spin limit. We carry out the first steps of this strategy for the 3d Ising CFT, deriving analytic approximations for the dimensions and OPE coefficients of several infinite families of operators in terms of the initial data {Δ_σ,Δ_ϵ,f_σ_σ_ϵ,f_ϵ_ϵ_ϵ,c_T}. The analytic results agree with numerics to high precision for about 100 low-twist operators (correctly accounting for O(1) mixing effects between large-spin families). Plugging these results back into the crossing equations, we obtain approximate analytic constraints on the initial data.
Directory of Open Access Journals (Sweden)
Milenka Ocampo
2009-01-01
Full Text Available El presente trabajo muestra, en base a evidencia empírica, que las diferencias entre la educación pública y privada han acentuado en el tiempo la desigualdad de ingresos existente en Bolivia. Inicialmente, se realiza un análisis descriptivo de las diferencias en calidad, infraestructura y cobertura entre la educación pública y privada. Posteriormente, empleando técnicas microeconométricas y tests de hipótesis bajo Bootstrap, se verifica que las diferencias entre ambos tipos de educación son significativas para explicar una parte de la desigualdad de ingresos generada entre 1999 y 2006 en Bolivia. Adicionalmente, se aplican el método Bootstrap Moon para asegurar la robustez de los resultados.
DEFF Research Database (Denmark)
Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender L. P.
2012-01-01
A new way of parametric bootstrapping allows similar construction of confidence ellipses applicable on all results from Multiple Factor Analysis obtained from the FactoMineR package in the statistical program R. With this procedure, a similar approach will be applied to Multiple Factor Analysis r...... in different studies performed on the same set of products. In addition, the graphical display of confidence ellipses eases interpretation and communication of results....
Directory of Open Access Journals (Sweden)
Enrico Zio
2008-01-01
Full Text Available In the present work, the uncertainties affecting the safety margins estimated from thermal-hydraulic code calculations are captured quantitatively by resorting to the order statistics and the bootstrap technique. The proposed framework of analysis is applied to the estimation of the safety margin, with its confidence interval, of the maximum fuel cladding temperature reached during a complete group distribution blockage scenario in a RBMK-1500 nuclear reactor.
Fisher, Anthony C; McCulloch, Daphne L; Borchert, Mark S; Garcia-Filion, Pamela; Fink, Cassandra; Eleuteri, Antonio; Simpson, David M
2015-08-01
Pattern electroretinograms (PERGs) have inherently low signal-to-noise ratios and can be difficult to detect when degraded by pathology or noise. We compare an objective system for automated PERG analysis with expert human interpretation in children with optic nerve hypoplasia (ONH) with PERGs ranging from clear to undetectable. PERGs were recorded uniocularly with chloral hydrate sedation in children with ONH (aged 3.5-35 months). Stimuli were reversing checks of four sizes focused using an optical system incorporating the cycloplegic refraction. Forty PERG records were analysed; 20 selected at random and 20 from eyes with good vision (fellow eyes or eyes with mild ONH) from over 300 records. Two experts identified P50 and N95 of the PERGs after manually deleting trials with movement artefact, slow-wave EEG (4-8 Hz) or other noise from raw data for 150 check reversals. The automated system first identified present/not-present responses using a magnitude-squared coherence criterion and then, for responses confirmed as present, estimated the P50 and N95 cardinal positions as the turning points in local third-order polynomials fitted in the -3 dB bandwidth [0.25 … 45] Hz. Confidence limits were estimated from bootstrap re-sampling with replacement. The automated system uses an interactive Internet-available webpage tool (see http://clinengnhs.liv.ac.uk/esp_perg_1.htm). The automated system detected 28 PERG signals above the noise level (p ≤ 0.05 for H0). Good subjective quality ratings were indicative of significant PERGs; however, poor subjective quality did not necessarily predict non-significant signals. P50 and N95 implicit times showed good agreement between the two experts and between experts and the automated system. For the N95 amplitude measured to P50, the experts differed by an average of 13% consistent with differing interpretations of peaks within noise, while the automated amplitude measure was highly correlated with the expert measures but was
Calia, Clara; Darling, Stephen; Havelka, Jelena; Allen, Richard J
2018-05-01
Immediate serial recall of digits is better when the digits are shown by highlighting them in a familiar array, such as a phone keypad, compared with presenting them serially in a single location, a pattern referred to as "visuospatial bootstrapping." This pattern implies the establishment of temporary links between verbal and spatial working memory, alongside access to information in long-term memory. However, the role of working memory control processes like those implied by the "Central Executive" in bootstrapping has not been directly investigated. Here, we report a study addressing this issue, focusing on executive processes of attentional shifting. Tasks in which information has to be sequenced are thought to be heavily dependent on shifting. Memory for digits presented in keypads versus single locations was assessed under two secondary task load conditions, one with and one without a sequencing requirement, and hence differing in the degree to which they invoke shifting. Results provided clear evidence that multimodal binding (visuospatial bootstrapping) can operate independently of this form of executive control process.
Pezzani, Carlos M.; Bossio, José M.; Castellino, Ariel M.; Bossio, Guillermo R.; De Angelo, Cristian H.
2017-02-01
Condition monitoring in permanent magnet synchronous machines has gained interest due to the increasing use in applications such as electric traction and power generation. Particularly in wind power generation, non-invasive condition monitoring techniques are of great importance. Usually, in such applications the access to the generator is complex and costly, while unexpected breakdowns results in high repair costs. This paper presents a technique which allows using vibration analysis for bearing fault detection in permanent magnet synchronous generators used in wind turbines. Given that in wind power applications the generator rotational speed may vary during normal operation, it is necessary to use special sampling techniques to apply spectral analysis of mechanical vibrations. In this work, a resampling technique based on order tracking without measuring the rotor position is proposed. To synchronize sampling with rotor position, an estimation of the rotor position obtained from the angle of the voltage vector is proposed. This angle is obtained from a phase-locked loop synchronized with the generator voltages. The proposed strategy is validated by laboratory experimental results obtained from a permanent magnet synchronous generator. Results with single point defects in the outer race of a bearing under variable speed and load conditions are presented.
Jones, Adam G
2015-11-01
Bateman's principles continue to play a major role in the characterization of genetic mating systems in natural populations. The modern manifestations of Bateman's ideas include the opportunity for sexual selection (i.e. I(s) - the variance in relative mating success), the opportunity for selection (i.e. I - the variance in relative reproductive success) and the Bateman gradient (i.e. β(ss) - the slope of the least-squares regression of reproductive success on mating success). These variables serve as the foundation for one convenient approach for the quantification of mating systems. However, their estimation presents at least two challenges, which I address here with a new Windows-based computer software package called BATEMANATER. The first challenge is that confidence intervals for these variables are not easy to calculate. BATEMANATER solves this problem using a bootstrapping approach. The second, more serious, problem is that direct estimates of mating system variables from open populations will typically be biased if some potential progeny or adults are missing from the analysed sample. BATEMANATER addresses this problem using a maximum-likelihood approach to estimate mating system variables from incompletely sampled breeding populations. The current version of BATEMANATER addresses the problem for systems in which progeny can be collected in groups of half- or full-siblings, as would occur when eggs are laid in discrete masses or offspring occur in pregnant females. BATEMANATER has a user-friendly graphical interface and thus represents a new, convenient tool for the characterization and comparison of genetic mating systems. © 2015 John Wiley & Sons Ltd.
1997-09-05
that cross the path; no ray need ever have followed the exact path previously. P- residuals (predicted) (observed) -2S ^AA+25 - 2Sri i AAAA+25...resampling techniques, such as Monte-Carlo iterations or bootstraping . IV. Disclaimer A historical U.S. explosion has been used in this study solely...diagnostic cluster population characteristics. The method can be applied to obtain " bootstrap " ground truth explosion waveforms for testing
Lépine, Aurélia; Vassall, Anna; Chandrashekar, Sudhashree
2015-01-01
In 2004, the largest HIV prevention project (Avahan) conducted globally was implemented in India. Avahan was implemented by NGOs supported by state lead partners in order to provide HIV prevention services to high-risk population groups. In 2007, most of the NGOs reached full coverage. Using a panel data set of the NGOs that implemented Avahan, we investigate the level of technical efficiency as well as the drivers of technical inefficiency by using the double bootstrap procedure developed by Simar & Wilson (2007). Unlike the two-stage traditional method, this method allows valid inference in the presence of measurement error and serial correlation. We find that over the 4 years, Avahan NGOs could have reduced the level of inputs by 43% given the level of outputs reached. We find that efficiency of the project has increased over time. Results indicate that main drivers of inefficiency come from the characteristics of the state lead partner, the NGOs and the catchment area. These organisational factors are important to explicitly consider and assess when designing and implementing HIV prevention programmes and in setting benchmarks in order to optimise the use and allocation of resources. C14, I1.
International Nuclear Information System (INIS)
Wesseh, Presley K.; Zoumara, Babette
2012-01-01
This contribution investigates causal interdependence between energy consumption and economic growth in Liberia and proposes application of a bootstrap methodology. To better reflect causality, employment is incorporated as additional variable. The study demonstrates evidence of distinct bidirectional Granger causality between energy consumption and economic growth. Additionally, the results show that employment in Liberia Granger causes economic growth and apply irrespective of the short-run or long-run. Evidence from a Monte Carlo experiment reveals that the asymptotic Granger causality test suffers size distortion problem for Liberian data, suggesting that the bootstrap technique employed in this study is more appropriate. Given the empirical results, implications are that energy expansion policies like energy subsidy or low energy tariff for instance, would be necessary to cope with demand exerted as a result of economic growth in Liberia. Furthermore, Liberia might have the performance of its employment generation on the economy partly determined by adequate energy. Therefore, it seems fully justified that a quick shift towards energy production based on clean energy sources may significantly slow down economic growth in Liberia. Hence, the government’s target to implement a long-term strategy to make Liberia a carbon neutral country, and eventually less carbon dependent by 2050 is understandable. - Highlights: ► Causality between energy consumption and economic growth in Liberia investigated. ► There is bidirectional causality between energy consumption and economic growth. ► Energy expansion policies are necessary to cope with demand from economic growth. ► Asymptotic Granger causality test suffers size distortion problem for Liberian data. ► The bootstrap methodology employed in our study is more appropriate.
Energy Technology Data Exchange (ETDEWEB)
Hager, Robert, E-mail: rhager@pppl.gov; Chang, C. S., E-mail: cschang@pppl.gov [Princeton Plasma Physics Laboratory, P.O. Box 451, Princeton, New Jersey 08543 (United States)
2016-04-15
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steep edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. A new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.
International Nuclear Information System (INIS)
Narayan, Paresh Kumar; Prasad, Arti
2008-01-01
The goal of this paper is to examine any causal effects between electricity consumption and real GDP for 30 OECD countries. We use a bootstrapped causality testing approach and unravel evidence in favour of electricity consumption causing real GDP in Australia, Iceland, Italy, the Slovak Republic, the Czech Republic, Korea, Portugal, and the UK. The implication is that electricity conservation policies will negatively impact real GDP in these countries. However, for the rest of the 22 countries our findings suggest that electricity conversation policies will not affect real GDP
Directory of Open Access Journals (Sweden)
Ettore Marubini
2014-01-01
Full Text Available This paper presents a robust two-stage procedure for identification of outlying observations in regression analysis. The exploratory stage identifies leverage points and vertical outliers through a robust distance estimator based on Minimum Covariance Determinant (MCD. After deletion of these points, the confirmatory stage carries out an Ordinary Least Squares (OLS analysis on the remaining subset of data and investigates the effect of adding back in the previously deleted observations. Cut-off points pertinent to different diagnostics are generated by bootstrapping and the cases are definitely labelled as good-leverage, bad-leverage, vertical outliers and typical cases. The procedure is applied to four examples.
Fourier Descriptor Analysis and Unification of Voice Range Profile Contours: Method and Applications
Pabon, Peter; Ternstrom, Sten; Lamarche, Anick
2011-01-01
Purpose: To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. Method: A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the…
An assessment of particle filtering methods and nudging for climate state reconstructions
S. Dubinkina (Svetlana); H. Goosse
2013-01-01
htmlabstractUsing the climate model of intermediate complexity LOVECLIM in an idealized framework, we assess three data-assimilation methods for reconstructing the climate state. The methods are a nudging, a particle filter with sequential importance resampling, and a nudging proposal particle
African Journals Online (AJOL)
Oyeyemi, GM. Vol 14, No 2 (2008) - Articles Comparison of bootstrap and jackknife methods of re-sampling in estimating population parameters. Abstract. ISSN: 1118-0579. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms ...
Explanation of Two Anomalous Results in Statistical Mediation Analysis
Fritz, Matthew S.; Taylor, Aaron B.; MacKinnon, David P.
2012-01-01
Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special…
Housing price forecastability: A factor analysis
DEFF Research Database (Denmark)
Bork, Lasse; Møller, Stig Vinther
of the model stays high at longer horizons. The estimated factors are strongly statistically signi…cant according to a bootstrap resampling method which takes into account that the factors are estimated regressors. The simple three-factor model also contains substantial out-of-sample predictive power...
Research Article Special Issue
African Journals Online (AJOL)
2018-01-15
Jan 15, 2018 ... In doing so, survey questions from previous studies were adopted and customized to collect data. .... bootstrapping method (1000 resamples) was employed [35]. ... commitment from shareholders, customers, suppliers and community. ... clusters in Latin America. ... job satisfaction in a cross-cultural context.
International Nuclear Information System (INIS)
Zhang, X.M.; Wan, B.N.
2005-01-01
Significant improvements of plasma performance after ICRF boronization have been achieved in the full range of HT-7 operation parameters. Electron power balance is analyzed in the steady state ohmic discharges of the HT-7 tokamak. The ratio of the total radiation power to ohmic input power increases with increasing the central line-averaged electron density, but decreases with plasma current. It is obviously decreased after wall conditioning. Electron heat diffusivity χ e deduced from the power balance analysis is reduced throughout the main plasma after boronization. χ e decreases with increasing central line-averaged electron density in the parameter range of our study. After boronization, the plasma current profile is broadened and a higher current can be easily obtained on the HT-7 tokamak experiment. It is expected that the fact that the bootstrap current increases after boronization will explain these phenomena. After boronization, the plasma pressure gradient and the electron temperature near the boundary are larger than before, these factors influencing that the ratio of bootstrap current to total plasma current increases from several percent to above 10%
International Nuclear Information System (INIS)
Huang Qian-Hong; Gong Xue-Yu; Lu Xing-Qiang; Yu Jun; Cao Jin-Jia
2015-01-01
The density profile of fast ions arising from a tangentially injected diffuse neutral beam in tokamak plasma is calculated. The effects of mean free paths and beam tangency radius on the density profile are discussed under typical HL-2A plasmas parameters. The results show that the profile of fast ions is strongly peaked at the center of the plasma when the mean free path at the maximum deuteron density is larger than the minor radius, while the peak value decreases when the mean free path at the maximum deuteron density is larger than twice that of the minor radius due to the beam transmission loss. Moreover, the bootstrap current of fast ions for various mean free paths at the maximum deuteron density is calculated and its density is proved to be closely related to the deposition of the neutral beam. With the electron return current considered, the net current density obviously decreases. Meanwhile, the peak central fast ion density increases when the beam tangency radius approaches the major radius, and the net bootstrap current increases rapidly with the increasing beam tangency radius. (paper)
Directory of Open Access Journals (Sweden)
Simionescu, Mihaela
2014-12-01
Full Text Available The necessity of improving the forecasts accuracy grew in the context of actual economic crisis, but few researchers were interested till now in finding out some empirical strategies to improve their predictions. In this article, for the inflation rate forecasts on the horizon 2010-2012, we proved that the one-step-ahead forecasts based on updated AR(2 models for Romania and ARMA(1,1 models for Bulgaria could be substantially improved by generating new predictions using Monte Carlo method and bootstrap technique to simulate the models' coefficients. In this article we introduced a new methodology of constructing the forecasts, by using the limits of the bias-corrected-accelerated bootstrap intervals for the initial data series of the variable to predict. After evaluating the accuracy of the new forecasts, we found out that all the proposed strategies improved the initial AR(2 and ARMA(1,1 forecasts. These techniques also improved the predictions of experts in forecasting made for Romania and the forecasts of the European Commission made for Bulgaria. Our own method based on the lower limits of BCA intervals generated the best forecasts. In the forecasting process based on ARMA models the uncertainty analysis was introduced, by calculating, under the hypothesis of normal distribution, the probability that the predicted value exceeds a critical value. For 2013 in both countries we anticipate a decrease in the degree of uncertainty for annual inflation rate. || La necesidad de mejorar la precisión de las previsiones ha crecido en el contexto de crisis económica actual, pero son pocos los investigadores que se habían interesado hasta ahora por la búsqueda de estrategias empíricas para mejorar sus predicciones. En este artículo, a través de las previsiones de la tasa de inflación en el horizonte 2010-2012, hemos podido comprobar que las previsiones de un solo paso adelante sobre la base de modelos actualizados AR(2 para Rumanía y ARMA(1
Directory of Open Access Journals (Sweden)
Alan Delgado de Oliveira
Full Text Available ABSTRACT In this paper, we provide an empirical discussion of the differences among some scenario tree-generation approaches for stochastic programming. We consider the classical Monte Carlo sampling and Moment matching methods. Moreover, we test the Resampled average approximation, which is an adaptation of Monte Carlo sampling and Monte Carlo with naive allocation strategy as the benchmark. We test the empirical effects of each approach on the stability of the problem objective function and initial portfolio allocation, using a multistage stochastic chance-constrained asset-liability management (ALM model as the application. The Moment matching and Resampled average approximation are more stable than the other two strategies.
Pang, Yi; Rong, Junchen; Su, Ning
2016-12-01
We consider ϕ 3 theory in 6 - 2 ɛ with F 4 global symmetry. The beta function is calculated up to 3 loops, and a stable unitary IR fixed point is observed. The anomalous dimensions of operators quadratic or cubic in ϕ are also computed. We then employ conformal bootstrap technique to study the fixed point predicted from the perturbative approach. For each putative scaling dimension of ϕ (Δ ϕ ), we obtain the corresponding upper bound on the scaling dimension of the second lowest scalar primary in the 26 representation ( Δ 26 2nd ) which appears in the OPE of ϕ × ϕ. In D = 5 .95, we observe a sharp peak on the upper bound curve located at Δ ϕ equal to the value predicted by the 3-loop computation. In D = 5, we observe a weak kink on the upper bound curve at ( Δ ϕ , Δ 26 2nd ) = (1.6, 4).
El-Showk, Sheer; Poland, David; Rychkov, Slava; Simmons-Duffin, David; Vichi, Alessandro
2014-01-01
We use the conformal bootstrap to perform a precision study of the operator spectrum of the critical 3d Ising model. We conjecture that the 3d Ising spectrum minimizes the central charge c in the space of unitary solutions to crossing symmetry. Because extremal solutions to crossing symmetry are uniquely determined, we are able to precisely reconstruct the first several Z2-even operator dimensions and their OPE coefficients. We observe that a sharp transition in the operator spectrum occurs at the 3d Ising dimension Delta_sigma=0.518154(15), and find strong numerical evidence that operators decouple from the spectrum as one approaches the 3d Ising point. We compare this behavior to the analogous situation in 2d, where the disappearance of operators can be understood in terms of degenerate Virasoro representations.